Skip to main content

Beyond binary computing

Image result for binary computing
Source: shareicon.net
Almost every computer in existence today is based on the binary system of 1s and 0s and it might look like this is always how it has been. But if we take a look at the history of computing we will find that in the 60s there were computers that were based on our decimal system, decimal computers as they were called. But it turned out that binary computing won the race and today many people think that binary is the only way to go when we have to build computers.
The reason for the dominance of binary computers is because it is very easy to build computers that only need to operate on two states, 1 and 0, not because there is anything special about the binary system or that nature can only function in two states.

There are many obvious advantages to using the binary number system to build computers like for example multiplication could be replaced with bit shifting. And there are other bit tricks that apply to a wide variety of computing scenarios so much so that no one ever thinks it will be necessary to go beyond the binary system.

Even though we use the hexadecimal number system to view abstract pictures of computation, the final fact is that at the level of the computer circuit, all we have is two states of 1s and 0s to build any kind of possible computation.

In the future we will have devices that operate at many more states that our current circuitry allows, this will allow us to perform more computation per unit of matter than is currently possible. We are so addicted to binary computing so much so that we are trying to build quantum computers that rely on units that act like binary systems.

Even though nature is not limited to two states, the ease of building systems based on two states has got us addicted and we are not even trying to explore if building systems that depend on multiple states will give us more bang for our buck.

It just like the QWERTY keyboard that you find on every computer today. Even though QWERTY was designed to slow down typists in the days of the typewriter, because highspeed typing resulted in constant jamming of the typewriter keys, today we thoughtlessly make QWERTY the default on every system even though it is not the fastest system available. DVORAK is much faster but many people do not know about it and I think that with Deep Learning we could analyse typing data and come up with better typing systems if only people will take out the time to learn it.

Most of computer science is based on analysing and building binary computers and we have built so much knowledge of these binary systems over the decades we think that nature must conform to our requirements. But in truth nature doesn't really care and offers more ample opportunities than the human mind is able to utilize at any point in time. It is left to us to seek more out of nature rather than think nature cares about our models.

This decade will be mostly about multicore systems, because of fundamental difficulties we will experience as we attempt to build chips with transistors at sub 7nm levels, we will opt for multicore to keep driving growth in the processor business. Another trend we will witness in the next 10 years is lots of highly specialized cores on one chip.

New computer hardware architecture will provide expansion slots that just enable us to add more specialized cores to our system alongside thousands of general-purpose cores. It will possible to just compile certain important software directly to hardware using some kind of FPGA system or even Evolvable hardware and just take the new chip and slam it on the expansion slots on our computers and then we gain the speed up of doing stuff in hardware rather than software.

But all this innovation will eventually stall and beyond 2030 we will start really looking into the idea of utilizing nature in more powerful ways beyond building flip-flops that switch between 1 and 0 only.


Comments

Popular posts from this blog

Next Steps Towards Strong Artificial Intelligence

If you follow current AI Research then it will be apparent to you that AI research, the deep learning type has stalled! This does not mean that new areas of application for existing techniques are not appearing but that the fundamentals have been solved and things have become pretty standardized.

Software of the future

From the appearance of the first programmable computer, the software has evolved in power and complexity. We went from manually toggling electronic switches to loading punch cards and eventually entering C code on the early PDP computers. Video screens came in and empowered the programmer very much, rather than waiting for printouts to debug our programs, we were directly manipulating code on the screen.


At the edge of a cliff - Quantum Computing

Source: https://ai.googleblog.com/2018/05/the-question-of-quantum-supremacy.html
Quantum computing reminds me of the early days of computer development (the 50s - 60s) where we had different companies come up with their different computer architectures. You had to learn how to use one architecture then move to another then another as you hoped to explore different systems that were better for one task than they were for another task.