Skip to main content

Beyond binary computing

Image result for binary computing
Source: shareicon.net
Almost every computer in existence today is based on the binary system of 1s and 0s and it might look like this is always how it has been. But if we take a look at the history of computing we will find that in the 60s there were computers that were based on our decimal system, decimal computers as they were called. But it turned out that binary computing won the race and today many people think that binary is the only way to go when we have to build computers.
The reason for the dominance of binary computers is because it is very easy to build computers that only need to operate on two states, 1 and 0, not because there is anything special about the binary system or that nature can only function in two states.

There are many obvious advantages to using the binary number system to build computers like for example multiplication could be replaced with bit shifting. And there are other bit tricks that apply to a wide variety of computing scenarios so much so that no one ever thinks it will be necessary to go beyond the binary system.

Even though we use the hexadecimal number system to view abstract pictures of computation, the final fact is that at the level of the computer circuit, all we have is two states of 1s and 0s to build any kind of possible computation.

In the future we will have devices that operate at many more states that our current circuitry allows, this will allow us to perform more computation per unit of matter than is currently possible. We are so addicted to binary computing so much so that we are trying to build quantum computers that rely on units that act like binary systems.

Even though nature is not limited to two states, the ease of building systems based on two states has got us addicted and we are not even trying to explore if building systems that depend on multiple states will give us more bang for our buck.

It just like the QWERTY keyboard that you find on every computer today. Even though QWERTY was designed to slow down typists in the days of the typewriter, because highspeed typing resulted in constant jamming of the typewriter keys, today we thoughtlessly make QWERTY the default on every system even though it is not the fastest system available. DVORAK is much faster but many people do not know about it and I think that with Deep Learning we could analyse typing data and come up with better typing systems if only people will take out the time to learn it.

Most of computer science is based on analysing and building binary computers and we have built so much knowledge of these binary systems over the decades we think that nature must conform to our requirements. But in truth nature doesn't really care and offers more ample opportunities than the human mind is able to utilize at any point in time. It is left to us to seek more out of nature rather than think nature cares about our models.

This decade will be mostly about multicore systems, because of fundamental difficulties we will experience as we attempt to build chips with transistors at sub 7nm levels, we will opt for multicore to keep driving growth in the processor business. Another trend we will witness in the next 10 years is lots of highly specialized cores on one chip.

New computer hardware architecture will provide expansion slots that just enable us to add more specialized cores to our system alongside thousands of general-purpose cores. It will possible to just compile certain important software directly to hardware using some kind of FPGA system or even Evolvable hardware and just take the new chip and slam it on the expansion slots on our computers and then we gain the speed up of doing stuff in hardware rather than software.

But all this innovation will eventually stall and beyond 2030 we will start really looking into the idea of utilizing nature in more powerful ways beyond building flip-flops that switch between 1 and 0 only.


Comments

Popular posts from this blog

Next Steps Towards Strong Artificial Intelligence

What is Intelligence? Pathways to Synthetic Intelligence If you follow current AI Research then it will be apparent to you that AI research, the deep learning type has stalled! This does not mean that new areas of application for existing techniques are not appearing but that the fundamentals have been solved and things have become pretty standardized.

What is Intelligence: Software writing Software

Sometimes I wonder why programmers are hell-bent on writing programs that can communicate in natural language and not even putting adequate effort into writing programs that write other programs. Maybe is because of the natural tendency to protect one's source of livelihood by not attempting to automate it away or maybe because writing programs is hard enough such that contemplating of writing some program that writes programs might be even harder.

Virtual Reality is the next platform

VR Headset. Source: theverge.com It's been a while now since we started trying to develop Virtual Reality systems but so far we have not witnessed the explosion of use that inspired the development of such systems. Although there are always going to be some diehard fans of Virtual Reality who will stick to improving the medium and trying out stuff with the hopes of building a killer app, for the rest of us Virtual Reality still seems like a medium that promises to arrive soon but never really hits the spot.