At the edge of a cliff - Quantum Computing


Quantum computing reminds me of the early days of computer development (the 50s - 60s) where we had different companies come up with their different computer architectures. You had to learn how to use one architecture then move to another then another as you hoped to explore different systems that were better for one task than they were for another task. 
These days you have several different Quantum computing platforms, each offering their unique approach to quantum computing. You have different systems from Microsoft, IBM, Google and of course Dwave offering different approaches to the field. 

Bristlecone is Google’s newest quantum processor (left). On the right is a cartoon of the device: each “X” represents a qubit, with nearest neighbour connectivity. Source: Google

Many doubt that quantum computing will ever be a thing, but the truth is that quantum computing will eventually be “a thing” and will accelerate our efforts on certain very specialized problems whose structure can be encoded in the form that is ready for solving on the various quantum computing platforms. 

Without cloud computing, quantum computing would not be delivered at the scale at which it is currently being delivered, which just shows how sometimes one technology invented for completely different purposes can eventually become the main platform of delivery for something really new. 

Not ignoring all the goodies that cloud computing gives us in and of itself, it is going to be the main supporting backbone that will enable wider dissemination of quantum computing as the field expands and develops

In the early days of computing when machines were room-filling like the current quantum computing technologies, the public didn’t have much access to these machines. Only “experts” in white lab coats had the qualifications to deal with these computers. But thinking about how primitive those machines were compared to the powerful mobile devices we have today one cannot help but think about why computers were so sacred in the early days.  

The cost and delicateness of these machines were probably the main drivers behind their holy status, but as costs went down and robustness increased it has become a plaything. These days it is usual for someone to drop a mobile phone on the ground about 5 times during the lifetime of the phone without much alteration to the functionality of the phone.
  
In these early days of quantum computing, we are back in the old days where they are seen as holy devices and only the qualified priests can dare approach the holy of holies. But because of a highly developed software infrastructure currently available in the world today and delivered via the cloud, normal people can access these machines via standard tools if they are interested. 

This is a good sign because unlike classical computing which really kicked off for the public in the 80s since early developments in the 50s, Quantum computing will enter a high drive exponential growth cycle from about 2020 due to the fury of investments and developer interest.  

Some people are even predicting doubly exponential growth but I am a bit cautious about this. One thing is clear though, we are going to witness a massive exponential growth cycle from about 2020 till 2030, where some new non computing-related field, most likely synthetic biology, is going to garner the most interest in the world and quantum computing at near unit speeds for many tasks will be taken as a given, a natural fact of life, nothing to be much excited about, just like the internet today. 

One revolution powers another, if you observe the nature of technological development you can see a clear iterative structure, even though there are massive paradigm shifts from time to time that switch the field of attention from one technology to another.  

The Mainframe computing revolution lorded by IBM and DEC, gave birth to the PC revolution. The PC revolution lorded by Microsoft, Apple, Dell, and HP gave birth to the internet revolution. The internet revolution lorded by Google, Facebook, Amazon, etc. Is preparing the world for the Quantum computing revolution. 

There was a mini-revolution that sparked the desire for more computation power which is the core driver behind the desire of many corporations to own quantum computing. Although many large companies would say they want to solve: Energy problems, Disease, Climate change, and all the altruistic desires you see pledged in the promotional videos. The true reason many companies are committing billions of dollars into the development of quantum computing is because of Artificial Intelligence. 

With the astounding performance of non-deterministic systems like deep learning and others, it is clear that the winner of the AI race would have to use some kind of non-deterministic system that operates based on probabilities, which is exactly what quantum computers are. 

It is expected that quantum computers are going to provide the heavy muscles needed to lift AI workloads that are expected to give most of the companies investing in this field great competitive advantage. This is the overarching motivation behind all the investments been poured into the development of quantum computing. 

Google is very explicit about their intentions for quantum computers by appropriately branding their movement, Quantum-AI labs which I love for the main fact of its sincerity of purpose.  Others are still holding the “research” placard when deep inside their quest for Quantum Supremacy is AI Supremacy. With truly intelligent machines all the other research problems are simple subproblems of the AI problem.  

The next decade from 2020 – 2030 will be one of breakneck speed in Quantum innovation. The deep reason behind why companies are seeking the quantum approach rather than hoping that some novel algorithm will save them from the current AI plateau is because of the unpredictability of creative innovation.  

The success of deep learning powered modern AI has declared to the world that the way forward is scaling in data and compute not cleverness, which I think is not wholly the right approach, but big companies have shareholder expectations to meet so they are not just sitting around and being patient that some PhD will cough up some novel algorithm which improves deep learning by reducing compute requirements in the 1000x. The quality of papers being released is not even encouraging so companies are now putting their entire will into the hope offered by scaling.  

The hope is that with sufficient compute and data we will achieve Strong AI and that with Quantum supremacy comes AI supremacy and thus the winner will take it all. 

The future always has so much shock in store for us but the sheer momentum with which giant behemoth tech companies move will not give them the time to rethink other strategies. Shareholders want results and CEOs have to deliver so quantum all the way. 

This is actually good news because it will lead us all the way into the next revolution beyond 2030 which is synthetic biology which will require some really heavy compute and which holds the promise of correcting most of the physical ills humanity has had to endure from the dawn of our appearance on this earth.  
Personally, I think AI will come about from algorithmic ingenuity much like Karatsuba disproved Kolmogorov’s conjecture by inventing a shorter algorithm for multiplying 2 numbers. Even though the consensus was that you couldn’t do multiplication in less than steps, Karatsub broke that “law” with an algorithm that operates in 

I actually believe that a small research team or an individual, maybe not even be a PhD or an academic, will soon come up with some algorithm that enables us do all the stuff that deep learning is giving to us and more and with the recent interest into other network topologies like Alan Turing’s B-Type networks it is clear that the search is on and many others are not just hoping that scaling alone will give us Strong AI (AGI).  

I hereby advise developers to start getting their hands dirty with any available quantum computing cloud service of their choice. Learn multiple approaches from different providers as you don’t know which will stick in the end. I am currently doing this because whether we like it or not, quantum computers are coming and the denial will not help you when it explodes like deep learning did after 2010.  

The people who got into deep learning early ate all the low hanging fruit, and have the best jobs around. Most people getting in now will just end up filling the need for the enormous amounts of data scientists that are currently needed to deal with the massive amounts of data currently residing in large company vaults.  

Although it is not a bad thing to be a data scientist in 2019, imagine that you are among the very first people who know how to program quantum computers and are required to fill new positions as soon as these systems enter primetime.  

Will quantum computing be as pervasive as classical computing? This is like asking in 1999, will webmail become more pervasive that having your own personal mail client? We all know how yahoo mail and Gmail took over the world during the 90s. Without them, many people who depended on the early cyber cafes, including myself, for internet access would not have access to email. Very few people today own personal mail client software due to the reliability of webmail. 

Same with cloud computing, sometime in 2003 I predicted that people will run companies from laptops. I didn’t know how this was going to happen but this is now a reality. With some purchased compute power on Azure, Google cloud or AWS, etc. You literally have access to all the computing power you need and not limited to what is available on your laptop (which is the new “thin” client). You could have access to this power without ever visiting the actual physical site where your compute devices actually reside. 

Same goes to quantum computing, for people with high algorithmic requirements, plugging into quantum computing service with just a few API calls the same way you would plug into your AWS account will enable them to have such computing power. And these people with high algorithmic requirements will soon involve everyone as more and more of our bodies will become connected to computers.

These symbioses with computation will require that computers match the speed and complexities of the human organisms and this will indeed require quantum computing capabilities to be possible because you cannot rent an entire supercomputer cluster just to model the nature of some protein interaction in your body after you have ingested some medication.  

Personalized medicine in the future might require instant analysis of the state of your biology and rapid synthesis of molecules (drugs) to counteract and adjust the development of any abnormality. This will be done with quantum computers. So if you think that quantum computers are not necessary, you should reason otherwise.

As a developer get yourself educated about these new machines. As a software company start thinking of ways to leverage your processes in the future with quantum computing even though you never have to start purchasing any equipment.  

It is clear that many companies are yet to achieve the full benefits of AI and Machine learning and I blame wrong marketing for this. But the fact that many companies have not adopted the recent AI developments doesn’t mean that others did not. Others did and are already getting ready for the next step which is quantum computing.  

So get set, we are at the edge of a cliff, and by next year we might be making a massive jump towards a future powered by Quantum Computing.  

Comments

Popular posts from this blog

Software of the future

This powerful CEO works remotely, visits the office only a few times a year