Skip to main content


Showing posts from 2023

Will there be another tech boom soon? writing from 2023

 Unless there is another XEROX PARC (Palo Alto Research Center) moment, we will not see any boom in tech for the next decade or so. I know this sounds pessimistic, but before you get your arrows out, know that I am a programmer who has never worked in big tech, so this issue affects me greatly also.  While big tech companies, and many small tech companies following in a copycat manner, are posturing to investors by firing employees, which is a very bad sign for future growth, all this posturing will not result in a resurgence of tech because there is a problem at the roots, of which consequences we are observing now.  So what is the deep problem at the roots causing the current destruction of value in the tech market, ignoring the temporary green we see from time to time that pops up in the stock? It is an excessive focus on value extraction and very little given to wild creative exploration.  Sometimes too much structure can lead to restriction, tech companies mostly the big guys some

Why you should write often

 Any creative activity, that is activity that requires more than passively absorbing data is very hard. That is because by default it requires far more energy to create something than to just consume or destroy as the case may be.  Writing is a creative activity, and it is hard to do. It doesn't matter what you are writing, either words, codes or equations. Bringing stuff out of your mind is very hard, and I believe that it is the core part of what we call general intelligence because it activates the processes that utilize knowledge rather than just passively absorb and store it.  You could augment memory by using a notepad to store some of the facts that you record, but it is harder to augment the ability to create something new or express your solution to some problem, which is what all true creativity is about.  Writing also requires some of the highest level of focus you can summon, for instance writing computer programs or mathematical proofs require the largest amount of ene

Human intelligence is more about knowledge utilization

There are two significant phases of intelligence, the first is the knowledge acquisition phase and the second is the knowledge utilisation phase. The knowledge acquisition stage is available by default in many animals but what makes humans unique is that our ability to utilize knowledge is far higher than most mammals.  But even humans individually and as a whole utilize only a very tiny fraction of the available knowledge. For an individual your acquired knowledge base, that is knowledge you have acquired both consciously through study or unconsciously through just existing, is far more than what you are able to utilize.  This is also similar for humanity as a whole, the knowledge we possess both in written form or in other forms far outweighs our ability to utilize it.  The core part of what we could call general intelligence is really more about knowledge utilization and less about acquisition. Because what is useful is only what we can utilize, the rest is just memory.  For example

Microsoft's Bing Killer feature and why people dunk on Bard

 All LLMs are equivalent, I have not done any statistical studies on the error rates but I intuitively know that they are equivalent, meaning that they have similar rates of failure. So why is everyone dunking on Bard? Well, it's because people expected more from the search leader of the world.  But the problem with Bard is not really Google's fault, it is more about human cognitive bias, the very bias that made Google stay at the top of the search engine race even as they kept adding excessive amounts of advertising that degenerated search results. The reason Google stayed on top for so long wasn't that their search engine was excessively better than others but because they had become a verb for anything that requires obtaining information from the internet. They also had the power of incumbency, so it did not matter how many ads people were flooded with, people just went to Google anytime they wanted to search for information because humans are lazy and hate change. So wh

New Information interfaces, the true promise of chatGPT, Bing, Bard, etc.

LLMs like chatGPT are the latest coolest innovation in town. Many people are even speculating with high confidence that these new tools are already Generally intelligent. Well, as with every new hype from self-driving cars based on deeplearning to the current LLMs are AGI, we often tend to miss the importance of these new technologies because we are often engulfed in too much hype which gets investors hyper interested and then lose interest in the whole field of AI when the promises do not pan out. The most important thing about chatGPT and co is that they are showing us a new way to access information. Most of the time we are not interested in going too deep into typical list-based search engine results to get answers and with the abuse of search results using SEO optimizations and the general trend towards too many ads, finding answers online has become a laborious task.  Why people are gobbling up stuff like chatGPT is not really about AGI, but it is about a new and novel way to rap