Skip to main content


Showing posts from March, 2020

Is AI explainability important?

AI explainability or the art of explaining why a certain AI model makes the predictions it makes is all the buzz these days. This is because unlike traditional algorithms we don't really understand what goes on inside AI systems and while we could peer into these models as they operate or log their actions in many cases we cannot really explain exactly why they make the decisions they do.

The Threefold nature of Intelligence

Trefoil knot. Source: I am constantly contemplating Intelligence, not purely for the purpose of exercising my mind, but also to discover its secrets so as to enable myself or anyone I inspire to create an artifact that embodies it as it is inevitable that humanity will eventually build synthetic intelligence to augment its current capacities.

Amorphous Computing

Source: If there is a major defining feature of our current microprocessor engineering capabilities, it is that building these chips requires a great deal of transformation of matter from a rough amorphous state with lots of impurities to a crystalline state at 99% purity.

Beyond binary computing

Source: Almost every computer in existence today is based on the binary system of 1s and 0s and it might look like this is always how it has been. But if we take a look at the history of computing we will find that in the 60s there were computers that were based on our decimal system, decimal computers as they were called. But it turned out that binary computing won the race and today many people think that binary is the only way to go when we have to build computers.