Skip to main content

The informative power of Computation

Starting from simple lumped matter abstractions like transistors we have been able to build a huge global computing infrastructure that has improved our earthly existence in many ways. But what I want to bring our attention to currently is that the process of abstraction through which we built this infrastructure can also inform us as to the fundamental nature of the physical universe and even our intelligence.

At the simplest possible level, computing can be seen simply as state transformations. When one state is transformed to another state we can say that computation has taken place. Thus if we represent a computation with the symbols 1 for the state of being ON and 0 for the state of being OFF, the transformation of the 1 state to the 0 state and from the 0 state to the 1 state can be seen as computation.

In actual computation, a device that is able to do this bit flip is called a gate and the simplest possible gate that performs this bit flipping operation is called a NOT gate. Many other gates exist like the AND, OR, XOR and compound gates like the NAND which combines the NOT and the AND gate.

Practical computing machines are usually built using mostly NAND gates and from theoretical computing, we know that the NAND gate is equivalent to the other gates, i.e. we can build any other gate using certain combinations of NAND gates. As we will see later this equivalence between different structures is a very powerful thing indeed.

We can build a physical gate using transistors and combine these transistors with other circuit components to build what we call a computer, this is the first level of abstraction in the computation hierarchy.

Most of the time in computing we are not interested in just performing operations on bits, we want to do some useful computation that exists at a higher level than simply gating states. As an example, we may want to compute some arithmetic.

To compute arithmetic like adding 1 to 3, we must transform our integer representation to a binary representation which is what we can represent on the computing hardware. To perform the actual addition we also need circuitry that performs binary addition. Other basic arithmetic operations also have their own specialized circuits like multiplication, etc. and we can perform other types of simple binary operations like bit shifting, etc. which are atomic components for a whole bunch of other operations.

All these operations usually exist in some specialized computing unit called the ALU (arithmetic logical unit) and there is usually another bunch of circuitry for synchronizing the ALU and other aspects of the machine, like memory, of course, we have to store things somewhere, which we call the CPU (Central Processing Unit)

From what I have described above till this point we can clearly see two levels of abstraction emerge. The world of bits moving on the circuit, which are actually electrical signals, and the world of Integers in which we express operations like addition using the plus symbol, etc. Don't take this view too far and there are a lot of details I am omitting like floating point, etc. What I am trying to build in your mind is a framework for viewing certain things.

We can perform arithmetic on integers on paper, in this case using the paper as a memory for our operations or we can transfer these operations on to some computing machinery requiring that we perform some translation from one world to another, from the world of arithmetic to the world of bits and electronic gates.

This translation is only possible because of some deep equivalence between the world of arithmetic with integer decimals, what we are used to, and the world of binary the world of the computer.

What this simple two-layer abstraction system shows us is that two worlds' activity can exist independently but can be translated between each other.

Apart from raw computation, it happens that we can also organize circuitry that does nothing but hold things, these are known as memory. A piece of memory should be addressable i.e. we should know where it exists and should also be able to hold it's content for the duration that it is needed. Also, it needs to be capable of being rewritten should we need it to be, much like writing on a piece of paper and erasing with an eraser.

This simple but powerful organization of memory is what extends computation from simple bit operations to giving us the ability to build higher and higher levels of abstraction till we reach the level of the browser I am using to write this post.

With memory new things are possible. We can now not only execute one-off hardwired computation like arithmetic but we can build programs that execute new kinds of operations based on manipulating memory using rules.

With memory, we can create instructions that use the fixed circuitry in definite ways, and with these instructions, we can build a sequence of operations that orchestrates the capability of the hardware to achieve higher-level goals than those explicitly exposed by the capabilities of the hardware.

These native instructions that are directly executable on the computing hardware are known as machine instructions and they are usually in binary. But we need a more expressive language that is more suitable to human minds to think in so we build a system of mnemonics that map directly to binary operations or a sequence of them.

This mnemonic system is known as assembly language and although it is translatable to machine language using a program written in machine language called an assembler, it exists at another layer of abstraction above the machine and thus we can think exclusively in assembly language without thinking of the underlying machine circuitry and still be able to translate what we have produced in assembly language automatically into machine code using the assembler. Both the assembly world and the machine world are different but equivalent because they can be translated or more definitely they can communicate via a protocol maintained by the intermediary assembler program.

We do not end at the assembly level but continue to build into the high-level language level with languages like C and using this C language we can build even higher languages like Python and with python, we can build mini special-purpose languages as the need arises in our programming activities.

Actually, the world of application software is actually built from mini languages written in some lower-level language, these languages are the program code itself written in some high-level language. To make this concept clear if we write a browser software in python, we will first have to define functions that perform specific tasks like connect to a URL, receive data packets, etc. But the browser software is not these individual functions, rather in the main function of the application software we orchestrate all these functions like individual language units to create the browser. Each function thus acts like programming language primitives.

Now that we have seen this abstraction at work in computer systems we might ask about the core topic of this post, what is the informative power of computation? The beautiful fact about computer systems is that all these different worlds, of assembler, high-level languages, and application programs of all kinds are all running on the same simple binary hardware built out of lumped matter abstractions like transistors.

The Informative power of computation is this, from observing the way abstractions are built on top of one another in a computer system while being translatable layer to layer we can see that even in nature the same abstraction process is at work. This concept can inform our ideas about anything we are trying to build in the world and even the most challenging problem of our times, which is artificial intelligence.

In nature, ignoring the levels lower than the atom for the sake of clarity, we have atoms coming together as molecules. And in organic life we see these molecules come together as macromolecules, organelles, cells, tissue, organs, and systems like a human being, a tree or a cat.

We can see that the same abstraction hierarchy we observe in nature is equivalent to the abstraction hierarchy in computing on the basis that simple units agglomerate to create structural units that agglomerate into other structural units interacting as systems to create bigger systems.

When it comes to intelligence, we can see that even the brain is organized in layers with the more primitive layers below the most recent with the neocortex being the most recent. If we see beyond the physical realization of the brain to the underlying structure we can see that even our consciousness/intelligence could be organized in this hierarchy, with base levels concerned with physical survival and higher levels concerned with more abstract concepts like self-realization that have almost no physical advantage to the raw physical atomic agglomerations.


Popular posts from this blog

Next Steps Towards Strong Artificial Intelligence

What is Intelligence? Pathways to Synthetic Intelligence If you follow current AI Research then it will be apparent to you that AI research, the deep learning type has stalled! This does not mean that new areas of application for existing techniques are not appearing but that the fundamentals have been solved and things have become pretty standardized.

How to become an AI researcher

Artificial Intelligence is all the rage these days. Everyone is getting on the bandwagon but there seems to be a shortage of AI researchers everywhere these days. Although many people are talking about doing AI not many people are actually doing AI research.