What is Intelligence: Evolution of Intelligence

EVOLUTION OF INTELLIGENCE

Imagine a large laboratory where billions of beakers are holding different mixtures of elements and molecules prepared by some invisible agent or process. Some beakers explode some evaporate away, while some become sweet smelling some become very toxic, etc. now imagine that the agent/process just adds new elements and molecules to different beakers sometimes takes the content of one beaker and mixes it within another beaker or bunch of beakers.


And also imagine that apart from just elements and molecules, other variables are brought to bear on the system of beakers in the laboratory. Maybe the temperature of the room is raised, then lowered. The pressure is increased and lowered, gravity is acting relatively constantly, electricity is being applied in the form of thunder and lighting etc. After a while, there appears to be some kind of muddle of elements and molecules that have increased in complexity beyond the basic mixtures in other beakers.

We must also assume that the process is not purely random, by purely random I mean that the process that is doing the mixing and the alterations in the environment is not acting in such a way that each action is drawn from a purely uniform random distribution. After reading through this chapter you will see why the process cannot be random.

Formally, I would say that the process is not following a rule like a rule 30 cellular automata which are a class 3 cellular automata (For more information on sources of randomness and why intrinsic generation of randomness is more robust thus rule 30 check out this section in Stephen Wolfram’s book a New Kind of Science: https://www.wolframscience.com/nks/p299--three-mechanisms-for-randomness/ .

The process is following something that looks more like rule 110 cellular automata which are a class 4 cellular automata. Class 4 cellular automata include elements of randomness but mixed with “structure”. The fact that class 4 cellular automata possess structure means that if the source of algorithmic action originates from sampling the evolution of the rule 110 class 4 cellular automata then the process will seem like it is acting with some kind of “intention” and is not totally random.

If such a structured process controlled by maybe sampling the evolution of rule 110 cellular automata or any one of the infinite number of class 4 of cellular automata out there in the computational universe, then it will possess some kind of preference for certain beakers in the lab because there is some structure present in its actions and it is not uniformly random.

This preference might be due to the fact that it acts with intention and varies in a kind structured way over time with elements of randomness here and there. So it now preserves certain beakers and starts adding chemicals in a more systematic way due to the structure in its process and finally aha! It realizes that the beaker has the potential to act like itself!

This special beaker is not just a bunch of haphazard chemicals, it has been structured as cells, there are other beakers that possess cells but this beaker has the potential to evolve into something more, something that is a physical representation of the process that brought about its existence.

But this representation is not in a direct way, the process that brought about this particular beaker is much close to a brute force algorithm for achieving the novelty that is represented in the content of that beaker, even though there was some structure to the actions which made it seem like there was some "intention" behind the actions, the actions where generated by some mechanical procedure without a direct "intentional agency" directing the process so we can safely say "brute force".

But the contents of the special beaker now possesses a more optimized algorithm for achieving the goal of acting with intention or structure, because it is the first time structure has been captured in some composite form, represented by the new form of organization as cells. But it is not just the cells that make this beaker special, it is the fact that within these cells are molecular structures that have somehow captured the algorithms that brought them into existence and are now capable of increasing complexity of their form to express more of this intentionally structured action in a wider field of activity. This would the major jump in the evolution of intelligence.

Someone will ask, why will the molecules agglomerate to capture the structure of the process acting upon them? they will because their own intrinsic properties predispose them to agglomerate. This is like a blackboard and a professor. The blackboards intrinsic property is to hold unto agglomerations of chalk that are scratched on it. The blackboard does not understand what is being scratched on it, its qualities enable it to hold the knowledge expressed on it by the professor who is performing the structured process of expressing knowledge on the blackboard.

The cells are no different from blackboards in the sense that the process that created them was acting on wild pools of elements and molecules in a structured way and the cells captured that structure. Where the cells are different from the blackboard is that the cells have the intention of replicating the actions of the agency that brought them into existence in the first place. If we can call structured action "intelligence", then the cells intention to act in the manner of the structured process that created it is also intelligent. But to express intelligence in a wider field, the cell has to agglomerate with other cells to form higher structures who have the sole intention of imposing structure on increasingly vaster fields of endeavour.

We must remember that not all the beakers in the labs produced cells and not all the cells produced posses all the molecules needed to not only capture the structure of the creative process but also its intention and thus in the final analysis not all higher agglomeration of cells will be able to capture this will to impose structure externally in a very powerful way, some will be more capable than others.

We have used beakers to illustrate this idea but we are not far from reality when we imagine all the processes that the earth experienced in its formative years which eventually resulted in the prokaryotic cell, the eukaryotic cell, unicellular and multicellular organisms, right up to humanity.

The stormy hot soup of prehistoric earth was mixing elements into molecules, molecules into macromolecules, RNA, DNA, cellular enclosure to retain DNA, cells filled up will all sorts of macromolecular machinery, right up to the human being and of course other organisms. My argument here is that this mixing was not uniformly random but rather looked something like a process that possessed some structure, with some actions looking random and some repeatable.

Four classes of the behaviour of any system https://www.wolframscience.com/nks/p231--four-classes-of-behavior/

Class 1: Simple behaviour, almost all initial conditions lead to exactly the same uniform final state

Class 2: Many different possible final states but consist of simple structures that either remain the same forever or repeat every few steps.

Class 3: Mostly random although small scale structures (like triangles) are seen to appear.

Class 4: Mixture of order and randomness

There are special Rules like Rule 22 that when initialized with random Initial conditions produce a totally random looking class 3 system. But when initialized with a special initial condition consisting of 1 surrounded by zeros (note where every you see black it represents one and white is zero) it returns an interesting looking structure known as a fractal.

Rule 22 initialized with random initial conditions looks like a class 3

Rule 22 evolves a fractal when initialized with a 1 surrounded by zeros.

2d view of the evolution  of the game of life cellular automaton averaged for 100 steps

1d slice of 2d evolution (above) of the game of life cellular automaton, an example of class 4 cellular automaton which mixes randomness and definite structure

 Rule 30 cellular automaton evolution, an example of a class 3 system. Sampling the bits in the centre column results in a collection of completely random digits.

Rule 110 cellular automaton a classical example of a class 4 cellular automaton

Rule 1599 cellular automaton another class 4 cellular automaton. Can you notice the gross similarities between this rule and the game of life evolution?

With some engineering you could use the information generated from these systems to guide the evolution of another process just like in classical programming scenarios, programmers use random number generators to guide the generation of random choices of execution path in randomized algorithms like those for decision making. It is quite straight forward to create a function in something like Wolfram Language (www.wolfram.com) using the results of the evolution of any cellular automaton system as a kind of choice generator that determines some execution pathway.

As an example to help us make this explicit if you were given some large graph consisting of nodes and edges and you were asked to write a program that traverses the graph in some fashion, either random or with some predetermined structure. Also assuming that your graph has a single root and a bunch of leaves and the goal is to traverse it from root to leaves. Using a random generator based on rule 30 you will traverse the graph in a totally random fashion but using a class 4 cellular automata like Rule 1599 as the generator function you will traverse the graph in a particular structure predetermined by the evolution of the Rule itself. If you were to record the path of traversal using a particular Rule on some very large graph, the record of traversal will mirror the process that generated the traversal actions.

Thus if a class 4 rule is behind the apparently random process that turned the primordial soup of primitive earth into one filled with life after things cooled down enough for some structured agglomeration of stuff to appear, then the structured agglomeration of stuff will be a record of the process that created it and thus when we get to the state where we have some structured cellular agglomeration that wants to replicate the actions of the processes that created it in a wider scheme of things then we have something that is not acting in a completely random way and thus is structured and we say that it is acting"intelligently".

There is no reason for only one rule to be used in the process of mixing, shaking, adding, heating up, applying pressure, electric sparks, etc. the experimental lab setup. Different rules can be applied at certain times and for certain durations in order to impose their different structures on the system. The order of switching which rule to be applied at which time and for what duration can still be governed by rules themselves, like random application could be governed by something like rule 30 and structure with rule 1599, 20,110, etc. If imposing a nested structure on the setup is needed something like rule 22 with a simple initial condition might be imposed. 

As a low hanging example to demonstrate the effects of something like the nesting generating rule 22, our organisms like mammals are structured in a nested fashion. Elements -> Molecules -> MacroMolecules -> Organelles -> Cells -> Tissue -> Organs -> Organ Systems -> Human Being. We can see that along the organizational chain is a nested structure with higher levels in the chain made up of composites of lower levels, etc. 

As cells come together to agglomerate into structures of greater complexities so that they can replicate the actions of the process that created them in a wider field, they could evolve other properties like cell replication (one cell pinching off into two after duplicating its content) , and the combination of different genetic material to make one that is hopefully more capable survival using the process of sexual reproduction. Tightly integrating a bunch of special cells in an explicit network so that they function with greater efficiency and thus the entire evolutionary will of the organisms handed over to these cells so that other cells can function on other things thus creating a nervous system which at a higher level of agglomeration a full brain, etc. 

We are the product of this kind of structured-randomness, and thus we are capable of actions which can be analysed as random+structured. We are the replication of the very capabilities that created us in the first place and we have the ability to create greater complexity than ourselves, which is what many have called super-intelligence, that which is above our intelligence.

The very fact that certain human beakers of chemicals have the desire to create something that surmounts their intelligence is because we as products of the processes of intelligence that created us have reached a stage where we seek something that will replicate our intelligence process, so that we can influence wider field beyond what we as a civilization have been able to achieve or will ever be able to without augmentation with that new creation.

We seek to transcend our own intelligent process the same way our emergence replicated the process that created it but also transcended it into something as complex as the modern world filled with synthetic artefacts which are the result of our own experimentation in the laboratory of the world.

As we churn out our own structures in code, machines and other kinds of structures (the beakers of our own lab) we are replicating the primordial soup and eventually we will discover a beaker which replicates our will to expand our influence but will be capable of its own individual evolution in its own way which might be very alien to us just like we are very alien to the natural world that birthed our emergence.

This is the instinct behind the effort towards artificial intelligence, or like I prefer to call it, synthetic intelligence.

Comments

Popular posts from this blog

Software of the future

This powerful CEO works remotely, visits the office only a few times a year

At the edge of a cliff - Quantum Computing