A gloomy future of Artificial life

In a previous chapter (see p. 22) the existing pessimism at the turn of the foregoing century was mentioned. In the presence of our own turn of the century we must establish the fact that the world and society still are considered to have a very gloomy future by many scientists. The main threat to humanity accounted by them, however, is not the year 2000 problem with its malfunctioning computers. It is something much worse, namely the inevitable disappearing of the human race in favour of a new, electronic creature. This view should be interpreted in the light of what has been written in the previous pages regarding AL and computer viruses.

In a well-known paper, the computer mathematician Alexander Bolonkin (1999), a professor of NASA, states that the worst threat to humanity is not nuclear war, collision with comets or AIDS (somebody will always survive). Instead it is the emergence of superior computer brains which will extinct humanity.

Bolonkin specifies that the main distinction of biological systems is the ability for unlimited self-propagation or reproduction. Man has now given birth to a new kind of complex top level systems which are based not on biological but on electronic principles. Just now they have evolved to the fifth generation and are built on new principles of light. Within a century they will have the capabilities of the human brain. The same path which took biological humanity tens of millions of years to complete, will now be followed by computers in an extremely short time. The electronic brains will, however, not stop at the human level but continue to improve themselves. They will soon surpass the human brain by hundreds and thousands of times in all fields by use of all the data and knowledge produced by human civilization and by other electronic brains. The fullness of their education will only take the time needed to write it into their memory.

‘When the electronic brain reaches the human level, humanity will have done its duty, completed its historical mission, and people will no longer be necessary for Nature, God and ordinary expediency’.

When analysing such a situation, most intellectuals makes a grave error when they believe that time for the big rest has arrived for humanity. It is assumed that recreation, entertainment, art and creative thinking now is in sight when electronic brains will do all the dirty work. The electonic brains should be our servants. But will an upper level mind become the servant for a lower level? Are we servant for our nearest ancestors, the apes? Do not we use them for our own needs — killing them in medical experiments if we find it necessary?

It is an overwhelming evidence that an other kind of civilizaton created on a superior electronic principle will regard us as in the same way as we regard lower level animals. They will use us for own purposes and kill us if we arrange for them. It is necessary to remember that Europeans conquered the Americas and decreased its

native population to practically zero due to superior competence. Furthermore, Europeans enslaved more than twelve million of the African population. Consequently, we are lucky that superiour creatures from other worlds have not visited us yet (otherwise we would reach them first). In all probability they will treat us as we have treated our own slaves.

Another analytic error in this context would be to withhold that why not construct computers which obey the famous robotic laws of Asimov (1968) reading the following:

  1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
  2. A robot must obey the orders given it by human beings, except when such orders would conflict with the First law.
  3. A robot must protect its own existence, as long as such protection does not conflict with the First and/or Second law.

Unfortunately, this is just as unrealistic as to forbid the progress of science and technology. Up to now humanity has not been able to stop nuclear research which must be considered to really have a devastating potential. Any nation doing this should immediately lag behind and in the long run be the victim of its neighbours. It is possible to state that any attempt to stop or slow down the technological progress is an action counter to the meaning of the existence of nature — the construction of complex upper level systems. Such actions would be against Nature and the law of “accelerated exchange” formulated by the computer scientist Ray Kurtzweil. This ‘ tells us that knowledge creates technological advances which in turn generates more knowledge in a self-amplifying and accelerating process.

A single electronic creature cannot establish a stable system even if it has great power. The new breed had to reproduce similar creatures of equal intellect and the result will be that the collective first rises. Of course they will give equal rights to those similar to themselves as every individual can write into its memory all the knowledge and programs created by their own society. With accelerating speed this civilization will disperse into the solar system, then in our galaxy, then in the universe. The next level of complex systems will then use the previous one and if the universe is bound in space and time, the process will end by the creation of the Super Brain. Such a brain may control the natural laws and may be considered the God.

In his paper Bolonkin ends with the exhortation that Humanity must realize its role in the development of nature. Its historical mission has reached an end with the emergence of the electronic brain. Humanity must exit from the historical scene together with all other animals and the vegetable world and do it with dignity. Humanity should not cling to its existence and should not make any obstacles for the new electronic society. We were those who gave birth to the electronic civilization which created the Super Brain!

Source: Skyttner Lars (2006), General Systems Theory: Problems, Perspectives, Practice, Wspc, 2nd Edition.

Leave a Reply

Your email address will not be published. Required fields are marked *