Vol.3, No.2, 91-100 (2011) Natural Science
Copyright © 2011 SciRes. OPEN ACCESS
Progress of computer potential of mankind as the basis
for a new model of the universe
Pavel A. Stabnikov
Nikolaev Institute of Inorganic Chemistry, Siberian Branch of Russian Academy of Sciences, Novosibirsk, Russia; stabnik@niic.nsc.ru
Received 13 August 2010; revised 20 September 2010; accepted 23 September 2010.
Known models of the Universe development are
discussed in the present work. At present it is
not possible to state what model is true among
the suggested ones because all models are
based on the assumptions a validity of which is
scarcely determined. There are plenty of rea-
sons, and the most important ones are impos-
sibility of experiments performance on a global
scale, very short time of nature study and low
accuracy of determination of fundamental phy-
sical constants to check their possible drift. In
the most of models the intellect is an unneces-
sary attribute. Mankind is only an inner insig-
nificant observer in the Universe. Small changes
which Humanity can produce on the Earth don’t
modify our planet on a global scale. However
besides changes in the material world Mankind
can create intellectual valuables. Large infor-
mation content can be stored, integrate and
process using computers. We have no any
principle restrictions in advance of the com-
puter engineering. Therefore we propose a new
model of the Universe development which is
based on the increasing facilities of our Man-
kind. In this model the Earth is considered as an
analogue of supercomputer. Under certain cir-
cumstances Mankind is ready to carry out in-
formation processing for other civilizations, but
such civilizations have not yet been found. The
Creators of the Universe could be other cus-
tomers. Computer-like model of the Universe
suggests a communication with the Creators in
future for execution of their calculation orders.
However this model is not complete because of
lack of high rate of information transfer for long
Keywords: Universe Models; Matter and
Information; Mankind Facilities; Computer Models
To date, a huge number of models of the Universe
development were proposed, which are based on the
physical and cosmological data [1-5]. Because of the
low accuracy of some physical constants, but also be-
cause of the impossibility of conducting experiments on
a global scale, all such models are still only proposals. In
most well-known hypothesis of the Universe the mind is
unnecessary attribute. The main purpose of this paper is
to describe the new, computer-like model of the Uni-
verse development, where we use the physical charac-
teristics of the surrounding world, as well as increasing
possibilities of the Mankind (cognition of nature, infor-
mation processing, creation of models, theories and sys-
tematology). In developing the model, we tried to an-
swer the pragmatic aspects of the nature: who created
our world and what, eventually, humanity will be able to
do in the future. We are attracted to the idea, borrowed
from the religious, scientific, and popular fiction. These
ideas are recycled in accordance with the modest capa-
bilities of humanity to change the material world on a
global scale and progress in information processing.
These ideas are fastened by a pragmatic approach, in-
spired by the energy shortage that already constrains the
further development of human civilization. Suggested
computer-like model of the Universe in the future in-
volves communication with the Creators to do their
computing orders.
All the proposed hypothesis of the universe can be di-
vided into two main groups:
1) The first group includes the hypotheses assuming
that the matter will continue scattering forever.
Broadening may be either uniform or accelerated
[1,2]. The average density will decrease perma-
nently. Galaxies, starts, planets, atoms will be
gra- dually decomposing. Only the energy will
remain in the infinite space. In these hypotheses,
P. A. Stabnikov / Natural Science 3 (2011) 91-100
Copyright © 2011 SciRes. OPEN ACCESS
Mankind just plays a part of an interior indiffer-
ent observer for a small time interval. Moreover,
mind is an unnecessary attribute of the Universe
that in no way can affect the general motion of
the matter.
2) The second group includes the hypotheses as-
suming that scattering and rapprochement of the
matter occur along a sinusoidal curve or a helix
[3,4]. In these hypotheses, the notion of the Big
Bang is replaced by Big Bounce, or the matter in
whole is represented as a kind of a spring that
oscillates from the most dense to the most rare-
fied state. In these hypotheses, the role of Man-
kind is also just an observer. This group also in-
cludes the hypotheses in which the development
of the Universe proceeds along a broken line as
an infinite saw. These are the so-called pulsating
models of the Universe. The start is the Big Bang,
then scattering of the matter proceeds until a
definite density is achieved; then a new bang
occurs. Big Bang may be initiated either by a
supernova or by an explosion organized by a
civilization of humanoids. It is this latter case
when the civilization may become a primer of a
new Big Bang. If we adhere to this hypothesis,
Mankind will sooner or later become the initiator
of a new Big Bang. However, this may happen
only if another civilization developing some-
where in Universe fails to leave us behind in this
deed. However that may be, it has been estab-
lished experimentally by present that the explo-
sions of atomic bombs do not cause a new Big
Bang. It remains unclear whether experiments at
the Large Hadron Collider will be able to initiate
a new Big Bang. We will get aware of this within
the nearest 10 years (May be 12/21/2012. This
date in the predictions of the ancient Mayan
completes the era of the “Fifth Sun”).
It should be noted that only few hypotheses put for-
ward the idea of a steady unvarying Universe. The rea-
son is that the soviet mathematician A. Fridman demon-
strated in 1922-1924 with the help of the theory of A.
Einstein that the development of the Universe may fol-
low two routes: expansion or compaction. The Universe
that is steady on the global scale would not be stable. In
addition, some authors assume the existence of many
universes, similar to our Universe, but with possibly dif-
ferent physical laws [2,5].
But why has it not been established yet what hypothe-
sis among the proposed ones is the most correct? There
are several reasons; the major ones are: impossibility to
perform experiments on the global scale, short period of
time (in the universal scale) during which the nature has
been under investigation, low accuracy of determination
of some physical values. For the present time, this does
not allow us to establish the presence or the absence of
interconnection between definite physical parameters
during the motion of material objects for long time in-
tervals. Thus, the velocities with which the galaxies move
away are calculated according to Doppler Effect on the
basis of the red shift value in the spectra of starts in
those galaxies. However, in order to recalculate the shift
values into the velocities of galaxies moving away, we
are to assume that Doppler Effect on the intergalactic
scale is also connected mainly with the velocity of bod-
ies moving nearer or away, as it is on the Earth’s scale.
We cannot confirm this assumption with any experiments,
so we cannot reject also other hypotheses, for example,
that light passing through intergalactic distances be-
comes aged, so that its frequency decreases, or the ve-
locity of the light increases permanently, due to which its
frequency may be decreasing, while galaxies do not
move away at all, or they do move but with smaller ve-
locities. In addition, the light from remote galaxies was
radiated by the atoms of starts several hundred million
years ago, but we compare it with the spectra recorded
today. Because of this, before making comparisons, we
should additionally assume that the spectra remained
unchanged during this time also on the Earth. The cor-
rectness of assumptions may become clear only in the
distant future.
In addition, we do not have any data showing how
stable the fundamental physical constants are during
long intervals of time. According to definitions, these
values should be invariable. However, some physicists
doubt that these constant are invariable. For example, P.
Dirac formulated a hypothesis in 1937 that the gravita-
tion constant may decrease while the Universe develops.
Consequences of the changes of some fundamental
physical constants for the Universe are discussed in [5,6].
At present only physical constant known to be changed
with time is Hubble constant. The point is that the age of
the Universe Т0 and the value of the Hubble constant Н0
are connected with each other [6] through the equation:
Т0 = 1/Н0 or Т0*Н0 = 1
This is true if Т0 and Н0 are expressed in the same
units, for example in seconds and reciprocal seconds,
respectively. In the general case, this equation is written
as Т0*Н0 = k, where k is a coefficient depending on the
units in which Т0 and Н0 as expressed. According to the
newest data, the age of the Universe is (13.72 ± 0.12)
109 years, or (13720000 ± 120000) thousand years. After
1 thousand years, its age will increase by a unit in the
eighth significant digit. Thus Н0 should decrease by a
corresponding value also in the eighth significant digit.
Today we have Н0 = (74.2 ± 3.6) km/s/Mparsec, only
P. A. Stabnikov / Natural Science 3 (2011) 91-100
Copyright © 2011 SciRes. OPEN ACCESS
two significant digits. If one day in the future it will be
possible to determine Н0 with the accuracy up to eight
significant digits, 1000 years later it would be possible to
establish the drift of Hubble constant in time. If people
would become able to determine Н0 with the accuracy
up to nine significant digits, the same procedure will
take only 100 years. But what will happen with the red
shift values after 1000 years? On the one hand, Н0 will
decrease; on the other hand, the distances from the gal-
axies will increase during that time, so the red shift val-
ues will remain the same. This will be so only in the case
if the model of Universe expanding uniformly (without
acceleration) is true. In addition, the spectra of atoms,
recorded on the Earth, should remain unchanged, too.
By present, CODATA (Committee on Data for Sci-
ence and Technology) recommends (according to [7])
the values of Rydberg constant to 13 significant digits,
proton mass to 11 significant digits, electron mass, elec-
tric constant, Planck constant to 10 significant digits, the
velocity of light and some other fundamental physical
constants to 9 significant digits. The majority of funda-
mental physical constants have been determined with the
accuracy of 5, 6, 7 significant digits, so it is difficult at
present to determine their possible drift. In order to es-
tablish which of the constants are indeed constant and
which change, it is necessary to organize a supervisory
service, but this service should operate more than a cen-
So, due to the impossibility of performing additional
experiments on a global scale, or due to the short time
interval during which the fundamental physical constants
and their accuracy are monitored, in many cases we
cannot confirm or reject one or another model of the
Universe. This is one of the paradoxes of modern cos-
mology. If we are unable to demonstrate experimentally
the falseness of a hypothesis, we should consider it plau-
sible. These are the general principles of science devel-
opment, as the science develops due to any ideas and
hypotheses including the craziest ones. Here we are to
adhere to a simple idea: the larger is the number of hy-
potheses dealing with the development of the Universe,
the higher is the probability that anyone of them will
turn out to be true. Today we may only suppose that a
model is more plausible in comparison with others.
Now we will venture to give an advice how to propose
a model of the Universe which could not be easily de-
nied: that model should rely on the new ideas and hy-
potheses, as well as on the data obtained during the re-
cent years. The model should not contradict easily veri-
fiable experimental data. As far as the behavior of gal-
axies or atoms at long distances and during long time
intervals is concerned, assumptions may be fantastic. In
any case, it would be impossible to confirm or reject
them during the forthcoming thousand years. To deco-
rate the model, we may introduce the drift of some fun-
damental physical constants depending on the astro-
physical parameters of the model. For instance, it was
assumed in [3], proposing a model of the cyclic devel-
opment of the Universe, that the Universe would expand
until the masses of electron and proton become equal to
each other. The authors of [3] call this moment the de-
generacy point. Then the galaxies wills tart to approach
each other, and the masses of electron and proton will
change in the reverse direction subsequent 15 milliard
(billion) years. However, the mechanisms through which
the mass gets redistributed between the proton and the
electron is not discussed in that work. In all other re-
spects, this is rather beautiful model that cannot be either
confirmed or rejected at present.
In addition, any model proposed for the development
of the Universe may be decorated with quite unusual as-
sumptions and notions. We should mention [1] introduc-
ing the notion of Newtonian antiattraction for the as-
sumed accelerated expansion of the Universe; the au-
thors of [2] introduce the notion of the “dark energy” for
the same purpose. If it turns out that, quite contrary, the
expansion of the Universe slows down, then antigravi-
taiton may be replaced with hypergravitation; there is no
need to change the term “dark energy” because nobody
knows what it is. We should also mention the wormholes
[8], the presence of which should allow Mankind to reach
distant points, both in our world and in the possible par-
allel worlds.
Following these recommendations, we may propose a
large number of the models of the Universe. Below we
will propose a hypothesis of the development of the
Universe that rests upon the astrophysical data and on
the possibilities of Mankind in the reclamation of nature
and in information processing. To start, we will discuss
what the information is.
The definition of information, contained in the phi-
losophical [9] and the Polytechnic [10] dictionaries, from
our point of view, are incomplete and one-sided. So here,
based on these definitions, we will offer a more com-
plete interpretation of the term information.
The most general categories in our world are matter,
information, ideal. We proceed from the opinion that the
matter is prime. It is usually stated that the matter is
permanently moving and changing; this motion is inde-
pendent of whether we or anybody else have any notion
of this motion or no. But this motion is not absolute.
Some material objects may remain unchanged during
long time intervals, which give us the possibility to store
P. A. Stabnikov / Natural Science 3 (2011) 91-100
Copyright © 2011 SciRes. OPEN ACCESS
information. The matter also possesses the ability to
conserve the information about changes that took place
long ago. For example, organisms that died several mil-
lion years ago and turned out to be under definite spe-
cific conditions may conserve their shape. For example,
in such processes as the formation of amber, carboniza-
tion, or zoolith formation, the composition of the organ-
ism changes completely but its shape is conserved and
remains cognate. Due to this feature, it was established
where ancient oceans and continents were situated on the
Earth, how the continents drifted, when life appeared on
the Earth, how living organisms evolved etc.
Material bodies may interact with other mechanically
or through another method while they move. The results
of these interactions may be diverse: scratches, deforma-
tions, magnetic or electrophysical changes on the surface
of solids, the formation of new chemical compounds,
acoustic or electromagnetic waves propagating in dif-
ferent directions etc. These are primary material data.
These material data may be recorded with specially de-
veloped sensors or with the receptors of living beings.
These recorded data are usually called virtual data. These
data are not material any more; they are ideal. The vir-
tual data may be subjected to changes and transforma-
tions in a definite space that is called the virtual space.
Modern notion of information includes both primary and
virtual data. The primary material data are quite clear
because they are real; however, virtual ones are latent;
their motion and transformations may be followed only
on the basis of indirect data, for example on the basis of
the results of action of sensors or living organisms. So,
the virtual space may be defined as a state of a definite
vector system formed in the analysis of the primary ma-
terial data and allowing one to reach optimal decisions.
Living creatures may not only read the primary mate-
rial data but they also may create these data themselves.
A human being may write any virtual information on
paper with the help of letters. If we consider only the
material aspect of such writing, this will be only a sheet
of paper spotted with paint in definite sites. o detect the
primary data (letters) on such a sheet of paper, a person
able to read is necessary. In the general case, for one
man to pass any virtual information to another man, he
should code it bring definite changes in the surrounding
material world thus creating the primary experimental
data. After that, another person who is able to read, un-
derstanding the language or gestures, should read these
data, decode them, then form virtual information for
himself on this basis. There is no other reliable route to
pass the information from one person to another. We
consider telepathy to be not proved.
For successful work with the information, it should be
stored somewhere. Under normal conditions of our life,
only solids may conserve the primary data. Gases and
liquids do not possess this ability. The developed living
organisms may conserve the information in the brain cor-
tex neurons. However, it is still not very clear how this
information is recorded and reproduced. The information
is stored in computers in special hard disks. At the mi-
cro-level, where no notions of solid, liquid or gas exist,
information may be stored due to the rigid structure of
molecules. This possibility is used by living organisms;
their genetic code is recorded in polynucleic acids with
the help of four bases: adenine, guanine, uracile and cy-
tosine. Considering smaller objects – atoms and atomic
nuclei – we may only assume the possibility of using
them for storage of primary data. These particles may
exist in the excited states only for a short time, but some
of these objects possess spin, a magnetic moment that
can be conserved for rather long time. In order to use
these effects, it would be necessary to develop an ana-
logue of a writing device surely transferring the state of
a separate atom from one to another, and an analogue of
a reading device. In addition, one cannot exclude that on
the global scale material objects may be similarly used
to store the primary data. The galaxies are stable due to
their internal rotation which is an analogue of spin in
micro objects; it may be potentially used to store the
The major part of primary data about the surrounding
world is obtained by us due to the ability of electromag-
netic waves to propagate in vacuum, air and some bodies.
If this were not the case, we would not get any idea of
the Sun, starts, and the universe. However, we ourselves
would not exist in this case because life exists on the
Earth due to the energy of solar light. Energy is neces-
sary to read, process and record data; so, while the mov-
ing matter possesses energy, it is possible to read the
primary data, analyze them, compare, and form virtual
spaces. So, motion of the matter at the same time creates
the conditions for the appearance of virtual information
spaces. Because of this, we may speak in our world of
the motion of two worlds: the material one, and a virtual
(ideal) one. The major difference between these worlds
is the fact that the material world is scalable, that is, with
an increase of material objects their mass changes; so
does the intensity of their interaction and many other
characteristics. Information is not scalable: analysis and
processing of the information may be carried out with
equal success with the help of computers in which the
working elements differ in size. The larger are these
elements, the larger place is occupied by the memory
and the lower is performance speed. Because of this,
objects with smaller elements are preferable for informa-
tion storage and processing.
The primary material data recorded on some medium
P. A. Stabnikov / Natural Science 3 (2011) 91-100
Copyright © 2011 SciRes. OPEN ACCESS
or transferred from one source to another can be charac-
terized by total volume; its exact evaluation was devel-
oped by Shannon, C.E. However, total volume of a mes-
sage does not indicate the amount of useful information
contained in it. For example, a message may be com-
posed of a random set of letters. Only qualitative criteria
have been developed for evaluation of virtual data. Thus,
several levels exist for characterization of these data:
syntactic, semantic, logical, model description, and la-
tent meaning. Only a message meeting definite rules of
orthography will pass the syntactic control. The number
of the kinds of syntactic control may be equal to the
number of known languages including programming lan-
guages. But if a message has passed the syntactic control,
this does not mean that it necessarily contains the virtual
information of a higher level, because a message may be
composed of unrelated sentences and thus having no
semantic (meaning) content. Another kind of control is
logical control; it establishes the absence of logical con-
tradictions in the message. Then it may be determined
whether a message is some model, or a law, or a general
scientific discipline. Finally, a message can contain la-
tent meaning. The latent meaning may be established at
any level. For example, it a message does not pass the
syntactic control with the help of a definite language,
this does not mean that it does not contain useful infor-
mation. It is quite possible that it contains the latent in-
formation; it may be detected through the syntactic
analysis with the help of another language. The most
vivid example containing latent information is the quat-
rains of Michel de Nostredame in which latent predic-
tions were found and will be searched for.
Since information may be characterized quantitatively
(primary data) and by several qualitative levels (virtual
data), the same levels may be used to characterize also
the spaces in which information processing is performed.
For example, sensors that switch off the street lighting in
morning respond to the amount of sunlight recorded by
photoelectric cells. A primitive space with the quantita-
tive evaluation of information is formed in these sensors.
Spaces with either quantitative or qualitative information
processing may be formed in computers. Spaces with
different levels of information processing may be formed
in living organisms. Humans possess even higher possi-
bilities in this respect. They may not only process infor-
mation but transfer it to each other and to forthcoming
Let us consider the features of interconnection be-
tween the matter, a human being and human civilization
as a union able to generalize and store virtual notions.
During the whole history of mankind, civilizations were
accumulators of primary data and generators of scientific
notions. With a loss of some civilizations, the entire in-
formation accumulated by them was lost. This informa-
tion included the practical experience in agriculture and
crafts, ceremonies and life conditions of different estates
of the society, literature and scientific achievements,
philosophy and religious doctrines. Today we may estab-
lish some facts concerning the vanished civilizations on
the basis of conserved written sources and material val-
ues found during archeological digs. Even in this case
the notions about those civilizations may be recovered
only by people through consecutive transformations of
the primary archeological data into virtual information
and then, after its generalization, through the formation
of our own notions about the life of people, their culture
and their scientific doctrines. With the general approach,
the motion of virtual information is formed specifically
in the brain of each person. This motion may coincide
with that of another person or differ from it. Because of
this, the number of subjects possessing brain is more
than six milliard (billion) while there is only one mate-
rial world.
In general, the most surprising feature of the material
world is the fact that the motion of the matter may pro-
mote the formation and long-term functioning of the
structures able to read and analyze the primary data, then
create virtual data, model representations and doctrines on
this basis. At present, our civilization has achieved defi-
nite success in information processing and in the crea-
tion of scientific notions that are the basis for subsequent
development of the possibilities of Mankind in the in-
vestigation of the surrounding world, in assimilation of
natural resources and the improvement of the life level
for each human being. Under definite conditions, Man-
kind is ready to share the entire set of data available and
to carry out information processing for other civiliza-
tions; however, no other civilizations have been discov-
ered so far. Below we will propose a model of the uni-
verse according to which Mankind will get the possibil-
ity in the future to process the information for an exter-
nal supergiant civilization that had presumably created
our world.
We are going to propose the model of the Universe in
which the main idea is a humanoid civilization able to
process information and to generate doctrines about the
nature. The basic provision will be formulated as follows:
the Universe has been created in order that somewhere
in it an intelligent civilization would appear that would
be able to process the information. Because of this, the
proposed model differs from the majority of the models
of the Universe in which the role of unnecessary interior
P. A. Stabnikov / Natural Science 3 (2011) 91-100
Copyright © 2011 SciRes. OPEN ACCESS
observer or at least a primer for a new Big Bang is as-
signed to Mankind. Our opinion may be expressed by
paraphrasing the known words of R. Descartes: <<If we
think, this excellent world has been created in order that
we could think>>. For what reasons do we hold to this
statement? We simply respect (appreciate) ourselves. We
are surprised at astrophysicists who assign as miserable a
role as a chancy observer to humans and therefore to
themselves, too, in their models of the development of the
As we declare that the Universe has been created, then,
there should be the creators of our world. We cannot
define exactly what or who are these creators. Our no-
tion of the creators is close to the ancient Greek polythe-
ism or belief in twelve gods. But if there were creators,
they had definite goals, and we are to understand these
goals and help the creators to solve their problems. In
order to understand what the creators assumed and hoped
to obtain from us, it is necessary to consider what we can
do in this world.
On the scale of our galaxy we can do nothing, simply
because we will never get even to the centre of our gal-
axy. We make such a conclusion because the scientists
have not invented a spaceship able to move faster than
the velocity of light, a time machine or the possibility to
travel outside our four-dimensional space-time. We also
do not believe in the possibility to travel with the help of
“wormholes” assumed by some astrophysicists [2,8]. We
may only observe the Universe being interior indifferent
observes but tending to have a definite probability in
future to become able to predict the further development
of the Universe. Mankind has achieved definite success
in nature development of eth Earth. As far as the inter-
stellar space is concerned, we stick to extremely pessi-
mistic opinion.
Now we will consider the Solar system. Can we change
the Solar system so that more people would be able to
live in it? Assume that it is possible to move Mars and
Venus by some miracle to the Earth’s orbit. If the Earth,
Mars and Venus move equidistant along one orbit, such a
“necklace” of planets will be in quasi-equilibrium which
may be sustained for arbitrarily long time. After some
time, the climatic conditions on Mars and Venus travel-
ling around the Sun along the Earth’s orbit will approach
the climatic conditions of the Earth. Then there will be
three planets suitable for habitation. However, these
changes in the Solar system are impossible. Then, what
will Mankind be able to do in the Solar system? People
will be able to travel by spaceships to Mars. Perhaps a
settlement may be built there. Other planets of the Solar
system will be studied most likely with the help of
automatic stations. Mankind can observe the near-Earth
space, calculate the motion of planets, as well as aster-
oids and comets. Maybe this will allow us to protect he
Earth from asteroids. However, Mankind is unable to
perform any noticeable changes in the Solar system.
What changes are to be made on the Earth in order to
provide the possibility to inhabit it with more people?
The most favorable living conditions are those existing
at the equator and in temperate latitudes. The conditions
existing at the poles are unfavorable. Subtropical belts
often contain deserts. The conditions in the mountains
are also unfavorable for life. So, it is desirable that the
polar and subtropical belts were covered with oceans,
that no marshland regions or mountains were occurring
on land. The size of main lands should not exceed 1000
km, which would allow winds to deliver precipitation
easily. However, at present Mankind is unable to reshape
land and move continents. This can hardly be possible
also in the future.
What can humans do at present? They can build cities
and roads, disforest, plough up the steppe. Burning fuel,
we enhance the greenhouse effect in the atmosphere,
which may cause warming on the Earth. Operating nu-
clear power stations also cause warming. However, mi-
nor warming recorded at present on the Earth is most
likely connected with an increase in solar activity. We
are able to annihilate the majority of large animals and
human beings using poisonous substances. People can
change the reflection power of the atmosphere with the
help of nuclear explosions, which would lead to Nuclear
Winter and maybe to the destruction of human civiliza-
tion. Nevertheless, we may hope that no catastrophes of
this kind would ever happen on the Earth. Maybe this is
a complete list of what we can do on the Earth. Yet this is
still nothing on the universal scale. In other words, homo
sapiens can do almost nothing in the physical world on
the universal scale and therefore means nothing.
However, Mankind can not only make some changes
to the material world but also create intellectual values.
Now we will consider not so much the aesthetic capacity
of homo sapiens as the possibilities of Mankind in in-
formation accumulating and processing. Mankind un-
doubtedly has achievements in this area that open unlim-
ited possibilities. A great number of various doctrines
has been created by people by present; people can carry
out rather complicated calculations that allow predicting
the behavior of matter, from the quantum size to the siz-
es of galaxies. Large amounts of information may now
be stored, systematized and processed with the help of
computers. In addition, the speed of computers and their
memory are permanently increasing. We have not any
principal obstacles that would limit the progress in com-
puter systems. If we take into account the time that we
have at our disposal while the Sun sustains the condi-
tions favorable for life on the Earth, we will see that
P. A. Stabnikov / Natural Science 3 (2011) 91-100
Copyright © 2011 SciRes. OPEN ACCESS
these possibilities are almost unlimited. By present,
Mankind has created the global computer network, the
Internet. Maybe Creators will be able to communicate
with us through the Internet to pose some problems for
solving that would be important for them. Surely, Man-
kind would undertake that work simply because of the
gratitude for the creation of this surprising and tremen-
dous world.
If our Universe had been created for sensible beings
capable of data processing to appear in it, then, in turn,
we also will be able to create the conditions that will
allow self-formation of “sentient structures” at the mi-
cro-level, so that they would be able to process informa-
tion but this time for us. In our opinion, this is the future
of computer systems. Let us at first consider the stages
of computer production allowing Mankind to create in-
creasingly perfect computer techniques.
The most important parameters characterizing any
computer are its speed and memory capacity for storing
executable programs, initial and intermediate data, and
computation results. All these parameters are directly
connected with the size of working elements. The small-
er are these elements, the higher is computer speed. Sev-
eral stages of computer production with their attempts to
miniaturize the working elements may be distinguished.
The first stage of computer production is made by
hands. At this stage, all the elements of the first com-
puters, including electronic tubes or the first transistors,
were manufactured manually. Assembling and soldering
were manual, too. This stage, similarly to all the subse-
quent ones, had its own size limits. This limit may be
illustrated by a horseshoe that a fabulous lefthander
Levsha had manufactured to horseshoe a flea. This makes
about 10-5 m. Human hands are a too coarse instrument
to make smaller details.
The second stage is due to solid crystals. The ability
of solids to form the zones with definite easily adjustable
electro physical characteristics and conserve them for a
long time was employed at this stage. This allowed us to
manufacture micro circuitry, memory elements with high
pacing density. Of course, in these cases, too, everything
is made by human hands, including the devices for
manufacturing the circuit boards and computer assem-
blage, but the most delicate work is the formation of
small zones of crystals with controllable size and required
characteristics; this may not be done directly with hands.
The formation of micrometer- and sub micrometer-
sized elements of integrated circuits (IC) is performed by
means of lithography. Lithography is a method to form
the required relief (pattern) during IC manufacturing
process. The patterns are made either with the help of
preliminarily manufactured templates (photographic me-
thods) or by means of scanning electron (ion) beam con-
trolled by computer. At present, depending on the re-
quired miniaturizing degree and expenses, photolitho-
graphy in the visible spectral region is used (the achieved
limit of element miniaturization is 1.0-2.0*10-6 m), as
well as UV lithography (0.5-0.8*10-6 m), X-ray lithog-
raphy (0.1-0.5*10-6 m), electron beam lithography (0.2-
0.3*10-6 m), ion lithography with the beams of H+, He+,
O+, Ar+ ions (0.1-0.5*10-6 m).
It has been established by present that further decrease
in the size of solid crystal elements causes a decrease in
the time within which the useful characteristics are con-
served. The performance characteristics of modern inte-
grated circuits may be conserved during many years,
while the structures with smaller elements may retain the
working state within a limited time (hours, minutes) due
to dark currents, temperature fluctuations, microbreak-
down, tunneling etc. To make the work of devices with
these features possible, it is necessary to perform peri-
odical updates of the parameters of working elements.
This periodical interruption of computation process will
take short time but total speed of these computers will be
much higher than that of the presently existing com-
puters due to higher miniaturization. This problem seems
to be one of the most important reasons that would re-
strict further miniaturization of solid crystal computers.
The miniaturization limit achieved by means of beam
lithography is estimated as 10-7-10-8 m. It should be
noted that almost all the possibilities of beam lithogra-
phy have already been involved by present.
Further progress in miniaturization is considered to be
connected with the possibilities of scanning tunneling
microscope (STM). Surface modification may be per-
formed by means of STM either through the direct me-
chanical action of a needle on the surface (direct scrat-
ching) or by local electrochemical oxidation of the sur-
face with the help of a needle. Separate deformed or
oxidized regions with a size of 1-2*10-8 m have been
successfully obtained by means of STM [11]. However,
these are only early attempts to apply this method to
modify crystal surface.
The third stage is molecular. Definite success at this
stage belongs only to living organisms at present. How-
ever, Mankind in theoretical and technological develop-
ment has already come right up to the possibility of
building a molecular computer. A large number of small
molecules are known to be capable of reversible rear-
rangements under the action of radiation or magnetic
field. These rearrangements are either geometric changes
of a molecule or magnetic and/or charge redistribution.
All these effects may be used in instantiation processors
P. A. Stabnikov / Natural Science 3 (2011) 91-100
Copyright © 2011 SciRes. OPEN ACCESS
or memory elements [12]. Yet, it should be stressed that
the problems connected with the conservation of per-
formance characteristics of elements will become even
more acute in molecular computers. The limit of minia-
turization of molecular computers should be determined
by the size of molecules being potentially able to rear-
range reversibly. An example of such a size may be the
size of the aromatic ring. The distance between the op-
posite atoms in benzene molecule is 2.9*10-10 m; taking
into account the Van der Waals radii of atoms, the radius
of benzene molecule is equal to 1*10-9 m. The latter
size may be accepted as the limit of miniaturization of a
molecular computer.
The fourth stage is assumed to be atomic. Assump-
tions concerning quantum computers are also put for-
ward; these computers are to employ different states of
electron spins [13]. However there are no real models of
computers applying on interatomic effects. It should be
noted that the size of atoms is 10-11-10-10 m.
The common phenomenon for the four computer
schemes considered above is energy flux through the
elements of the system. In modern computers, it is elec-
tric current. A computer in which the driving force is a
light wave or liquid may also be possible. All these de-
sign may be called static models of computers with rigid
structure. Self-assemblage and self-formation are in-
creasingly frequently used in descriptions of manufac-
ture of IC elements; thus it is stressed that automatic
machines create only definite conditions in micro-zones,
while the useful electro physical properties are formed in
them due to the special properties of solid bodies or
crystals. Will it be possible in our material world to cre-
ate conditions for self-formation of automatic machines
themselves, that would be able to manufacture super-
miniature computers? We think this is possible. To con-
firm this statement, let us consider our modern civiliza-
tion on the Earth. Potentially, Mankind is already able to
process information for extraterrestrial consumers. That
is, our planet is an analogue of a computer. If the size of
the Earth is considered as the size of a computer, the
elements on which data processing is carried out are
super miniature. Some conditions are necessary for the
self-formation of such a computer; these conditions have
occasionally formed in the Solar system: the source of
energy is the Sun, the material substance on which life
had begun is the Earth’s surface, energy absorber is the
universal space. This model, unlike the static one, will
be called the dynamic computer model. While the static
computer model is a device in which data processing is
performed on specially built easily adjustable elements,
the dynamic computer model is a black box which is
permanently improving under the action of long energy
action and finally becomes able to perform data proc-
essing. The majority of material objects that are subject
to energy action simply scatter the energy without form-
ing any structures able to process the data. However, we
think that in some cases it will be possible to develop a
device working according to the dynamic scheme.
In our opinion, similarly to the developing civilization,
self-structuring substance in the dynamic computer sche-
me will always get adjusted to the problem to be solved,
and it will itself try to find more efficient and fast solu-
tion methods. This means that it is not necessary to de-
velop special computer languages and to write various
data processing programs. Other problems arise; they are
connected with the probability of formation of a self-
structuring substance, maintenance of its active capacity
for work, and the possibility of information connection
with it.
The major physical conditions are: 1 – artificial main-
tenance of the unstable state of the material substance, 2
– the possibility of additional action on this substance,
and 3 – the possibility to receive responses to these ac-
tions. The simplest example of the dynamic computer
model may be laser generator. Another example of the
possible dynamic computer model may be the scheme of
the Electron-Positron Collider (EPC). Recently con-
structed Large Hadron Collider (LHC) potentially may
also serve as a model of the dynamic computer. It is
quite possible that the Big Bang was initiated by a colli-
sion of two gigantic parts of the substance [2] produced
according to the EPC or LHC schemes. In addition, oth-
er schemes of dynamic computers are quite possible, too.
Even now we can try to make experiments in this direc-
tion. It may turn out that the development of a dynamic
computer is much simpler and cheaper than, say, a mo-
lecular computer. It should be noted that the develop-
ment of a dynamic computer is to a definite extent equi-
valent to the creation of sentient life in the Universe.
During the whole history of Mankind nothing similar has
even been created by humans. However, it should be
noted that the creation of sentient life is not quite novel
idea. It has been under discussion in science fiction since
long ago.
The largest problem of any dynamic computer is the
possibility of rapid information exchange between the
active working zone and the external customers. Infor-
mation may be reliably transferred in our world only in
the form of primary material data; their maximal rate is
limited to the velocity of light, which is too slow on the
universal scale. These are the major shortcomings of the
model proposed by us. But why did the Creators make
our world in which the information cannot be transferred
P. A. Stabnikov / Natural Science 3 (2011) 91-100
Copyright © 2011 SciRes. OPEN ACCESS
infinitely quickly? The matter is not so simple, however.
Restricted maximal rate of the transference of material
objects has substantially increased the lifetime of the
Universe. If the velocity of matter expansion could be
infinitely high, there would possibly be lack of time for
the generation and development of life on the Earth.
There still remains a hope that the tools of information
transfer with the velocities higher than the velocity of
light may be found. An interesting communication that
appeared in 2008 was made by Nicolas Gisin and co-
workers from the University of Geneva [14]. They de-
monstrated that the entangled photons spaced at a dis-
tance of 18 km <<sense>> changes in the states of each
other. The rate of correlation of their behavior exceeds
the velocity of light by several orders of magnitude. The
authors proposed to call this effect <<the transfer of
quantum information>>. It is still not clear yet whether it
is possible to use this effect for transferring any data. We
have mentioned previously that we do not believe in the
possibility to move material objects with the help of the
assumed wormholes [8], but maybe it will be possible to
transfer the information.
So, we may be useful for the assumed Creators due to
the increasing possibilities of Mankind in data process-
ing and due to the large experience in modeling and
generating scientific concepts. Along with the possibili-
ties, Mankind has also got several problems. The major
one is connected with the lack of energy for further ac-
celerated development. This problem will only become
more acute in the future. It is quite possible that the
Creators will be able to supply Mankind with energy
(which is so necessary for us) as a gratitude for data
processing. Then there will be mutually beneficial col-
laboration between the assumed Creators and Mankind.
The idea of a Creator, or Creators of our world seems
to be the most ancient, the most disputable one, and still
unsolved till present days. We do not want to discuss this
problem; we will only set out our opinion on the origin
and development of this idea. The life of a human being
in this world is very diverse. Sometimes the actions of
people lead to a desirable result but sometimes they do
not. In the cases when the activities of people had not
brought them up to the desirable results for unknown
reasons, prerequisites for assumptions concerning exter-
nal powerful forces appeared. At the earliest stage of the
development of human civilization, people were unable
to explain many frightening natural phenomena, such as
earthquakes, tsunami, hurricanes, thunderstorms with
lightning, droughts and flood events, illness etc. There-
fore, people interpreted the world as hostile and fright-
ening. People needed support and a strong protector. It
was that time when the religious doctrines appeared; in
those doctrines, the Almighty Creators or God created
the Earth in the centre of the world for people to be able
to live on it. All the disasters and frightening natural
phenomena were understood just as punishments for
disobedience or bad actions. The Creators might be pro-
pitiated with prayers, good deeds, or sometimes with
Centuries had passed. Many natural phenomena were
explained by natural reasons. Due to the accumulated
knowledge, Mankind was able to develop the methods of
protection from the negative natural phenomena. The
notions about the external world had changed essentially,
too. The Earth turned out to be not the centre of the
world but just one of the planets of the Solar system; the
Sun is situated not in the centre but at the very edge of
our vast Galaxy. There are many other galaxies in the
Universe. In addition, it had not been established where
the Creators can be found in our world; no reliable in-
formation connection with them was established. At that
time, religious notions were replaced by materialistic
ideas completely denying the real existence of the Crea-
tors of the Universe. According to the classical material-
istic ideas, Mankind is only an insignificant observer
that has appeared by chance within a small time interval
of the development of the Universe.
Some more time passed by, and the possibilities of
Mankind changed substantially, especially in the area of
data processing, systematizing and modeling. In addition,
any restrictions that might be limiting the progress in
this direction are not seen. At present, Mankind is able to
process the information for any possible consumers.
That is why we regenerate the idea of external Creators
of our world who presumably could have the need for
rapid data processing.
In the present work, we did not try to unite or concili-
ate the materialistic and idealistic directions in philoso-
phy. Sticking to the idea of the primary character of the
matter, we tried to give an answer to the question what
our world had been created for, and what Mankind can
do in this world. To answer these questions, we regener-
ate a well known idea of the Creators of our Universe at
a novel level. Every reader may agree with such an ap-
proach or, quite contrary, consider it to be incorrect.
In most of the known hypotheses of the Peace devel-
opment the mind in the Universe is an unnecessary at-
tribute. In the proposed computer-like model namely
mind and humanoid civilization is the main purpose of
the Universe. This model is based on the assumption that
P. A. Stabnikov / Natural Science 3 (2011) 91-100
Copyright © 2011 SciRes. OPEN ACCESS
the external Creators organized the Big Bang, which
became the cause of our universe. The pragmatic goal of
the Designers is the formation of a material substance
that can process information. The proposed computer-
like model of the Universe in the future involves com-
munication with the creators to do their computing or-
ders. However, this model of the world is not perfect due
to lack of an infinitely high speed of information trans-
mission. Maybe the with the opening of new data on the
Nature or the development of technical capabilities of
the Mankind it will became possible to propose the
World model, fully consistent with the data about the
world and the increasing possibilities of our civilization
in the development of nature, and especially in the field
of information processing.
The author thanks Prof. V. I. Belevantsev and Prof. I. K. Igumenov for
discussion of this paper.
[1] Cherepaschyk, A.M. and Chernin, A.D. (2009) Cosmo-
logy: Discoveries and questions. Nauka iz Pervyh ruk
(Science First Hand), Novosibirsk, 1, 26-37.
[2] Turner, M. (2009) The origin of the universe. Scientific
American, 9, 17-23.
[3] Cherny, V. (2009) Cyclical Universe. Nauka i Zizn
(Science and Life Magazine), 10, 40-43.
[4] Bojowald, M. (2009) Big bang or big bounce? Scientific
American, 1, 18-24.
[5] Djenkins, A. and Peres, G. (2010) Looking for life in the
multiverse. Scientific American, 1, 15-23.
[6] Spiridonov, O.P. (1991) Fundamental physical constants.
Vyschay Shkola (in Russia), 238.
[7] Karshenboim, S.G. (2008) New recommended values of
the fundamental physical constants. (CODATA 2006)
Physics-Uspekhi, 51, 1057-1064.
[8] Chirkov, J. (2008) Upon investigation secrets of universe.
Nauka v Rossii (Science in Russia), 2, 61-62.
[9] Frolov I.T. (2001) Philosophical dictionary. Moscow.
[10] Islinsky A.J. (1980) Polytechnic dictionary. Moscow.
[11] Aseev, A.L. (2007) Nanotechnology in solid-state elec-
tronics. Novosibirsk.
[12] Rambidy, N. (2006) Nanotechnology and molecular
devices. Nauka v Rossii (Science in Russia), 6, 38-46.
[13] Aaronson, S. (2008) The limits of quantum computers.
Scientific American, 3, 62-72.
[14] Salart, D., Baas, A., Branciard, C., Gisin, N. and Zbinden,
H. (2008) Testing the speed of spooky action at a
distance. Nature, 454, 861-864.