< See chapter 7 | Continue to chapter 9 > | See all chaptersIn this chapter we will attempt to answer the following questions:
(1) What force propels evolution in the informational space?
(2) What is the nature of the process of evolution?
(3) Are all informational combinations possible?
To answer the first two questions, we need to clarify the factors that determine the evolutionary path of man in the informational space – that is, the world informational line. This line is determined by several factors, which can be nominally subsumed under two categories: HaShem’s Plan and Man's choice. HaShem’s Plan is revealed through prophecies, which inform us of such events as the enslavement of the Jews in Egypt, the resurrection of the dead, the advent of the Mashiach and the World to Come. These events are unavoidable, although their timing is not specified. This means that at a certain point in time the universal informational path will enter the region of informational space corresponding to these events.
As for our choice, it is the consequence of our informational interaction with the surrounding world, which is a pulsating process (reviewed in the preceding chapter). The process works in the following way: we receive information from the world, we process it and perform certain acts that change the world's informational landscape; thus, we advance along the universal information path. Our choice hinges on numerous factors: the goals we set, the anticipated consequences, the information available to us, the structure of our soul and our acceptance of certain laws and restrictions (the laws of the Torah and others).
As previously noted in the introduction, we cannot change the fundamental laws of Creation, but we are granted the liberty to alter the course of events.
The guiding factor in our choice is our evolution within the informational space, measured by progress in our study of the Torah and cognition of HaShem. The further we advance by this dimension, the better and more effective choices we make and the better the quality of our interactions with the surrounding world. As we have noted previously, HaShem’s Plan is the driving force behind our evolution within the informational space. However, the paths may vary. Let us imagine that the entire population of our planet has suddenly recognized the Torah and the commandments. I leave it to the reader to think about how this story goes.
Following the above-mentioned scenario, let us try to understand the nature of our evolution along the universal information path. Modern science employs the concept of Markov and non-Markov processes, named after the Russian mathematician Andrei Markov, to describe system evolution. A Markov process is a process in which the future event depends solely on the event that precedes it and is independent of the itinerary the system had followed to come to the preceding event. Conversely, any evolution in which the future event depends on the cumulative history of prior system evolution is a non-Markov process.
This writer is of the opinion that evolution within the informational space cannot be described in terms of the two processes mentioned as they imply no established future – that is, no backwards causation. On the other hand, as also noted above, evolution within the informational space includes prophesied events that are certain to happen in the future. At least one future event is certain even in the evolution of an individual. It is death, and foreknowledge of death often defines our actions in the present.
The prevailing opinion in the academic community today is that human behavior cannot be described within a cause-and-effect framework. In his book At Home in the Universe, Wheeler writes that insofar as human behavior is at any given moment aimed at and dictated by goals set in the future, it is unpredictable.
This raises the logical question: What happens when our choice shifts the universal information path to a region incompatible with G-d's plans? We will try to answer this question in the next chapter.
Addressing our third question, it is quite obvious that the system evolving within the informational space is characterized by enormous combinatorial complexity. In theory, the number of combinations is colossal. The question is as follows: Are they all possible and feasible? Or does the universal information path traverse a rather limited area of the informational space?
Clearly, not all combinations are possible. Thus, for example, the Jews could not have found themselves in a situation where they would avoid slavery in Egypt, the Mashiach cannot fail to come and so on. As for the second part of the question, this writer maintains that in reality only a negligible percentage of the theoretically possible combinations is realized, which is to say that the universal information path traverses a fairly limited region within universal space.
To understand this idea (understand, not prove), the author recommends pondering the known examples of complex combinatoric systems (the author refers to the information given in the book From Strange Simplicity to Complex Familiarity by Nobel Prize-winning scholar Manfred Eigen).
1. Thermodynamics. Any physical system consisting of N parts may be described by a point within the phase state space with the number of dimensions 6N, where the digit 6 is defined as the sum of three spatial coordinates and three impulses at those coordinates. A cross-section of such a system at a specified moment in time is known as a 'micro-state.’ In theory, there exists an enormous number of micro-states. The aggregate of micro-states is known as a 'macro-state.’ Calculations have demonstrated that only an infinitesimal portion of all the possible micro-states are actually realized during the entire life cycle of the Universe. Therefore, despite the system's exponential combinatorial complexity, the number of states actually realized constitute a portion equal to
2. Language. In 1942 Claude Shannon developed his Mathematical Theory of Communication (also known as Information Theory), the goal of which was the effective encoding and transmission of information. The cornerstone of Shannon's theory was the concept of information entropy, determined by the following formula: E= log2 Pi ⬚
where Pi is the quantity of a system's possible information states.
Let me explain this with a simple example: if you keep tossing a coin with only heads on both sides, you will keep getting just one information state. In this case Shannon's entropy will be zero as the logarithm of one equals zero. If we toss a regular coin with heads and tails, we will get two information states. In this case Shannon's entropy equals one, and that is one bit.
When applied to information encoding, Shannon's theory looks like this: if you take the English alphabet of 26 letters and one space, the number of possible letter combinations in the absence of any limitations is vast. However, taking into account the known linguistic limitations in word formation and meaningful letter sequences, the number of combinations standing to be realized will constitute but the tiniest share of the sum total of the possibilities, the same as with our thermodynamics example.
Linguistic limitations lead to a drastic reduction in Shannon's entropy and in the number of possible combinations that can be realized. Manfred Eigen cites the following example in his book. Taking into account the specific limitations existing in the English language, it can be demonstrated that Shannon's entropy falls, approximately, from 4.76 bits per letter to 2.16 bits per letter. This means that a phrase one hundred characters long will have 1065 possible sequences instead of 10143. The reduction factor equals 1078, which is a colossal value.
3. Genetic Science. Manfred Eigen gives a fairly representative example of the following nature. If you take a cage containing one gene or one protein, the number of cages you can deploy across the visible Universe will equal approximately 10102. At the same time, just one gene consisting of a thousand nucleotide monomers has a number of possible combinations equal to approximately 10600. What follows is that in genetic science too only the tiniest portion of the total number of theoretically possible combinations can be realized. Notably, a gene also constitutes an informational code.
The examples above may suggest the operation of some Divine Principle whereby a certain number of combinations, which sustain the evolution of Creation, are realized with an inconceivable level of precision. The writer deems it possible to extend that principle to the evolution of Creation in the informational space.