In previous articles, (Red Button of Adam, Part 1, Red Button of Adam, Part 2 and The first creation of Adam - Where, When, Why? https://www.jpost.com:kabbala ), I put forward the idea that the creation of Adam took place in the informational zone of the Sefira Malchut of the world of Atzilut and that the subsequent creation of the worlds Beriyah, Yetzira and Asiya (BYA) was caused by the sin of Adam. Before the sin, the creation of the worlds BYA was not necessary.
Two of the main underlying principles supporting my conclusion were:
- G-d never does unnecessary work for the execution of His Plan.
- Adam should have been created in such a zone of information space so that information exchange between G-d and Man, and vice versa, required a minimum amount of work. This zone corresponds to the Sefira Malchut of the world of Atzilut.
In this and consequent articles, I will consider the following:
I. The expression of these above-mentioned principles in our reality, namely the family of “minimal” principles:
- Fermat Principle
- Principle of Least Action - PLA
- Ockham’s Razor
- Principle of Sufficient Reason - PSR
- Theories of Algorithmic Complexity
- Minimum Energy Principle
- Ground State
- Minimum Work Principle
II. I will attempt to answer the question, ‘What is the nature of G-d’s work?’
III. I will consider the connection between the “minimal principles,” causality and Divine Providence, the essence of Shabbat and the path of the Torah in the information space.
IV. The influence of Man on the amount of G-d’s work.
V. The information nature of the ‘physical’ laws and their connection with the punishment of Adam.
Fermat’s Principle was proposed in 1658 by Pierre De Fermat, a French mathematician. In brief, the essence of the principle is that the path taken by the ray of light between two given points is the path that can be traveled in the least time. Later on, the least time was replaced by stationary time with respect to the variations of the path.
Principle of Least Action - PLA
In simple language, this principle is ingeniously described in the book of Nobel prize winner Richard Feynman “The Character of Physical Law:”
“When you have a number of particles, and you want to know how one moves from one place to another, you do it by inventing a possible motion that gets from one place to the other in a given amount of time. Say the particle wants to go from X to Y in an hour, and you want to know by what route it can go. What you do is to invent various curves and calculate on each curve a certain quantity. (I do not want to tell you what the quantity is, but for those who have heard of these terms, the quantity on each route is the average of the difference between the kinetic and the potential energy.) If you calculate this quantity for one route, and then for another, you will get a different number for each route. There is one route which gives the least possible number, however, and that is the route that the particle in nature actually takes.”
Initially, the PLA was proposed by Pierre Louis Maupertius in 1744 and elaborated by the great mathematician Leonhard Euler in the same year. In 1760, Joseph-Louis Lagrange proposed the way of mathematical description of the mechanical systems known today as Lagrangian. Lagrangian is known as the difference between the kinetic and potential energies, and the action is the integral of the Lagrangian over time. It is important to note that action is a scalar. It doesn’t describe what happens at a certain moment in time. It is a parameter of the trajectory.
The Principle of Least (stationary) Action is central in modern physics. It was applied in Thermodynamics, General Theory of Relativity, Quantum Electro Dynamics (QED), Particle Physics and String Theory.
It is well described in the article “The Principle of Least Action” (David Dalrymple, Edge): “Nature is lazy. Scientific paradigms and “ultimate” visions of the universe come and go, but the idea of “least action” has remained remarkably unperturbed.”
Max Planck, founder of quantum physics, wrote:
“The least-action principle introduces a completely new idea into the concept of causality: The efficient cause, which operates from the present into the future and makes future situations appear as determined by earlier ones, is joined by the final cause for which, inversely, the future - namely, a definite goal - serves as the premise from which there can be deduced the development of the processes which lead to this goal.”
Also, he wrote, “the most adequate formulation of this law creates the impression in every unbiased mind that nature is ruled by a rational, purposive will.”
From the point of view of the mathematical description of the physical systems, PLA and formulations based on the physical laws – for example, Newton’s second law – give the same result. We again refer to Richard Feynman’s book. Feynman gives three equal descriptions of gravity - the famous Newton’s Law of Gravity, the local field method, and the PLA. He writes:
“This is an example of the wide range of beautiful ways of describing nature. When people say that nature must have causality, you can use Newton’s law; or if they say that nature must be stated in terms of a minimum principle, you talk about it this last way; or if they insist that nature must have a local field - sure, you can do that. The question is: which one is right? If these various alternatives are not exactly equivalent mathematically, if for certain ones there will be different consequences than for others, then all we have to do is to experiment to find out which way nature actually chooses to do it.
“But in the particular case I am talking about, the theories are exactly equivalent. Mathematically, each of the three different formulations, Newton’s law, the local field method and the minimum principle, give exactly the same consequences. What do we do then? You will read in all the books that we cannot decide scientifically on one or the other. That is true. They are equivalent scientifically. It is impossible to make a decision because there is no experimental way to distinguish between them if all the consequences are the same. But psychologically, they are very different in two ways. First, philosophically you like them or do not like them; and training is the only way to beat that disease. Second, psychologically they are different because they are completely inequivalent when you are trying to guess new laws.”
The principal and most important difference between the description of the system by PLA or by the physical laws is in causality. The PLA implies teleology (finality) - an explanation for something which serves as a function of its end, its purpose, or its goal - actually backward causality.
Formulation by the law implies efficient cause - straightforward causality.
The answer to the question, “Which type of causality actually takes place in the Creation?” is of utmost importance since it affects our understanding of G-d’s plan, Divine Providence and freedom of choice.
Both approaches demand certain explanations.
In the case of PLA, the standard question is, “How does the particle ‘know’ which trajectory to choose?”
When the founding father of quantum mechanics, Niels Bohr, visited Ernest Rutherford in Cambridge and told him that when an electron emits a photon, it jumps to the lower energy orbit, Rutherford inquired, “How does an electron know to which orbit to jump?”
Any type of causality is effected through the transfer of energy, information or momentum.
Since PLA implies backward causality, the mechanism should be explained. Also, we should ask a question about the connection between PLA and Divine Providence.
In the case of the “law approach,” there are many unanswered questions. Where do the laws come from? What is their nature? How can we account for the mutability of the laws?
In his book “At Home in the Universe,” John A. Wheeler writes:
“No laws. So far as we can see today, the laws of physics cannot have existed from everlasting to everlasting. They must have come into being at the big bang.”
He also writes: “How can one see the lesson of gravitational collapse - whether big bang or black hole or big crunch - in a larger framework? No concept puts itself forward with greater force than ‘mutability.’”
There exists a distinction between the attitude of different schools of thought towards the problem of causality.
One of the most ardent supporters of teleology (final causes) was Gottfried Leibnitz with his famous Principle of Sufficient Reason (PSR), which in concise form could be laid out as “nothing happens without a reason.” Here Leibnitz meant reason of G-d. He writes:
“I grant that particular effects of nature could and should be explained mechanically.…but the general principles of physics, and even of mechanics depend on the conduct of a sovereign intelligence and cannot be explained without taking it into consideration.”
Actually, the essence of the principle is that the Creation is a “goal-directed activity,” as it was put forward by William of Auvergne, Bishop of Paris, much earlier than Leibnitz put forward his PSR. Long before William of Auvergne, Talmudic sages expressed the idea of backward causality in a very concise form: “What was conceived first was created last.”
According to Descartes, Divine reasons exist, but are forever hidden from the human mind. (Yemima Ben-Menahem, “Causation in Science”). She writes: “According to Descartes, the natural laws are decreed by G-d and are humanly inexplicable. G-d’s reasons cannot be used for scientific explanations. The Cartesian school of thought rejected teleology and accepted the principles of straightforward causality.”
It’s an established tradition in contemporary science to follow the Cartesian line of thought and to refute teleology, which is not surprising, taking into account the principle which says that “G-d is not a business of science, and He cannot be measured or computed.”
The “law approach” has a number of weak points. Scientists can observe the pattern, express it in mathematical form and call it a law but science cannot explain the law. Isaac Newton admitted that he had no explanation for the Law of Universal Gravitation. He also admitted the problem of initial conditions which are not known to science. As John A Wheeler put it (“At home in the Universe” ):
“Never has physics come up with a way to tell with what initial conditions the Universe was started.”
The problem of initial conditions has a general character. In his seminal work “From Strange Simplicity to Complex Familiarity,” Nobel prize winner Manfred Eigen writes: “Darwin excluded any application of his theory to the origin of life, insisting that it dealt exclusively ‘with the manner of succession.’ Darwin derived his principle from observations in nature. Only once, many years later, did he think of a more general theoretical foundation. In a letter written in 1882 (during the last month of his life) to the zoologist and surgeon George C. Wallich, Darwin confirmed that his work was concerned only with the manner of succession, and he ‘left the question of the origin of life uncanvassed as being altogether in the present state of our knowledge.’”
William of Ockham (c.1287-1347) was one of the most radical thinkers in the history of philosophy. He was born in the village of Ockham in Surrey, England. The scope of his work covers logic, physics, natural philosophy and the theory of knowledge, but he remained in the history of science as a creator of the famous “Ockham’s Razor” principle, which is formulated as follows: ‘Entities are not to be multiplied beyond necessity.’ (“Stanford Encyclopedia of Philosophy,” by Alan Baker)
The principle of simplicity permeates all the areas of human thought; theology, philosophy and science. It was supported by Aristotle, Thomas Aquinas and many other thinkers.
Emmanuel Kant noted that simplicity is a “regulative idea of pure reason.”
Isaac Newton said, “Nature is pleased with simplicity and affects not the pomp of superfluous causes.” Galileo maintained that “Nature does not multiply things unnecessarily; that she makes use of the easiest and simplest means for producing her effects; that she does nothing in vain, and the like.”
The attitude of modern science toward the principle of simplicity could be described by the following quotation from the writings of Albert Einstein:
“The grand aim of all Science… is to cover the greatest possible number of empirical facts by logical deductions from the smallest possible number of hypotheses or axioms.”
Beauty, elegance and simplicity are considered to be the criteria of the truthfulness of scientific theories. In Physics, beauty is often associated with symmetry and with invariance.
In scientific literature, a distinction is made between the two kinds of simplicity: syntactic simplicity (elegance) – the number and complexity of hypotheses and ontological simplicity (parsimony) – the number and complexity of things postulated.
The algorithmic complexity of a bit string is a measure of how much a given bit string can be compressed by a computer. A long, but algorithmically simple bit string can be compressed into a much shorter bit string. (“Information Processing and Thermodynamic Entropy,” Stanford Encyclopedia of Philosophy, Owen Maroney)
There are a number of quantitative theories of information.
Claud Shannon introduced the notion of information entropy which was a measure of uncertainty about the value of certain variables. Practically, he attempted to solve the problem of efficient coding. The task was to transmit a message using a minimum amount of symbols and without errors.
According to Kolmogorov, the complexity of a binary string “x” is the length of the shortest program “p” that produces “x” on a universal Turing machine. (Pieter Adriaans, “Information,” Stanford Encyclopedia of Philosophy.)
The Solomonoff Induction method was proposed by Ray Solomonoff. It deals with the hypotheses which could explain the set of data. The assumption is that the set of data was generated by some algorithm. Hence the hypotheses that explain the data are also algorithms and could be used as inputs for the universal Turing machine. All possible inputs which generate the data set are, at the same time, all possible hypotheses. But which is the correct one or the most probable? According to Solomonoff, the one represented by the shortest input algorithm. Actually, the Solomonoff Induction is a mathematical formalization of Ockham’s Razor. It is jokingly called “Solomonoff’s Lightsaber.”
It must be noted that theories of algorithmic complexity have some deficiencies:
- They are incomputable, but in practical cases, could be approximated.
- Algorithmic complexity is an asymptotic measure (it gives a value that is correct up to a constant).
- It has never been proven that physical reality works like a Turing machine.
- It is not known that all possible hypotheses were found.
- The Church-Turing thesis, which states that a function on the natural numbers can be calculated by an effective method if and only if it is computable by a Turing machine, has never been proven.
All the theories of algorithmic complexity deal with a quantitative measure of information and not with semantics (meaning). Two sentences could have the same amount of bits of information and absolutely different meaning.
The question – can meaning be reduced to computation – is very complex and pertains to the problem of the nature of human consciousness and Divine Providence.
Principle of Minimum Energy (or potential energy)
The Principle of Minimum Energy states that for the closed system, with constant external parameters, the internal energy will approach a minimum value at equilibrium.
Ground state (vacuum state)
The ground state of a quantum mechanical system is its stationary state of lowest energy.
The Minimum Economic Principle states that a given result must be obtained with the minimum possible expenditure of energy (resources) in the present binding conditions.
This principle is general and pertains to every action in the universe.
A good example of the application of the above-mentioned principle is given in the article “The Minimum Work Principle in the Universe” (Ihsan Kose):
“A nice example in the living world of the minimum work principle is the similarity of ant behavior to the maximum economy principle. When some groups of the ant colony set out to forage, they communicate with pheromone hormones among each other. An ant that has found food leaves pheromones on the ground - indicating the quantity and quality of food - to guide others.
“Another ant that follows this pheromone trace reaches the food and marks the surface on the path back to the nest with the pheromone by assessing the quantity and quality. Pheromones in spots that are not renewed by the ants within a certain time frame evaporate. Upon investigation of the ant routes, they are always found to follow the shortest path between the food and nest and leave pheromone tracks accordingly. For instance, when an asymmetric obstacle is positioned on the ant route, ants after a certain time are able to locate the shortest route again.”
- ‘Minimal principles’ are expressed in all the aspects of our reality.
- According to Kabbalah of Information, the information space of Creation is built on the principle of hierarchy of the concepts according to complexity, dimensionality and correspondence.
- The “minimal principles” of our reality are the least dimensional prototypes of the highly dimensional complex concepts of minimality and simplicity imparted to the Creation by G-d.
To be continued.
To purchase Eduard Shyfrin’s book ‘From Infinity to Man: The Fundamental Ideas of Kabbalah Within the Framework of Information Theory and Quantum Physics’ please click here. To purchase Eduard Shyfrin’s book ‘Travels with Sushi in the Land of the Mind’ please click here.