Skip to main content

What is Syntropy

Posted in

ISSN 1825-7968


Negative energy, syntropy and living systems

Antonella Vannini1

4.1 Introduction

The description of the qualities of entropic and syntropic phenomena can be traced back to the works of Luigi Fantappiè, one of the major Italian mathematicians, who, while working on the equations of quantum physics and special relativity, discovered that all the physical and chemical phenomena which are determined by causes located in the past are also governed by the law of entropy, whereas all the phenomena which are attracted towards causes located in the future (attractors), are governed by a symmetrical law which Fantappiè named syntropy.

 

The law of entropy is described by the second principle of thermodynamics which states that when transforming energy (for example from heat to work) part is lost to the environment. Entropy is a measure of the quantity of energy which is lost to the environment. When energy lost to the environment is distributed in a uniform way (ie where no differences in heat exist), a state of equilibrium is reached and it is no longer possible to transform energy into work. Entropy measures how close a system is to this state of equilibrium.

Syntropic phenomena, on the contrary, are characterized by the tendency towards the concentration of energy, differentiation, order and the ability of keeping the system away from heat death. Fantappiè noted immediately the coincidence between the properties of syntropy and the qualities of living systems.

1 antonella.vannini@gmail.com

4.2 Fantappiè: a short biography

Luigi Fantappiè was born in Viterbo, Italy, on 15 September 1901. He graduated in higher mathematics at the age of 21 on 4 July 1922 in the most selective Italian university, the “Scuola Normale di Pisa”, where he was a room-mate and close friend of Enrico Fermi. In

1926  he became professor of algebraic mathematics at the University of Florence, and in

1927  of infinitesimal analysis at the University of Palermo. His important mathematical researches were recognized with the Mathematical Medal of the Italian Science Society in 1929, and with the prize in mathematics of the Accademia dei Lincei and with the Volta prize of the Accedemia d’Italia in 1931. In 1931-32 he taught in the Universities of Berlin, Gottinga, Munich, Colonia, Friburgo and Lipsia, and in 1932 he became Director of the Institute of Mathematics at the University of Bologna. After 6 years in Brazil, where he founded and directed the Mathematical Institute of San Paolo, he became vice-president of the National Institute of High Mathematics at the University of Rome (founded and directed by Francesco Severi), where he taught high mathematics analysis. In 1954 he was nominated Accademico dei Lincei, and in 1955 he was given the golden medal as emeritus professor of culture. He died in Bagnaia (Viterbo) on 28 July 1956.

4.3 The unified theory of the physical and biological world

At the end of 1941, Luigi Fantappiè was working on the equations of relativistic and quantum physics when he noted the dual energy solution:

+E, which describes waves diverging from a source, from causes located in the past (retarded potentials), is governed by the law of entropy;

- E, which describes waves converging towards a source, a cause located in the future (anticipated potentials), is governed by a symmetrical law which Fantappiè named syntropy.

What Fantappiè discovered can be considered a natural evolution of the discoveries of the beginning of the 20th Century.


1.  In 1905, Einstein had introduced the special theory of relativity, which:

a.  described the universe using four dimensions: three dimensions of space and a fourth dimension relative to time, paving the way for descriptions in which past, present and future co-exit;

b.  had a dual solution for the energy/momentum/mass equation leading to the Übercausalität, supercausality: mechanical causation and retrocausality.

2.  In 1928, Dirac demonstrated the existence of the positron as an anti-particle of the electron.

3.  In 1932, Anderson experimentally proved the existence of the positron producing in this way the first empirical evidence of the existence of retrocausality and the negative solution of energy.

4.  In 1951, Bell discovered the famous CPT symmetry (Charge Parity and Time reversal symmetry) according to which in quantum physics all laws are symmetrical in respect of time.

Fantappiè added to these discoveries a new element. He showed that diverging waves, in which causes are located in the past, are governed by the law of entropy; while converging waves, in which causes are located in the future, are governed by the law of syntropy. (Fantappiè 1942). According to Fantappiè the main properties of +E and –E are:

1.  +E:

a.  Causality: diverging waves exist as a consequence of causes located in the past.

b.  Entropy: diverging waves tend towards leveling (and heat death).

2.  –E:

a.  Retrocausality: converging waves exist as a consequence of causes located in the future.

b.  Syntropy:

i. converging waves concentrate matter and energy in smaller places (ie this principle is well described by the large quantities of energy accumulated by living systems of the past and now available in the form of coal, petrol and gases).

ii. Entropy diminishes. Entropic phenomena are governed by the second law of thermodynamics according to which a system tends towards homogeneity and disorder. The inversion of the time arrow also inverts the second law of thermodynamics, so that a reduction in entropy and an increase in differentiation are observed.

iii. Final causes, attractors, which absorb converging waves are observed. These final causes cause syntropic systems; in this way it is possible to introduce scientific finalism, where finalism means final causes.

iv. Because syntropy leads to the concentration of matter and energy, and this concentration cannot be indefinite, entropic processes are needed to compensate syntropic concentration. These processes take the form of the exchange of matter and energy with the environment.

Studying the properties of syntropy, Fantappiè discovered the coincidence with the qualities of living systems: finality, differentiation, order and organization. These findings were presented on 30 October 1942 at the Accademia d'Italia, in the form of volume titled “Principles of a Unified Theory of the Physical and Biological World”.

4.4 Albert Szent-Gyorgyi

Albert Szent-Gyorgyi (Nobel prize 1937) discoverer of vitamin C, used to underline that “One major difference between amoebas and humans is the increase in complexity, which presupposes the existence of a mechanism which is capable of contrasting the second law of thermodynamics. In other words a force must exist which is capable of contrasting the universal tendency of matter towards chaos, and of energy towards heat death. Life processes continuously show a decrease in entropy and an increase in inner complexity, and often also in the complexity of the environment, in direct opposition with the law of entropy.” In the 1970s Szent-Gyorgyi concluded that in living systems there was wide evidence of the existence of the law of syntropy, even though he never managed to infer it from the laws of physics. While entropy is a universal law which leads towards the disintegration of all types of organization, syntropy is the opposite law which attracts livingsystems towards forms of organization which are always more complex and harmonic (Szent-Gyorgyi, 1977).

The main problem, according to Szent-Gyorgyi, is that “a profound difference among organic and inorganic systems can be observed … as a man of science I cannot believe that the laws of physics lose their validity at the surface of our skin. The law of entropy does not govern living systems.”

Szent-Gyorgyi dedicated the last years of his life to the study of syntropy and its conflict with the law of entropy.

4.5 Science of chaos

Supercausality leads to processes which are chaotic, but at the same time reactive to attractors. In 1963 the meteorologist E. Lorenz discovered the existence of chaotic systems which react, at each point of their states, to small variations. Studying, for example, a simple mathematical model of meteorological phenomena, Lorenz found that a small perturbation could generate a chaotic state which would amplify, making weather forecasting impossible. Analysing these unforeseeable events, Lorenz found the existence of an attractor which he named the “chaotic attractor of Lorenz”: this attractor causes microscopic perturbations to be amplified, and interfere with the macroscopic behaviour of the system. Lorenz described this situation with the words: “The flap of a butterfly’s wing in Brazil can set off a Tornado in Texas”. Lorenz’s discovery started the science of chaos, which is centred on attractors. Whereas in thermodynamics disorder is a property of mechanical deterministic systems, governed by entropy with causes in the past, and order is a property of syntropy and attractors, in which causes are placed in the future, in the science of chaos order is associated with deterministic systems (entropic systems), whereas disorder is associated with attractors (syntropic systems). The origin of this contradiction can be found in the fact that in the science of chaos, “ordered” systems are those which can be predicted (a property which is true only within entropic systems), whereas “disordered” systems are those which cannot be predicted (a property which is true within syntropic systems). The science of chaos links order to entropy and disorder to syntropy; but, as we have seen already, as a consequence of the second law of thermodynamics, entropy is linked to disorder and syntropy is linked to order. The fact that syntropic phenomena are attracted by the future and cannot be predicted in a precise and mathematical way is associated at the micro-level with chaos and disorder. It is interesting to note that the forms of order which syntropy generates at the macro-level are accompanied, at the micro-level, with chaotic / non-deterministic processes.

4.6 Chaos and fractals

Fractal geometry was started in the 1970’s by Mandelbrot, and shows that by inserting attractors in a geometrical system, complex and ordered figures are generated. In fractal geometry an attractor is an operation, a function which tends to a limit which will never be reached. For example, if we repeat the square-root of any positive number larger than one, the result will tend to one, but never reach it. The number one is therefore the limit (attractor) of the square-root of positive numbers larger than one. In the same way, if we square a number superior to one the result will tend to infinity, and if we square a number inferior to one the result will tend to zero. Fractal figures are a result of the interaction of attractors introduced into a geometrical figure; fractals show, in a visual way, what happens when syntropy and entropy interact together.

Fractal geometry reproduces some of the most important structures of living systems, and many researchers have proved that life processes follow fractal geometry: the outline of a leaf, the growth of corals, the form of the brain and the nervous terminations. An incredible number of fractal structures have been discovered, for example:

1.  Blood arteries and coronary veins show ramifications which are fractals. Veins divide into smaller veins which divide into smaller ones. It seems that these fractal structures have an important role in the contractions and conduction of electrical stimuli: the spectral analysis of the heart frequency shows that the normal frequency resembles a chaotic structure;

2.  Neurons show fractal structures: if neurons are examined at low magnifications, ramifications can be observed from which other ramifications depart, and so on;

3.  Lungs follow fractal designs which can easily be replicated with a computer. They form a tree with multiple ramifications, and with configurations which are similar at both low and high magnifications.

These observations have lead to the hypothesis that the organization and evolution of living systems (tissues, nervous system, etc.) can be guided by attractors (causes placed in the future) in a similar way to fractal geometry.

Note the similarity of these fractal images with brain structures (http://fractalarts.com/)

Fractal structures of the human body grow in complexity following the evolution of life. Fractal structures in life probably evolve through limited information which guides living organisms in their evolution.

At present, medicine seems to be one of the leading fields in the study and development of fractal science and the science of chaos. Researchers need to understand in a deeper way how evolution is guided by attractors, and how apparently chaotic systems are part of higher forms of order. For instance, brain processes are characterized by the co-presence of chaos and order: chaos is observed at the micro-level where non-deterministic processes take place, whereas order is observed at the macro-level where attractors lead inevitably to an increase in syntropy.

www.sintropia.it

213

Opposites

What is Syntropy?

The opposite of entropy!!

So what is entropy?

The opposite of Syntropy!

Which is the opposite of.....

Syndicate

Syndicate content