Review | La Complexité, vertiges et promesses. Histoires de sciences.
An excellent compilation of interviews with some of the great minds that have shaped complex systems science (and its various holistic approaches) in the late twentieth century and early present.
Introduction
Science, like any other human activity, is embedded by a social context that defines it. Today the world is no longer simple, and as a corollary of the hyperconnectivity in which we are immersed, the way of life in contemporary societies has been transformed in a sophisticated and irreversible way. Although curiosity and the quest for knowledge are biological traits that may well have been key evolutionarily speaking, there is no doubt that external factors such as geopolitics and economics directly affect the paradigmatic direction of science and its progress.
Historically, the reductionist approach has taken the lead: an epistemological posture holding that in order to understand the complex, one must do so in terms of its simplest components. Without underestimating the technological progress that reductionism has brought us over the last decades, specialization is no longer sufficient to solve the current international crises facing humanity. Since the second half of the last century, an alternative current to the bottom-up narrative has been developing: the science of complex systems seeks to explain non-linear, emergent and evolutionary phenomena. Under the prism of complexity, today’s global problems seem less challenging.
Today I will review a brand new book that helped me practice my French over the summer. In La Complexité, vertiges et promesses Réda Benkirane compiles a series of interviews made at the end of 1990s to eighteen specialists who in their research seek to unravel the paradigm of complexity. In this way, the author offers a comprehensive overview of a diverse ensemble of scientific perspectives: from the physics of turbulence to information theory, from genetic biology to artificial life, from the mathematics of chaos to algorithmic complexity, from fractal space-time to the universe in inflation, and beyond.
In his interviews we find people like Christopher Langton, one of the founders of artificial life when he organized the first workshop on the subject at the Santa Fe Institute. Or Gregory Chaitin, who at the age of fifteen developed the basis for algorithmic information theory and later developed a mathematical model for evolution. The structure of each chapter is the same: first we are given some biographical context about the researcher, pointing out their most important publications. Then we are provided with a quote from the interviewee related to complexity and finally we find the dialogue between Réda and the scholar in question. In order to facilitate the task of writing this review, I have categorized the different interviews. In each paragraph I will synthesize the multiple ideas of the thinkers and mention some historical aspects that seemed relevant to me.
Part I: Physicists, Physicists Everywhere
Neil Gershenfeld, Bernard Derrida, Laurent Nottale & Andrei Linde
The 20th century was perhaps one of the most interesting for physics. On the one hand, we began to decipher the mysteries of the microscopic world, which led to the development of one of the pillars least understood by physicists: quantum mechanics. Simultaneously, one of Albert Einstein’s great contributions saw the light of day. The theory of relativity, both in its special and general versions, allows us to understand the astrophysical and cosmological realm. However, in the second part of that prosperous century, a third revolution in physics began. At that time physicists were interested in predicting the unpredictable, in controlling the uncontrollable. Notable figures such as Ludwig von Bertalanffy, Norbert Wiener, and Henri Poincaré pioneered emerging fields such as general systems theory, cybernetics and chaos theory, respectively. It was from that intellectual wealth that the science of complex systems was engendered.
One of the fundamental problems currently facing modern physics is to unify the microscopic and macroscopic worlds through what is known as quantum gravitation. Of the many proposals that exist today, in one corner we find Laurent Nottale’s scale relativity, which proposes to generalize Einstein’s principle of relativity by introducing fractal geometry in the dimensions of space and time. Since 1992 this approach has caused much controversy, but the truth is that this theory allows us to approach the problem of grand unification quite elegantly, putting quantum and classical realities in the same theoretical apparatus. Another way to approach this problem is to study the cosmos at very early times. In 1983 Andrei Linde postulated a colossal explosion prior to the big bang itself, which since then has not ceased to reproduce itself, generating an infinite number of universes. This hypothesis is known as chaotic inflation and offers a philosophical perspective on the origin of the universe that is not based on the theory of high-temperature phase transitions in the early universe, but on chaotic initial conditions.
Phase transitions are ubiquitous in physics. When we have an abrupt change between two states, or when we smoothly approach from one equilibrium state to another, we are facing a phase transition. In 1986 Bernard Derrida—who should not be confused with his philosopher cousin, Jacques Derrida—characterized a phase transition between order and chaos, belonging to a model proposed almost 20 years earlier by Stuart Kauffman (of whom more later). In this dynamic transition we find criticality, a phenomenon widely studied in complex systems science. At the mesoscopic level it is this balance between stability and disorder that dictates the robustness, variability, evolutionarity and adaptability of complex systems. But at the microscopic level things change. John Wheeler was one of the first exponents to propose that reality is made of bits, or binary decisions, rather than particles, forces and fields. However, the person who has materialized this narrative is Neil Gershenfeld, who, using quantum phenomena, has unraveled the secrets of the physics of information. Understanding information handling at the quantum level could help us unravel biological mechanisms linked to evolution that we do not yet understand, or even know about.
Part II: Order From Chaos and Chance
Ilya Prigogine, Yves Pomeau, Ivar Ekeland & Gregory Chaitin
But what is chaos after all? We have mentioned the concept repeatedly without giving a trace of its definition. Usually chaos is associated with disorder, but mathematically it can be well characterized. In 1963 Edward Lorenz observed an unexpected behavior in a simplified weather model he was running on a computer. Basically a very small change in initial conditions led to drastically different weather patterns over time, demonstrating the concept now known as the “butterfly effect”; this accidental discovery highlighted the inherent unpredictability of certain complex systems, even when governed by deterministic equations. Although such a definition is adequate for operational purposes, it is incomplete for different scenarios. Through observation of a diverse range of physical phenomena, we can epitomize chaos in three properties: period doubling, intermittency and quasi-periodicity. It was Yves Pomeau (another physicist) who in 1980 discovered the second of these nicknames, characterizing chaos as the art of forming the complex through the simple. A visionary, Pomeau imagined the phenomenon of chaos in biological, sociological and economic phenomena, expanding the frontiers of multidisciplinary science.
As we have seen so far, physicists have made great contributions to the science of complex systems. It is not for nothing that in 2021 the Nobel Prize in physics was dedicated to this science. However, the importance of this field was already conjectured as early as 1977. In that year Ilya Prigogine won the Nobel Prize in chemistry for the discovery of dissipative chemical structures which demonstrated the many facets of self-organization. It was Prigogine who consecrated the school of complexity in Brussels, bringing with him a new post-Newtonian, evolutionary and historical perspective of the universe, where time emerges from thermodynamics out of equilibrium. Many consider Prigogine to be a philosopher rather than a chemist, but if we are talking about philosophy we must mention the name of Ivar Ekeland, who in the 90s became a regular columnist for Nature magazine and was characterized by always mixing mathematical and philosophical culture. By investigating why the so-called chaotic systems are so sensitive to initial conditions, the philosopher and mathematician discovered the mechanisms of linear and nonlinear causality, leaving behind the perspective that chaos is a black box. Thus, taking up the legends of his Scandinavian origins, he combined them with the mathematics of chaos and chance to produce a singular work, disturbingly coherent with the most sophisticated theories of calculus.
Since ancient times philosophy has become a fundamental part of scientific work. On the other hand, mathematics is another essential tool for the development and progress of scientific theories, so understanding its limits has been one of the most important tasks since the last century. Kurt Gödel and Alan Turing are both considered foundational figures in exploring the limits of logic and computer science. In simple terms, Gödel proved there are true statements in mathematics that cannot be proven within a system, while Turing demonstrated that there are problems a computer can’t solve definitively. Among these lines we find Gregory Chaitin, who at the end of the 20th century extended Gödel’s work to the incalculable. In this way, by deepening the relationship between information, randomness and incompleteness, Chaitin was one of the founders of algorithmic information theory (AIT). AIT provides a mathematical framework to quantify the inherent complexity of individual objects by measuring the minimum amount of information needed to describe them, allowing researchers to study concepts like simplicity, complexity, and randomness in a rigorous way.
Part III: Towards Holistic Thinking
Edgar Morin, Brian Goodwin, John Barrow & Michel Serres
Complex systems science is not the first scientific paradigm which looks for an integral understanding of multiple phenomena. One of the first modern approaches to the study of life was systematics. Developed in the 19th century, this unified biological field sought to use both taxonomy and phylogenetics to characterize the diversity of living organisms. A century later, cybernetics was born, a transdisciplinary approach to study how systems regulate themselves, with a focus on the flow of information. In parallel but independently, a new stream of knowledge called general systems theory was seeking to build tools to describe how systems are made of interacting parts and how to apply that knowledge to other systems. But it was not until 1980 that the science of complex systems as a semantic field flourished thanks to the founding of the Santa Fe Institute. From those years until today, sub-branches such as complex adaptive systems and artificial life have emerged, all of them with the same goal: to describe collectivity.
It is not a coincidence that over the last century several schools of thought have emerged regarding the concepts of emergence and self-organization. Historical phenomena such as world wars have exponentially enhanced technological advances, bringing with them, for example, the Internet, which has revolutionized communications, information processing and commerce on a global scale. Among the many supporters of systems thinking we find Edgar Morin, whose intuition accompanied the appearance of the sciences of complexity from very early on. Morin shows us that abstraction need not be hermetic. For him, complexity is not a thought that should only be applied to machines or scientific metaphors, but the symbiotic culmination between the individual and their society. Comparably, in his political facet, Brian Goodwin claims complexity science as a tool at the service of society, whose development must be respectful of nature and its diversity. Known as the poet of biology theory for his forceful metaphors, Goodwin makes it clear that the limits in the history of evolution are given by the limits imposed by physicochemical laws on the emergence of holobiontic structures.
But what would science be without excellent writers? The ability of humans to store and transfer knowledge is unique. While it is true that other species possess communication systems equal to or more sophisticated than ours, it is still almost impossible to find a convincing analogue. Since 1980 John Barrow positioned himself as a thinker by studying the role of the observer in the study of the universe, but it was his books such as Pi in the Sky, New Theories of Everything and Impossibility that gave him recognition as an excellent writer, able to position us on the border between science and philosophy. In turn, with an exuberant language, difficult for the translator and stimulating for the reader, Michel Serres has dedicated his career to intermingling the exact and social sciences. For Serres, language serves as a tool and not as an object. Therefore, we can use writing as a social benefactor, where through the flow of ideas it is possible to awaken a spark of curiosity that later could become a fire of creativity and ambition. The butterfly effect at its best.
Part IV: When the Whole Is Greater Than the Sum of Its Parts
Jean-Louis Deneubourg, Francisco Varela & Stuart Kauffman
Due to its enormous polysemy and diverse interpretations, emergence has become perhaps the most esoteric concept within complex systems science. During the 19th century, the physicists of the time were concerned with describing collective phenomena, those formed by an immense number of particles interacting randomly. Think of a water molecule. Although we can characterize intrinsic properties such as its polarity and the cohesion between its atoms, it is impossible to deduce from it the properties of water, that substance formed by an unimaginable quantity of molecules indistinguishable from one another. Societies can also be characterized as emergent properties. A nation is nothing more than the result of a bunch of institutions made up of a huge number of individuals who share a certain degree of identity. Unlike water, however, it is almost impossible for us to predict social behavior, even if we manage to characterize individual human behavior. Jean-Louis Deneubourg is a specialist on collective intelligence in animal societies, particularly ant and bee colonies. His studies focus on in-vivo examples of emergent processes, having potential applications in robotics and artificial intelligence. For Deneubourg, these global phenomena reveal the mechanisms of social behavior, which are certainly complex but have nothing to do with the existence of an «élan vital».
Élan vital is a French term that means “vital force” or “vital impetus”. It was coined by French philosopher Henri Bergson in his 1907 book Creative Evolution. Bergson used the term to describe a hypothetical, non-physical force that for him was responsible for the evolution and development of organisms. But if life does not have a metaphysical nature, why have we not been able to synthesize it in the laboratory despite our ability to create all the components that make it up? As one of the great exponents of autocatalytic sets, Stuart Kauffman seeks to revolutionize biology by endowing it with its own laws. In his research, Stu develops the hermeneutics of evolution, explaining the constructivist logic behind living beings, a logic that, according to him, comes from both natural selection and self-organization. But as we have mentioned, there are many self-organizing phenomena that could not necessarily be characterized as living. Striking the theoretical, epistemological and philosophical dimension in his work, during the late 1960s Francisco Varela proposed the concept of autopoiesis, which clearly explains the structure of living organisms and their dynamic organization, providing them at the same time with autonomy, individuality and unity. Simultaneously, his writings have prompted the creation of new research programs that seek the development of formal logical systems capable of expressing self-reference, something that for both Varela and Kauffman is fundamental when describing life.
Part V: In Search of An Ideal Theory of Life
Daniel Mange, Luc Steels & Christopher Langton
The origin of life is perhaps one of the greatest unsolved enigmas. Although today we have the ability to synthetically construct the elements that may well have formed life in a primordial state, it is still impossible to reach the level of dynamism that we observe in vivo. In the 1950s Harold C. Urey and Stanley Miller tested the Oparin-Haldane conjecture and successfully produced organic molecules from inorganic compounds that must have been present on prebiotic Earth. From those times one thing was clear to us: life is an emergent property. With the heyday of personal computers in the second part of the 20th century, the great minds of the time had a new world of possibilities at their fingertips. Computers were to cyberneticians what telescopes were to astrophysicists, or microscopes were to cell biologists. By organizing the first international Workshop on the Synthesis and Simulation of Living Systems, in 1987 Christopher Langton is consecrated as the founder of Artificial Life, which condenses theoretical and empirical, mathematical and computational, studies of phenomena commonly associated with life, such as replication, morphogenesis, metabolism, learning, adaptation and evolution.
Although it was not yet called artificial life, similar research had been going on since the middle of the last century. One of the greatest examples of this was von Neumann’s work on self-reproducing automata, a theoretical machine capable of creating an exact copy of itself using available resources, replicating its own design and functionality. Within this same line of thought we find Daniel Mange, who tried to build bio-inspired machines capable of adapting to their environment, both evolving and modifying their environment. For Mange it is structurally impossible to identify a hypothetical “active agent” within a static, non-evolving architecture. For him we must accept the ambiguous nature of life, the intrinsic impossibility of identifying which component does what when the event or activity is established in a parallel and delocalized way. Engaged at the highest level in both theory and practice in artificial life, Luc Steels, by the rigor and intelligibility of his approach, is one of the most brilliant representatives of multidisciplinarity in contemporary science. Similar to Daniel, Luc seeks to endow robots with autonomy and a learning capability that can evolve with their own environment. This proposal is known as Talking Heads (yup, like the band) and with it Steels was able to reproduce cultural evolution in a laboratory.
Conclusions
Among his interviews and interviewees, we can glimpse a great convergence of ideas that denotes Réda’s fascination with the world of complexity science. That late twentieth century brought with it an academic diaspora that came to replace the school of cybernetics founded by Norbert Wiener in 1948. While it is true that the science of complex systems has brought us great eurekas in the first twenty years of the current century, it is quite likely that we have ahead of us one more line of holistic thinking from the many that humanity has dragged along since its origins. What is the next step? What does the second half of the 21st century hold in store for us? Perhaps the great minds that will lead us to a deeper understanding of the great mysteries of the universe have not yet been born. Perhaps some of them have already died. Perhaps some of them will never see the light. With more unknowns than answers, Benkiare’s book leaves us with a clear message: to buy insight, you have to pay first with confusion.