Review | How Chemistry Computes: Language Recognition by Non-Biochemical Chemical Automata. From Finite Automata to Turing Machines
Computation is perhaps one of the most influential terms of the last century. However, I believe we must reformulate this concept in terms of its substrate, that is, what performs the computations. Today I am going to review a paper in that direction, which opens up avenues toward understanding chemical computation.
Introduction
Computation, in its modern form, was first formalized by Alan Turing in 1936. His abstract machine (today called Turing Machine) laid the groundwork for a robust theory of effective calculability by defining computation in purely syntactic terms: operations on symbols governed by rules, entirely detached from any physical substrate. This syntactical view has proven incredibly powerful, forming the backbone of computer science and formal logic. Yet, as fields such as artificial intelligence, artificial life, and unconventional computing have matured, the limitations of this abstraction have become more pronounced. Turing’s formalism, while foundational, treats computation as divorced from time, energy, and matter—oversights that are becoming increasingly difficult to ignore when discussing systems embedded in the real world.
Indeed, computation is fundamentally substrate-dependent. The feasibility and efficiency of implementing a given model of computation can vary drastically depending on the physical medium. What is straightforward to simulate in silicon may be infeasible in a biological neuron, a biochemical cascade, or a chemical reactor—and vice versa. This asymmetry is not merely a matter of engineering convenience; it is a conceptual clue. It suggests that understanding computation “in practice” may require engaging more seriously with the nature of the substrate that performs it. When we say that all models of computation are equivalent in theory, we implicitly invoke Turing equivalence in a Platonic realm. But do these equivalences hold when we descend into the material world? Can every computation that is theoretically feasible be physically instantiated, and what does that imply for how life computes?
In this context, the paper How Chemistry Computes: Language Recognition by Non-Biochemical Chemical Automata. From Finite Automata to Turing Machines by Marta Dueñas-Díez and Juan Pérez-Mercader offers a groundbreaking and highly original contribution. By realizing abstract automata (finite automata, pushdown automata, and linear bounded automata) within purely non-biochemical chemical reactions, the authors challenge us to rethink what it means to compute. Their work goes beyond metaphor or simulation: they show that chemical systems can recognize formal languages in the Chomsky hierarchy using nothing but native chemistry, without the aid of DNA, enzymes, or any of the machinery associated with life. In doing so, they open a new avenue for understanding the physical foundations of computation, and by extension, the possible pathways toward prebiotic chemical information processing.
How Chemistry Computes?
The authors begin by offering a precise operational definition of computation, rooted in formal language theory: computation is the process by which a device (an automaton) receives a sequence of symbols from a given alphabet and applies internal rules to determine whether the sequence belongs to a particular language. Their goal is to explore whether such computation, typically associated with digital circuits, silicon processors, or biological machinery, can be achieved using only inorganic chemical reactions. Crucially, they restrict themselves to one-pot chemical reactors with no external spatial constraints or biochemical components, thereby isolating the computational capabilities of purely chemical systems.
This conceptual leap is grounded in the well-known Chomsky hierarchy of languages and automata: Finite Automata (FA) recognize Regular Languages, Pushdown Automata (PDA) recognize Context-Free Languages, and Turing Machines (TM) recognize Recursively Enumerable Languages. The hierarchy is inclusive, meaning that more powerful automata can recognize all the languages recognized by those below them. The authors set out to construct one physical realization of each type of automaton using chemical systems, thereby demonstrating the full computational power of chemistry in practice.
To implement this idea, they define a novel and elegant methodology. Each input sequence is translated into a series of chemical aliquots (specific amounts of selected reactants) added one at a time into a well-mixed chemical reactor. The symbols in the alphabet are mapped to particular chemicals, and the sequential input is delivered by timed additions. The output is a macroscopic, physically measurable change in the reactor, such as a color shift, pH change, or oscillatory pattern. If the sequence belongs to the language accepted by the automaton, the system’s response reflects this by reaching a characteristic final state, an Acceptance. If not, the reactor settles into a qualitatively different pattern, a Rejection.
Before advancing to more complex automata, the authors demonstrate the feasibility of their approach with a Finite Automaton implemented via a simple bimolecular reaction. This reaction is capable of accepting a regular language by appropriately consuming or accumulating a chemical species based on the input sequence. Extending this, they realize a Pushdown Automaton using a pH-sensitive reaction network where the state of the stack is encoded in the acidity of the solution. The system maintains memory through the cumulative effect of acid/base additions, and language acceptance corresponds to a return to the initial pH, indicating that the stack has returned to its empty state. Their results are robust, reproducible, and grounded in fundamental chemistry.
Chemical Turing Machines
To construct a chemical analog of a Turing Machine, arguably the most ambitious part of the paper, the authors turn to a well-known nonlinear chemical oscillator: the Belousov-Zhabotinsky (BZ) reaction. This choice is not incidental. Turing Machines require the ability to store and manipulate multiple interacting sequences (tapes or stacks), and the BZ reaction, with its multiple interdependent observables (such as redox potential and oscillatory frequency), provides a natural substrate for this task. The authors note that a bounded Turing Machine, or Linear Bounded Automaton, suffices for physical realizability, as no actual system can implement an infinite tape. They therefore focus on this subclass, which still retains the capacity to recognize context-sensitive languages.
To encode the input, each symbol in the alphabet is assigned a specific chemical species whose presence perturbs the oscillatory dynamics of the BZ reaction in a known way. These symbols are introduced sequentially, just as with the simpler automata, and the reaction evolves through multiple phases. The key insight is that the sequence-dependent changes in the oscillatory profile (e.g., frequency, amplitude, duration) embody the machine’s computation. Words that are accepted by the automaton result in a specific and reproducible pattern of oscillations, while rejected words diverge in their dynamics.
In order to ground this behavior in thermodynamic terms, the authors introduce an equivalent energetic criterion: they show that the total free-energy difference associated with processing accepted words is invariant across all such words. This is because the oxidation and reduction contributions in the BZ reaction exactly cancel out over the course of the computation. In contrast, rejected words lead to unbalanced reactions, resulting in different net energy dissipation. Remarkably, this means that word recognition can be mapped onto a free-energy threshold, making Acceptance or Rejection a physically measurable thermodynamic signature.
This energetic criterion not only reinforces the credibility of the chemical Turing Machine but also points toward a broader principle: that computation can be understood as a process of energy transformation and entropy production. The authors emphasize that their Turing Machine performs its computations isothermally, with no need for external control or intervention, relying solely on the intrinsic properties of the chemical system. As a result, the system operates with Avogadro’s number of parallel processors (each molecule acting as an independent unit) offering a scale of parallelism that is unimaginable in conventional silicon-based systems.
Finally, the authors demonstrate that their chemical automata respect the inclusivity of the automata hierarchy: the chemical Turing Machine, for example, can also recognize the regular and context-free languages accepted by simpler chemical automata. This fidelity confirms that their approach is not only physically sound but also theoretically rigorous.
Conclusion
The conclusion of the paper circles back to a foundational question: is biochemistry necessary for chemical computation? The authors answer resoundingly in the negative. Why? Because they have shown that purely inorganic chemical reactions, operating in a single well-mixed vessel, can implement the full hierarchy of classical automata. These systems require no biological molecules, no spatial patterning, and no external guidance, just the inherent capacity of matter to organize and process information through chemical reactions.
This finding has profound implications. First, it recasts computation as a general property of chemical matter, not a special feature of life. The information processing observed in biological systems may not be unique to DNA, proteins, or neural networks but may instead be one instantiation of a broader class of chemical computation. Second, it suggests that primitive forms of information handling may have preceded life, providing a plausible bridge between prebiotic chemistry and the origins of biological computation. In fact, I recently proposed a relational model which explains the empirical observations made by this paper, explaining how we can go from simple chemical reactions to complex, autonomous systems.
Moreover, the authors propose that the use of oscillatory systems, such as the BZ reaction, is not unique. Alternative chemical reactions could be engineered to perform similar computational tasks. By increasing the size of the alphabet, the number of stacks, or the number of interacting oscillators, it may be possible to construct even more powerful networks of chemical computation—approaching, in a distributed way, the capacities of programmability in modern digital machines.
Of particular significance is the thermodynamic framing of computation. By designing acceptance criteria based on free-energy balances, the authors provide a pathway for understanding the cost, efficiency, and control of chemical computation. This could inform the energetic analysis of natural biochemical pathways or guide the engineering of synthetic molecular computers. Importantly, these systems are not brittle: because they rely on ensemble effects at molecular scales, they benefit from a robustness and parallelism rarely seen in conventional computation.
Yet some limitations remain. While the paper convincingly shows that one-pot chemical reactions can perform formal language recognition, the scalability and programmability of such systems are still open questions. How adaptable are these systems to changing inputs? Can they perform arbitrary computation beyond sequence recognition? How does noise at the chemical level influence the reliability of outputs? These are exciting avenues for future exploration. In fact, I have begun to investigate some of these aspects in my master’s thesis.
In this way, How Chemistry Computes offers a bold and empirically grounded vision of computation beyond the confines of biochemistry or digital electronics. It forces us to reconsider the nature of symbols, machines, and substrates. In doing so, it brings us one step closer to understanding not only how life computes, but how chemistry itself might have once begun to compute, long before life emerged.