Quantum Physics and AI: Continuous is to Analog what Discrete is to Binary

Jan Krikke
8 min readSep 28, 2018

Artificial Intelligence and quantum physics will cross paths in the 21st century. When they do, it may change our view of Nature. The text is followed by a list of resources showing the various domains in which this issue plays a role. The list will be updated as new material becomes available.

In the early 20th century, the pioneers of quantum physics debated whether nature is continuous or discrete — whether particles are waves or waves are particles. The pioneers of binary computing confronted a similar and related question; the difference between analog and digital. The latter was a major issue during the legendary Macy Conferences on Cybernetics between 1946 and 1953.

The Macy Conferences were a series of meetings of scholars from various disciplines, held in New York. Their aim was to promote meaningful communication across scientific disciplines and “restore unity to science.” Among the participants were some of the most influential thinkers of their days: William Ross Ashby, Gregory Bateson, Julian H. Bigelow, Ralph Waldo Gerard, Margaret Mead, Arturo Rosenblueth, Claude Shannon, John von Neumann, and Norbert Wiener.

The debate on the analog-digital question had a predecessor in European philosophy long before the advent of digital computing. Immanuel Kant and Soren Kierkegaard addressed the notion of analog. Not being familiar with the modern distinction between analog and digital (or binary), they discussed analog in terms of ontology and epistemology. The word analog is typically defined as “something having the property of being analogous to something else,” suggesting they believed the world pertains to perception and aesthetics.

The analog-digital divide became a contentious issue at the Macy Conferences, where it centered mostly on the human brain. Ralph Waldo Gerard, a neurophysiologist and behavioral scientist, claimed that the brain’s operations are much more analog than digital. He called into question the digital logic-based model developed in 1943 by neuroscientist Warren S. McCulloch and logician Walter Pitts, authors of an influential paper entitled “A logical calculus of the ideas immanent in nervous activity.”

More analog than digital

McCulloch and Pitts tried to understand how the brain could produce highly complex patterns by using many basic cells (neurons) that are connected. The paper was an important contribution to the development of digital artificial neural networks which model key features of biological neurons.

Gerard’s claim that the brain’s operations are “much more analog than digital” set off an animated debate that frustrated many of the conference participants. Mathematicians, like von Neumann, spoke in favor of digital perspectives, others (especially the psychologists) favored an analogical orientation.

Long argumentation ensued over the distinctions between discretely-coded digital orientation, adopted by McCulloch and Pitts, and the continuous analog character of Wolfgang Köhler’s Gestalt model. Köhler was a German psychologist and a key figure in the development of Gestalt psychology, which seeks to understand learning and perception as structured wholes.

The multi-disciplinarian Gregory Bateson, English anthropologist, social scientist, and cyberneticist, called for clarification of the distinction between analog and digital to remove ambiguities. None was forthcoming. The “analogical versus digital” debate remained a pesky item of “old business unresolved.”

In the end, and with the support of Norbert Wiener, digital computing won the day, with analog computing mostly confined to specific computing tasks. In recent years the analog-digital issue has been mostly discussed in the context of continuous and discrete mathematics.

Mathematician Freeman Dyson revisited the issue in 2014, in his lecture “Is Life Analog or Digital.” He pointed at the complexities of understanding brain functions like memory.

“It seems likely that memories are recorded in variations of the strengths of synapses connecting the billions of neurons in the brain with one another. But we do not know how the strengths of synapses are varied. It could well turn out that the processing of information in our brains is partly digital and partly analog. If we are partly analog, the downloading of a human consciousness into a digital computer may involve a certain loss of our finer feelings and qualities.”

Quantum processes

The latter point may very well be an elegant understatement. In biology, and other sophisticated processes, let alone the human brain, missing information can be decisive. Professor Dyson points at a third possibility: The processing of information in our brains is done with quantum processes, and the brain is the biological equivalent of a quantum computer. He adds this is speculation:

“Quantum computers are possible in theory and are theoretically more powerful than digital computers. But we don’t yet know how to build a quantum computer, and we have no evidence that anything resembling a quantum computer exists in our brains. Whether a universal quantum computer can efficiently simulate a physical system is an unresolved problem in physics.”

Digitizing an audio signal “samples” the wave and each sample is given a binary number.

We rely on the analog-digital dichotomy every time we use a digital device, whether computer, mobile phone, or digital sound system. At the heart of all these devices is the conversion between analog and digital. There is no such thing as digital music; there is only digitized audio.

To digitize audio, the sound wave is sampled 44.000 times a second. Each sample is given a binary number, and the numbers are restored on an electronic medium. Playback devices decode the binary samples and convert it back to analog signals we can hear.

When we visualize the sampling process of an analog wave, we see that it is identical to a histogram. We use a histogram, or a “coordinate grid,” to show movement or rate of change of a given unit time — the change in temperature in a year, or the movement of stock markets and currencies exchanges.

In a histogram, the horizontal coordinate (x) denotes time or moment; the vertical coordinate (y) denotes the amplitude of change. The rate of change is “analogous” to the amplitude as it moves through the grid set up by the binary parameter (x) and (y).

The histogram or coordinate grid measures or samples dynamic movement within the boundaries of a static binary grid.

Like many other tools used in science, the histogram is a human construct. We use it to manipulate, control, or understand specific aspect of reality. But the histogram can be used to speculate on the analog-digital divide and the role it plays in nature. Note that the analog wave running through a coordinate grid is dynamic. The grid itself is stable. The grid sets the boundaries within which the wave can move so that it does not go “off the charts.”

The relation between the two becomes clear when we think of the dynamic wave as “force” and the stable, coordinate grid as “equilibrium.” Force and equilibrium are mutually dependent. Equilibrium without force leads to petrification, force without boundaries leads to chaos.

The structure of a histogram, with 1 and 0 defining the stable binary grid and A representing the dynamic, analog force.

The distinction between forces and equilibrium may very well be at the heart of the dichotomies between analog and digital and wave and particle. If we understand the distinction between force and equilibrium, we may have the key to understanding all the wave-particle dichotomy. Moreover, it could also shed light on the limits of artificial intelligence, where the analog-digital distinction plays a key role.

In recent years, computer scientists have recognized that binary computing has its limits. Unless computer science makes a quantum leap forward and shows us otherwise, the sampling of analog information for conversion in digital format will always result in “missing information,” no matter how high the sampling rate and processing power. The decoding and uploading of the human brain to a computer, thought to be possible by some in the AI community, would not tolerate missing information, no matter how small.

How will AI deal with the complexities of nature that we do not yet fully understand? Many AI experts believe next-generation AI, or artificial general intelligence (AGI), will have the ability to reason, use strategy, solve puzzles, make judgments under uncertainty, represent knowledge, even plan, learn, and communicate in a natural language, and integrate all these skills for achieving common goals.

Developing such wide-ranging abilities and skills will rely on, both, science and the humanities. To be more than a general “expert system,” it requires social and emotional intelligence, the differentiation between male and female sensibilities, and social differences that vary from culture to culture and can reflect entirely different, if not opposite world views.

Further reading:

Is life analog or digital — Freeman Dyson

Perhaps the processing of information in our brains is partly digital and partly analog. If we are partly analog, the down-loading of a human consciousness into a digital computer may involve a certain loss of our finer feelings and qualities.

Back to analog computing — Columbia University

The discrete step-by-step methodology of digital computing was never a good fit for dynamic or continuous problems. A better approach may be analog computing to solves ordinary differential equations at the heart of continuous problems.

Being Analog — Carol Wilder

The concepts of analog and digital were known only to scientists and scholars, but suddenly they have become part of the daily general discourse about communication technologies.

Does the Brain Store Information in Discrete or Analog Form? — MIT Technology Review

It is not easy to answer the question of how the brain stores information. Neuroscientists have long pondered this issue, and many believe that it probably uses some form of analog data storage. But the evidence in favor of discrete or analog data storage has never been decisive.

Discrete and Continuous: A Fundamental Dichotomy in Mathematics — James Franklin

Discrete mathematics has a set of concepts, techniques, and application areas largely distinct from continuous mathematics (traditional geometry, calculus, most of functional analysis, differential equations, topology).

Evolution Saves Species From ‘Kill the Winner’ Disasters — John Rennie

Goldenfeld and Xue refer to this problem as a lack of “stochastic noise” because the calculations do not reflect the mathematically arbitrary discontinuities that the real world’s limitations impose.

The Riemann Hypothesis — Michael Atiyah

The fusion between the work of Hirzebruch and that of von Neumann involves a passage from the discrete to the continuous, the transition from algebra to analysis.

Neuromorphic Engineering — Wikipedia

Neuromorphic engineering, also known as neuromorphic computing, describes the use of very-large-scale integration (VLSI) systems containing electronic analog circuits to mimic neuro-biological architectures present in the nervous system.

A new brain-inspired architecture could improve how computers handle data and advance AI — American Institude of Physics

This analog storage better resembles nonbinary, biological synapses and enables more information to be stored in a single nanoscale device… The team continues to build prototype chips and systems based on brain-inspired concepts.

The Third Law: The Future of Computing is Analog — George Dyson

Alan Turing wondered what it would take for machines to become intelligent. John von Neumann wondered what it would take for machines to self-reproduce. Claude Shannon wondered what it would take for machines to communicate reliably, no matter how much noise intervened. Norbert Wiener wondered how long it would take for machines to assume control.

--

--

Jan Krikke

Author of Creating a Planetary Culture: European Science, Chinese art, and Indian Transcendence