What is the future of digital technology?

Luca Gammaitoni
Geek Culture
Published in
8 min readJun 3, 2021

--

From it to bit and back: a new generation of inventors is needed

It appears (to me) that the future of digital technology will not be so digital, after all. To see why, please follow me for a few minutes.

Digital technology was born in a time where noise was an enemy and electronics was still in the cradle. In the early 1940s, sending a message was not an easy task and the quality of the one received, often differed greatly from the original, due to the unwanted action of noise. The natural solution was to reduce the variability of the signal, due to fluctuations, by imposing a finite number of amplitude levels. Clearly, the most efficient reduction was to squeeze it down to just two different levels: the binary representation was born.

It is important to remember that such a reduction, called digitalization, i.e. the action aimed at transforming a continuous (analog) signal into a discrete (digital) one, is an operation that is not free of errors. It mainly consists of two different procedures: sampling and quantization. Sampling is the time periodic measurement of the signal. If this procedure is accomplished properly, according to the work done by the Swedish-American electronic engineer Harry Nyquist (1889–1976), no detail of the original signal is lost. Quantization, on the other hand, is the transformation of a continuous, smooth signal into a stepwise one (see Figure 1). This operation cannot be accomplished error-free and, no matter how carefully we perform it, there will always be an error. This implies that the digitalised signal will always be different from the original one or, how we would say following Claude Shannon (1916–2001), the father of information theory, some information will always be lost in the process.

Figure 1: Original (blue )and quantized signal (red). The difference is evident. Source Wikipedia.

Notwithstanding this limitation, scientist have used digitalization to build an enormous societal progress based on the extensive use of digital tools. Undoubtedly, the most popular of them is the digital computer. In addition, we have had digital communication systems, like digital phones, digital radios and digital TVs, and also digital industrial machines and most of the present robots that operate in the manufacture sector. Going digital has become the new mantra and most countries have invested large amount of money just to promote the so-called digital transformation, i.e. the massive introduction of these tools into our daily life.

In order to perform their tasks, these digital tools need input data that are captured from the physical world through digitalization procedures, performed by special devices called ADC (Analog-to-Digital Converters). Data production in present-day society has become enormous and the large quantity of data available has triggered some concern about its maintenance, ownership, control and use (see e.g. Analog society vs Digital society).

The impact of the digital technology has affected not only the way we work and communicate but also our understanding of the world around us. It was Ralf Landauer (1927–1999) that, building on the work of Shannon and John von Neumann (1903–1957), firstly advanced the idea that the amount of information stored in a computer memory, typically expressed in bits (i.e. binary digits), could play a role on the physics of computing itself, being interpretable as part of the entropy of the computer. Information is physical, was the motto that popularised this idea, advanced by Landauer for the first time in a controversial paper, entitled Irreversibility and Heat Generation in the Computing Process(IBM Journal of Research and Development, 5, 183–191. 1961). Although that statement was long debated in the Physics community and I, myself, among others, worked to show that his conclusion was wrong (Computing study refutes famous claim that ‘information is physical’), the idea that the application of the information theory paradigm, i.e. the digital perspective underlying it, was a potentially useful key to interpret the fundamental laws of the universe, flourished. The most famous representative of this idea was probably John Archibald Wheeler (1911–2008), the physicist that in 1967 invented the name “black hole” for the mysterious celestial objects, recently observed. Wheeler, during a conference held at the Santa Fe Institute, in the spring of 1989 proposed the expression “It from Bit”, to exemplify the process that we go through when we deal with the reality of things. In Wheeler’s idea, expressed also in “Information, Physics, Quantum: The Search for Links” (Proc. 3rd Int. Symp. Foundations of Quantum Mechanics, Tokyo, 1989, pp.354–368) our experience is the result of binary decisions (thus expressed by bits: 0 or 1) that we take during each observation. The underlying fabric of the universe is based on these decisions and its nature is intrinsically digital.

More recently, a similar point of view, has been advanced by Seth Lloyd, presently at MIT. You can check the nice chat he has on Youtube entitled “Is Information the Foundation of Reality?”. He also published a popular book entitled “Programming the universe” (2006) where he discusses this matter. When I visited him at MIT, just few years ago, he took me to lunch and we had a nice conversation about the role of information as a physical quantity. At the end of the lunch I can say that we politely disagreed, but it was nice talking to Seth.

My point of view is that information theory, although a nice and useful theoretical framework for dealing with computer programming, is actually not adding much to our understanding of physics. In some cases, it may offer an interesting perspective but still, fundamental laws of physics need a more general point of view that those based on the idea that everything, at the end, is a mere computation act. Is this the end of it? Not really. Recent progresses on Artificial Intelligence (AI) studies found a profitable application for the large amount of data produced by our digital tools. Deep Learning (DL) algorithms, very powerful statistical classification schemes, proved useful in image and speech recognition. Although progresses in other aspects of AI are still lacking behind, some scientists are applying DL to the task of guessing new laws of physics and new science schemes in general. In a recent article (Why it is difficult to find something if you do not know what you are looking for) I discussed how this recent objective of AI is plagued by some fundamental problems that might limit its efficiency in this field.

Overall, there is no doubt that digital technology has revolutionized our society by promoting a progressive loss of weight, through a widely diffused dematerialization process. Let’s take a look at the way we enjoy music, for example. In the eighteenth century, a wealthy person who wanted to listen to music, as he pleased, had to host a quartet in his/her house. At the end of the nineteenth century, thanks to Edison’s invention, it was sufficient to own a phonograph with waxed cardboard cylinders: undoubtedly a device much lighter than the quartet. Subsequent inventions have all proceeded in this dematerialization path to the point that, currently, digital music is downloaded as electromagnetic radiation into few-grams portable devices. As it is nowadays common experience, dedicated digital tools have also dematerialized newspapers, films, books, paintings, sculptures and, of course, money.

This progressive loss of weight did not come unannounced. One of the famous Six Memos for the Next Millennium by Italo Calvino (1923–1985), probably the best, is devoted to lightness. This lecture was scheduled to be presented for the Charles Eliot Norton Lectures at Harvard, in the fall of 1985, but never delivered because Calvino died unexpectedly on September 19th of that year. In this first lecture, devoted to the opposition between lightness and weight, the author upholds the virtues of lightness. “This does not mean that I consider the virtues of weight any less compelling, but simply that I have more to say about lightness. (…) my working method has more often than not involved the subtraction of weight. I have tried to remove weight, sometimes from people, sometimes from heavenly bodies, sometimes from cities; above all I have tried to remove weight from the structure of stories and from language”.

The work-program announced by Calvino has been taken quite seriously by those who, in the following years, have designed, built and distributed the digital tools that today we so widely use, with the unexpected result of flooding us with an enormous amount of digital data.

The dematerialization process we have witnessed over the past forty years was aimed at removing information from atoms and transferring it in bits. In my humble opinion it’s time to change things. If we want to stop the data deluge and the risks associated with it, if we want to preserve our democracies and to protect our freedom, it is time to invert the process and defend the reasons of the atoms against those of the bits. Remarkably, few initial steps in this direction have already been taken by the pioneers of 3D printing. In a 2008 book entitled Fab: The Coming Revolution on Your Desktop — from Personal Computers to Personal Fabrication, Neil Gershenfeld, the director of the MIT Center for Bits and Atoms, discussed the potentiality of the fabricators, the new generations of nerds interested in making things: real things, not virtual ones. A clear sign that the atoms revenge has begun.

What is going to be the future of digital technology, then? To answer this question, it comes in our help, again, Italo Calvino. In the same lecture, Calvino speaks about the important legacy of the roman poet Lucretius (c.99–c.55 BC) and cites a beautiful verse from his description of the material world: the little motes of dust swirling in a shaft of sunlight in a dark room (II.114–124). What a beautiful description of something that is light without being empty. What an unlimited source of information there is in this dust swirling, impossible to contain in a finite digital representation and, still, so fundamental in our description of reality. Fluctuation is the word that comes to my mind when I listen to such a description. Fluctuation and noise: exactly what we wanted to get rid of at the beginning of the digitalization process. If it is true that the future belongs to those who can imagine it, we need more noise and fluctuation to get a grasp at the un-simplified reality we live in. No digital representation will ever be able to embrace it. Get over it. Analog technology will eventually take over digital technology and we will need a new generation of inventors.

--

--

Luca Gammaitoni
Geek Culture

Luca Gammaitoni is Professor of Experimental Physics at the University of Perugia in Italy and the director of the Noise in Physical Systems (NiPS) Laboratory.