Signals and Noises

Thoughts and Theory

First off, the name of my web domain, Signals and Noises, is half pun, half tip-of-the-hat to Claude Shannon and Warren Weaver's 1949 publication The Mathematical Theory of Communication, which essentially laid the groundwork for what is today commonly referred to as information theory, in which the concepts of 'signal' and 'noise' play central roles. Simply put, the term 'signal' refers to any message, regardless of medium: spoken dialog, music, telephony, internet stream, text, images, radio or television broadcast, photons from a distant galaxy, etc. The term 'noise' includes anything that obscures that message: static, transmission gaps and errors, interference, cancellation, echo, harmonic distortion, scratches on a phonograph record, coffee stains, typos, torn paper, missing pages, cloud cover, etc. The signal-to-noise ratio (SNR), a term familiar to all audiophiles, is a measure of the fidelity of the received message to the original signal. Noise is intimately related to the universal concept of entropy; while signals often include redundancy as a form of error-correction, which is to say, of increasing the signal-to-noise ratio. For example, natural language (including writing) is highly redundant: yu cn rd ths sntnce evn f mst f th vwls r mssng. James Gleick's The Information: A History, a Theory, a Flood is a very readable introduction to information theory and its significance.


Snippet of notes generated from a chaotic system


Let's move on to the last phrase of the epigraph on my home page: "sources of semi-autonomous authority." By which I mean authorship, or in a broader sense, agency: the act of making essential decisions about any or every aspect of a work under construction. Perhaps we might say that authors are humans, while agents are algorithms; in either case someone or something that causes something to be done. Autonomous agents do what they want all by themselves; semi-autonomous agents not so much: constraints are placed upon them, they operate within a small realm.

Here's a quick example illustrating the difference provided by Lali Barrière. The creative agent in this case is an algorithm that draws circles of random diameters and colors within a rectangular boundary. In the first image below (left), the algorithm is allowed to "run wild," there are virtually no constraints on either size, color or position of the circles. In the second image (right), small constraints have been imposed. The range of permitted diameters is smaller, the range of color never strays from the line between green and black, and perhaps most importantly, the locations of the circles are relegated to a grid. To my eye, the second composition is far more arresting and aesthetically pleasing, perhaps because the imposition of constraints effectively transforms a random process into a loose kind of pattern, one which is more relatable than pure noise. Humans like patterns.


Image with random circles in a plane   Image with less random circles on a grid


This might be a good spot for a clarification. The term 'random' is actually a hot topic in science and math. Is anything truly random? Or is there an ultimate cause—albeit one which remains beyond our ability to perceive? The random number generator available on most computers is, in fact, a pseudo-random number generator: the algorithm which produces the sequence of apparently uncorrelated numbers is itself completely deterministic. In fact, if you "seed" the algorithm with the same number every time you run it, then it produces exactly the same "random" sequence. Which is why, in most applications, the algorithm is seeded with the current time and date, so that, for all intents and purposes, you cannot predict what it will produce. Prediction, or more precisely, lack thereof, is key to what we understand as randomness. Prediction is not possible, and patterns are not discernable.

I encountered my first computer in 1981, a DEC VAX-11/780 that was the size of three refrigerators standing side by side. It also required an air conditioning slightly larger than the computer itself, and had two cake-platter disk drives each the size of a washing machine. The analog-to-digital/digital-to-analog converter occupied fully half of a 6-ft tall 19-inch rack. It had less computing power than your average smartphone today. I was enchanted, and I remember my overwhelming desire being nothing more than to teach it to sing.

But how to do so? Computers excel at producing mechanically perfect bleeps and blops, but who's going to dance (or cry) to that? So you start thinking about undermining the ridgid, mechanical perfection in an effort to "humanize" or "naturalize" the computer's output. One of the easiest and most obvious ways (on a computer at least) to make the perfection imperfect is to inject randomness. That does shake things up a bit, and applied judiciously can definitely enrich whatever it is that's being synthesized. But are we to suppose that randomness is a good model, or substitute, for the human touch, for a natural feel?

No, probably not. Humans don't really behave (or perform music, or create art) randomly. Randomness is not what distinguishes humans from machines. Randomness certainly does exist in nature, but to my way of looking at things, even casual observation indicates that randomness is not the fundamental organizing principle of the universe. Instead, everywhere we look, what we see are patterns and hierarchies. There's something more complex going on. All Douglas-firs resemble each other, but they are not all alike.

What I was looking for was a generating agent that matched, or at least approached, the types of structures and behaviors that we observe in nature: an agent capable of a kind of randomness, but with a bit more nuance and inner cohesion. As it happened, the meteorogist Edward Lorenz, during the 1960's, had discovered that precise long term wheather prediction is impossible, due to the fact that we can never have perfect knowledge of current state of the atmosphere in all places around the globe; thus any model of atmospheric behavior will always diverge from the behavior of the real atmosphere over time. This principle is known as sensitive dependence on initial conditions: that even minute differences in the initial starting point can (and will) lead to large and unpredictable changes in long-term behavior. Over the next couple of decades, this insight—that deterministic nonlinear systems are capable of unpredictable, or chaotic, behavior—began to permeate research across many disciplines—mathematics, physics, biology, chemistry, medicine, economics, sociology—as researchers became aware of the ubiquity of nonlinear dynamical systems (a.k.a chaotic systems) and fractal structures throughout nature, including human behavior, society and physiology. I had the good fortune to meet Henry Abarbanel, at that time the director of the Institute for Nonlinear Science at UCSD, who introduced me to the families of chaotic systems. Benoit Mandelbrot's highly influential book The Fractal Geometry of Nature had also just come out, and eventually it came to be understood that chaos and fractals are two sides of the same coin: fractals are the footprints left behind by chaotic processes.   Mountains and coastlines—quintessential examples of fractals in nature—are the result of the combined nonlinear dynamics of plate tectonics and the weather. One of the most useful textbooks I had on the subject was J.M.T. Thompson and H.B. Stewart's Nonlinear Dynamics and Chaos: Geometrical Methods for Engineers and Scientists, published in 1986. James Gleick made chaos theory accessible to non-engineers and non-scientists with the publication of Chaos: Making a New Science in 1987.


Snippet of notes generated from a chaotic system


In retrospect, the employment of chaos, fractals, stochastic procedures, and information theory (Markov chains, data manipulation) in music and art can be seen as simply the latest manifestation of what has been a hallmark of art practice since the early twentieth century: the use of chance operations and numeric procedures as generators of the new, the unexpected and provocative—to both the artist and to the public. We see this beginning with the Dadaists, Duchamp's readymades, the frequent use of found objects, the serial techniques of the Second Viennese School and their modern day diciples, the splatter paintings (or "action" paintings) of Pollock and de Kooning, the "indeterminite music" of Cage and Feldman, and on and on. Artists consciously employ chance operations as a foil to their own ingrained tendencies, as a means of forcing themselves out of their own comfort zone. Or just to make something that they would otherwise never think of.


Back