Litterature‎ > ‎

Error, Noise, Glitch

Phillip Stearns

The Art of the Algorithmic Unconscious1

This article is a meditation on the underlying substrate (the material) of artworks produced by human/machine collaborations where the uniqueness of the machine is an integral element; the “imaginative” or creative properties of its algorithms are vital in the completion of the artwork.   The idea that the machine can function as a collaborator is based on the premise that humans have embedded their thoughts, ideas, and abstract notions into these machines---that they are a kind of funny mirror where we see reflected a distorted image of ourselves is central to the ideas presented within.


Somewhere on the far side of the known universe, a wrinkle in the fabric of space-time ripples through empty space, traveling at the speed of light towards what from its perspective appears to be a tiny speck of dust in the vastness of the cosmos.  This wrinkle, a highly energize photon, a gamma ray, speeds towards its destination: a lonely blue planet orbiting a tiny yellow star drifting in the void between the arms of a spiral galaxy.  The world rushes up before it, and high in the upper atmosphere, our traveller collides with an oxygen molecule.  The impact creates a fantastic explosion, echoing the events of the big bang, but on a much smaller scale.  Particles are produced from the energy of the impact, translating momentum into matter, exotic matter lasting for the briefest of instants: gluons, quarks, proton and anti-proton pairs, electron and positron pairs, pions, muons, neutrinos, anti-neutrinos, each unique wrinkles in the fabric of space-time.  As the particles scatter, the neutrinos and their counter parts leave the scene of the accident unimpeded by normal matter; gluons coalesce into quarks, which coalesce into highly energized protons and neutrons.  One of those neutrons ejected from the heart of the subatomic mini-big-bang collides a short time later with a nitrogen atom, jettisoning a proton from the nucleus, resulting in an alchemical transformation of nitrogen into carbon.   Particles produced in the micro furnace of the gamma ray collision meet similar alchemical fates, smashing into DNA molecules of the terrestrial beings below, driving random mutations in the evolutionary development of life.  Another of those neutrons produced in the original collision tears off at near light speed, penetrating the hull of an aircraft, and slicing into the heart of its navigational computer where it reacts with the nucleus of an atom in the ceramic casing of the core processor.  Another cascade of transformations occur, producing a shower of positrons which annihilate electrons in the silicon of the processor itself.  This neutralization of electrons is interpreted by the navigational computer as a legitimate piece of information: a bit flipped from zero to one (an alchemy of data?).  It doesn’t know that the changed information is anomalous, the algorithms responsible for taking in data, processing and analyzing, and ultimately guiding the aircraft, run it through the mill.  The aircraft rolls sharply in the night sky and descends into the clouds below.

A Failure of Materials

To err is human and to glitch is machine.  Do these two characterizations of behavior parallel one another, is there are a metaphorical relationship between them?  If so, can it be hypothesized that there is an innate connection between human and machine which explains this parallel?  A possible explanation which may allow for such a connection to be inferred is that a tool or technological object is a projection of human intentions, desires, and ideas onto an object.  If the fashioning of a complex tool can be understood as the manifestation of the dreams guiding those desires and intentions into a technological object, then it can be inferred that encoded and inscribed within the physical form of the technological object are the ideas, the bodies of knowledge, and deeper still the cultural values and structures of belief which form a dynamic relationship between a society and its environment (conditions of existence).  What follows is an understanding of the glitch as an inevitable feature of technology, the result of imperfect machines building imperfect machines in the pursuit of perfection (from what we can tell, a uniquely human ideal).  The glitch, therefore becomes a way of examining the fallibility of what is essentially a human desire for perfection; the pursuit of this goal through infinite improvement and revision, however, implies that perfection is an unobtainable goal.

If a specific tool or technological object fails, it can be agreed that it has encountered some kind of limit---environmental, functional, or otherwise---which has caused it to function short of or outside the scope of its intended utility.  Although failure itself is a foreseen consequence of any undertaking (though many unwisely curtail preparatory actions in face the face of its eventuality), the specific outcome of a failure has the potential to breach the limit of what is humanly imaginable.  The most catastrophic example in recent memory demonstrating the limits of human imagination is the eruption of “Mt. Fukushima” following the massive magnitude nine Tohoku-Taiheiyou-Oki earthquake and resulting tsunami, which flooded the backup generators at the Fukushima Daiihi nuclear plant, disabling the emergency cooling systems.  It was not a question of the magnitude of the earthquake breaching the limits of the conceivable (the seismic activity resulting from the off-shore earthquake did not exceed the Fukushima Daiichi’s design limits), but the sequence of events in the wake of the tsunami that followed, which led to the full meltdown of reactors 1, 2, and 3 and the explosion of reactors 1, 3, and 42.  The notion of failure is thus a question of limits, both of  material: the actual physical substrate, the arrangement of that substrate to represent codified actions, the specific relationships with it environment (its situation); and imagination: the confluence of perception and practical knowledge with insight, interpretation, intention and agency.

As artists working within a much larger tradition of material interrogation, it is important to ask ourselves: “What are the materials we are working with?  What role do they have in creating a space where metaphorical relationships can give rise to meaning?”  For the time being, we’ll overlook the question of meaning along with output media---net art, displays on video monitors, projections, prints, texts, performed actions, etc.---and zero in on the primary material.  To do this we will have to set aside materials in the traditional sense, the primary material of the so called “classical” arts---wood, paper, stone, metal, textiles, paint, canvas, photographic paper, film, light, sound, movement (this list is by no means exhaustive).  In the established field of digital art, our primary materials may be considered to be data and code.  The generalization that I’m making (which may prove to be a dangerous one), is that any art works which utilize digital systems (based on binary logic) in their production involve the generation or acquisition, and processing of numerical information, which is dictated by instruction sets or code (which we will see is indistinguishable from numerical information) contained in programs.  The conclusion that is drawn from this, is that digital arts has a “material” which distinguishes it from other disciplines, and any discussion of the material of digital arts (of which glitch art is a sub genre), must include not just the data (as information) and its source but the means by which data that is generated and processed: the code or underlying instructions/algorithms.

Re-imagining Architecture of Error

Assuming that to err is human, is to glitch really the machine equivalent to erring?  

Our brains are massively parallel biological interpretive engines built from a diverse array of neurons.  They contain roughly 100 billion neurons with somewhere near 1 quadrillion connections between them.  Although they can be classified in unique types---according to function, structure, and other parameters---each neuron is completely unique.  A single neuron may contain hundreds of synapses, or connections from other neurons, and may also connect to dozens of others.  At these synapses, neurotransmitters, or chemical messengers, open or close close ion channels, portals through which ions such as sodium or calcium may pass.  Opening and closing these ion channels has the effect of altering the electrical properties of the neuron, which is normally polarized with respect to its surroundings, typically resting at a slightly negative voltage.  It is only after sufficient “stimulation” (de-polarization), that the neuron releases neurotransmitters at connections with other neurons.  This is a gross simplification of the process---the number of different neurotransmitters and ion channels, together with their effects on different neurons produces myriad ways for “information” to be gathered, processed, analyzed, and stored---but serves to illustrate that computation in the brain is not a simple binary operation.  Despite the temptation to look at the computational devices we create as metaphors for the brain, the reality is that they are vastly different in both architecture and function. However, as a consequence of these machines having been built and designed by us, there are detectable fragments of our logic, language, and imagination embedded within.

The basic building blocks of our present day computing systems are junctions between two pieces of silicon with specially engineered properties.  These junctions are used to build transistors, the smallest computational unit from which our most complex computational machines are built.  A single processor may contain a billion or more transistors, each functioning as a switch: it is either on or off.  Although transistors can be designed to provide a continuously variable output (as in analog electronics), here their function has been limited to provide an unambiguous two-state output.  Two benefits of designing a two-state system are that information encoded in a sequence of ons and offs is highly immune to noise, and that Boolean algebra, a system of mathematics represented by true/false logical statements, can be easily build from configurations of transistors functioning as on/off switches.

Microprocessors are collections of vast numbers of transistors configured in such a way that an instruction formed out of a sequence of on and off messages (1s and 0s, or bits) can select commands, which perform certain operations on a data set.  Said another way, instructions are arrangements of bits which correspond to certain commands embodied and represented by physical arrangements of transistors.  Code then is the arrangement of instructions which form a program.  Today, code more closely follows linguistic structures; we use programming languages to instruct computers to perform certain tasks.  This development has occurred out of necessity as a string of 1s and 0s representing a set of instructions is difficult for most humans to read and understand at a glance.  A solution was to group 4bits (a nibble) together and represent them in a hexadecimal counting system where symbols 0-9 represent zero through nine, and A-F represent 10-16.  Machine code is this binary machine language, which appears most commonly in its hexadecimal representation.  Upon this is built an assembly language, where instructions and commands in machine code are called using a mnemonic code which resembles actual words.  Built on this basic foundation are more complex programming languages: FORTRAN, Pascal, Basic, and C to name only the smallest faction of the many existing languages today.  What this means is despite the complexities of today's programming languages, all code refers back to a set of instructions (specific to the processor), which at the level of the machine is simply a set of numbers.  Everything done inside the computer is then a mathematical operation represented by two-state (binary) logic, and it is because of the fact that instruction set and data sets are both arrangements of bits that, at the most basic level, code becomes indistinguishable from data---the two can be interchanged at will.

File formats effectively keep information or data and instructions separate and allow us to distinguish between data types.  By overriding file formats the potential interchangeability of code and information can be actualized, enabling the production of interesting mis-interpretations or re-imaginings of previously established data sets.  A crude example of this process of disregarding formats and protocols is best illustrated by connecting an audio amplifier directly to points on a computer's motherboard while it is performing a set of instructions (do not attempt unless you are willing to sacrifice your computer!).  Here data is sonified in a direct one-to-one fashion: a 1 pushes the speaker out and a 0 causes the speaker to return to its resting position.  Other possibilities include manually re-wiring an output pin an input pin on a microprocessor, which may result in any number of outcomes (one of which may be converting your computer into a door jam).  This manual re-wiring or short circuiting is the hall mark of the practice of circuit bending.  By converting data sets from one format to another, it is possible to render instruction sets (program files) as images, images as sound files, sound files as incomprehensible strings of characters, back into images.  This practice of data-bending takes advantage of this technique of forced data processing by opening image or video files in text or hexadecimal editors, changing a few characters, and then opening the resulting file in an image viewing program.  Rosa Menkman, in her “A Vernacular of File Formats” demonstrates the potential of various data-bending techniques performed on a wide range of file formats3.  When these transformations are performed using standardized file formats, the results take on the signature noise of the algorithms used in the translation from one format to another.  This forced rendering of “unconventional” (altered, corrupted, format inappropriate, or mismatched) data can reveal the architecture of the machine; the grid work of the algorithmic unconscious is revealed.  

From Error to Noise

Mis-interpretation from the perspective of machines has no meaning without the context of conventions devised by their human operators.  Error is relevant only in the context of an intended purpose.  To dive further into the nature of machine error (if we can even call it that anymore), we now turn to the introduction of noise---here taken to be anomalous or undesired data.

The example in the opening paragraph illustrates one natural process capable of introducing noise into a digital system by changing the state of a bit from 0 to 1.  Were this to happen in a data set representing a bitmap image, the effect may be as subtle as changing slightly the color of a single pixel, or as drastic as corrupting the file in such a way that it is no longer recognizable as an image file; it’s all a matter of what that bit’s function is.  On the level of code, a change in the instruction set could cause any number of effect ranging from incomprehensible output to the entire system to a grinding halt.  What is important here is that errors do not appear to machines as errors at all; all that is really happening are mathematical transformations, numbers acting on numbers via different physical configurations of transistors.  Unless a device has been designed to detect and suppress anomalous output, return an error message, the logic gates are perfectly capable of churning out bits as fast as they can be pumped in.  Garbage in, garbage out, or so they say.

By designing a system built around two-state logic and numerical representation of information, the effect of interference and noise---random electronic variations introduced by thermal noise---on signal fidelity is minimized.   In this sense, digital systems are by design anti-noise.  In the shift from analog (or rather physical or chemical) forms of art making---where physical agents operated on physical materials---to digital, the inherent noise of physical material and its impact on signal fidelity is controlled and managed according to algorithms (mathematical operations).   Anything that is to be generated or processed by a digital system must be represented in numerical form, even the program generating or processing the data.  This does not mean that it is impossible to capture noise or generate a sequence of numbers that appear random, rather that noise becomes represented in sets of discrete values.  The random variations that characterize noise become limited by the complexity of the mathematics used to represent or reproduce them.

Hearing Voices in the Noise

Despite the elimination and control of noise in the form of random fluctuations, other forms of noise become inherent features of digital technologies.  Encoding continuously variable values in discrete numbers reduces the impact of noise in the form or transmission errors, but introduces its own signature in the form of quantization errors and other artifacts.  To reduce the these basic forms of error, we can increase bit depth and sampling rates but this leaves us with a massive amount data.  Streaming media over the Internet requires us to transmit digitized signals through a system with limited data rates.  A standard audio CD has a data rate of 1411KBps, which theoretically could be streamed one-to-one on today's high-speed Internet connections, but if you wanted to send audio two hours of recordings to a collaborator elsewhere, you'd have to wait two hours for that transfer to complete assuming your ISP isn't lying to you about upload rates (which are as of 2011 still only a fraction of download rates).  Video is a whole other beast with data rates of 24MBps for uncompressed SD video and five to size times that rate for HD formats4.  Streaming this data or transferring it across the Internet today at a one-to-one rate is out of the question.  

Lossy compression schemes allow for large volumes of data to be represented by a much smaller amount of data.  This is achieved by analyzing a file and removing data that, according to perceptual models, is not perceived by a human viewer.  In the case of audio, the spectrum of a signal is analyzed, and based on psycho-acoustic phenomenon such as spectral and temporal masking (appropriate for the average human listener of course), data that represents information that would not be perceived by the listener is removed.  This loss of information introduces noise in places where it is likely to be masked by the content so that we are less likely to perceive it.  A similar approach is taken with the encoding of image and video files.  The overall color palate for a file may be reduced according to its content, patches of very similar tones consisting of hundreds of pixels may be represented as a few overlapping squares of color, and in the case of some video compression schemes, these squares will move according to vectorized paths.

What characterizes these forms of digital noise as opposed to the fine-grained variations of analog noise is that they are highly controlled; noise is only introduced where its impact is minimized, it is suppressed according to very specific algorithms and mathematical formulas.  It doesn't appear as noise because it is designed to take on the appearance of the original signal.  Ironically, this desire to mitigate the impact of noise can actually amplify its effects.  When a digital signal becomes degraded, the algorithms responsible for decoding a data stream reconstructing the original information produce artifacts that bear little resemblance to the original content.  These artifacts---from fragmented and disjointed images, scrambled geometric patterns, melting color fields, atonal melodic whistles, to bursts of static---bear the marks of the compression/decompression algorithms which operate otherwise undetected, in the background.  These signatures are the products of interpretive algorithms designed to discard information based on human limits of perception.  Though I would hardly characterize compression artifacts themselves the product of machine-based creative improvisation, by repeatedly compressing a file, compression algorithms begin to produce new forms of content (within their limited vocabulary) which can be interpreted as metaphors for hallucinations, active imagination, creativity.  In a strange sense, we have encoded ourselves into the machines, imbuing them with a crude form of imagination or creativity.

A Premature Closing

This appearance of a possible machine creativity, of the machine collaborator has its roots in the dynamic relationship between digital technologies and their human creators.  The production of highly complex processors and the instruction sets which govern their operations involves a collaboration between the humans who specify the design requirements and the computer algorithms they've designed to make decisions on how to execute those designs.  The problem of compressing billions of transistors into arrangements that utilize the surface area of the silicon wafers out of which they’re made is nearly infinitely complex and an incredible challenge for human or computer alone to solve.  The necessary collaboration between human and machine enabling the development of more advanced digital technologies is at the core of digital art making practices.  As algorithms become a metaphor for human thought encoded in machine language, we are seeding these machines with crude, limited, and highly specific ideas in the form of series of instructions and commands.  In light of all this, McLuhan's notion of technology being an extension of ourselves may not be far from the mark5; though, far from being autonomous, our machines are dependent upon our survival for theirs.  The algorithmic unconscious may not yet be something that we can clearly define or identify, however, we may be able to look at the products of glitch art, circuit bending, and other related forms and identify between their ideas a revised metaphor for ourselves and our relationship to our technology and the environment.

This meditation has focused its attention on the material basis for the digital art making practices, touching upon the numerical systems of representation and the algorithms employed by digital technologies.  In much the same way that structuralist film abandoned the conventions of cinema in the pursuit of working with the material essence of the medium of film, glitch art and circuit bending---and other related practice which force digital systems ans algorithms into limit performances---represent a set of practices seeking to work beyond the traditional scope of the software or hardware tools, seeking within them essential characteristics and using effects inherent to the medium to explore new avenues artistic production.  It is my hope that this meditation will contribute to the enrichment of the discussion surrounding the work of artists who are working outside of conventional practices, violating not only the physical enclosures of the devices they work with, but the very data structures and architectures of the processors operating within.  Through a more refined understanding of the material basis for an artistic practice, it becomes possible to more concisely define the potential conceptual metaphors entailed by the application of specific techniques and how they can be used to compose a situation that produces a physical effect on the viewer which reinforces the production meaning on the subjective level.


[1] A yet undefined term, algorithmic unconscious appears independently in the writing of Carl Diehl ( and Matthew Fuller & Andrew Goffey (

[2] JAIF, Tepco Nuclear Power Plants and Earthquakes. September 2011
retrieved from

[3]Menkman, Rosa. A Vernacular of File Formats. August 2010.
retrieved from

[4] Final Cut Pro 7 User Manual: Data Rates and Storage Devices March 2010
retrieved from

[5] Marshall McLuhan Understanding Media: The Extensions of Man 1964