Listening To Cellular Automata’s Natural Habitat (or am I…?)
As I have mentioned previously I am currently working on OpenLabs Game of Life project. A persistent point which I raised during discussions was why we were having the machines communicate with one another via light and sound. I was trying to address that both of those seem like concessions to human phenomenology. I am really taken by the way that code inhabits spaces and temporalities which are inaccessible to the human sensorium, and I can’t deny the influence of Valentina Vuksic on me in this regard. To that end I decided to try out my newly purchased pick-up mics on what cellular automata software sounded like (the results of which can be seen in this video below).
(in the above video two contact mics were placed on the RAM and CPU of the computer running the cellular automata program, in this case the example sketch bundled with processing)
After doing this I came to realise that attuning to software in this manner isn’t much different than the light/sound scenario proposed by open lab. The main difference is that the sounds heard above are equally meaningless to both human and software (it being the residue of voltage exchanges rather than the significant information conveyed). I guess the contrarian and software exoticiser in me liked that aspect of it.
In either scenario the following point remains pertinent: by bending the machines into our apprehendable sensorium we are being disingenuous to the actual ecology which the humans and software/hardware assemblages inhabit. The point being that the means in which software affect humans (and vice versa) doesn’t have to be confined to sensory registers in order for it to be significant.