Sensing Systems was a multidisciplinary exhibition and event series celebrating the majesty of system dynamics in nature which took place between February and March, 2020 in Nottingham. The debut exhibition by Matt Woodham at Bonnington Gallery was a live, interactive system of light and sound. Installations throughout the gallery space harnessed mechanisms commonly found within the dynamics of nature, producing an organic, textural yet familiar and uncanny perceptual experience. The processes utilised were inspired by Matt Woodham’s educational background in cognitive neuroscience and his further research into complexity and dynamical systems, such as feedback, reaction-diffusion and chaos theory. Visitors to the exhibition could interact with the real-time generative projects on a custom-built sculptural interface in the centre of the space. The interface also enabled visitors to bend light around the room and trigger analogue video synthesiser animations on angled video screens.
The event series included a symposium which brought together artists and scientific researchers to Nottingham Contemporary to deliver talks and discussions around the shared goals challenges within artistic and scientific pursuits. There were also 6 arts and tech workshops delivered at Near Now which shared tools and techniques to create real-time generative art and artistic processes which utilise scientific concepts such as cymatics. The finalé of the project was a live audio-visual performance at Metronome with musicians and artists, however unfortunately this event, and the final 2 weeks of the exhibition were curtailed due to the Covid-19 pandemic. An online interactive stream of the exhibition was created on Twitch in response, allowing users to control the generative animations via a chat bot similarly to the interface in the exhibition. It’s a fitting end to a project where the themes introduce system dynamics and their vulnerabilities, to be halted by a cascade failure of fragile global human systems.
EXHIBITION MECHANICS
Below I outline the how the technical challenges of the exhibition were approached. I will walk through the hardware and software systems we utilised and built to produce a 20 minute, real-time, generative experience comprised of 5 different ‘scenes’ running on 7 projections, 6 ESP32 micro-controllers, 4 Rapberry Pis, 750 LED pixels and 1 computer. Most of the custom software was written by developer and maker Benjamin Shirley-Quirk.
The dual images on the right (or below on mobile) expand as you hover or touch them. An audio mix is being made in response to the following Covid-19 lockdown to provide an atmospheric companion to this page and will be uploaded soon.
1. INTERACTIVE PROJECTIONS
HARDWARE
COMPUTATION
The first consideration was figuring out the most efficient and affordable way to run up to 8 concurrent real-time interactive video steams. As my live visual software of choice is VDMX (more on this later) is only on macOS, this restriction became a tricky initial challenge. After briefly considering stacking multiple mac minis, the conclusion became to build a custom system which had a powerful multi-output GPU. The Aorus RX 5700 XT GPU was just released before the exhibition and would prove to be suitable to run 7 x generative 1080p streams in real-time. There are only 6 outputs on this GPU however, so an AJA HA5-4K was used to split one 4K output and distribute this among 4 separate 1080p projectors. The HA5-4K also benefitted from having SDI output – allowing for hassle-free long cable runs of HD video signals.
CPU & MOTHERBOARD
GPU & CASE
CUSTOM SCULPTURAL INTERFACE
The next biggest challenge to consider was how the real-time systems could be interfaced by an audience. A custom sculpture was designed comprised of 15 arcade-style buttons modified to fit individual WS2812 addressable LEDs and 10 encoder assemblies with 144/m density WS2812 LED strip mounted on a laser-cut panel to display the encoder values. The 3 sides of the pyramidal tip corresponded to each projection, and the top panel controlled both the caustic reflections around the room and the ‘screen garden’ installation. This large number of inputs and LEDs were controlled by an ESP32 microcontroller inside the structure, which sent OSC data over WiFi to VDMX running on the custom machine. The interface also housed a Fane Colossus 18xb speaker driver from a Funktion 1 sound system, which provided omnidirectional rumbling bass frequencies from all the audio tracks.
INTERFACE
INTERNALS
SOFTWARE
VDMX
The beating heart of the entire operation was VDMX. This one piece of software was both the source of all generative video and handled OSC message routing. The cue list plugin was used extensively to create the sequence of events, scheduling when the projections should play, when the pendulums should activate, when the screen garden should trigger a random video, when to play audio and which channel, when specific LEDs should be active, etc. The cue list was over 150 items long and sequenced the entire show, including everything up to the ‘identifying signal’, which primed audiences about which scene would happen next. The interface controlled several programmed ‘macros’ in VDMX which controlled a wide variety of parameters and cycled through a variety of preset states. Many of the parameters and FX were contained within feedback loops, causing unpredictable behaviours and emergent visual textures. The cue list also triggered random events and parameters, so that each 20 minute cycle was never the same.
2. AUTO DOUBLE PENDULUMS
The double pendulums had a multitude of challenges to overcome and were developed over several prototypes. The final construction for the exhibition was engineered as precisely as was possible. This is of absolute importance for this piece as the double pendulums demonstrate chaos theory by their high sensitivity to initial conditions. Both pendulums were constructed identically, and a high resolution stepper motor automatically raised the pendulums to the same position, which was also confirmed with an accelerometer. When the electromagnet released the pendulums, the traced paths very quickly diverge due to slight discrepancies and initial conditions sensitivity.
A custom aluminium mount was CNC milled to affix the pendulum to the keyway coupler of the stepper motor. This assembly sandwiched and clamped the CNC-cut luminescent circular back panel at the motor axle. The panel had CNC cut holes to place the two electromagnets, which engaged to lock the pendulums in place as they were raised to position, and disengaged to release. There was also an engraved groove for placement of the wireless charging coil which charges a LiPo battery, keeping the UV LED illuminated and tracing the pendulum path pattern. A prototype pendulum assembly with receiver coil, charger PCB, battery and UV LED is pictured to the right. Pendulums were laser-cut and CNC milled 10mm thick acrylic with high quality bearings and affixed metal washers to lock onto the electromagnets. Transmitting power across a free-moving pendulum to illuminate the LED, and clamping/raising the pendulums to a precise position were two of the greatest challenges with this installation.
The lock, raise, release, return sequence was initiated by an OSC message from VDMX and handled by an ESP32 on the local WiFi network.
PENDULUMS IN ACTION
PROTOTYPE PENDULUM ASSEMBLY
3. SCREEN GARDEN
Six LCD screens were installed in angled pairs on plinths. Each pair contained a Raspberry Pi 4 with its own playlist of videos. When a specific OSC message was sent from VDMX – either triggered by the cue list or visitors pressing ‘secret’ button on one of the encoders, a python script triggered two videos to play at random from a playlist, creating an ever-changing composition of videos.
The videos themselves were produced using a DIY analogue video synthesiser, which is primarily comprised of LZX PCB designs. Simple functional blocks combine to create complex geometric patterns. Emergent, organic ‘reaction-diffusion-like’ texture and motion is achieved using multiple layers of feedback loops among the analogue hardware and digital recording software.
4. CUSTOM LIGHTING CONTROL
A custom LED driver was developed to control five lengths of 5m neon flex attached to scaffolding structures, with embedded 12V WS2811 addressable LED pixels. A modified version of the ‘fadecandy’ firmware was installed on ESP32s, which was needed to overcome the 64 pixel limit on each of the 8 fadecandy channels – as each length of flex contained 150 pixels. The ESP32 microcontroller and power supply were installed in a laser-cut enclosure on the scaffolding structure.
Video animations were generated and controlled by VDMX, then converted to pixel data using Syphon to Processing, which transmitted the OPC pixel data over WiFi via a server to the individual microcontrollers. This real-time feed of pixel data was sequenced by the VDMX cue list plugin, signalling the upcoming projection, and the animation could be controlled by the top panel encoder of the interface.
5. AUDIO
Each projection has it’s own stereo audio track which again was triggered to the correct channel by VDMX. Stereo speakers were placed aside each projection to provide a spatial orientation to the sound and direct audience around the room. The audio was routed via the audio interface so that the 18″ bass driver within the sculptural interface in the middle of the room resonated only the bass frequencies of each audio track throughout the room.
The audio was created by using a eurorack modular synthesiser which utilised chaotic circuits and feedback to create unpredictable, textural sounds. These live synthesiser recordings were then arranged and produced in collaboration with musician Matt McMahon.
If you would like to talk about anything to do with this project (or other projects!) – please feel free to get in touch with me at matt[at]multimodal[dot]live.
Thank you to collaborators:
Benjamin Shirley-Quirk, Alex Pain, Rich Maskey, Hattie Coupe, Ulrike Kuchner & Matt McMahon
Partners: