Innovation & Technology Sound for the Moving Image
Rob Ferguson

I am a multi-disciplinary sonic and visual artist most keenly interested in place and emotion. My work explores the emergent properties of sound and vision, using both experimental and robust techniques in digital realms in captivating the listeners imagination. I look forward to collaborating with artists and designers in disparate fields and am excited to apply my skill-set in post production, video games SFX, advertising, field recording, and music. Having built up my own studio over the course of several years, I am a steadfast Ableton user well versed in Premier, Max MSP, Dreams PS4, eurorack, synthesis, piano, foley/field recording, and Elektron sequencers. I produce music under the moniker rabalan and also enjoy photography and writing. I am always on the look out for meaningful opportunities to deepen my knowledge pool and am largely uninterested in zeitgeist movements that take place across the internet. I am not a user of social media but I love a good email so please get in touch if you have any queries or questions, especially in relation to the course I undertook at GSA.

Sonorous Body
An exploration of generative environmental sound recessed into an esoteric, spatialised interface. You are transported into solemn vignettes of something that resembles a place in time. The only thing to do here is listen and leave. This interactive piece encourages the player to immerse themselves in a series of ephemeral sonic milieus – expanding on the acousmatic threads of Francisco Lopez & Pierre Schafer.
The piece is assembled in Dreams PS4 (an increasingly obscure yet magical intuition-based game engine) and makes extensive use of the experimental audio and visual systems. Sonorous Body echoes the decay of obsolete technology and the intrigue it imparts on its users.
The soundscapes are composed of obfuscated and representational bespoke field recordings. The custom audio generation system binds and manipulates them into ambisonic sound fields that attempt to challenge our perceptions of bucolic nature. The procedural generation techniques allow for a magnitude of ghostly sonic snapshots.
Project Links
Without Destinations
“You know we don’t want to return to Unthank. I asked for a town with sunlight.”
“Would Provan suit you’ ‘The inhabitants there speak a kind of English”.
…
“Industrially speaking, you see, Unthank is no longer profitable, so it is going to be scrapped and swallowed. In a piecemeal way we’ve been doing that for years, but now we can take it en bloc and I don’t mind telling you were rather excited. We’re used to eating towns and villages, but this will be the first big city since Carthage and the energy gain will be enormous.”
(Alistair Gray, Lanark, 1981)
This imagery of a Scottish council swallowing cities for energy gain and that some of these cities don’t have sunlight was so vivid and so bizarre I had to follow the thread.
.
I aimed to extrapolate on Grays ideas and re-frame them in a futuristic world, asking the question, what if this council never stopped and built bigger and bigger cities in untold pocket dimensions and what would this feel like for normal people? I liked the subtext Gray was alluding to, the Thatcher-esqe dissolution of cities no longer profitable combined with the unpredictable and dreich Scottish winters where we live without depending on the sun. This short animation pursues a Glaswegian narrative based on Gray’s works while aiming to subvert the typical screen-cities of Mega New York/Tokyo, bringing it home to Scotland and incorporating varying degrees of high future and bog-standard Glaswegian city planning. The film explores something sensory and dizzying, peering into a world that creates more questions than it answers. Its stand alone as a story but also act as a slice of a larger tale.
.
The CGI involves building scenes in Dreams PS4 with an amalgamation architecture drawing from Scottish municipal, brutalist, and cyberpunk styles. Visuals were inspired by films like Fallen Angels (1995) and Bladerunner 2049 (2017), recreating some of the iconic shots. Drones were designed simply to be characterised by sound. Final shots were processed in Premier Pro.
The SFX, VO, and Music was was composed and designed by me and produced using Ableton 12 and Protools. A key technique was in experimenting with diegetic/non-diegetic barrier to destabilise the visuals and obfuscating known sounds, particularly the car. The general sound design aimed to be low tech and naturalistic like Dune (2021) and Drive (2011) but draws from the clinical high tech fantasy of Metal Gear Solid 4 (2008). The Dialogue, a phone call, features a dejected driver and his excitable brother. Although bewildering, performances are intended to feel authentic, importantly accommodating real Glaswegian accents. Dialogue expands the world building.
Awumbuk
Awumbuk: There is an emptiness after visitors depart. The walls echo. The space which felt so cramped while they were here now seems weirdly large. Sometimes everything seems a bit pointless. The indigenous Baining people who live in the mountains of Papua New Guinea are so familiar with this experience that they name it awumbuk. They believe departing visitors shed a kind of heaviness when they leave, so as to travel lightly. This oppressive mist hovers for three days, leaving everyone feeling distracted and apathetic. To counter it, the Baining fill a bowl with water and leave it overnight to absorb the festering air. The next day, the family rises very early and ceremonially flings the water into the trees, whereupon ordinary life resumes.
T. Watt Smith. 2015
Working in a team of multi-disciplinary students, we worked to capture the raw emotion experienced by this small death into an audio-visual piece. The team worked cooperatively in creating one cohesive piece, outlining the key physical and emotional stages of the ceremony, by which the tribe set out a bowl of water to capture the cloud of dismay and thick emptiness and empty this bowl to cleanse themselves of Awumbuk. The team members each worked in layers to compose each component to provide a rich, full representation of the experience: layering sound compositions pulled from noise-making sessions; analogue visuals created in-studio; animated composition sequences created in Adobe After Effects; and composite visual overlays in Max MSP. The outcome captures 4 minutes of collective footage taking the audience on an emotional journey similar to that of the tribe. The beginning sets the tone of the tranquil peace experienced in day-to-day life, before the harsh and disruptive build of activity disturbed the otherwise placid waters. The flurry of vivid colour and emotion prevail throughout the parting of loved ones, leaving a muddied, thick, and heavy atmosphere. This swamped emotional disarray is then collected, in ceremony, and cleared from the atmosphere to restore the original, undisturbed tranquillity.
Situated Piano
The Situated Piano takes inspiration from Phillipe Manoury’s Pluton (1988), which combines piano and computer in a way that balances control and generative chaos. The aim is to follow this path by imposing transformative rules on the piano itself to explore novel music styles and uncover hidden tones and textures. Historically, the piano, invented by Bartolomeo Cristofori, surpassed earlier instruments like the clavichord and harpsichord by offering greater dynamic range. Initially accessible only to the European elite, it became associated with high status and cultural capital. Despite its complex history, the piano became ubiquitous partly due to its functional simplicity and the player’s perceived perfect and immediate control over time, amplitude, and frequency. This control instils a sense of predictability, and the instrument’s design has been refined for centuries, remaining functionally similar for 300 years.
The system specifically challenges two core “rules” of the piano’s canon: its established frequency and amplitude range and its predictability in time.
Firstly, to break the frequency and amplitude, the project creates the illusion of strings extending beyond the physical limits of the instrument.This is influenced by prepared pianos, a technique pioneered by John Cage where objects are placed on strings to disrupt tuning and create unstable harmonics, and artists like Volker Bertelmann (Hauschka) who use techniques like vibrators. The project combines these ideas with digital processing, feeding lower regions into a spectral resonator for sounds of long, rattly strings and upper regions into a physical resonator suggesting tiny strings. A ramped tremolo unpredictably excites the upper resonator.
Secondly, to challenge the dimension of time, the project employs a delay circuit that recalls past events. Inspired by Bugge Wesseltoft’s work with loopers that chop and layer sonic fragments, this project uses a pseudo-random injection of past sounds. A delay system freezes and opens based on a sequencer, splicing in microsound loops from several seconds ago, adding unpredictability that encourages engaging decision-making. In essence, like other artists challenging the piano canon, this project introduces unpredictability and uses external forces to expand the sonic possibilities of the instrument. It works within the established framework of the piano, recognising its perfected design while pushing its boundaries.
I made use of Ableton 12’s Racks to design a malleable and generative effect system that situates the piano within the field I established myself in. Moreover, the piano itself is provided by Moddart’s Pianoteq, an expertly designed physical modelling of the piano. I opted for this instead of a live recording as I was able to further contort the piano into something unpredictable and elusive, making using of microtuning and midi control.
The final composition is available to listen to but I have also included some of the test pieces to better get a sense of how this system can warp the sonic input.
Sound Alone [Sonifying the Emotional Core of the Cityscape]
I am well acquainted with the ruckus and tenderness of Glasgow, having made a number field recordings in the city over the years. For this project, I concerned myself with capturing new tonality and rhythm from an environment I’m familiar with. To me this piece is about the microscopic or silent emotional world of urban spaces. By blending three types of sound (transparent field recordings, vague textures, and synth arpeggios) I was able to explore a new perspective on urban spaces, of our own biological sounds, and imply emotional context through music.
There were several factors that guided my direction, and aesthetic of the track. Namely, the nature of the field recordings I captured and the reactive composition techniques I explored. The contact mic (CM) samples translated well into biological noises like breathing, heartbeats, and swallowing but are still sympathetic to the mechanical noises of urban environments. The underlayers of transparent field recordings and warped looped field recordings ground the soundscape to a real but unfamiliar urban. The swelling reactive synth lines help to emotionally contextualise the soundscape with ascending arpeggios and woozy vibrato. In this brief conceptual statement, I will discuss the thinking behind the contact mic field recordings and its relationship to the themes presented in the soundscape.
I explored several concepts in sound to inform my production of the soundscape. Namely that of John Cage and in using chance or reactivity to create music. He noted that by means of our decreasing dominance over our own complex societies and of earth – music or art that blindly promotes control is in opposition to our actual place in reality. He suggests that natural flow and impermanence better serve our us than the unflinching unchanging structures that western ideology concerns itself with John Cage. Cage promotes ‘letting sounds be themselves’. Sound can be heard purely as itself and ‘not depend for its value on its place within a system of sounds’. This informed my use of music concrete to arrange the CM samples. I wanted to avoid tempo or rhythm to better emphasis the alien nature of the recordings and felt they thrived better when placed sparsely like in early electronic music. However, to ground it emotionally I employed a synth part that reacted to the density of sound in the main music concrete part, letting the synth breathe life into the music concrete and instantly making it more relatable.
The CM samples exist entirely in the realm of anthrophony. Consider typical field recordings where some level of biophony, anthrophony, and geophony are present in the recording, even just briefly. When field recording with a contact mic there is very little capacity to capture the sound of birds or wind etc making the recordings completely alien. This kind of recording as accessing ‘hidden or microscopic’ sounds that exist in their own hard to access parallel world. This feeds into Cage’s argument that there is no such thing a silence. The sound of our bodies, of our listening environments, the worlds we inhabit emit constant noise that pour into our musical work. He argues that to ignore the noise within musical pauses, of attempted silence, is to deny our place within the real world. I also considered the beauty bias in acoustic ecology where urban field recordings are typically regarded as unwieldy or unpleasant.
For the soundscape I attempted to subvert this notion, framing the quite harsh and chaotic CM recordings as ethereal or intimate similar to the ambient works of Burial (2007) while also highlighting the hidden and silent world of sound they come from. I also drew from Michael Chion who suggested that when sounds lose their physical context, they become ambiguous and through reductive listening can be heard as something new. Through what he calls synchresis, one sound can fill the gap of another if the context is right. In this case the sounds of metal scraping became breathing and deep metallic donks became heartbeats. Field recordings of the city typically are defined by traffic, pollution, and layers of indistinct indifferent chatter. The dulled, resonate world of contact vibrations is one without words or pollution to me. It’s a simple and dense world and has an intimacy not usually found in the wide spectrum soundscapes of typical urban field recordings. This piece focuses on the warmth and interior of the city. Not so much so interior spaces but safe spaces away from the decay and gaudy layering of modern cities. For me it’s as if you can hear a hidden and intimate part of the city gently stirring as if its alive. At times you can hearing a raspy breathing or metallic heartbeats and when the synth swells it feel like pockets of light illuminating this hidden world. Its nervous, timid, and elative energy contrasts with my previous works in the same field.
Alien: Romulus Opening Sequence [Audio Redesign]
Please note the video component of this has been copyright claimed by Disney Enterprises 1 and is unavailable to view until disputed I have included the sound work alone to review :/
There are two areas I want to highlight and discuss the sound design strategies I employed. The opening interior shot of the unmanned spacecraft booting up required me to design a number of fictional computer tones and textures, spatialise them, and make ground them in a believable reality. To do this, I reviewed the source material again but also referenced similar sequences Scott’s Alien and Cameron’s sequel Aliens. I also looked at sequences in Neon Genesis Evangelion for a more modern and stylised take on fantasy computers. I synthesised these sounds using a specialised ring modulation feedback system I designed that can generate wonderful and unpredictable tones. I then sculpted these into specific sequences for each shot – carefully matching the visual cues. I then spatialised them using a combination of Plogue’s Chipcrusher and Ableton’s ML.Distance. Chipcrusher was used to simulate the sound coming from different pieces of vintage electronic equipment and is core to the rugged analogue aesthetic I attempted to situate the work in. ML.Distance simulates distance and panning (effectively a linked filter and amplifier with an intuitive interface) and was used throughout the SFX design to effectively and quickly assign each sound to a space on screen. These keynote sounds were combined with ambiences and impacts I designed using contact mics and field recordings to further the claustrophic and hauntological atmosphere of the main deck.
The other area I wanted to highlight was utilising psychoacoustics and critical listening bands to the fullest extent. To begin with, I wanted to uphold Files’ and Gilmore’s silent opening. At first I thought maybe some ambience or music could fill the opening minute but I quickly realised that the silence is by design. By having absolute silence, the viewers ears completely reset and wait in anticipation. This long minute is suddenly interrupted by a cacophony of noise as the computer guided ship comes to life. This contrast is absolutely crucial in maximising the impact. In every other designed sound, I made sure that, where appropriate, I was making use of the whole frequency spectrum. This was employed to full sell the size and reality of the fictional world. Two examples. At 2.30, when the alien rock is being lowered onto the apparatus I created a low-end impact to convey it’s mass and cinematic important but to properly convey the sheer size and reality of the sequence, I added high frequency detail sounds like rocks cracking and chains wavering. Without these sounds the full impact is completely lost. Moreover, at 2.38, when the cutting laser is working, I added some mid-range rock cracking and low-range room ambience. Although the laser is the focus of that sequence and is overwhelming in the mix, adding these tiny detail sounds in to other parts of the spectrum in brings it to life and sells the illusion of it.
Flooded Grounds [Implementing Dronescapes in Unity with Google Resonance]
Through the environment there are soundscapes emitting from various object. Some of these are leaning on the side of realistic whereas others fall into supernatural. For example, the radio masts and boats have a generally representational sound, but the audio source is slightly warped and distorted. They also resonate in an unnatural way. The dark ambience from other objects, like the windows in the first cabin and the wardrobe in the next are completely nonrepresentational. They emit a loop of a warped underworld. Processed samples of wind, contact mic recordings, piano, and waves are some of the textures used to create these loops. The technical intention was to experiment with Google Resonance emitters as they have the ability to create a very focused field of audio. You can tune these to the point where they emit a spatialised field of only a few inches wide and a few feet forward when in the shotgun configuration. My aesthetic intention was to create sounds inspired by the likes of games known for their dark ambiences. Notably, Bloodborne, for its use of audio in the sequences of talking to characters behind locked doors. It’s a cheap but effective trick to use when adding depth to a game space – the acousmatic sound your only option to comprehend an unknowable sequence. Moreover, I looked at games like Alien Isolation, Inside and the Stalker series. They all rely heavily on environmental audio to inject horror and tension to their respective style of game. Alien Isolation for example consistently keeps you on your toes with the constant rumbling and whirring of the space station. This distorts the players experience into one where no environment feels fully safe. A similar approach was used in Inside but in this case the story is told almost entirely through the mood of the environmental audio with each section introducing new layers of cosmic industrial horror. I found the bridge sequence and underwater sequences to be particularly effective in conveying this and tried to replicate some of the distorted, subdued, and brutal textures Martin Andersen created for the game.
The design of the footsteps was partly inspired by Escape from Tarkov, it being a hyper realistic game based in unity. The visual style prompted me to revisit the game where I drew inspiration from the sparkly and crunchy player SFX. I opted for a spline system that would run the audio around the coastline relative to the player. This is a much more efficient implementation with regards to performance, but it also negated the risk of having any phasing or dead spots where the water could not be heard – factors that could break the immersion. I also implemented an interior audio system based on FMOD snapshots. This is a simple but effective method to differentiate between the various game spaces. When the player walks into a building a snapshot event is triggered which smoothly brings down the outside ambience, brings up a room tone wind, and sends the player SFX to a reverb. There are two sizes of reverbs available to better match the different buildings in the level. The outside ambience becomes uncannily quiet to better accentuate the presence of the dark ambience.