
Alexander Dupuis
2021That Which is About Us
Alexander Dupuis
The audiovisual engine behind this piece uses a digital adaptation of the "raster scanning" method. The digital implementation initially ran into some unexpected limitations: the number of pixels onscreen for the duration of one frame significantly exceeds the number of audio samples played for the equivalent duration. However, I found the resulting distortions to be an especially interesting part of the process, eventually building the program for that which is about us around the distortions from translating a digital sound to an image and back in this particular fashion. Much like That Which Pulls, this piece develops in relation to points of harmonic resonance, but in this case we witness the sounds and images circling these resonances together.
That Which is About Us
4JM
Alexander Dupuis
4JM applies techniques from drawn-on-film animations to a digital context. The visual content was created by a video feedback program which wrote out a single line of pixels at a time, beginning at the top of the first frame and working down. When it reached the bottom of the first frame it saved the image and continued on, with its next output creating the first row of pixels of the next frame. The changes in the video were created by manipulating the feedback parameters as the frames rendered one line at a time. Overall the piece was rendered over the course of several days - each frame took about twelve seconds, and I had to slowly change the parameters and monitor the output, occasionally going back a few frames to re-render something with the right effect. The sound was created with a synthesis program I've been devising which simulates the effects of amplified feedback in a virtual space.
4JM
No-Input Pixels
Alexander Dupuis
"While designing and testing a video feedback rig, I noticed the system started creating repetitive patterns without any input from me. Under certain settings, the system demonstrates emergent behaviors where related gestures form, collapse, and reconfigure themselves. Over time, these same gestures rotate through different combinations of primary and secondary digital colors, with the collapses becoming longer and more intricate. The sound is a sonification of the video based on digital extension of optical soundtrack idea." (A.D.)
No-Input Pixels
That Which Pulls
Alexander Dupuis
That Which Pulls resynthesizes sounds and images by using differential motion to shuffle their data. Within this piece I apply this process to the pixels of a single image and the samples of a single note to generate the audiovisual material. The pixels arrange and rearrange themselves, alternately descending into chaos and resurfacing in a new order. An unexpected side effect was the Moiré patterns that emerge as the pixels realign near the different nodes, emphasizing the resolution. Within the audio, the single note quickly becomes multiple notes as the samples realign near the nodes, with complex syncopated rhythms appearing in the later iterations. These two streams of raw material were cut up, layered, and recombined to create a piece that explores the dynamic interplay between the emergent auditory and visual gestures, focusing on the counterpoint between their patterns of chaos and resolution.
That Which Pulls
Three Paths
Alexander Dupuis
An audiovisual piece in which sounds and animations are generated in real-time from simulations of microphone feedback. Mic feedback is a useful paradigm for generating audiovisual works because it involves an inextricable relationship between the sonic qualities of the feedback and the positioning of the microphone - or, to put it another way, it links (visual) spatial information with audible results. By digitally simulating these microphones, their locations can be animated in any number of ways to provide a synchronized counterpart with the generated sounds, and the motion of these animations will correspond with the resulting sonic changes as microphones move closer and further from the virtual speakers. Small undulations become vibrato, twists and turns lead to Doppler effects, and sudden movements generate percussive attacks. Other parameter changes like distortion, autotuning, and video feedback, are programmed alongside these movements to help establish the arc of the piece.
Three Paths
First Contact
Alexander Dupuis
The video component of First Contact grew out of a couple of studies in video feedback, one of which involved ways of steering feedback towards specific color palettes, and the other bringing in simple 3D animations as texture masks. Combining these feedback systems led to the emergence of colorful, blobby objects which reacted in a somewhat explosive manner when colliding with each other or with the bounds of the video space. When putting the piece together, I structured its form around these moments of meeting, and the final video was ultimately recorded in one take through the sequencing of feedback parameter changes. The music in the film was put together after the creation of the video, and uses a number of different instrumental samples like flutes, clarinets, wind chimes.
First Contact