top of page

Programming

My passion for interactive audio has lead me to utilise the power of audio programming to design and implement audio systems for video games using Wwise, UE4 Blueprints and C# scripting in Unity. I also use the Max/MSP programming language frequently to create a range of audio tools and realise various audio experiments.

Wwise & C#

This short video documents my approach to implementing audio assets into a Unity game using Wwise. It shows how I used C# and Wwise to script audio events and send RTPCs from Unity to Wwise.

MAX/MSP

These are some recent projects of mine created with the MAX/MSP programing language. I have created stand-alone applications as well as MAX for Live devices to be used as audio plug-ins in Abelton Live.

X:delayY

“X:delaY” is a delay effect based on polyrhythms. It can be used for long interesting delay patterns or for noisey FM tones when frequency modulation is used as well. 

 

With ease of use and performability in mind, I created a patch which is not too complicated but does what I want well. I made sure the polyrhythm ratio is displayed clearly and can be easily modified by either clicking on the sliders or via the drop down ​menu​ tab.

My initial inspiration for this patch was the rhythmic phrasing heard in Steve Reich’s Piano Phase (1967), although the final patch ended up moving in a slightly different direction, with the use of polyrhythms to determine delay length. When developing the device I often used an arpeggiated synth line to test out the sound.

Video demo: https://youtu.be/Vb5bTDf0Vsw

Screenshot 2021-01-14 at 00.28.23.png

GranuTouch

“GranuTouch” is a touch-based trackpad granular synthesizer. It used the position, size and velocity of the user's fingers on a Macbook trackpad to create and manipulate audio samples.

Input data from a Macbook trackpad sends values to a granular synthesier I created. Values such as feedback, stereo width and grain size are controlled by how much of each finger is being pressed down on the trackpad. This means that the sound gets more or less dense based on how much pressure you push down on the trackpad.

The velocity of each finger modulates the audio sample's pitch. Up to three fingers can be tracked at once

Computational Composition

These are some more projects of mine which aim to utilise MAX/MSP programming to produce generative pieces of music often embracing the computer music aesthetic. 

This is a patch which composes a short piece of music structured in a classical sonata form. The user inputs a selection of notes and a corresponding probability and the patch will out the piece as MIDI data. In this video, I then sent the MIDI to a DAW to trigger orchestral samples.

This is a performance device which uses granular synthesis and random sample playback and manipulation to perform an abient electronic piece of music.

This is a self-contained software composition which uses Fast Fourier Transform to split an audio source into individual frequency bands which are then each sent through a delay. The moving sliders in the UI represent each frequency band and their corresponding delay times and feedback amount.

Chance and randomisation are then used through the patch to create evolving drum patterns and glitch textures.

bottom of page