Interfaces 1 – MIDI
As they are widely available and well known, MIDI interfaces can serve as a starting point for exploration by translating physical gestures into synchronized data streams that simultaneously control both sound generators and visual rendering systems.
MIDI CCs can quiete easily be mapped to parameters of sound as well as visual elements. When MIDI is received in TouchDesigner or Processing, the data can be broadcasted via OSC to suitable sound software as SuperCollider, Max, or PureData.
Modern MIDI interfaces often incorporate additional sensor data through MPE (MIDI Polyphonic Expression) or OSC integration, allowing for more nuanced and expressive audiovisual performances that respond to subtle gestural variations.
But also the standard MIDI data can be treated by programming and software for more complex data structures and computations.
Interactive systems with MIDI in Processing
Receiving MIDI in Processing
import themidibus.*; //Import the library that lets us talk to MIDI devices
MidiBus myBus; // Create a variable that will connect to our MIDI controller
void setup(){
size(400, 400); // Create a window that is 400 pixels wide and 400 pixels tall
MidiBus.list(); // Show all MIDI devices connected to the computer in the console
myBus = new MidiBus(this, "MIDI Mix", "MIDI Mix"); // Connect to a device called "MIDI Mix"
// The first "MIDI Mix" is for input (receiving messages)
// The second "MIDI Mix" is for output (sending messages)
}
void draw(){
background(255); // Fill the window with white color (255 means white)
// This happens 60 times per second to keep the window clear
}
void controllerChange(int channel, int number, int value) {
// This function runs automatically when you move a fader or dial on the MIDI controller
println("Fader/dial: channel "+channel+", number "+number+", value "+value);
// Print information about which control was moved and its new value
// channel = which group the control belongs to
// number = which specific fader or dial was moved
// value = position of the control (0-127, where 0 is all the way down/left and 127 is all the way up/right)
}
void noteOn(int channel, int pitch, int velocity) {
// This function runs automatically when you press a button on the MIDI controller
println("Button on: channel "+channel+", number "+pitch+", velocity "+velocity);
// Print information about which button was pressed
// pitch = which button was pressed
// velocity = how hard the button was pressed (usually 127 for full press)
}
void noteOff(int channel, int pitch, int velocity) {
// This function runs automatically when you release a button on the MIDI controller
println("Button off: channel "+channel+", number "+pitch+", velocity "+velocity);
// Print information about which button was released
}
Passing MIDI on via OSC
MIDI data to SuperCollider
import oscP5.*;
import netP5.*;
import themidibus.*; //Import the midibus library
MidiBus myBus; // The MidiBus
OscP5 oscP5; // Declare an instance of the OscP5 class to handle OSC communication (receiving messages).
NetAddress myRemoteLocation; // Define the destination for outgoing OSC messages
int dotX; // X-coordinate for the visualization dot
int dotY; // Y-coordinate for the visualization dot
void setup(){
size(400, 400); // Set the size of the window
MidiBus.list(); // List all available MIDI devices to the console - useful for debugging connections
myBus = new MidiBus(this, "MIDI Mix", "MIDI Mix"); // Connect to the MIDI Mix device for both input and output
oscP5 = new OscP5(this, 12000); // Initialize oscP5 to listen for incoming OSC messages on port 12000.
myRemoteLocation = new NetAddress("127.0.0.1", 57120); // Target SuperCollider's language port for OSC communication
dotX = width/2; // Initialize dot position to center of the window
dotY = height/2; // Using width/height properties avoids hardcoded values
}
void draw(){
background(255); // Refresh the canvas each frame to prevent trails
fill(0); // Set fill color to black for the dot
ellipse(dotX, dotY, 10, 10); // Draw the dot at current position with fixed diameter
}
void controllerChange(int channel, int number, int value) {
println("Fader/dial: channel "+channel+", number "+number+", value "+value); // Log MIDI CC data for debugging
switch (number){ // Switch based on the controller number rather than multiple if statements
case 16: // Controller 16 (likely a specific fader on the MIDI Mix)
dotX = (int) map(value, 0, 127, 0, width); // Map MIDI range (0-127) to window width, cast to int
OscMessage oscX = new OscMessage("/pitch"); // Create OSC message with address pattern "/pitch"
oscX.add(value); // Add the raw MIDI value as the first argument in the OSC message
oscP5.send(oscX, myRemoteLocation); // Send the OSC message to the specified address/port
break;
case 17: // Controller 17 (another fader/knob on the MIDI Mix)
dotY = (int) map(value, 0, 127, 0, height); // Map MIDI value to window height
OscMessage oscY = new OscMessage("/pitch"); // Same address pattern used for Y position
oscY.add(value); // Use the unmodified MIDI value for the OSC message
oscP5.send(oscY, myRemoteLocation); // Send to the same destination (SuperCollider)
break;
}
}
void noteOn(int channel, int pitch, int velocity) {
println("Button on: channel "+channel+", number "+pitch+", velocity "+velocity); // Log MIDI note events
// Function triggered by MIDI note-on messages, currently just logs data without taking action
}
void noteOff(int channel, int pitch, int velocity) {
println("Button off: channel "+channel+", number "+pitch+", velocity "+velocity); // Log MIDI note-off events
// Function triggered by MIDI note-off messages, currently unused but available for expansion
}
Interactive systems with MIDI in TouchDesigner
Process MIDI inputs from the Touché device

Send the processed data to SuperCollider via OSC


Example: AV System
The type of interface and the technology affects how interactions are possible and how performers are able to manipulate digital materials. We can explore this notion by building an audiovisual system that employs two very different interfaces:


Last updated