🎧
Audiovisual Design
WS 2024 Audiovisual Design
WS 2024 Audiovisual Design
  • Audiovisual Design
  • Sessions
    • Session 1
      • Audiovisual Toolchains
      • Processing Basics
    • Session 2:
      • Bitmap Manipulations & Image Effects
      • OSC Connections
    • Session 3
      • Introduction to TouchDesigner
      • Image processing and manipulation
      • Dynamic video processing
    • Session 4 + 5
      • Masking
      • More examples of OSC-Links between TD and SC
      • Streaming bitmap data
      • Projection mapping
    • Session 6
      • 3D Environments
      • Audio analysis
    • Session 7
      • Outlook
      • Collected Assignments / ToDo
    • Session 8 (presentation)
  • Audiovisual Theory
    • Audiovisual Artforms
    • Theories of Audiovisual Perception
    • Artistic Concepts
  • Bibliography
    • Bibliography
    • Links
Powered by GitBook
On this page
  1. Sessions
  2. Session 4 + 5

More examples of OSC-Links between TD and SC

PreviousMaskingNextStreaming bitmap data

Last updated 9 months ago

CtrlK

Example: triggering sounds from video data

This example demonstrates going the other direction: to generate sound that depends on image data.

The TD network analyzes temporal changes, or movement, in the video stream and triggers a sound if a certain degree of change happens.

The approach to compare frames for motion detection follows several steps. First, both frames (current and previous) are converted to grayscale (alos a Blur can be applied to reduce noise and small irrelevant changes in lighting or camera sensor noise). This makes the motion detection more robust. Then, the the difference between the pixel values of the current and previous frame are calculated. Areas with motion will show up as non-zero values (grey), while static areas will be close to zero (black). Optional thresholding: Apply a threshold to the difference image to create a binary mask where significant changes are marked as white (1) and insignificant changes as black (0). This helps eliminate minor variations that aren't actual motion.

Limitations: This basic approach has some limitations. Changes in lighting can be misinterpreted as motion. Also in live situations it is sensitive to camera movement and shaking the camera can trigger false detections. Slow-moving objects might not be detected if their frame-to-frame difference is below the threshold.

s.boot;
s.quit;

// see available audio devices
ServerOptions.outDevices;

// define output device
Server.default.options.outDevice_("");

// Create the SynthDef
(
SynthDef(\crackleMotion, {
    arg freq=440, amp=0.3, density=30;
    var sig, env, crackle, filtered;
    
    // Create envelope with short attack and 1 second total duration
    env = EnvGen.kr(
        Env.new(
            [0, 1, 0],          // levels
            [0.01, 0.99],       // times (adding to 1 second total)
            \lin               // curve shape
        ),
        doneAction: 2          // free synth when done
    );
    
    // Create a crackling sound using Dust and filtering
    crackle = Dust.ar(density) * 2 - 1;
    filtered = BPF.ar(
        crackle,
        freq,
        0.3,
        4
    );
    
    // Add some distortion and apply envelope
    sig = (filtered * 5).clip(-1, 1) * amp * env;
    
    Out.ar([0, 1], sig);
}).add;
)

// Create a function to spawn new synth instances when receiving OSC
(
OSCdef(\motionListener, {|msg, time, addr, recvPort|
    var randomFreq = exprand(200, 2000);  // Random frequency between 200 and 2000 Hz
    ["Received OSC message:", msg, "at time:", time].postln;  // Print incoming message
    Synth(\crackleMotion, [\freq, randomFreq]);
}, '/motion');
)

// To stop listening
OSCdef(\motionListener).free;

// Test it
// n = NetAddr("localhost", 57120);
// n.sendMsg("/motion", 440);

Creating sounds from image data