Session 8 (presentation)
Assignment Compose material for an audiovisual installation. The visual part will be projected and mapped on a 3D-printed geometric structure containing several surface planes. The sound will be stereo. It can translate visual elements to sound, or sound into visual shapes and colors. It may involve random processes, operate autonomously according to the rule-set given when it was created, or be stimulated by interaction. The audiovisual events can also emerge entirely from the code. Everything together is also possible. What connections of auditory and visual material emerge? Make sure that the installation is designed to be feasible at the current level of knowledge.
NAIK - 'STELLATIONS' PROJECTION MAPPING TEST
How the physical object was coded in Processing

This Processing sketch uses the PeasyCam library to create a 3D camera, and the nervoussystem.obj library to export a 3D object. The setup() function sets the size of the window, initializes the PeasyCam object, and sets the background color, stroke color and fill color. The draw() function clears the background and draws the object. The keyPressed() function checks for key inputs and performs specific actions based on which key is pressed: 'r' for starting to record the obj file and 'n' for creating a new instance of the Shaper class.
This sketch requires:
The PeasyCam library for camera control
The OBJExport library from Nervous System
A custom class (see next code listing) that defines the geometry
Controls:
Mouse: Use PeasyCam's default controls for rotating/zooming
r' key: Export current geometry as OBJ file
'n' key: Generate new geometry
Last updated