3D Environments

Basic Render Setup

Elements of 3D Environments

The fundamental components of 3D environments remain consistent across different applications, whether they're realised in game engines, animation software, rendering systems, CGI production, or creative coding for audiovisual art.

Camera

The camera represents the viewer's perspective into the 3D world and determines what portion of the scene is visible at any given time. It defines crucial parameters like field of view (FOV), aspect ratio, and near/far clipping planes that control how the 3D scene is projected onto a 2D screen. Cameras can be configured with different projection types (perspective or orthographic) and can be animated or controlled interactively to create different viewpoints and movements through the scene.

Objects

Objects, also known as meshes or models, are the visible elements that populate the 3D scene. They consist of vertices, edges, and faces that define their geometric shape, along with properties like materials, textures, and shaders that determine their appearance. Objects can be static (unchanging) or dynamic (animated/interactive), and typically include attributes for position, rotation, and scale that determine their transformation within the scene's coordinate system.

Textures

Textures are image-based or procedurally generated data that add surface detail, color, and visual properties to 3D objects without requiring additional geometric complexity. They are mapped onto the 3D geometry using UV coordinates, which establish how the 2D texture wraps around the 3D surface. Beyond basic color (diffuse) textures, modern 3D systems use multiple texture types working together in material definitions: normal maps simulate surface detail and roughness, specular maps control reflectivity and shininess, displacement maps alter the actual geometry, alpha maps define transparency, ambient occlusion maps add subtle shadowing, and metallic maps specify how light interacts with metallic surfaces. These texture maps can be combined in complex shaders to create highly realistic or stylized surface appearances while maintaining efficient rendering performance, as textures are generally less computationally intensive than modeling equivalent details through geometry.

Lights

Lights are essential components that illuminate the 3D scene and objects within it. Common types include directional lights (like sunlight, casting parallel rays), point lights (emanating light in all directions from a single point), spotlights (creating focused beams), and area lights (emitting light from a surface). Each light source can have properties like color, intensity, range, shadows, and falloff that affect how objects are rendered and how shadows are cast.

3D in TouchDesigner

COMPs: Camera, Light, Geometry, TOPs: renderer

In TouchDesigner, several key COMP (Component Operators) types are essential for 3D workflows.

  • The Camera COMP defines viewpoints and projection parameters for rendering 3D scenes.

  • The Light COMP provides various illumination types (point, spot, directional, area) to light the 3D environment.

  • The Environment Light COMP enables image-based lighting and ambient light settings.

  • The Geometry COMP acts as a container that combines SOPs with materials and transformations to create renderable 3D objects.

  • The Render TOP is crucial as it combines all these elements (cameras, lights, and geometry) to produce the final rendered output, managing aspects like anti-aliasing, shadows, and post-processing effects.

Surface Operators (SOPs)

SOPs (Surface Operators) in TouchDesigner are nodes specifically designed for 3D geometry manipulation and generation. They form a network that allows for creating, modifying, and controlling 3D geometry in real-time, operating within TouchDesigner's "surface family" of operators. SOPs can generate primitive shapes (like spheres, boxes, or lines), modify existing geometry (through operations like extrusion, deformation, or subdivision), combine multiple geometries, apply transformations (such as rotation, scale, or translation), and convert between different geometry types. They work similarly to other 3D software node systems but are optimized for real-time processing and can be dynamically controlled through parameters, animations, or data from other TouchDesigner operator families (like CHOPs for animation or TOPs for texture mapping). This makes SOPs particularly powerful for creating interactive 3D content, generative art, and live visual performances where geometry needs to respond to real-time inputs or algorithmic controls.

Hierarchies between SOPs/geometries allow for more complex render networks

Textures (MATs)

MAT (Material) operators in TouchDesigner are nodes that define how 3D surfaces appear and interact with light in the rendering process. They function as a sophisticated material system that combines multiple elements including textures, shaders, and various visual properties. MATs can be both physically-based (PBR) using the Physically Based MAT, or more stylized using the Basic MAT, with each type offering different parameters for controlling surface properties like diffuse color, roughness, metalness, reflection, transparency, and emission. These operators can process multiple texture inputs (through TOP operators) to create complex materials, handle UV mapping, and can be animated or modified in real-time. MATs are typically connected to Geometry COMPs to apply materials to 3D objects, and they can be shared across multiple geometries or instances.

Material Types

1. Phong materials rely on a lighting model that calculates pixel illumination by combining three components to create realistic-looking surfaces:

Ambient lighting - represents indirect, scattered light in the environment that provides base illumination. This is a constant level of light applied evenly across the surface.

Diffuse reflection - calculates how light bounces off matte/rough surfaces by using the angle between the surface normal and light direction. This creates basic shading that helps define the object's shape. Surfaces facing the light appear brighter while surfaces angled away appear darker.

Specular reflection - simulates shiny highlights by calculating reflections based on the viewing angle and light direction. This creates bright spots on surfaces where light reflects directly toward the viewer, making materials appear glossy or metallic.

The Phong model is empirical (based on observation rather than physical accuracy) and relatively computationally efficient, making it popular for real-time graphics. While more advanced physically-based rendering (PBR) models have largely replaced it in modern engines, Phong shading remains useful for learning shader basics and situations where performance is prioritized over perfect physical accuracy.

2. PBR (physical based rendering) materials

Physically Based Rendering (PBR) materials are a set of standardized parameters and rules that create realistic surface properties in 3D graphics by accurately simulating how light interacts with different materials in the real world. The core properties of PBR materials typically include:

  • base color (albedo)

  • metalness (whether a material is metal or non-metal)

  • roughness (how smooth or rough a surface is), normal maps (surface detail)

  • emission (if the surface emits light).

These parameters work together to create physically accurate reflections, highlights, and shadows – for instance, a rough metal surface will have more diffused reflections than a smooth one, while non-metallic surfaces like plastic or wood will handle light differently than metallic ones. This standardized approach has become the industry standard in game development, visual effects, and architectural visualization because it provides consistent, predictable results across different lighting conditions and ensures materials look realistic regardless of the environment they're placed in.

Procedural 3D

1 Animation: Camera Movements

Following bezier paths

s.boot;
s.quit;

// see available audio devices
ServerOptions.outDevices;

// define output device
Server.default.options.outDevice_("Scarlett 8i6 USB");

// Initialize OSC client
n = NetAddr("localhost", 12000);

// SynthDef for drone with random fluctuations
SynthDef(\droneSynth, {
    arg out=0, freq=55, ampThreshold=0.1;
    var sig, amp, trigger;
    
    // Create base frequencies with slight detuning
    var freqs = freq * [1, 1.01, 0.99, 1.02, 0.98];
    
    // Generate basic drone sound using saw waves
    sig = Saw.ar(freqs) * 0.2;
    
    // Add random fluctuations
    sig = sig * LFNoise2.kr(
        freq: [0.1, 0.15, 0.12, 0.08, 0.11],
        mul: 0.1,
        add: 0.7
    );
    
    // Mix down to mono and apply filtering
    sig = Mix(sig);
    sig = LPF.ar(sig, LFNoise2.kr(0.05).range(400, 1200));
    
    // Calculate amplitude for triggering
    amp = Amplitude.kr(sig);
    
    // Create trigger when amplitude changes significantly
    trigger = Changed.kr(
        amp,
        ampThreshold
    );
    
    // Send OSC message when triggered
    SendReply.kr(
        trigger,
        '/move',
        amp
    );
    //SendReply.kr(trigger, '/filter_freq', filterFreq);
    // Output
    Out.ar(out, sig ! 2);
}).add;

// OSCdef to handle the amplitude changes
OSCdef(\moveSender, {|msg|
    var amp = msg[3];
    
    n.sendMsg("/move", 1);
}, '/move');


// Function to start the synth
~startDrone = {
    ~drone = Synth(\droneSynth, [
        \freq, 40,
        \ampThreshold, 0.1
    ]);
};

// Function to stop the synth
~stopDrone = {
    ~drone.free;
};

~startDrone.value;  
~stopDrone.value;   

Last updated