PROC / SANDBOX
● AUTO DEMO — INTERACT TO TAKE CONTROL
MODE 
FREQ 
ENERGY 
System Documentation
What is this

A sandbox for procedural motion

Procedural animation means motion generated by rules, mathematics, and state — not keyframes or recorded paths. Each frame is computed fresh from parameters: time, forces, constraints, and emergent interactions. The simulation never repeats exactly, yet remains internally consistent.

This sandbox demonstrates five distinct procedural principles, each operating through the same engine loop. Click or tap the canvas to influence the simulation. Move your pointer to inject energy. Watch how each mode responds differently to the same input.

Interaction

How to explore

Click or tap anywhere on the canvas to interact with the current simulation — attract particles, perturb orbits, or inject noise. On desktop, clicking and dragging creates continuous influence. On touch devices, press and drag across the canvas.

Keyboard shortcuts: 1–5 switch modes, R resets the simulation, S toggles sound, D toggles demo mode. Leave the sandbox idle for 10 seconds and demo mode activates automatically, cycling through all modes.

The Five Modes

Distinct principles, one engine

01 — Sine Field
Periodic motion through superimposed sinusoidal waves. Demonstrates how frequency, phase, and amplitude combine into complex but fully deterministic patterns.
02 — Noise Flow
Pseudo-random value noise drives a vector field. Particles follow continuous, organic trajectories with no true repetition across the plane.
03 — Force Web
Attraction, repulsion, and spring constraints between nodes. Classical mechanics in a bounded environment — energy conservation and dissipation made visible.
04 — Orbital
Gravitational n-body simulation. Small perturbations cascade into orbital resonance, chaotic trajectories, or stable ellipses depending on initial conditions.
05 — Swarm
Boids-style emergent flocking. No agent has global knowledge — cohesion, separation, and alignment rules alone produce collective motion that appears coordinated.
Under the Hood

The architecture

The engine separates state update from rendering inside a single requestAnimationFrame loop. Each mode implements three methods and is registered with a mode manager that handles transitions cleanly.

// Mode interface — every simulation implements this const mode = { init(w, h) { /* spawn particles, set initial state */ }, update(dt, input) { /* integrate physics, advance time */ }, draw(ctx, w, h) { /* render current state to canvas */ }, energy() { /* return scalar 0–1 for audio mapping */ } }

Input is unified across mouse, touch, and keyboard into a single input object with x, y, active, vx, vy fields. Modes never access raw events — they receive a normalized coordinate and activity flag each frame.

The Web Audio API oscillator tracks the active mode's energy() value, mapping it to frequency and gain. You hear the simulation's state — not a soundtrack.

Further Directions

Where this leads

Every mode here is a simplified gateway. Real-time fluid simulation, cloth physics with Verlet integration, reaction-diffusion systems for biological patterns, cellular automata — all operate through the same rules → state → render pipeline this sandbox uses.

GPU-accelerated versions using WebGL or WebGPU can run these simulations with hundreds of thousands of particles by moving the update step to the graphics hardware. The architectural separation you see here — update logic decoupled from rendering — is exactly the contract a compute shader expects.

The emergent complexity you observe in the Swarm mode is the same principle behind neural population dynamics, market microstructure, and distributed robotics. Procedural thinking scales from pixels to systems.