Procedural animation means motion generated by rules, mathematics, and state — not keyframes or recorded paths. Each frame is computed fresh from parameters: time, forces, constraints, and emergent interactions. The simulation never repeats exactly, yet remains internally consistent.
This sandbox demonstrates five distinct procedural principles, each operating through the same engine loop. Click or tap the canvas to influence the simulation. Move your pointer to inject energy. Watch how each mode responds differently to the same input.
Click or tap anywhere on the canvas to interact with the current simulation — attract particles, perturb orbits, or inject noise. On desktop, clicking and dragging creates continuous influence. On touch devices, press and drag across the canvas.
Keyboard shortcuts: 1–5 switch modes, R resets the simulation, S toggles sound, D toggles demo mode. Leave the sandbox idle for 10 seconds and demo mode activates automatically, cycling through all modes.
The engine separates state update from rendering inside a single requestAnimationFrame loop. Each mode implements three methods and is registered with a mode manager that handles transitions cleanly.
Input is unified across mouse, touch, and keyboard into a single input object with x, y, active, vx, vy fields. Modes never access raw events — they receive a normalized coordinate and activity flag each frame.
The Web Audio API oscillator tracks the active mode's energy() value, mapping it to frequency and gain. You hear the simulation's state — not a soundtrack.
Every mode here is a simplified gateway. Real-time fluid simulation, cloth physics with Verlet integration, reaction-diffusion systems for biological patterns, cellular automata — all operate through the same rules → state → render pipeline this sandbox uses.
GPU-accelerated versions using WebGL or WebGPU can run these simulations with hundreds of thousands of particles by moving the update step to the graphics hardware. The architectural separation you see here — update logic decoupled from rendering — is exactly the contract a compute shader expects.
The emergent complexity you observe in the Swarm mode is the same principle behind neural population dynamics, market microstructure, and distributed robotics. Procedural thinking scales from pixels to systems.