Webcam Phase Simulation
This interactive web application uses your device's camera to explore concepts of phase, timing, and perception. By manipulating the sliders, you can apply wave-like distortions and temporal effects to the live video feed, creating an immediate and intuitive connection between your input and the visual output.
How to Use
- Start Simulation: Click the "Start Simulation" button. You will be prompted to allow access to your camera.
- Sliders: Adjust the sliders to see their effects in real-time.
- Phase Shift: This controls the starting point of the wave distortion. As you move it, the entire wave pattern will shift up or down, demonstrating a change in phase.
- Wave Amplitude: This determines the intensity or "height" of the wave. Higher values create a more pronounced horizontal displacement.
- Wave Frequency: This adjusts how many waves appear vertically. Higher frequency means more, tighter waves.
- Motion Blur: This slider controls a temporal effect. It leaves a faint trail of previous frames, simulating motion blur or persistence of vision. A value of 0 means no blur.
- Canvas Interaction: Click or tap anywhere on the video canvas to generate a visual ripple effect. If sonification is enabled, this will also produce a sound.
- Sonification: Toggle the "Sonification" checkbox to enable or disable audio feedback for your interactions. Sound is off by default.
Technical Details
This application is built entirely with vanilla JavaScript, HTML, and CSS, without relying on any external frameworks or libraries. This approach prioritizes performance and low-level control over the rendering process.
- Video Capture: The
navigator.mediaDevices.getUserMedia()API is used to request access to the user's webcam. The live feed is streamed to a hidden<video>element. - Real-time Rendering: A
<canvas>element is the drawing surface. The main animation logic is driven byrequestAnimationFrame(), which ensures smooth, efficient updates synchronized with the browser's refresh rate. - Visual Effects:
- The wave effect is achieved not by slow pixel-by-pixel manipulation, but by drawing thin vertical slices of the video source onto the canvas with a calculated horizontal offset. The offset for each slice is determined by a sine wave function:
offset = sin(y + phase) * amplitude. This is a performant way to create complex distortions. - Motion blur is simulated by drawing a semi-transparent black rectangle over the entire canvas before rendering the new frame. The opacity of this rectangle is controlled by the slider, determining how quickly previous frames fade away.
- Ripples from user clicks are simple geometric shapes (circles) whose radius and opacity are updated in each frame to create an expanding and fading animation.
- The wave effect is achieved not by slow pixel-by-pixel manipulation, but by drawing thin vertical slices of the video source onto the canvas with a calculated horizontal offset. The offset for each slice is determined by a sine wave function:
- Sonification: The Web Audio API is used for generating sound. When the user interacts with the canvas, an
OscillatorNode(to create a tone) and aGainNode(to control volume) are dynamically created. The gain is quickly ramped down to create a "pluck" or "drip" sound effect. The audio context is initialized on the first user interaction to comply with browser autoplay policies.
Future Directions
This simulation serves as a foundation for more advanced explorations:
- Advanced Effects: Implementing more complex shaders using WebGL for pixel-level manipulation, such as chromatic aberration, lens distortion, or real-time color filtering.
- Frame Buffering: Adding a true frame delay effect by storing a buffer of past video frames (as
ImageDataor drawn to offscreen canvases) and displaying a frame from the past. - Data Sonification: Tying audio parameters (like pitch or volume) directly to the simulation parameters or even to the video data itself (e.g., average brightness of a screen region).
- More Interaction Modes: Using mouse/touch dragging to "paint" effects onto the canvas or to control parameters in a 2D space (e.g., X/Y position controls amplitude/frequency).