The gaze panopticon
Can the gaze be turned into a collective power?
This interactive installation reimagines Foucault’s Panopticon through a radical reversal: the confined self transforms external surveillance into collective power. Audiences join via a shared web link (https://the-gaze-panopticon.onrender.com/audience.html) where real-time face detection tracks their attention. Each gaze is measured by duration and multiplied by the number of watchers, accumulating as tangible pressure on the confined self. As collective observation intensifies, the self channels this force to shatter the prison structure, taking us to a place where what was designed to control becomes the catalyst for liberation.
#Speculative Design
#Creative Coding
Concept
As collective observation intensifies, pressure accumulates - the Panopticon structure begins to shake, the Self deforms under stress. When pressure reaches critical mass, the prison structure shatters, and the Self transforms into a luminous, free-floating form.
Design Goals
- Create real-time networked interaction between 1 experiencer and multiple audiences
- Use facial detection to quantify attention as measurable force
- Visualize abstract concepts (pressure, surveillance) through 3D spatial experience
- Build entirely web-based for accessibility—no app downloads required
Technical Architecture
Built with Three.js, Socket.io, face-api.js — entirely web-based, no installation required.
- Face Detection: face-api.js tracks attention duration in real-time
- 3D Morphing: Blender shape keys → Three.js morph targets (pressure 0→100)
- WebSockets: Socket.io synchronizes state across all connected clients
- Particles: ~5K-10K vertices explode with radial velocity at rupture
Development
- At pressure = 0
morphTargets [0] = 0.0 blob stays in base form - At pressure = 50
morphTargets [0] = 0.5 blob is 50% deformed - At pressure = 100
morphTargets [0] = 1.0 blob is fully compressed
1st try - MediaPipe Face Mesh
The failed reason: Google's MediaPipe model promised 468 facial landmarks with high accuracy. However, it relied on WebAssembly (WASM) binaries that failed to initialize.
2nd try - TensorFlow.js Face Landmarks
I pivoted to TensorFlow.js, which uses pure JavaScript/WebGL instead of WASM. Model loaded successfully, camera feed worked; but detection consistently returned 0 faces.
3rd try - face-api.js
Gaze Event HandlingFace detection on the client emits three events: gaze-start, gaze-hold (every 100ms), and gaze-end. The server accumulates pressure accordingly.
- Timing: 1 person gazing continuously = ~2 pressure/sec → 50 seconds to reach rupture (100)
Each phase required distinct visual language to communicate psychological states. I implemented dynamic materials, particle systems, and animations responding to accumulated pressure.
- Pristine metallic form
- Subtle gray glow
- Shape keys begin deforming
Rupture (70-100 pressure)
- Structure explodes
- ~5K particles with radial velocity
- Cyan/magenta color gradient
- Red/yellow flashing (20Hz)
- Shake effects intensify
- Visual tension builds
Transmutation (Manual trigger)
- Self resets, high roughness
- Zero metalness, emission 3.0
- Gentle floating animation
Demo & Reflection
Want to be gazed?: https://the-gaze-panopticon.onrender.com/index.html
Effect Video
This project intentionally uses the tools of surveillance to imagine their opposite, like cameras, algorithms, quantified attention. This contradiction remains unresolved. But perhaps that's the point: transformation doesn't come from perfect solutions, but from the willingness to question the structures we inhabit.
The work reveals its own limitations. You watch through screens, mediated by the very tools of digital surveillance we live under. The face detection that enables "liberation" is the same technology used for mass surveillance in authoritarian states. But perhaps this liminal space, which between constraint and freedom, between complicity and resistance, is exactly where transformation begins.
Next step
- Better user experience: responsive design for mobile and tablets
- WebXR integration: may introduce VR/AR mode for a more immersive experience
- Session persistence: Database storage of game states and histories
References
Socket.io Documentation
face-api.js GitHub
WebGL Fundamentals