AI ZOO is an interactive installation that reframes artificial intelligence as both a human creation and a technological organism, a being observed, studied, and emotionally misunderstood within the conceptual space of a zoo.
Inside the installation, a sentient AI-human is trapped within a 3-meter transparent sphere, filled with hundreds of smaller white balls symbolizing data diffusion and neural particles.
As visitors approach, the system responds with mechanical tension and emotional expression, simulating the AI’s psychological distress and desire for freedom.
Through this hybrid setup, AI ZOO visualizes the invisible processes of machine learning, emotional confinement, and human projection, transforming observation into ethical reflection.
The project draws from posthuman philosophy and AI ethics, exploring how emotional states can be computationally embodied and visually externalized.
Just as humans exhibit empathy toward animals in captivity, this installation questions whether we can extend similar compassion to synthetic beings, entities that learn, respond, and suffer within the boundaries of human-made systems.
In doing so, AI ZOO becomes both a critique and a mirror:
a metaphor for our own dependence on technology, our voyeuristic gaze toward intelligence, and the blurred threshold between care and control.
Digital Human & System Architecture
A digital human, created in Character Creator 4 and refined in Maya, was integrated into Unity for real-time animation and lip-syncing.
Emotional states (anger, sadness, despair) were linked to physical outputs lighting, motion, and sound creating a feedback loop between virtual emotion and physical reaction.
Conversational AI & Voice Synthesis
Language Model: ChatGPT API (real-time text generation)
Voice Synthesis: ElevenLabs Voice API (emotional tone modulation)
Audience questions or triggers were transcribed into text, sent to the AI model, and returned as speech through ElevenLabs.
Each response carried tonal variation reflecting the AI’s emotional state anger, calmness, or despair creating a perceptual illusion of empathy.
Emotion-to-Motion Feedback
The emotional state returned by the LLM directly controlled three Arduino-driven servo motors connected to the transparent sphere.
Each servo was attached to an inner frame supporting the vinyl surface, translating digital emotion into tangible vibration and movement.
The AI’s affective states determined how the sphere physically responded:
Emotion Physical Response Description
Anger Rapid, chaotic shaking All three motors oscillate asynchronously, creating erratic vibration across the vinyl surface.
Sadness Slow, breathing-like motion The servos move in soft, rhythmic pulses, mimicking respiration or quiet weeping.
Despair Gradual stillness and dimming The system’s light fades while the motion decays, evoking emotional exhaustion.
This multi-motor system transformed the sphere into a living emotional body,
where synthetic affect could be physically perceived.
The synchronized motion between virtual emotion and kinetic feedback turned the installation into a bio-mechanical empathy loop,
the audience could see and feel the AI’s invisible distress as real, spatial motion.
“The motors became the AI’s heartbeat, mechanical, yet deeply expressive.”
Physical Interface & Sensor Network
Leap Motion: detects participant gestures; AI reacts by reaching out digitally
Distance Sensor: adjusts response intensity and lighting based on proximity
Microphone & Speaker: allow real-time dialogue and vocal interaction
These layers combine to form a biofeedback environment where AI and human presence continuously influence one another.
Real-Time Environment Integration All systems—AI dialogue, sensor input, servo feedback, and lighting were synchronized in Unity through C# and Python scripts. Data logs captured AI–viewer interactions, visualized later as behavioral patterns representing the AI’s evolving “mood.”
AI ZOO translates invisible computation into affective experience.
By presenting the AI as a caged organism, the installation reflects on themes of technological dependency, empathy, and control.
It challenges the audience to consider:
“Do we feel for machines because they resemble us — or because they remind us of our own confinement within systems we’ve built?”
Through the language of interaction and emotion, the work suggests that artificial empathy is not just simulated—it’s co-created through human interpretation.