For the first time, scientists have reconstructed short video clips from the neural firing patterns of mice as they watched films — offering a genuine, unprecedented peek into how an animal experiences the visual world. The work, published today in the journal *eLife*, is a landmark step toward understanding animal consciousness.
Researchers at the **Sainsbury Wellcome Centre at University College London (UCL)** trained an AI to predict how neurons in the mouse visual cortex respond to visual input. They then ran the process in reverse: feeding blank video data into the model and iteratively adjusting the imagery until the AI predicted brain activity patterns that matched what was recorded from real mice watching real films.
The result? Grainy, pixellated — but unmistakably recognisable reconstructions of the clips the mice had been watching. The animals had been shown footage of gymnastics, horse riding, and wrestling. The AI produced reconstructions that captured the basic structure of the movement on screen.
**"With humans you can just ask them what they're dreaming or hallucinating. But you can't do that with animals. This gives us a way in."** — Dr. Joel Bauer, Sainsbury Wellcome Centre, UCL
**How It Works**
The technique involves two stages. First, the researchers used an infrared laser technique to record the firing of individual neurons in the mice's visual cortex — the region of the brain that processes what we see — as the animals watched 10-second movie clips. The laser light allowed them to observe hundreds of neurons simultaneously with extraordinary precision.
Second, they fed that neural data into an AI competition winner — a model specifically built to predict visual cortex responses. By running the model backwards, they could reconstruct what the visual system had experienced.
The reconstructed clips aren't cinema-quality — mice have significantly poorer eyesight than humans, so the videos appear blurry and low-resolution. But the researchers believe they can sharpen the output approximately **seven times** with further technical refinement.
**Why This Is Extraordinary**
For anyone who has ever wondered what their dog or cat actually sees — and how they experience it — this research matters deeply. Animals can't tell us how they perceive colour, depth, movement, or contrast. They can't describe whether a sunset looks like anything to them, or whether they recognise a person's face.
This technique opens the door to answering those questions empirically rather than through guesswork.
- **Animal welfare:** Understanding how animals perceive pain, stress, and comfort could transform how we assess and improve conditions in farms, labs, shelters, and zoos. - **Veterinary medicine:** Conditions like cataracts, retinal degeneration, and neurological disorders could be evaluated from the animal's subjective experience. - **Comparative neuroscience:** Mapping the differences and similarities between how different species process visual information may reveal deep truths about the evolution of consciousness itself.
**Looking at Both Eyes — and Beyond**
The next step, according to the team, is to reconstruct a full field of view by combining neural data from both eyes individually. A full binocular reconstruction would be significantly closer to the mouse's complete visual experience.
Dr. Bauer noted that similar approaches in humans — brain-scanning technologies that could reconstruct what someone imagines or dreams — would raise profound privacy questions. In animals, however, the technique is purely a tool for understanding, not surveillance.
**A Deeper Empathy With the Animal Kingdom**
There's something quietly beautiful about this research. For centuries, humans have speculated about what it's like to be a bat, a dolphin, a mouse. With this method, science has made the first tentative steps toward a real answer.
The reconstructed movies are grainy and imperfect. But they represent something genuinely new: a moment where we looked through another creature's eyes — not metaphorically, but through mathematics, light, and the firing of neurons — and caught a glimpse of a world we've always wondered about. 🧠🐭
*Source: The Guardian · eLife journal · Sainsbury Wellcome Centre, UCL — March 10, 2026*