Every animal owner has experienced it: a gut feeling that something is wrong, before any clear symptom appears. The slight posture change. The ears held differently. Something in the eyes.
Now scientists are training artificial intelligence to catch what humans can only intuit.
**The HoliWell Project**
Researchers at the **University of the West of England (UWE Bristol)'s Centre for Machine Vision** have launched **HoliWell** — a project to teach AI systems to recognise emotional cues in animals using 3D imaging and deep learning.
The goal is to build a system that can detect not just physical symptoms of illness, but the subtler signals of **stress, discomfort, pain, and positive emotional states** — the full emotional vocabulary that animals express through body language, posture, and facial expression.
**Built on a Decade of Research**
HoliWell isn't starting from scratch. The UWE Bristol team has spent over a decade developing 3D imaging technology and machine learning models for livestock health monitoring. They can already detect conditions like lameness and respiratory disease from camera footage alone.
Now they're going deeper — trying to understand not just whether an animal is sick, but how it **feels**.
The science behind this is robust. Decades of animal behaviour research have established that animals show consistent, measurable physiological and postural responses to emotional states — from ear positions in cattle and sheep to facial action units in horses, pigs, and even fish. The **Grimace Scale**, already used in veterinary medicine to assess pain, shows that emotion leaves physical traces.
AI, trained on thousands of hours of animal footage, could learn to read those traces faster, more consistently, and at a scale no human observer could match.
**The Implications Are Vast**
If HoliWell succeeds, the applications span almost every domain where humans interact with animals:
- **Farmers** could receive alerts when livestock are stressed — catching problems before they become costly - **Vets** could assess post-surgical pain or recovery progress without invasive tests - **Shelter workers** could identify anxious or depressed animals and adapt their care - **Wildlife researchers** could monitor wild populations for stress responses to habitat change or human disturbance - **Pet owners** could, eventually, use an app to understand what their dog or cat is experiencing
**Why It Matters**
The world contains hundreds of billions of animals in human care — livestock, pets, zoo animals, lab animals. For most of them, welfare assessment relies on the training and intuition of individual humans. That's a fragile system, prone to gaps and blind spots.
A reliable, scalable AI tool that can read animal emotional states would fundamentally shift the relationship between humans and the animals they share the world with.
We've spent centuries learning to speak to animals. HoliWell wants to teach machines to truly listen. 🐄🤖💚
*Sources: Bristol 247 · UWE Bristol Centre for Machine Vision · Animal Behaviour Research (published March 2026)*