🔬 Science

Scientists Just Shrunk an AI Brain to 1/6000th Its Original Size — Small Enough to Email

🧠

A human brain consumes less power than a light bulb. The AI systems trying to match it guzzle electricity by the megawatt.

Now scientists have built an AI model so compact it hints at how nature solved this efficiency problem hundreds of millions of years ago — and the answer is almost absurdly small.

Researchers at Cold Spring Harbor Laboratory, working with teams from Carnegie Mellon and Princeton, started with a standard AI vision model containing 60 million variables. Using data from macaque monkey neurons and statistical compression techniques similar to those used for digital photos, they shrunk it to just 10,000 variables.

That's a 6,000-fold reduction. The compressed model could literally fit in an email attachment.

"That is incredibly small," said Ben Cowley, an assistant professor at Cold Spring Harbor Laboratory and lead author of the study, published in Nature. "This is something we could send in a tweet or an email."

The remarkable part? The tiny model performed nearly as well as the original.

The team focused on V4 neurons — cells in the visual system that process colours, textures, curves, and 'proto-objects.' By studying what real monkey neurons responded to and stripping away everything redundant in the AI model, they revealed something fascinating about how biological brains might achieve their efficiency.

Some artificial neurons in the compressed model responded strongly to shapes with curves and edges — 'When you go into the supermarket and you see the arranged fruit, your V4 neurons love that,' Cowley explained. Others responded specifically to small dots, which the researchers suspect relates to primates' innate attention to eyes.

The implications ripple outward. If brains run on models this compact yet outperform AI systems running on supercomputers, something fundamental is different about how they're designed. Self-driving cars might run on far less powerful computers. AI assistants might work offline on phones. And scientists might finally crack open the black box of how we see.

"If our brains have less complex models and yet can do more than these AI systems, that tells us something about our AI systems," Cowley said.

Smaller, simpler, closer to nature. That's the direction. 🧠

🌅 Get Good News in Your Inbox

Join thousands who start their day with uplifting stories. Free, no spam, unsubscribe anytime.

More Science Stories

British Firm Just Solved Fusion Energy's Biggest Fuel Problem

First Light Fusion has validated a tritium breeding ratio of 1.8 for its FLARE power plant concept — the highest ever an…

UK Fusion Startup Just Solved One of the Hardest Problems Blocking Clean Energy

British company First Light Fusion has validated that its FLARE reactor design can breed its own fuel — tritium — at a r…

⚗️

Scientists Just Turned Methane Into Medicine — Literally

For the first time, researchers in Spain have converted methane directly into a complex pharmaceutical compound, using a…

✨ You Might Also Like

💉

Scientists Are Building a Two-Part Cure for Type 1 Diabetes — No Immunosuppression Required

Researchers at MUSC have secured $1 million to develop a revolutionary therapy: lab-grown insulin cells protected by eng…

🐕

Drone With Thermal Camera Finds 'Sweetie' — The Injured Dog Trapped in an Oregon Pond

A frightened, injured dog trapped in a water retention pond was located using thermal imaging and rescued by a coordinat…

🐢

Giant Tortoises Are Home Again — After 200 Years Away From Galápagos Island

For the first time since the mid-1800s, giant tortoises roam the Galápagos island of Floreana. 158 animals — bred from s…