There is a limit to how small you can make a display pixel before it stops working properly.
For years, that limit has been one of the key barriers standing between the smart glasses we have and the smart glasses we imagine — the ones that feel like ordinary eyewear but project a full, vivid, high-resolution image directly into your field of view.
Researchers at Julius-Maximilians-Universität Würzburg (JMU) in Germany have just shattered that limit.
Their team has created the world's smallest functioning OLED pixel, measuring just 300 by 300 nanometres across. To put that in perspective: a human hair is around 70,000 nanometres wide. Their pixel is roughly 233 times smaller than that — and it works, without sacrificing brightness.
The implications are staggering. At that pixel density, a full HD display would fit on an area the size of a grain of sand.
The engineering challenge wasn't just making the pixel small. Any OLED element at that scale faces a fundamental problem: current injected into a nanoscale pixel tends to distribute unevenly, leading to dim, inconsistent output. Previous attempts to shrink OLED pixels below a certain threshold resulted in significant brightness loss that made them practically unusable for real-world display applications.
The JMU team solved this with two innovations. First, they developed a novel metallic contact structure that injects current efficiently while simultaneously amplifying and emitting the generated light — a single element doing double duty. Second, they designed a specially manufactured insulation layer to ensure even current distribution at nanoscale dimensions, preventing the uneven output that had frustrated previous attempts.
The current prototype emits orange light and operates at approximately one percent efficiency — which the team acknowledges is not yet commercially viable. But the physics is proven. The constraint that made ultra-miniaturised OLED displays impossible has been removed. What remains is optimisation: improving efficiency and extending the technology to produce red, green, and blue (RGB) colour emission.
Smart glasses have been 'almost there' for years. The processing power is there. The AI integration is advancing rapidly. What has lagged is display technology — specifically, the inability to produce high-resolution visuals in a form factor small enough to sit unnoticeably in a standard lens.
A 300-nanometre pixel changes that calculation entirely.
The team's findings have been published in scientific literature and are already attracting attention from the display industry. The path from a university lab to a commercial AR lens is long — but it just got considerably shorter. ✨