Google AI glasses prepare to take center stage

Mar 2, 2026

11:44pm UTC

Copy link
Share on X
Share on LinkedIn
Share on Instagram
Share via Facebook
N

ine months after my first demo, Google's AI glasses still feel like they could change everything. And my second demo at MWC 2026 this week only confirmed it.

I wasn't allowed to take photos during the demo since these were prototypes and not the final product. Even so, the promise is clear: like the classic Meta Ray-Bans, they look strikingly similar to regular glasses. The final product will be produced in collaboration with popular eyewear brands Warby Parker and Gentle Monster, likely making them more stylish than the typical geek glasses.

Google AI glasses at MWC 2026

Google demoed AI glasses at MWC 2026. Photo by Sabrina Ortiz

The in-lens display is the biggest highlight, as it opens up a whole new range of capabilities. Smart glasses are gaining momentum largely through AI integration and the ability to fuse the physical and digital worlds, but there are also practical, everyday wins, like reading messages or following turn-by-turn navigation without pulling out your phone.

The in-lens display is well-positioned and easy to read. During the five-minute demo, I asked Gemini multiple questions, watched my words get accurately transcribed and sent to the chatbot, and received responses in real time.

I also tried the Nano Banana integration, which let me ask Gemini to take a photo of what I was looking at and modify it. I asked it to add a space-themed background. While it wasn't the most practical everyday scenario, the image quality was impressive, and the processing was fast (around 15 seconds, I was told). Last time, I demoed Google Maps turn-by-turn navigation and came away equally impressed.

Following the surprise success of the classic Meta Ray-Bans, last year Google announced that it was re-entering the category with its own smart glasses. When worn, Google's AI glasses feel much closer to the original Meta Ray-Bans, which owe their popularity largely to their comfort and the fact that they look like normal glasses. However, Google's version is more functionally similar to the bulkier and more expensive Meta Ray-Ban Displays, which look less like normal glasses and more like a tech product.

Our Deeper View

There's already growing acceptance of AI glasses, and since Google's glasses are so similar to regular glasses and add so much functionality with the in-lens display, I think they are poised to push smart glasses adoption to the next level. Some important details that will play a major role in the appeal are still to be determined, such as battery life and speed. However, all the pieces may be coming together. Qualcomm just unveiled Snapdragon Wear Elite, a platform designed to power next-gen AI wearables with always-on, low-power, on-device AI processing. The next year will be pivotal for the AI wearable category, and Google’s take on smart glasses is likely to redefine the category by making in-lens displays mainstream.