Search results for

All search results
Best daily deals
$600 off the Roborock Saros Z70, limited time

Affiliate links on Android Authority may earn us a commission. Learn more.

Watch Google demo its Android XR glasses in front of a live crowd

See Google's Android XR glasses work like magic in this live demo.
By

Published onApril 17, 2025

Android XR glasses TED conference
Axios
Jason Redmond/TED
TL;DR
  • Google recently demoed prototype Android XR smart glasses during a TED Talk.
  • The video of that demo is out now, showing the glasses in action.
  • These glasses have features like live translation, real-time object recognition, navigation, and more.

The next hot product from the tech world may be smart glasses. Meta has already had success with its Ray-Ban collaboration, and Apple and Samsung are hot on its heels with their own projects. Google recently demoed prototype Android XR glasses, and now the video of this demo is out for everyone to see.

Google has previously shown off Android XR-powered smart glasses in a few teaser videos. It has also allowed a handful of publications to try out these smart glasses. Taking things a step further, Google performed a live demonstration in front of a crowd last week during a TED Talk. During that TED Talk, Google revealed a few details about the smart glasses, which you can read about in our earlier article.

Footage of this demo has now been shared by the TED Talk X (formerly Twitter) account.

Misplace your things often? These AI glasses could help. In this live demo at #TED2025, computer scientist @izadi_shahram debuts Google’s prototype smart glasses, powered by the new @Android XR system. Watch the full demo here: https://t.co/F3Rugyig4C pic.twitter.com/7SPlrzThKW
— TED Talks (@TEDTalks) April 17, 2025

In the demo, we see the presenter asking Gemini to help her remember the title of the white book on the shelf behind her. The AI manages to not only answer correctly, but also answer quickly. Next, she asks Gemini to help her find her missing hotel key card, which the AI is able to pinpoint. The presenter then finishes up the demo by asking for directions, which she is then provided through the in-lens display.

Given that this is both prototype hardware and software, what was shown off is pretty impressive. Overall, the experience seems pretty smooth and intuitive. What’s also impressive is that these smart glasses aren’t much bigger than a normal pair of glasses. They are definitely sleeker than Meta’s Orion XR glasses.

Got a tip? Talk to us! Email our staff at news@androidauthority.com. You can stay anonymous or get credit for the info, it's your choice.
Follow