Google teases smart glasses with a HUD at I/O 2020

The smart glasses teaser segment of Google I/O 2024 showed that the smart glasses had a fixed heads-up display (HUD) with an audio input indicator showing when wearers are speaking and white text displaying AI’s response. Google says Astra works by “continuously encoding video frames, combining the video and speech input into a timeline of events, and caching this information for efficient recall”.

The Project Astra demo video starts on a smartphone, but halfway through the user picks up and puts on a thick pair of glasses.

The smart glasses teaser segment of Google I/O 2024.

These smart glasses were shown to have a fixed heads-up display (HUD) with a blue audio input indicator showing when the wearer is speaking and white text showing the AI’s response.

Google Deepmind CEO Demis Hassabis said “new exciting form factors like glasses” were “easy to envision” as an endpoint for Project Astra, but no specific product announcement was made, and a disclaimer reading “Prototype Shown’ appeared at the bottom of the clip near the end.

What we didn’t see in the Google I/O 2024 keynote was any mention of Android XR, the spatial computing platform the company is working on for Samsung’s upcoming headset. Google could be waiting for Samsung to make the announcement. Google will compete directly with Project Astra, or is it planning to give Project Astra as a service to other hardware manufacturers?

Scroll to Top