Google Reveals Project Astra: An All-Seeing AI That Could Live In Your Glasses     – CNET

Google Reveals Project Astra: An All-Seeing AI That Could Live In Your Glasses – CNET

AI and AR are going to dovetail. Meta has already talked about this, and at Google I/O on Tuesday a new AI initiative called Project Astra, which can continuously scan camera feeds to provide contextual understanding of the world around you, certainly looks like a model for where next-gen glasses and AI devices could be heading.

Google called Astra “the future of AI assistants,” a universal AI agent that can “help in everyday life.” The goal of the agent is to be quick and conversational, with little lag. Having used Meta’s Ray-Ban glasses with onboard AI, I’ve already noticed that the response lag between asking what I’m looking at and getting a response can take several seconds. Google’s Astra demo, which was performed with a phone continuously pointing a camera around a room, was nearly instant with its responses.

googles project astra looks like a road to better ai and ar glasses

Watch this: Project Astra Revealed at Google I/O

Recent AI gadgets — the Rabbit R1, Humane AI Pin and Meta’s Ray-Bans — can ping the world with cameras that onboard AI can then analyze and respond to, but the response times are generally really slow, and limited in function. In Google’s Astra demo, things looked a lot snappier. 

Google even seemed to show the new AI running with a pair of glasses, a tease that suggested more wearable AI tech to come.

A phone looking at a computer monitor, interacting with an AI assistant with the camera

Google demonstrated Astra on a phone, and also on camera-enabled glasses.


The person interacting with Astra moved from a phone to camera-enabled glasses, looking around and asking questions to Gemini with both hands free, like, “Where did I leave my glasses?” “On the corner of the the desk next to a red apple,” Gemini answered. The glasses looked almost like the promised live-translation AR glasses Google announced but never released two years ago. Google Glass — Google’s first attempt at assistant smart visors — was released 11 years ago.

Google is already working on developing a mixed reality platform with Samsung and Qualcomm, and — who knows? — maybe a pair of camera-enabled AI glasses, too. AI is clearly the missing link for future XR devices, something Meta’s Mark Zuckerberg and Andrew Bosworth have said for years. Google looks ready to make moves in this space, too.

Scott Stein

Leave a Reply