Response from Google to OpenAI: Project Astra announced

Six years ago, Google showed an artificial intelligence demo called Duplex at an I/O developer event and made an appointment with a barber. After many years, Google recently announced Project Astra. In the shared demo, Project Astra can see everything around it through a phone camera and answer your questions about it.

Google's DeepMind CEO Demis Hassabis said in his keynote today that his team is working to develop universal AI agents that can help in daily life. Project Astra is a step towards this goal.

In the shared demo, we see that the person holding the phone takes the device's camera and Project Astra to some parts of the office and asks questions. In the example shown, the user says aloud, “Tell me when you see something making a sound,” and the Gemini-powered AI responds, “I see a speaker making a sound.” The user then asks the AI ​​what it is that he is pointing at on the speaker and says, "This is the tweeter. It produces high-frequency sounds." gets the answer.

According to its statement, Google underlines that this video was made in a single shot and in real time. Later in the video, Gemini is shown identifying and explaining pieces of code on a monitor, telling the user which neighborhood they are in based on the view from the window.

The most impressive part is when the user asks, “Do you remember where I saw my glasses?” It happens when you ask the question. In the video, the artificial intelligence was not asked anything about glasses, in fact, there were no glasses in the scene that the user was showing on the phone camera at that moment. But despite this, Gemini says, "Yes, I remember. Your glasses are next to a red apple."