AI

Google showed an assistant of the future that sees and understands everything around

The head of the AI ​​division of Google DeepMind, Demis Hassabis, spoke at the annual Google I/O developer conference about an early version of what the company calls a universal AI assistant. We are talking about a system codenamed Project Astra, which is a multimodal AI assistant that works online. It can “see” the surrounding space, recognize objects and help perform various tasks.

“I have had this idea for a long time. We will have this universal assistant. It is multimodal, it is always with you This assistant is simply useful. You get used to the fact that he is always there when you need it,” Hassabis said during the presentation.

Along with this, Google published a short video that demonstrates some of the capabilities of the early version of Project Astra. One employee at Google’s London office activates an AI assistant and asks it to tell it when it “sees” something that can make sounds. After that, she begins to rotate the smartphone and when a speaker standing on the table hits the camera lens, the algorithm reports this. Next, she asks to describe the colored crayons in the glass on the table, to which the algorithm replies that with their help you can create “colorful creations.” Next, the phone’s camera is directed to the part of the monitor on which the program code is displayed at that time. The girl asks the AI ​​algorithm what exactly this part of the code is responsible for and Project Astra almost instantly gives the correct answer. Next, the AI ​​assistant determined the location of the Google office based on the landscape “seen” from the window and performed a number of other tasks. All this happened virtually online and looked very impressive.

According to Hassabis, Project Astra is much closer than previous similar products to how a true real-time AI assistant should work. The algorithm is built on the basis of the large language model Gemini 1.5 Pro, Google’s most powerful neural network at the moment. However, to improve the quality of the AI ​​assistant, Google had to carry out optimizations to increase the speed of processing requests and reduce the delay time when generating responses. According to Hassabis, the developers have been working for the last six months precisely to speed up the operation of the algorithm, including by optimizing the entire infrastructure associated with it.

It is expected that in the future Project Astra will appear not only in smartphones, but also in smart glasses equipped with a camera. Since at this stage we are talking about an early version of the AI ​​assistant, the exact timing of its launch to the masses has not been announced.

Leave a Reply

Your email address will not be published. Required fields are marked *