Quantcast
Channel: Internet – MySmartPrice
Viewing all articles
Browse latest Browse all 323

Google’s Project Astra: All You Need to Know About This New AI Assistant

$
0
0
Highlights
  • The Project Astra is Google’s answer to OpenAI’s ChatGPT-4o.
  • Astra stands for advanced seeing and talking responsive agent.

This year’s I/O 2024 was all about artificial intelligence, with the company’s Gemini assistant occupying a central position in the keynote. However, Google also showcased Project Astra, a real-time, multimodal AI assistant. This assistant can hear, see, and identify objects using a phone’s camera and AR glasses. Project Astra seems to be Google’s answer to OpenAI’s ChatGPT-4o.

So what exactly is Project Astra and how will it be different from Gemini? Let’s take a look at the details.

What is Google’s Project Astra?

Google says that Project Astra is the future of AI assistants and stands for advanced seeing and talking responsive agent. The Mountain-view tech giant claims that its AI assistant can understand and respond to the dynamic and complex world, similar to humans.

According to the company, this AI Assistant can process information faster by continuously encoding video frames and combining the video and speech input into a timeline of events. This lets the assistant catch the information for efficient recall. The company claims that the assistant can better understand the context in which it is being used and respond promptly.

This means that users could use Astra as a real-time assistant of sorts, asking it questions and getting instant answers. The difference is that Astra could appear on various devices, such as glasses, and might not be limited to just phones.

In a demo video, Google showcased some of Astra’s abilities. Take a look at the video below.

Here’s What Project Astra Answered in the Demo Video

  • The AI assistant can identify a speaker using the command ‘tell me when you see something that makes sound‘. Upon asking, it could even identify the speaker’s tweeter and described that this ‘produces high-frequency sounds‘.
  • The AI assistant could understand the functionality of a part of a code and provide more specific details about it. It could also show that the code defines encryption and decryption functions using the AES-CBC encryption method.
  • It could also identify the neighbourhood by pointing the phone’s camera towards nearby buildings, providing the user with important details about the location.
  • Responding to the command ‘Do you remember where you saw my glasses?‘, the AI assistant responded, ‘Your glasses were on the Desk near a red apple,’ showcasing its ability to identify multiple objects kept together.
  • The Project Astra AI assistant also provided suggestions on how to make workflow faster.

In summary, Google claims that the AI assistant can do almost everything, from identifying what it is looking at and locating an item left somewhere to answering your questions.

When is Project Astra Coming to Users?

Project Astra is not yet available or public and is still being tested. According to the official blog post, some of Astra’s “capabilities are coming to Google products, like the Gemini app and web experience, later this year.”

So, we will have to wait and see when and if Google rolls this out to all users. However, Gemini will likely be able to do many things that Project Astra showcased. And going by the blog post, Astra might eventually arrive on a different kind of device, and not just the regular smartphone.

The post Google’s Project Astra: All You Need to Know About This New AI Assistant appeared first on MySmartPrice.


Viewing all articles
Browse latest Browse all 323

Trending Articles