Meta Platforms has added new features to its Ray-Ban sunglasses, including real-time AI and Shazam in addition to existing features.

This comes as the social media giant is constantly upgrading its AI-powered glasses to boost performance including handling more complex tasks and responding more naturally in major updates that are expected to transform the smart glasses.

Meta has made major upgrades to Ray-Ban.

Earlier this year, Meta revealed that it was integrating its next-generation Llama 3 AI model into the virtual assistant — MetaAI — in Ray-Ban smart glasses to improve performance. The multi-modal features were made available early last December and can perform translations as well as identify objects, animals, and monuments.

Now, Meta has brought another major upgrade to the smart glasses to further boost their performance. According to Cnet , the always-on AI assistant went live on Ray-Ban glasses on Monday for owners who have access to Meta’s features. That’s in addition to built-in translation and Shazam, which are currently only available in the US and Canada.

The latest features added to the AI-powered glasses include continuous audio and camera recording instead of specific individual prompts. This, according to Cnet, allows the glasses to be used for extended periods of time with the AI ​​features turned on.

When the always-on Live AI is activated, the glasses' LED indicator also stays on. Meta keeps a recording of the conversation that can be referred to throughout the AI ​​session.

In terms of translation, it is expected to work matically while speaking. However, although the translation comes through the glasses, it comes with a slight delay.

The direct AI assistance is said to be similar to what Google showed off on its Gemini concept glasses this month and Android XR, which will arrive next year.

According to Meta, as reported by Cnet, the remote AI affects battery life, and users can expect up to 30 minutes of use before the device needs to be recharged.

However, the company also makes it clear that this kind of always-on camera-powered AI is exactly what more tech companies will be exploring in the coming year.

AI Glasses Will Hit Their Step in 2024

The upgrades come as big tech companies are pushing AI assistants as the raison d’être of smart glasses. Last week, Google unveiled Android XR for its new smart glasses, specifically touting Gemini AI as its killer app.

Meanwhile, Meta CTO Andrew Bosworth said in a blog post: “2024 was the year AI glasses made a big breakthrough.”

In the same blog post, Bosworth also sees smart glasses as the best possible form factor for a “truly original AI device.” He added that AI glasses could be the first category of devices to be “fully defi’d by AI from the start.”

With Meta's Ray-Ban smart glasses, users can activate the virtual assistant on the glasses by saying "Hey Meta" and then asking a question or prompt, before the assistant can respond via the frames' built-in speakers.

According to Meta, users can live stream from the glasses to social media platforms like Facebook and Instagram, using “Hey Meta” to interact with the company’s “advanced conversational assistant” MetaAI.

Ray-Ban sunglasses with improved sound and cameras, plus more than 150 different combinations of frames and custom lenses, are “lighter and more comfortable.”