Meta Adds Major Upgrades to its Ray-Ban Glasses
Meta Platforms has introduced significant upgrades to its Ray-Ban smart glasses, enhancing their AI capabilities with real-time features like live AI assistance, Shazam integration, and continuous audio-visual recording.
These updates build on earlier improvements designed to handle more complex tasks and provide natural, responsive interactions.
Meta's Ray-Ban smart glasses just got a major upgrade. Now featuring real-time AI capabilities, these stylish shades can translate spoken language on the fly and provide contextual info about your surroundings.
Imagine traveling and effortlessly communicating in any language—how… pic.twitter.com/hVTHP1cHzy
— Audax Flux (@AudaxFlux) December 16, 2024
Earlier this year, Meta announced plans to integrate its next-generation AI model, Llama 3, into MetaAI for the smart glasses, enabling advanced functions such as object, animal, and landmark recognition, alongside real-time translation.
The latest enhancements include always-on AI assistance, which began rolling out to users in the US and Canada on Monday.
Unlike previous versions that required specific prompts, the new continuous AI features allow seamless functionality, though the system's LED remains active when the live AI is on.
This upgrade improves usability but comes at a cost to battery life, offering up to 30 minutes of operation before recharging is required.
Additionally, real-time translations, while functional, come with a slight delay.
Product Lead for Ray-Ban Meta Glasses, David Woodland, shared more details on X (formerly known as Twitter).
Today we start the rollout of one of the biggest updates of year to Ray-Ban Meta Glasses. Here are some of my favorite things in this update:
1) "Hey Meta, start Live AI." You can now have a conversation with AI that streams video of what you see into the context of the… pic.twitter.com/fvh6K763vA
— David Woodland (@DavidSven) December 16, 2024
These advancements align Meta with growing trends in smart eyewear, paralleling Google's recent demonstration of its prototype glasses powered by Gemini AI and Android XR.
As Meta emphasizes, continuous camera-assisted AI is expected to be a key focus for tech companies moving forward, signalling the potential transformation of wearable technology into an integral part of everyday life.
Will AI Glasses Be the Future of Eyewear?
Big Tech is increasingly positioning AI assistants as the cornerstone of smart glasses.
Last week, Google introduced Android XR, highlighting its Gemini AI assistant as the pivotal feature for next-generation smart eyewear.
Introducing Android XR, our new platform for headsets and glasses built for the Gemini era pic.twitter.com/CBLaFGUwez
— Google (@Google) December 12, 2024
Meanwhile, Meta CTO Andrew Bosworth described 2024 as "the year AI glasses hit their stride," suggesting in a recent blog post that smart glasses may be the ideal form factor for a "truly AI-native device.”
He further noted that AI-powered glasses could be the first hardware category entirely defined by AI from inception, and that smart glasses will replace TVs in years.
Meta CTO Andrew Bosworth says smart glasses will replace TVs in years not decades and learning to use wrist-based neural interfaces will enable devices to be controlled with your mind pic.twitter.com/sswPRRrKkl
— Tsarathustra (@tsarnick) October 3, 2024
Meta's Ray-Ban smart glasses exemplify this vision, allowing users to activate the MetaAI virtual assistant with a simple "Hey Meta" voice command to ask questions or issue prompts.
Responses are delivered through built-in speakers in the frames, enabling seamless interactions.
The glasses also support livestreaming directly to Facebook and Instagram, integrating AI-enhanced capabilities with social media engagement.
With improved audio, upgraded cameras, and over 150 customisable frame and lens combinations, the Ray-Ban smart glasses are lighter, more comfortable, and purpose-built to merge style with advanced AI functionality.