Meta lets Ray-Ban smart glasses share view while on video calls

24 Apr 2024

Image: Meta

The company is also rolling out Meta AI, an assistant that greatly enhances the device’s overall functionality, in beta mode for users in the US and Canada.

Just in time for sunglasses season, Meta is rolling out new features for its Ray-Ban smart glasses such as sharing one’s view on a video call and access to an AI assistant.

First released in 2021, when Meta was pushing hard for its dominance in the metaverse, the Ray-Ban smart glasses are wearable devices that can perform a range of tasks, from capturing photo and video on voice command to acting as a headset that can play music from one’s phone.

Meta said yesterday (23 April) that the device – now in its second generation – been “flying off the shelves” thanks to high demand for its growing list of features and improved specs, including integrated audio and an ultra-wide 12MP camera.

The company is now rolling out an update that will allow users to share their view with other people through WhatsApp and Messenger video calls in real-time and in a “hands-free” manner.

Users of the Ray-Ban smart glasses in the US and Canada now also have access to Meta AI, an assistant powered by artificial intelligence that enhances the device’s functionality. First released in December as a test, Meta AI is now available to all US and Canadian users in beta.

“You can ask your glasses about what you’re seeing, and they’ll give you smart, helpful answers or suggestions. That means you can do more with your glasses because now they can see what you see,” Meta wrote in its announcement.

To activate Meta AI, users just have to say “Hey Meta” and follow it up by asking a question or providing a command, similar to how to Siri or Alexa works.

“Say you’re travelling and trying to read a menu in French,” the company added. “Your smart glasses can use their built-in camera and Meta AI to translate the text for you, giving you the info you need without having to pull out your phone or stare at a screen.”

Last week, Meta unveiled Llama 3, its latest batch of large language models that it said comes with new capabilities such as “improved reasoning”.

Find out how emerging tech trends are transforming tomorrow with our new podcast, Future Human: The Series. Listen now on Spotify, on Apple or wherever you get your podcasts.

Vish Gain is a journalist with Silicon Republic

editorial@siliconrepublic.com