Meta Corporation has launched three exciting new features for its Ray-Ban smart glasses: real-time AI, real-time translation, and Shazam. The addition of these functions will further enhance the user experience and demonstrate the huge potential of AI technology in the field of wearable devices. Among them, the real-time AI and real-time translation functions are currently in the early access stage, while the Shazam function is open to all users in the United States and Canada. The launch of new functions marks the development of smart glasses in a more intelligent and convenient direction, and also indicates that wearable devices will be more deeply integrated into our daily lives in the future.
Meta Company recently announced that its Ray-Ban smart glasses will launch three new features: real-time AI, real-time translation and Shazam. Among them, the real-time AI and real-time translation functions are currently limited to members of the Meta early access program, while the Shazam function is open to all users in the United States and Canada.
Real-time AI and real-time translation capabilities were first previewed at the Meta Connect 2024 conference earlier this year. Real-time AI allows users to have natural conversations with Meta’s AI assistant while the glasses continue to observe their surroundings. For example, when you're browsing the produce section of a grocery store, you could theoretically ask Meta's AI to recommend some recipes based on the ingredients you're looking at. Meta says that when fully charged, users can use the real-time AI feature for about 30 minutes at a time.
Meanwhile, a real-time translation feature allows the glasses to perform real-time voice translation between English and Spanish, French or Italian. You can choose to hear the translation through the glasses themselves, or view the translated text on your phone. You need to download the language pairs beforehand and specify the languages you and your conversation partner use.
The Shazam function is more straightforward. When you hear a song, just prompt Meta AI and it should be able to tell you what song is playing. Meta CEO Mark Zuckerberg demonstrated the feature in an Instagram video.
If your glasses are not yet showing these new features, make sure your glasses are running v11 software and your Meta View app is running version v196. If you are not already in the early access program, you can apply through this website.
The update comes at a time when technology giants are focusing on AI assistants as a core selling point of smart glasses. Just last week, Google released Android XR, a new operating system designed specifically for smart glasses, and highlighted its Gemini AI assistant as a killer app. At the same time, Meta Chief Technology Officer Andrew Bosworth also wrote in a blog that "2024 is the year when AI glasses will make great progress." Bosworth also asserted that smart glasses may be the best form of device that is “truly AI-native” and the first hardware category to be fully defined by AI from the start.
All in all, this functional upgrade of Meta smart glasses further proves the huge potential of AI in the field of wearable devices, and competition among technology giants will further promote the rapid development of smart glasses technology.
This functional update not only improves the user experience of Ray-Ban smart glasses, but also indicates that smart glasses will become more intelligent in the future and further integrate into people's daily lives. Meta’s efforts have also promoted the technological development of the entire smart glasses industry, and it is worth looking forward to more innovative features in the future.