In a significant stride toward enhancing the capabilities of its smart glasses, Meta, the parent company of Facebook, is rolling out powerful upgrades to the Ray-Ban Meta smart glasses. The improvements focus on Meta's AI assistant, introducing real-time information support and a groundbreaking feature called "multimodal AI."
Real-Time Information Access:
Previously constrained by a "knowledge cutoff" as of December 2022, Meta AI was unable to provide real-time information on current events, game scores, traffic conditions, and other time-sensitive queries. However, Meta Chief Technology Officer (CTO) Andrew Bosworth announced a transformative change, revealing that all Meta smart glasses users in the United States will now have access to real-time information. This enhancement is partly powered by Bing, Microsoft's search engine.
Multimodal AI Capability:
Meta is also testing a cutting-edge feature called "multimodal AI." This functionality allows the AI assistant to answer contextual questions about the user's surroundings and respond to queries based on visual input captured through the smart glasses. This multimodal capability was initially previewed during Connect, Meta's annual conference.
Early Access Beta and Future Expansion:
While these upgrades are poised to make Meta AI more practical and less gimmicky, access to the new multimodal functionality will initially be limited. According to Bosworth, the early access beta version will be available in the United States to a select group of users who opt in. The company anticipates expanding access to a broader audience sometime in 2024.
Command-Based Interaction:
Mark Zuckerberg showcased the potential of these new capabilities through videos on social media. Users can engage with the feature using commands like "Hey Meta, look and tell me." In the demonstrations, Zuckerberg utilized the AI assistant to identify clothing items and suggest matching outfits. The capabilities also extend to identifying objects in images and translating text within memes.
Enhanced User Interaction:
According to Bosworth, users can now ask Meta AI questions about their immediate surroundings, fostering a more interactive and engaging experience. This includes creative queries, such as generating captions for recently captured photos.
The upgrades to the Ray-Ban Meta smart glasses represent Meta's commitment to evolving and improving its augmented reality offerings. With real-time information access and multimodal AI capabilities, Meta aims to make its smart glasses not just a technological accessory but a genuinely useful and immersive tool for users. As these features roll out, they are likely to reshape the way users interact with and derive value from wearable technology.