The best Ray-Ban Meta update for smart glasses yet adds Live AI tools in early access
- The Ray-Ban Meta glasses just got new Live AI in Early Access
- Shazam music recognition is also there
- All new features are only available in the US and Canada
Meta rounds out the year with a major update to its Ray-Ban smart glasses with two Live features that it teased at Meta Connect 2024. It also adds Shazam integration to help you find the names of tunes you hear while wearing your glasses.
The only downside to the great-sounding Live features is that they’re accessible early on, so expect them to be less reliable than your typical AI tools. They are also only available to Early Access Program members in the US and Canada. That’s possible register on Meta’s official site.
But if you’re in the Early Access program, you can try Live AI and Live Translation now.
Live AI is like a video version of Look and Ask. Instead of snapping a quick photo, your glasses continuously record your vision so you can talk to them about what you can see – or other topics. Moreover, you don’t have to say “Hey Meta” over and over again during a Live AI session.
Meta adds: “Ultimately, live AI will provide helpful suggestions at the right time, even before you ask for them.” So be prepared for the AI to come up with ideas without you directly asking for them.
The babel fish comes closer
Live translation is another real-time AI tool. This time the AI can automatically translate between English and Spanish, French or Italian.
When you speak to someone who uses one of these three languages, you’ll hear what he or she says in English through the glasses’ open-ear speakers, or see it as a transcription on your phone — and they can hear it or read a translation of what you say in their language.
Luckily, the update isn’t just about early access features.
If you’re at an end-of-year party and like the sound of a tune, you can also ask your glasses, “Hey Meta, Shazam this song,” and it will tell you what song is playing via the Shazam music recognition tool.
While this feature is more widely available, unfortunately it’s once again only available in the US and Canada – so people in the UK and beyond can’t access it yet.