Test-driving Meta's Ray-Bans, with added AI smarts
Published Date: 2/9/2024
Source: axios.com

When I used Meta's second-generation Ray-Ban smart glasses to broadcast my interview with Sam Altman at Davos live on Instagram, Altman called the experience "a little weird" — but I think the glasses pack some neat tricks for a device that doesn't feel much bigger or heavier than standard eyewear.

Why it matters: The design of many cutting edge devices, like Apple's Vision Pro, starts with a long list of tech features and then tries to make everything smaller and lighter. With the Ray-Bans, Meta is taking a different approach — choosing a size, weight and price that people actually want, then seeing what you can do with it.


What's happening: The new glasses launched in October, and I've been using them (with my prescription) on and off for a few months. While they're largely a refinement of the initial version, they also include the ability to livestream and a built-in, if nascent, AI assistant.

Of note: My favorite use of the second generation of these specs is the same as the first time around: They let me capture first-person photos and videos without having to pull out my phone.

  • This is especially great for sports. During my recent trip to Switzerland, for example, I used the glasses to capture video of me skating on ice, shooting the puck at a hockey net outdoors with snow-covered mountains in the background.

Details: The new model's main addition is access to Meta AI, a chatbot that resembles a less capable ChatGPT. The bot can also provide some real-time data — such as weather and sports scores — via Microsoft's Bing.

  • Simply saying "Hey, Meta" summons the bot's voice in your ear. It's still pretty hit or miss in terms of which queries it will answer and how well.

Another feature, currently only available to a small group of testers, is the ability to ask the AI chatbot questions based on what's in a user's field of view.

  • Saying "Hey Meta, look at this" brings up this capability and prompts the glasses to take a picture, which it then analyzes to answer the question asked. It can write fun captions, translate a menu or just describe what it sees.

As for that Sam Altman interview live-stream: I'll admit it — it was a bit weird for me, too.

  • I was conscious that anything I looked at — including my notes — could end up in the broadcast.
  • Also, I wasn't sure if I had properly turned off the setting that enables a recital of live comments from viewers through the speakers by my ear.

Zoom in: I took the glasses on an hour-long walk through San Francisco's Mission Thursday, enjoying a Tracy Chapman playlist on Spotify, at least when I wasn't taking pictures or peppering the AI with questions.

  • The AI assistant accurately told me that I didn't need to worry about rain, but struggled to tell me when the Warriors or Sharks play next. It also correctly answered who won the 2020 presidential election, but the answer did have some extraneous information.
  • The experimental multimodal feature was able to translate a sign in Spanish for me and correctly deduce that I needed to pay if I wanted to park at a meter. It also wrote some amusing captions for the photos I took with the glasses.
  • At a neighborhood thrift store, the AI didn't do so hot at translating passages from a French novel. But it was able to look at various shirts and tell me what pants and shoes might pair well — something I usually have to rely on my partner for.
  • When I got home a little over an hour after I left, the glasses were at 49 percent battery, though I had been using them fairly heavily. They can be recharged a couple of times by placing them in their case — though for me, that means carrying an extra pair of prescription glasses for when they are charging.

Between the lines: The glasses have also gotten much better at carrying out the key features from the first version.

  • Pictures are sharper and more detailed, while the audio is louder and clearer than the first go-around.
  • Tracy Chapman always sounds great, but in the first version, she could easily have gotten drowned out by street noise.

One troubling aspect that persists from the first version is that many people, especially in a large group or crowd, have no idea I'm wearing glasses capable of recording photos and video until I tell them.

  • Meta has a white LED that shows up when the glasses are taking a picture or recording video. While it's now brighter than the last version, it's still not super clear.

The big picture: A number of companies are betting on AI wearables — including Humane, with its AI Pin and Brilliant Labs, with its just-announced Frame glasses.

  • Unlike some rivals, the Meta glasses don't require a separate wireless account, but you need an iPhone or Android smartphone for retrieving photos and providing the connectivity needed for the AI features.

The bottom line: Even this second version of the Meta Ray-Ban glasses still feels like early days, especially for the AI capabilities.

  • I think of Meta's glasses as more of a hint at the types of hardware coming than a device fully ready for the mass market.
  • At the same time, they aren't super pricey (they start at $299), feel like the glasses I already wear and do a few interesting things relatively well.

Go deeper: Check out my video review of the Meta Ray-Bans — made while wearing the glasses.