CNN's Clare Duffy takes a selfie with the new Meta Ray-Ban Display glasses during the Meta Connect conference on September 17, 2025.



Menlo Park, Calif.
 — 

Have you ever wished you possibly can rapidly and quietly reply to a textual content whereas in a film theatre with out pulling out your cellphone. Or that you possibly can see instructions to the closest espresso store with out having to look down at your maps app? A new pair of Meta good glasses with a tiny show contained in the lens makes these issues doable.

Meta (META) unveiled the Meta Ray-Ban Display glasses Wednesday at its annual Connect occasion at its headquarters in Menlo Park, California, together with a number of up to date variations of its extra fundamental, audio-only Ray-Ban and Oakley good frames.

It’s all a part of Meta’s push to develop units for the unreal intelligence period. In demos of the new glasses on the occasion, I noticed firsthand how they might make it doable to spend much less time wanting down at your cellphone — thanks to a tiny display an inch from my eyeball.

“Glasses are the only form factor where you can let AI see what you see, hear what you hear, talk to you throughout the day… so it is no surprise that AI glasses are taking off,” Meta CEO Mark Zuckerberg stated on the Connect occasion. He added that the corporate is seeing shoppers undertake good glasses at a fee comparable to that of “some of the most popular consumer electronics of all time.”

But are shoppers prepared to shell out $799 for Meta’s good glasses that symbolize an early step towards augmented actuality? Here are my takeaways after making an attempt them out.

Unlike earlier variations of Meta’s good glasses, which customers may work together with solely by speaking to them and getting audio responses, the Displays present visible suggestions through a small show on the within, righthand nook of the correct lens.

When you’re carrying the frames, the show seems to be like it’s projected a number of toes in entrance of you. You have to give attention to it to actually see what’s there — in any other case, it form of floats in your peripheral imaginative and prescient once you’re issues in the true world. (You may flip the show off, if you’d like to do centered work or have a dialog with out the distraction of the display.)

Only the wearer can see the show. Talking to somebody who’s carrying the Ray-Ban Displays you’d don’t have any indication if that they had messages popping up or Instagram Reels scrolling on the facet, besides that they may break direct eye contact with you to have a look at the show.

CNN's Clare Duffy takes a selfie with the new Meta Ray-Ban Display glasses during the Meta Connect conference on September 17, 2025.

That would most likely trouble me if I encountered Displays in the true world. Who desires to have a heartfelt dialog with a good friend and never know whether or not they’re watching texts pop up on the identical time? But then once more, for a lot of, know-how has lengthy been a part of interpersonal interactions.

One plus of the Displays in contrast with Meta’s audio-only Ray-Bans: They include a “neural” wristband, which helps you to navigate the show with refined hand gestures, so that you don’t have to contact the frames or say “Hey, Meta!” out loud to get the gadget to do one thing.

And whereas they’re barely heavier and thicker than common glasses — or earlier Meta Ray-Bans fashions — it’s simple sufficient to neglect you’re carrying them once you’re not partaking with the tech.

The show permits you to do an entire lot of stuff you would possibly normally decide up your cellphone for, together with taking and viewing pictures and movies, studying and responding to messages, scrolling Instagram Reels and even taking video calls.

The show is small, so it’s most likely not the primary place you’ll flip to to scroll by means of your trip pictures. But should you wished to take a photograph, do a fast high quality test and put up it straight away, you possibly can knock all of that out on the Displays.

Perhaps probably the most helpful characteristic is one that may present you the place you might be on a map in real-time. It’s a step up from earlier variations of the Meta good glasses, which I’ve beforehand requested for instructions to the closest grocery retailer, for instance: While they’ll provide the road deal with, they’ll’t let you know how to get there.

There are additionally stay captioning and translation that allow you to learn the phrases your dialog accomplice is saying in real-time, on the display. And once you ask Meta AI a query, it can reply with written info playing cards on the show, as well as to an audio reply. As a plant nerd, I discover this software very helpful for figuring out foliage after which asking follow-up questions, like whether or not the plant would survive in a pot in a New York City house.

Still, it’s not but an ideal system. The glasses misspelled a phrase once I was making an attempt to reply to a textual content by dictating my response and, with no keypad, I principally simply had to begin over to repair it.

Similarly, in a demo of the Meta Ray-Ban Displays, Zuckerberg struggled to reply a video name from Chief Technology Officer Andrew Bosworth as a result of the button to settle for the decision didn’t present up on the show. “We’ll debug that later,” Bosworth stated.

The Meta Ray-Ban Displays glasses are undoubtedly spectacular. It’s a know-how that Google first envisioned with Google Glass greater than a decade in the past, however now it’s extra useful, trendy and accessible through intuitive hand gestures.

But I nonetheless have some massive questions on whether or not this may actually catch on. Namely: at a time when so many individuals I do know are wanting to spend as a lot time as doable away from screens, will shoppers actually need a display they’ll carry on always inches from their retinas?

The Meta Ray-Ban Displays are the first glasses from the company to include a display inside the lens.

But ask Meta executives and so they say that concern was really a motivating issue behind the event of the corporate’s latest glasses.

“We built this product to help protect presence,” Ankit Brahmbhatt, Meta’s director of AI glasses, advised NCS. “At first, that might sound counterintuitive but we designed this to be a glanceable display, so that it’s there for you when you need it, you get in for seconds at a time, you get the information, then it’s kind of out of your way.”

He added: “This idea of being more heads up and not having our heads buried in our phones, I think, is a really a big part of the kind of experience that we’re trying to unlock here.”

As with earlier generations of the Meta Ray-Bans, I additionally surprise how snug a broad swath of customers will probably be with carrying glasses that may see and listen to their environment whereas of their dwelling or office or whereas they’re spending time with children — particularly given Meta’s tainted history as a steward of our private info.

Likewise, I really feel uncertain whether or not those that don’t put on good glasses would like being round somebody carrying a tool with a digital camera and microphones. In 2013, some Google Glass wearers gained the nickname “glasshole” as a result of different individuals took challenge with the prospect of being filmed with out their consent.

Like earlier variations, the Meta Ray-Bans Display glasses have an LED mild indicating that the gadget is recording.

And Brahmbhatt stated “building responsibly” is a spotlight for the corporate, including that “there’s just education that’s needed when you have a new category” for the general public to perceive the built-in security options like the “recording” mild.

The first proof of simply how a lot convincing Meta may have to do will come on September 30, when the glasses go on sale in a restricted set of retail shops within the United States, earlier than they roll out globally subsequent 12 months.



Leave a Reply

Your email address will not be published. Required fields are marked *