If Meta Connect wasn’t proof enough that VR is taking a back seat to smart glasses, Apple may soon hammer that point home. According to a report from Bloomberg’s Mark Gurman, Apple is sidelining work on a cheaper Vision Pro to focus on putting out multiple pairs of smart glasses ahead of schedule.
Per the report, Apple has refocused its staff working specifically on a cheaper and lighter version of the Vision Pro to instead pursue prototypes of smart glasses that sound awfully familiar to the ones Meta is making right now. One pair of the in-the-works glasses from Apple, according to Bloomberg, will be display-less, similar to Ray-Ban Meta AI glasses (Gen 2) that I just reviewed. Those smart glasses still rely on a connection with a phone and focus primarily on camera capabilities, open audio, and an infusion of AI for computer vision and voice commands.
Apple is also reportedly going after the Meta Ray-Ban Display, which have a screen in them and were launched officially at the end of September. Apple’s version will likely have more going on. Meta Ray-Ban Display, for example, are capable of turn-by-turn navigation, delivering message notifications, video calling, and more.
What’s equally as telling as the shift in staff is that Apple is reportedly looking to expedite the release of smart glasses: the display-less Apple smart glasses are slated for 2027, while the pair with a display in them were previously slated for 2028, though Apple is reportedly working on bumping that release up. In other words, it looks like Apple took a sober look at what Meta is doing on the smart glasses front and said, “Oh shit.” Having demoed Meta’s Ray-Ban Display myself at Connect, I agree with that assessment.
Smart glasses might still be in their beginning stages, but the appeal of a pair with a screen in them is clear. Like the Apple Watch and other wearables, smart glasses represent another way to augment the phone experience in a way we haven’t seen before, and, for Apple, the potential is even greater. While Meta’s smart glasses have struggled with phone compatibility in some cases, Apple’s entrant would have no such problem.
Things like messaging and calling, for example, could be seamless and natively integrated on a pair of smart glasses made by Apple, while Meta has to rely on app integrations through Instagram and WhatsApp. The tighter the integration between your phone and a pair of glasses, the more useful the smart frames become. Transferring photos and videos is easier, using voice assistants becomes more useful, and connectivity between devices is lightning fast. What I’m saying is, Meta may have a very big lead in the field, but Apple, which still has a vice grip on its ecosystem of hardware, has a chance to really hammer things home—and the sooner, the better, since Meta’s lead in the smart glasses space seems to be steadily growing.

I’m eager to see what Apple has to offer for smart glasses, and for lots of reasons. On top of tighter integration, Apple has a better track record for privacy, which will undoubtedly be a major topic of conversation as smart glasses balloon more into the mainstream. Apple also has years of experience developing user interfaces that feel just a tad bit easier to use, and could really put its iOS flourish on smart glasses. And speaking of UI, I’m curious to see how Apple envisions its customers actually using a pair of smart glasses.
Meta’s first-gen Meta Ray-Ban Display come with a Neural Band worn on your wrist that reads the electrical signals in your arm and hand so you can navigate UI in the smart glasses with finger pinches and thumb swipes. Would Apple also go with a hardware approach? Or will it try to cram a camera-based visionOS-like hand-tracking system into a smart glasses-sized device? All of this is still conjecture, obviously, but it looks like we may be inching ever closer to some real answers. And you know what? I’m ready. The time for smart glasses is nigh, and Meta can’t be the only player in the game.