Not long ago, I found myself in a room staring at a colleague who, at first glance, looked like she was simply wearing a stylish pair of Ray-Ban glasses. But when I leaned in, I realized she was doing something remarkable. Through a small, nearly invisible display inside the right lens, she was scrolling WhatsApp messages, snapping photos using the glasses as a viewfinder, and even adjusting Spotify’s volume with a subtle twist of her hand, almost like turning an invisible knob in the air.
It was my first close encounter with Meta’s Ray-Ban Display smart glasses, and the experience left me a little stunned. While the frames were slightly bulkier than your average pair of sunglasses, they could easily pass for everyday wear. The screen itself wasn’t visible to me as an onlooker, even though I could tell she was looking at something. For the first time, smart glasses didn’t feel like a gimmick — they felt like a product you could actually wear out in the real world.
And now, it looks like Apple wants in on the game, too.
Apple Steps Into the Arena
According to Bloomberg, Apple has decided to pause work on a lighter version of its Vision Pro headset to focus instead on smart glasses. The report suggests that Apple is developing two types: one with a built-in display and one without.
At first, the non-display glasses might sound underwhelming — why buy “smart” glasses if they don’t show you information? But think about it: imagine AirPods in the form of sunglasses. Hands-free music, calls, voice commands, maybe even integration with Siri — all packed into a wearable accessory you’d already use every day. That’s the kind of “simple but sticky” product Apple has a knack for creating.
And then there’s the version with a display. Tie that into the Apple ecosystem — your iPhone, Apple Watch, Mac, and iCloud — and suddenly you’ve got a wearable that can handle messages, directions, music, and notifications without pulling out your phone. Even if the first generation only acts as a “visual extension” of the iPhone, it could instantly become one of Apple’s most compelling new products.
Why Apple’s Entry Matters
Apple has built its empire on perfecting hardware most people didn’t think they needed. Remember the first iPod? The iPhone? Even AirPods were laughed at in their early days. But in each case, Apple came in late, refined the concept, and completely redefined the market.
If Apple does the same with smart glasses, the impact could be huge. Unlike Meta, which struggles with iOS limitations, Apple could seamlessly sync messages, maps, contacts, and music into its glasses. That’s a game-changer, especially for iPhone users who already live inside Apple’s tightly integrated ecosystem.
It doesn’t hurt that Apple already has years of experience building tiny, powerful hardware — just look at the Apple Watch’s sensors or the AirPods’ miniaturized chips. Translating that into eyewear feels like the next logical step.
Meta’s Head Start
Of course, Apple isn’t walking into an empty room. Meta has already been experimenting aggressively with smart glasses, building on its Orion augmented reality prototypes and refining its Ray-Ban partnership.
CEO Mark Zuckerberg has been clear about his vision: smart glasses are about breaking free from the smartphone’s dominance, especially Apple’s. Last year, he even admitted one of his “formative experiences” was being restricted by what Apple allowed Meta to build on its platforms. In other words, Meta doesn’t just want to innovate — it wants to compete head-to-head with Apple’s grip on mobile computing.
And right now, Meta has a lead. Its glasses already combine voice commands, cameras, and displays into a device that looks almost like regular eyewear. By the time Apple’s rumored models arrive (the earliest being 2027, according to Bloomberg), Meta could be several iterations ahead.
Other Challengers in the Race
Meta and Apple aren’t alone here. Samsung and Google are both said to be working on their own AR glasses, with varying degrees of secrecy. Smaller startups are testing unique concepts, from ultra-light glasses that project text onto your vision to AI-powered assistants that live in your eyewear.
And then there’s the curveball: Jony Ive, Apple’s legendary former design chief, is rumored to be collaborating with OpenAI on a pair of AI-driven smart glasses. If true, that would bring one of the most influential designers of our era back into the spotlight — and potentially give OpenAI a way to move beyond text and chatbots into hardware that sits directly on your face.
The point is: this isn’t just a two-horse race. It’s a full-on sprint involving nearly every major tech company, plus a few dark horses.
The Timing Problem
Here’s the catch: Apple may be too late — or exactly on time.
Bloomberg’s report suggests that Apple won’t announce its non-display glasses until 2026 or 2027, with display-equipped models coming later, around 2028. That leaves years of runway for Meta and others to refine their products, gain users, and establish dominance.
But Apple has been in this position before. The company wasn’t the first to make an MP3 player. It wasn’t the first to make a smartphone. Yet when it finally entered those markets, it didn’t just compete — it rewrote the rules.
The question is whether Apple can do it again in an era where AI and AR are advancing faster than ever.
The Bigger Picture: Why Smart Glasses Matter
At first glance, smart glasses can sound like a luxury. Who really needs a floating notification in front of their eyes when you can just check your phone?
But step back, and you can see why tech giants are pouring billions into this space. Glasses aren’t just about convenience — they’re about redefining how we interact with the digital world.
-
Instead of looking down at a phone, you glance at the horizon and see your map directions.
-
Instead of fumbling with your camera, you blink or tap to snap what you’re seeing.
-
Instead of carrying a screen in your pocket, the screen is simply part of your daily view.
It’s a vision of technology that feels less like a gadget and more like a natural extension of human experience. That’s why so many companies are chasing it, and why the first one to really nail the formula could shape the next decade of computing.
My Take: The Race Is Just Beginning
Having seen Meta’s glasses firsthand, I can say this: we’re closer than most people think to smart glasses becoming mainstream. They’re still bulky, sure, and the software isn’t perfect. But the core experience — being able to access digital tools without pulling out a device — is surprisingly powerful.
Apple’s entering this race doesn’t guarantee success, but it does guarantee attention. Millions of people who might never consider Meta glasses will line up to try Apple’s version, even if it’s just a fancier extension of the iPhone. That’s the weight of the Apple brand.
At the same time, Meta’s persistence shouldn’t be underestimated. The company has a clear vision and is willing to iterate, even if its early attempts are imperfect. In fact, that persistence might give it a critical edge before Apple even shows up.
Here’s What This Really Means
The smart glasses race isn’t about who makes the coolest demo — it’s about who defines the future of how we interact with technology. Right now, Meta is in the lead, Apple is preparing its move, and a dozen others are waiting in the wings.
For everyday users, this means we could soon have a choice: do we keep pulling out our phones every few minutes, or do we let information live in our line of sight?
When you step back, it feels like we’re standing at the edge of a new computing era — one where glasses could replace the smartphone as our primary interface. It might sound ambitious, but so did the idea of a phone replacing your computer once upon a time.
If Apple really does launch “iGlasses,” don’t be surprised if they once again take a late entry and turn it into the product that defines the category. Until then, Meta, Samsung, Google, and maybe even Jony Ive are making sure the race stays very, very interesting.