Have smart glasses finally hit an inflection point?

The display in a pair of smart glasses
Photo via Urban Optiks

I know what many of you may be thinking now, as you read the headline of this post. You might be remembering the photo reproduced below, in which Silicon Valley venture capitalist Marc Andreessen (who, it must be said, has a large and rather egg-shaped head, no offense intended) is standing with Bill Maris, the head of Google Ventures, and veteran Silicon Valley investor John Doerr from Kleiner Perkins Caulfield Byers — and they are all smiling while wearing pairs of Google Glass, the thin frames with the tiny square camera on one corner (the three men were launching something called the Glass Collective, to invest in offshoots of the undoubtedly soon-to-be-ubiquitous smart glasses). Within minutes of this photo appearing, I would argue that it helped to symbolize a kind of socially inept techno-utopianism that often seems endemic to Silicon Valley (remember the Segway?), and in the process Google Glass almost instantly superseded the previous emblem of nerd-dom, the pocket protector.

Google put the full weight of its corporate branding behind the launch — in a demo presented by Brin at the Google I/O conference in June 2012, four skydivers wearing Google Glass jumped from a blimp and landed on the roof of the Moscone Center in San Francisco while livestreaming their descent. Google said the glasses would be available for pre-order for those in attendance for $1,500. Sales to the public started in 2013 and the product was discontinued in 2015, which is right up there with the shortest product runs in recent memory (the company pivoted to making an Enterprise Edition for use in factories). I would be willing to bet that the vast majority of those sold wound up on bookshelves somewhere or in personal museums — like the one I have that contains a working Palm Pilot, an original Motorola flip phone, a CueCat handheld bar-code reader, and an endless parade of the USB gadgets that get handed out at trade shows.

Marc Andreessen, Bill Maris and John Doerr
Photo via Forbes

I doubt that the photo of Marc Andreessen and his venture-capital pals in their geekwear put the final nail in the Google Glass coffin, but it probably didn't help (neither did the picture that infamous tech blogger Robert Scoble posted of himself wearing Google Glass in the shower while grinning maniacally). Not only did the headgear scream "I am a giant nerd with no social skills," but the camera square at one end also seemed vaguely threatening to some, suggesting that this person was probably going to record you without you knowing at some point, or use it to do facial recognition, or something equally nefarious. More than one person compared the look of someone wearing Glass to the Borg, the half-human, half-computer race from Star Trek, who wore a kind of steampunk-looking outfit with a viewfinder/camera gizmo locked over one eye.

Combine all that with the fact that Google Glass cost over a thousand dollars each and you have a toxic combination for what was supposed to be a consumer item. And there was one other factor that I haven't mentioned yet, which is that there seemed to be no real purpose for having them, or at least not enough of a purpose for people to put up with 1) looking like a gigantic idiot with no friends and 2) paying over a thousand dollars to look like a gigantic idiot with no friends. You could take photos with them, and short video clips, but not enough to make it worthwhile — at least not without using up all of the battery — and the creepy factor outweighed most of that. There were no compelling "augmented reality" features like image recognition or language translation.

It may (or may not) surprise you to learn that Google Glass came out a decade ago. In some ways it feels like a hundred years ago — to me anyway. So has anything relevant changed in those ten years? In a sense, everything has changed. Moore's Law has continued its inexorable advance, so chips are smaller and faster; camera lenses and technology have improved fairly dramatically; WiFi and Bluetooth are more ubiquitous and work better than they did a decade ago; image recognition — including facial recognition — is light years ahead of where it was; and most importantly, we have ubiquitous artificial-intelligence engines that can do a significant amount of heavy lifting in the computation department. And whether it is good or bad, the idea of being recorded or having facial-recognition software identify you is a lot less strange — after all, even airlines are doing it.

Note: In case you are a first-time reader, or you forgot that you signed up for this newsletter, this is The Torment Nexus. You can find out more about me and this newsletter in this post. This newsletter survives solely on your contributions, so please sign up for a paying subscription or visit my Patreon, which you can find here. I also publish a daily email newsletter of odd or interesting links called When The Going Gets Weird, which is here.

Smart glasses for non-nerds

Smart glasses on a table
Photo via PocketLint

All of these things have contributed to a sort of quiet resurgence of the idea of smart glasses. No one is rapelling down a building wearing them — instead, average people (or at least more average than the predominant Google Glass buyer) are buying and wearing them. And there is one thing I didn't mention in the above list that has definitely changed the way people think of them: they are being designed by the same people who make Ray-Bans and Oakleys and other stylish eyewear, and they are almost indistinguishable from their brethren, with their smart features either hidden or visible only in the form of a tiny lens on the corner of the stylish sunglasses frame. Here's how Mat Honan described it in a recent essay he wrote for MIT's Technology Review:

Beyond that — and this is key — is EssilorLuxottica’s dual strength in both retail and fashion. Let’s start with retail. EssilorLuxottica has a massive footprint — locations like Target Optical, Lenscrafters, and Sunglasses Hut give its smartglasses instant access to Apple Store-level retail distribution. (Seriously, is there a mall in America without a Sunglasses Hut?) What’s more, unlike just about every other previous attempt in the category, because the company has great designers with genuine fashion sense, the glasses look not only normal but cool. Even on the likes of Mark Zuckerberg. It’s amazing what you can accomplish when you don’t make your customers look like a bunch of buffoons.

EssilorLuxxotica recently reported that sales of its Ray-Ban Meta smart glasses rose by 200% in the first half of the year. While the company did not break out unit sales, it stated in its interim financial report that it has sold “millions” of pairs of the glasses, and described the demand for them at Sunglasses Hut as "insatiable." Sales are apparently going so well that the company has added a new line of even sportier smart glasses to its Oakley line in June, and is set to release a high-fashion version under the Prada brand. As Honan and others have pointed out, the glasses available now don't just take photos or videos, you can use artificial intelligence to tell you things about the world around you, and even translate foreign languages being spoken around you in real time.

Meta is exited enough that it spent almost $4 billion recently to buy 3% of the company. Zuckerberg said in his recent memo on personal AI that he believes "personal devices like glasses that understand our context because they can see what we see, hear what we hear, and interact with us throughout the day will become our primary computing devices." And Meta is not the only company betting on this future: Snap has released a line of AI-enabled smart glasses that provide an "augmented reality" experience, and Bloomberg has reported that Apple is planning to release AI-powered smart glasses in 2026, after the somewhat lackluster sales of the company's Vision Pro virtual reality headset. Even Google is planning to launch a new line of smart glasses known as Android XR, with augmented reality features that are powered by the company's Gemini AI engine.

According to Martin Peers at The Information, Google's new glasses will use software developed with Samsung and be made in conjunction with Warby Parker and Gentle Monster, an edgy Korean eyewear brand. And the prototype models worn at Google's I/O developer conference were unlike the original Google Glass in every way, he said — in other words, they were stylish. And the demo at least suggested that Google's glasses might be smarter than Meta's: the person doing the demo used Gemini to call up directions in the form of a map overlaid on the lenses, as well as emails and texts. None of these things are available yet on Meta's glasses (although the company has promised this and more for future models) and Google said that its operating system and/or an API would be publicly available for other smart-glasses manufacturers to use.

A smartwatch for your face

Smart glasses displaying information
Photo via Enviz

Is everyone suddenly going to start wearing glasses, even if they don't already, just because they can ask an AI about what they are looking at, or in case they run into someone who speaks a different language? Ryan Broderick, who writes a newsletter called Garbage Day, isn't so sure. After writing about Mark Zuckerberg's predictions that glasses will become the must-have interface for AI, Broderick wrote: "Let’s not dwell on how inherently insane it is to assume that we’re going to be living in a world soon where everyone is just wearing glasses all the time." But is it so crazy? If you had suggested before the iPhone came out that in the near future millions of people would carry around a tiny rectangle of glass and stare into it for hours at a time, let alone use it to read books or watch movies or find someone to date, you would have been called insane as well.

Casey Newton, meanwhile, wrote in his Platformer newsletter about trying Meta's prototype Orion AI glasses, which aren't yet available for sale but offer some next-generation features, including augmented reality. "Most of the apps that I used were fairly mundane: watching a video, making a video call, answering a text message with my voice, or using the glasses’ cameras and Meta AI to take a picture of some smoothie ingredients and ask for a recipe," Newton wrote. "The interface is extraordinary — you’re watching fairly bright, high-quality images projected in front of your eyes in a pair of glasses." The differences between the Orion and something like Apple's Vision Pro or the Meta Quest headset were dramatic, Newton said — no feelings of claustrophobia or nausea. Smart glasses, he wrote, "feel like the obvious path forward for mixed-reality computing."

A writer for the Washington Post wrote about trying a set of smart glasses from a company called Even Realities, saying he "interpreted different languages in conversation (which works passably), teleprompter-ed my way through a presentation (which was even better) and asked for advice from popular chatbots that I could read in a flash. (If I had fewer scruples, they would make me an ace at trivia night.)" Mainly though, he said that the smart glasses "feel like a smartwatch for your face: They are best for quick interactions such as checking your notifications and leaving yourself reminders and kind notes." Also, he added: "what I appreciate most about these glasses is how seriously Even Realities tried to make them look normal. " In other words, not Google Glass.

The Post writer also said that people responded differently to him when they realized he was wearing smart glasses, and many assumed that he was recording them, even when he said he wasn't. "Mercifully, no one felt so put off by the glasses that they left the conversation entirely, but a few asked me to take them off — an option my nearsightedness makes a nonstarter. Most of the time, people chose to take me at my word and the conversation continued (if a little icily.) Even in tech-heavy San Francisco, casual chats with people I have known for years sometimes turned tense after the glasses’ true nature were revealed." Perhaps some of these uncomfortable conversations happened because of news items like a recent one in which an agent with the Customs and Border Patrol department was filmed wearing what appeared to be Meta's smart glasses.

Are there going to be downsides to this kind of ubiquitous AI and facial recognition, etc.? Undoubtedly. And we will have to figure out new social conventions and manners to deal with them, just as we are still trying to do with smartphones, and as we did with Bluetooth headphones before that, and portable CD players before that, and so on. But as someone who loves to travel, I can honestly say that I am excited at the idea of having a pair of glasses that can not only recognize things like the Roman Colosseum and tell me about them, but can also translate my Airbnb host's Italian — not to mention showing me directions without forcing me to take out my phone and stare at it while I'm trying to walk somewhere. And if I can do so while wearing what appear to be a pair of unremarkable Ray-Bans or Oakleys, so much the better! What about you — could you see yourself wearing a pair of smart glasses?

Got any thoughts or comments? Feel free to either leave them here, or post them on Substack or on my website, or you can also reach me on Twitter, Threads, BlueSky or Mastodon. And thanks for being a reader.

Read more