Is the World Ready for Google Glass Now? (No.)
10 years later, the future is wearing shades again.
Welcome back to Social Signals
This week we're talking about smart glasses and what we can learn from Google Glass. I made an AI-generated short film with Google Veo 3 (ahem, Faxzilla LIVES!!). And I got my irises scanned by World for digital identity verification in LA this week and wrote about the experience.
๐ For paid subscribers this week:
๐ Chart of the Week: How CMOs are using influencers to build trust + drive revenue
๐ผ AI Mandates Report: Which brands are really doing the work, and which are just peacocking
โก 19 Social Signals Iโm Tracking: From AI-optimized YouTube ads to holographic drones bringing ads to the sky everywhere.
Letโs get into it. Keep going! ๐โจ
Greg
Is the World Ready for Google Glass Now? (No.)
10 years later, the future is wearing shades again.
Remember when the idea of carrying a phone everywhere sounded ridiculous? Like, who needs email in their pocket at all times? And yetโฆ here we are, checking our screens 300 times a day.
So when I say weโll eventually leave our phones at home and wear smart glasses as our primary device, it might sound funny now. But so did smartphones: right up until they werenโt. The future always feels far-fetched... until it's just what we do.
More than a decade ago, I was one of the first to wear a computer on my face. Not metaphorically, but literally. As an early member of the Google Glass Explorer Program, I wandered the world with a heads-up display hovering just above my right eye, fielding side-eyes from strangers and questions from friends like: โWait, are you recording me right now?โ and โCan you please take those off?โ
And honestly, fair questions.

It was early days. We didnโt know that cameras would end up everywhere. Not just on our faces, but in our doorbells, dashboards, and even our fridges. We didnโt know that talking to an AI assistant would become more normal than calling a human.
But many of us did think definitely believe companies like Meta, Google, Apple, Amazon, and OpenAI would all be racing to build a screenless future powered by ambient intelligence.
Thatโs part of what led to my quote in Julioโs storyโฆ
โRather than shun and fear-monger away something I donโt understand, Iโd rather spend my time experimenting and innovating around it.โ
Back then, Glass felt like science fiction. Today, it feels like the rough draft of something inevitable.
At one point, I even figured out how to send a fax through Glass. Yes, a fax. Why? Because thatโs what innovation looked like in the early 2010s: clunky, ambitious, and just a little ridiculous. But I believed in the potential, not just the prototype.
Fast forward to 2025, and Googleโs giving smart glasses another go. This time with actual momentum and modern AI muscle behind it.
And folks are here for it? Look at this Fast Company headline: Googleโs second swing at smart glasses seems a lot more sensible.
At Google I/O, we saw Android XR prototypes showing off live translations, camera-enabled real-time search, and Gemini Live integration. Itโs a much more confident take on the โyour face is the interfaceโ future. Read my take on this here (reminder: paid subscribers to Social Signals have access to the full archives. Upgrade your subscription here).
From Glassholes to POV Heroes
Remember the Glass backlash? People wearing the device were dubbed โglassholes.โ That came partially because of privacy concerns (there was no clear indicator when it was recording), and partially because... well, early adopters can be intense, foolish, and Google didnโt do any PR around this weirdo tech bomb they dropped on society.
Google Glass prioritized utility over fashion. It was expensive, dorky, and it didnโt seem to solve a problem despite lots of signals of where it could.
Meanwhile, when Snap released Spectacles, they leaned hard into aesthetics, privacy, and accessibility: hot colors of 2016, a built-in recording light, and a $130 price point. That contrast said a lot about how the market perceived wearability vs wearable tech.
But today, with the rise of ambient computing and generative AI, utility is getting cool again.
Googleโs XR glasses arenโt arriving in a vacuum. Metaโs Ray-Ban smart glasses are already in the wild. Apple has planted its spatial computing flag with the Vision Pro. Samsung and Google are teaming up on Project Moohan. And then thereโs Project Aura from Xreal, aiming to turn your glasses into a personal movie screen.
Apple, Jony Ive, and the Ambient AI Race
Apple is in this game, too. Appleโs rumored plan to launch AI-powered smart glasses by the end of 2026 isnโt about rushing to catch up: itโs the same playbook weโve seen time and time again. From the iPod to the iPhone to the Vision Pro, Apple waits for a product category to simmer before it swoops in with something polished, premium, and paradigm-shifting.
The Vision Pro might have launched as a developer unit, but it showed Appleโs hand: itโs not just building XR hardware; itโs building the future of human-computer interaction. With glasses next on the roadmap, the expectation isnโt just elegance. Itโs usefulness. And ecosystem integration. And that classic Apple โoh wowโ moment.
But while Apple waits, OpenAI is already sprinting.
OpenAIโs collaboration with legendary designer Jony Ive (you know, the guy behind the iPhone, iPad, and MacBook) is one of the most intriguing wildcard entries in this race. Their goal? To build a post-smartphone device for ambient AI interaction. Something screenless. Something intuitive. Something... delightful.
โWhat it means to use technology can change in a profound way,โ Sam Altman said. โI hope we can bring some of the delight, wonder and creative spirit that I first felt using an Apple Computer 30 years ago.โ
Whether this becomes a pair of glasses, a wearable pin, or something entirely new, it signals a sea change in how weโll engage with AI. The interface wonโt be an app. Itโll be everything around us.
Hereโs Foursquare founder โs take on what Altman and Ive are cooking up for this AI assistant hardwareโฆ
Early AI Assistant Tech
It reminds me of my Humane AI pin, my Limitless AI pendant, and my Bee Pioneer โ all in one.
AND! It could be weโll have some combination of neural interface tech like in Metaโs Orion system and the Mudra band that was a favorite of mine at CES the last two years on your wrist, some kind of smart glasses, and some kind of wearable all working in concert to replace (and improve) on the phone you carry today.

While weโre waiting to see what this device will be, the march to glasses continues. I actually think smart glasses may be more realistic than using three devices to replace a phone. Give me one device to replace my one device please.
So back to Google Glass: Whatโs Changed in 10 Years?
Just a few things:
Design: Todayโs prototypes actually look like something youโd wear in public. Ray-Ban and Warby Parker are involved now.
Privacy: Thereโs more transparency (hello, LED indicators) and cultural readiness around being โalways on.โ And I think the most affluent and educated are really skeptical of who has that data and how itโs being used, but many folks are still woefully uninformed of how data + ads work. As long as the utility is thereโฆ
Utility: AI makes these glasses useful in real time. Translating signs, summarizing articles, answering questions about the painting you're staring at in a museum, having access to your calendar, contacts and email, and more. Solve problems and the tech sells itself.
Connectivity: Whether tethered to your phone or powered by their own cell signal, theyโre no longer standalone gadgets with limited functionality. They're extensions of the AI assistants we already use.
Phone Addiction: We donโt feel good about how much weโre scrolling. We want to be heads-up and more present. But the technology isnโt there. Yet. Glasses give us both.
The Real Hurdle: Culture
Hereโs the thing: the technology might be (closer to) ready, but culture is still catching up.
Google Glass flopped not just because of its price or limited features, but because society wasnโt ready for people wearing always-on cameras in public. It was weird. It was invasive. It raised questions no one had good answers for yet. As mentioned, Google didnโt do itโs job on PR.

Fast-forward to today, and those questions are still around, but they feel slightly less uncomfortable. We talk to our phones in public. We wear AirPods everywhere. Some of us even wear smart rings (Iโm in year two of my Oura ring and love it). Weโve become more accustomed to tech blurring the line between personal and public.
Still, the leap to glasses as a primary interface is a big one. The devices have to blend in. The features have to be genuinely helpful. And the experience canโt feel like a Black Mirror episode.
Mass adoption won't come from specs or novelty. Itโll come when the glasses solve real problems in non-weird ways.
So... Are We Ready for Google Glass Yet?
No, not quite. But weโre getting closer.
Weโre seeing the platform wars forming in real time: Meta, Apple, Google, OpenAI, Microsoft, Amazon and others are all building in the same general direction, but with very different visions. Weโre still a few killer apps and price drops away from widespread adoption. But this time, unlike 2013, itโs not just about curiosity or cool factor.
Final Thought:
Iโve seen this story before, and I still believe in the plot. The future is getting a new lens. And this time, we might just be able to see it coming. Iโve got puns!
Will the future look like Keiichi Matsudaโs short film, Hyper Reality (2016), experienced through smart glasses? Weโll find out soon enough.
๐ RELEVANT LINKS from the Social Signals Archives:
The most successful Social Media ads are going to be in Augmented Reality (2021)
Exploring the Tech Trends of SXSW 2024: A Journey into the Future (2024)
Faxzilla Returns! ๐ ๐ฆ
I spent some time experimenting with Googleโs new AI video tool called Veo 3 with experimental audio. This is 8 short videos all generated via text prompts, edited together with CapCut, and the result was made in less than an hour.
FAXZILLA!!! Also, this is my real fax number if you want to fax me what you think of this. ๐
๐ซ Good News: Iโve Verified My Humanity.
Last month I wrote about the coming identity crisis online: where AI-generated content is indistinguishable from real humans, and the need for a new kind of infrastructure to prove weโreโฆ well, us.
This week I went to Worldโs new LA flagship location and got my irises scanned. Yes, seriously.
It was simple. There were 5 people working there and they told me they get about 20 folks who come in each day. Most of that is walk-by traffic on Melrose there (expensive real estate!), and it was safe to say I was the only person to take a 45 minute Lyft lugging a carry on suitcase around West Hollywood just to get my rises scanned.
The entire process took about 3 minutes. Get on the wifi. Open the app. Scan a code. Quit taking selfies because youโre making it time out, Greg. Get scanned. Wait for your confirmation. The whole time I was asking questions and taking pictures.
Because thereโs more than just Worldโs pop-up hereโฆ
Worldโs Orb isnโt just scanning eyeball: itโs scanning for solutions.
Their bold bet? Biometric โproof of personhoodโ stored on the blockchain: a World ID you can use to prove you're a unique human, not a bot or synthetic persona. Itโs like CAPTCHA, evolved for the generative AI age. And they already have partnerships with Visa and Match/Tinder.
Iโm not endorsing this as the answer. There are still big questions:
๐ง Why wonโt they let me ever delete my iris scan from their system?
๐ก๏ธ Is your iris scan the most private part of you? Was it worth $42 in crypto?
๐ธ When does humanity verification become pay-to-play?
๐ณ How do we feel about OpenAIโs leadership both breaking what is real in 2025 and also monetizing it?
But hereโs what I do know:
Verifying real humans online is no longer optional. Itโs critical infrastructure.
As generative AI accelerates, weโre going to need new ways to rebuild trust, reduce fraud, and make sure people (ahen, actual people) arenโt erased in the noise.
I got my eyes scanned. Would you? Read more from last month: Proving youโre a real human online will be a problem from now-on, so what do we do about it?
And on my flight home I learned the Orb is on the cover of Time Magazine this monthโฆ maybe this will bring more than 20 folks a day in to get scanned?