Apple Vision Pro | Mirrorworld vs. Metaverse
Why did Apple build a headset and is the world ready for it?
Yesterday, Apple released the all new, Apple Vision Pro. It's their mixed-reality headset competing with Meta's Quest and Microsoft's HoloLens.
As expected, Apple packed it with tons of "Apple-esque" features like:
auto-screencasting for your Mac
eye tracking (point) and hand gestures (click) for navigation and interaction
3D video capture and immersive replay
Community: what’s your favorite Vision Pro feature? Drop it in the comments!
I considered writing about all of the features and specs but I know there will be many other writers covering those in detail so I want to zoom out and attempt to answer two bigger and more complicated questions.
Why did Apple build a headset and is the world ready for it?
This question is particularly potent considering that Meta has (very) recently lost $13.7B on their investments in the Quest headsets and have since pivoted their focus from the Metaverse to AI after mounting pressure from shareholders.
Let's get into it.
Welcome to Making Product Sense
Join thousands of leaders from companies like Nike, Google, Uber, Coinbase, Twitter and Venmo as we learn how to build the future.
Why a headset?
The Vision Pro is Apple's first new product category since HomePod five years ago in 2018. So why this? Why not a car, or a camera, or a home security system? Surely, given the constant rejection of headsets, this is NOT the play... right?
To answer this question, I think we have to look at the trajectory of computing.
Over the last 50 years, we've evolved from room-sized computers to a desktop computer at the office to a laptop in our home, to a phone in our pocket, to a watch on our wrist, to an earbud in our ears. Each step bringing technology closer to us in a more personal, more powerful way. And a headset is the natural evolution of computing, bringing the screen from our hands to our face. Bringing the digital world from our screens into our real world.
Some, like Humane, have argued that the next computing platform is wearable AI. Screen-less and almost invisible. They're moving away from the visual aspects of computing in an attempt to help us engage more deeply with the world around us.
An admirable mission to be sure, but I believe, doomed to fail. We won't let go of our Instagrams and our TikToks so easily, will we?
But they have gotten one thing right... whatever we build next should be a shift away from the glowing portals of our screens. Something that allows us to swing the pendulum back toward the real world. And while there's something romantic about Humane's vision of a world without screens, I'm afraid they've swung the pendulum too far too soon.
Perhaps one day in the future we'll break our reliance on screens altogether, though I tend to think probably not. Until that day comes (if it comes), there's a better option. A middle ground that begins to shift us back toward a more present IRL experience without taking away our precious pictures. A digital layer superimposed onto our world. Even in the fictional world of Marvel, Iron Man's J.A.R.V.I.S. projected holographic visuals that Tony Stark could interact with. Audio alone isn't enough - we're sensory beings who like to see and touch.
A headset fits in the in-between. A space between one end of the pendulum with its laptops and phones and the other end with its screenless, ever-present AI. It's the next computing platform. The next evolution of our parallel world.
I believe that's the "why" for Apple, but of course, that begs a broader question. Is the world ready for it?
Are we ready for it?
I'll be the first to admit that I was both excited and shocked when the rumors began surfacing that Apple was going to announce a headset at WWDC. Excited because, as a regular Quest user, I've been looking forward to the day when Apple would release a competing product that would pair with the rest of my devices, sync with iCloud and provide a more immersive, higher resolution experience. And shocked because the world seemed to have all but rejected the headset.
Virtual Reality has gone through multiple hype cycles and intense skepticism. And Augmented Reality's spark seemed to died out into nothingness with Google Glass. When Meta bought Oculus and began to invest in the games and worlds of the Metaverse, I was genuinely hopeful that it would make a comeback. I even wrote an article about why I thought Meta could turn around the public perception, but alas, I watched it get rejected once again by the fickle masses.
So yeah. I was a bit confused when Tim Cook began teasing his love for AR in interviews. But when I looked through my Apple-colored lenses (pun intended), things began to make more sense. Despite Meta's attempt and debatable mainstream "failure" with the Quest, there are a few things Apple has going for them that gives them a leg up.
Let's break 'em down.
Apple's Secret Sauce
1. App Store
Use case has always been an uphill battle when inventing a new modality. Back during the Personal Computing Era, they gave people a spreadsheet and eventually a word processor and then email - the use cases came much more slowly.
During the Mobile Computing Era it was a bit easier since the iPhone replaced something consumers already used - a phone. But with the introduction of the App Store, suddenly a million use cases were at their fingertips.
In the Spacial Computing Era, you once again, have to give the people something to do. A use case that justifies a $3,499 price tag. This was part of the downfall of Google Glass - for the price, it just didn't do anything your phone couldn't. An app store is table stakes of course, which is exactly what Meta attempted to do but without a strong community of developers building for Quest, the every-day use cases just haven't added up quick enough. The Meta Quest Store only has 1,000+ apps.
Apple's App Store on the other hand is already seeded with 14,000+ AR apps that their customers have been using on their iPhones and iPads for years. Plus, you can run your non-AR apps as 2D floating screens within your H.U.D. offering millions of use cases that consumers are already reliant on. Not to mention, Apple's partnership with Disney brings an entire world of entertainment options that you've never seen before like under the sea excursions and MBA games rendered on your tabletop in a stunning 3D miniature model.
2. Developer Ecosystem
Speaking of use cases, how do you generate new ones? That brings us to the developer ecosystem. If you're opening up your new computing platform for folks to build for, they obviously will want to know there's a distribution advantage that's worth their time. Since augmented/virtual reality is a new platform, you have to give the builders a reason to build.
Since Meta's apps are relegated to the VR world that Meta built, developers are relying on Meta to build the hype for Quest so consumers buy the headset so there are people who will actually use their app or play their game. As we've mentioned, Apple owns two of the most ubiquitous devices in the world (iPhone/iPad) that already supports AR apps without the headset. That boosts app distribution, encouraging more developer engagement and spins the consumer flywheel.
3. Device Interoperability
Speaking of the iPhone/iPad, let's talk about device ecosystem. Apple is world-class at building a walled garden of connectivity. Within their garden, their devices seamlessly work together creating an almost magical experience. And the Vision Pro is no different. From the moment you put on your headset, your photos, notes, contacts, passwords etc. are piped in and ready for you to use just as you would expect.
The inability to easily sync your digital world into your new "space" makes using the Quest a bit of a cold start while Apple makes your first experience in their spacial world feel like home.
The Metaverse
Now, let's talk about what I believe is the biggest advantage Apple has going for it. While Meta focused on the VR-first play (eg. The Metaverse), Apple is focusing on an AR-first play (eg. The Mirrorworld). Rather than escaping reality, Apple wants to enhance it.
Before we jump into that, I want to first give some kudos to Meta because, while it may have been a misstep, it wasn't an obvious one in my opinion. The idea of the Metaverse is a grand one. It promises endless worlds to explore, games that defy the laws of physics and an immersive landscape that creates a breathtaking backdrop for work and play. I get it - I really do. There's an inescapable allure to the Metaverse.
But while Meta's pursuit of VR was admirable, the tech simply wasn't good enough to bring the vision to life and, to some degree, I just don't think the world was ready for it. Folks have been talking about the addictive, anti-social and depressive qualities of looking at screens for years now. If we're honest, strapping it to your face and blocking out every speck of the real world was a non-starter.
That being said, there is a definitive "first step" toward virtual reality and that's augmented reality. For a computer on your face to run, it first has to walk. But thankfully, AR is an end game all to itself. How many of us have watched in awe as films like Iron Man and Minority Report depicted high-tech, holographic desks or H.U.D.s?
In terms of the cultural zeitgeist, AR is a better representation of "the future" than VR which most often is associated with a dystopian world (à la Wall-E). So naturally, when Apple decides to build version zero of the next great computing platform, they're going to build something that feels like a dream, not a nightmare.
The Mirrorworld
With all of that as context, I want to visit an idea that legendary futurist, Kevin Kelly, wrote about back in 2019. The Mirrorworld.
Despite the incredible technological innovations that Apple brought to this new headset modality, there's still a long way to go. I got hyped about the super high-def resolution and impressive hand tracking, but a $3,499 headset with a short 2-hour battery life and that creepy "Eyesight" passthrough thing is very much version zero.
But that doesn't phase me in the least because, with the help of Kevin Kelly, I can see (pun intended) the future that Apple is building toward. Here's how Kevin describes The Mirrorworld.
"At first, the mirrorworld will appear to us as a high-resolution stratum of information overlaying the real world. We might see a virtual name tag hovering in front of people we previously met. Perhaps a blue arrow showing us the right place to turn a corner. Or helpful annotations anchored to places of interest. (Unlike the dark, closed goggles of VR, AR glasses use see-through technology to insert virtual apparitions into the real world.)
Eventually we’ll be able to search physical space as we might search a text—“find me all the places where a park bench faces sunrise along a river.” We will hyperlink objects into a network of the physical, just as the web hyperlinked words, producing marvelous benefits and new products.
These examples are trivial and elementary, equivalent to our earliest, lame guesses of what the internet would be, just after it was born—fledgling CompuServe, early AOL. The real value of this work will emerge from the trillion unexpected combinations of all these primitive elements."
—Kevin Kelly, AR will spark the next big tech platform - call it Mirrorworld
The Mirrorworld that was teased in its infancy with Google Glass, has begun to evolve and take on the form and function of our personal and mobile computing devices with Apple's Vision Pro. It looks new and exciting but also familiar with the array of icons denoting our messages and photos. Except now, we see some of those things appearing as digital objects, inserted into the real world around us. An x-ray view of a car letting you see the inter-workings of the engine. A 3D heart. A digital turntable.
As the technology gets smaller, lighter and cheaper to produce, heavy aluminum goggles with an external battery pack will turn into slimmer, lighter, stand-alone goggles. And those will, in time, turn into glasses that feel natural, though perhaps a little bulkier than usual. And maybe someday, when over-the-air charging, holographic technology, and chip production has reached an unimaginable maturity, we'll all be wearing glasses that look and feel as natural as they do today, casting an ever-present digital overlay on the world around us.
When that day comes, a bulky laptop that you have carry around in a backpack will feel like a relic and a phone in our pockets might even feel unnecessary.
That's the future Apple is building.
But I'm curious to hear your thoughts!
What’s your favorite feature of the Vision Pro?
Do you think it has what it takes to go mainstream?
What do they have that the Quest doesn’t and vice versa?
That’s all for this one - I’ll catch ya next week.
—Jacob ✌️
Great post! Never read anything on the "mirrorworld" but love where that can go. I'll be excited to see a less awkward wearable than goggles, but I know this is a massive step in the right direction. I give it 5 years before they release "Vision Air" which will mimic glasses (probably bulky like you said). I'll be looking for those, and would be more willing to cough up $3.5k.