Apple Vision Pro: Computing Without Borders
- The Experience (hands-on Demo)
- The Market Opportunity
For the entire history of software development, developers have been constrained by the device screen size they are creating apps for. This is the paradigm shift in software creation Apple hopes to accomplish with Apple Vision Pro. Apple intentionally has stayed away from terms like VR or metaverse and is securely positioning Apple Vision Pro as a spatial computer. In their own words, Apple Vision Pro is the most advanced personal electronics device ever created.
From a pure technology standpoint, Apple Vision Pro is absolutely one of the most impressive pieces of technology I have ever experienced. Every Apple product, innovation, and breakthrough that has come before Apple Vision Pro served as learnings to make this product possible. Apple loves to talk about things that “only Apple can do,” and Apple Vision Pro may be the pinnacle of “only Apple.”
Carolina Milanesi and I made a short video sharing our experience getting a demo of Apple Vision Pro, which you can watch below.
I’m hesitant to write too much about the experience with Vision Pro because I won’t do it justice. There are some parts of the device you just have to experience to believe. But I’ll make a few points I think are important.
First, Apple nailed eye tracking. I was extremely skeptical of this when it was announced, having tried many products that claim eye tracking that fails miserably and is highly frustrating to use. Apple’s eye tracking was essentially flawless, always letting me pinch to tap where I was looking. Upon first using Apple Vision Pro, you go through a brief eye-tracking setting up where you follow dots along a screen, and the device learns how to track your eyes accurately. I have never used a device that nailed eye tracking, and Apple Vision Pro did just that.
The second thing was the pass-through video experience. I have tried all modern headsets and many in labs, and I always encounter visual latency when the device passes through video from the external cameras. This creates instability in balance, dizziness, and somewhat disorientation when using it. Apple Vision Pro had little to no latency, and I could easily walk around the room, interact with objects, and had no disorientation. Beyond no latency, it was also extremely high resolution. While you knew it was a video and not an actual lens, it felt extremely close. This is due to the just more than 4k resolution in each eye. Still, yet another custom-designed piece of Apple silicon called the R1 that is responsible for processing all the camera and sensor inputs to eliminate latency which is the main issue most who have had dizziness, nausea, and other problems when trying VR.
Lastly, I wanted to highlight EyeSight. This technology is critical to the experience of giving people around you context, letting them see your eyes, or letting them know you are in a simulated environment, and letting you still have the context of others and your surroundings. What strikes me about this is that Apple did not have to put an outward-facing display on Vision Pro. That is added cost, innovation, compute strain, and more, but they did have to deliver on the experience Apple envisioned. And solve one of the main critiques of VR experiences that you are too disconnected from the real world. The most mind-blowing part of the demo to me was when you are in a fully immersed environment, and someone starts talking to you, they enter the scene out of nowhere in a blended reality experience. It was one of the most amazing things I’ve ever experienced technologically.
I need to ensure I say it, but I have been one of the biggest skeptics of Apple’s timing in this category. I knew how challenging the technology hurdles were for Apple to release something that needed Apple’s standards in an experience. I struggled greatly with why Apple would want to launch something now, knowing this product would take a long time to get broad market adoption. But I walked away from the demo really in a sense of shock and awe. The hardware and the experience were so much more mature than I thought possible technologically at this moment. The price is high because so many technological breakthroughs went into this product. And my experience with Vision is the baseline experience I think is necessary to remotely believe tens of millions of people will want to use a product like this for hours per day at some point in the future.
During this next section, I’ll provide some analysis of what I think the market opportunity is I just want to say from the perspective of a technology enthusiast at heart, Vision Pro is a remarkable innovation.
While I acknowledge there are still many unknowns about this segment and any prediction or forecast is likely to be wrong, I think it is worth wrestling with the question of market opportunity and maintaining a thesis as it develops. I remain convinced this is a very slow adoption cycle, and Apple’s end goal is not a goggle form factor but more glasses, but there are a few important points to make about this market opportunity.
The first is to recognize the potential for a new computing paradigm with spatial computing. Apple is intentional about using this term and attempting to peak developers’ creativity with the idea of an infinite canvas as a software development opportunity. This mixed reality market will ultimately be defined by software developers that create apps and software experiences not found on other pieces of hardware. The only way a mixed reality device becomes mainstream, in any significant way, is if developers (en masse) create a rich and vibrant software/services ecosystem for it. That point is true about Apple Vision Pro, Meta’s Quest, and any other company that has ambitions to enter the space.
This, to me, is the most interesting part of the market development to watch. We know the technology hardware and software will only improve, but developers need to embrace it if this category is to scale. And that is a primary reason so many in the industry are betting on Apple in this space. Apple has the developer advantage when it comes to third-party app ecosystems, and there is no reason to doubt that advantage won’t continue into their spatial computing platform.
Competitively, Meta is at a strategic crossroads. Either they decide to go toe-to-toe with Apple in hardware, something I do not think they can do, or they need to focus on the platform and let other companies build hardware for their platform. In this case, they become like Android and Windows for mixed reality. The trade-off in this scenario is that they will cede the high-end to Apple just like every category in which Apple competes. Apple will absorb the most valuable part of the market and the most valuable customers, and others will compete for the rest of the market, whatever size that ends up being. Apple has 1.3-1.4 billion customers and growing, and that is essentially their potential market size someday BEST case scenario.
Fully recognizing any market prediction is wrong, I’ll at least throw out how I think about this category. For any real short-term adoption, potential customers for a mixed-reality device need to see what use cases the device can absorb from other things they are familiar with. For example, let’s take TVs. If Apple, and anyone else like Meta or others, can convince consumers this can replace their TV, then we can use the TV market size as a comparable. Roughly 200 million TVs are sold every year. Far less at prices above $1500 dollars. But using TVs as a comparison is at least a thesis.
The other is game consoles. Could a mixed-reality headset be the next gaming platform? If so, the gaming hardware market is ~50-60m units a year, most at prices below $600. Lastly, for now, what if it can replace your laptop or desktop someday? If that is comparable, then we are talking about 270-280 million units a year.
But here is the real idea behind the thesis, what if this device can do all those things? If we believe it will absorb tasks from a number of the best-selling consumer and enterprise hardware devices, then the market potential is much larger, even at higher a higher price point. If it can replace your TV, Game console, and laptop/desktop, the combined ASP of those categories is $1400. And the total unit potential in annual sales of those combined categories is ~600 million units a year.
Now, that is a BEST case scenario. And one that is many many years away, but that is how I think about this market and how it can evolve if every upside scenario we can think of comes true. It may not, and this may be a much smaller market, but if that’s the case, it isn’t good for anyone but Apple.