0

The Apple Vision Pro has a lot of hype, a lot of negativity, and a lot of snarky hot takes on social media surrounding it. Depending on who you listen to, it’s either the AR/VR headset that’s going to revolutionize the industry and the way we work, or it’s going to be Apple’s biggest failure. The truth is that it’s neither of those things. You might not have noticed, but I got my hands (or more like my face) on the Apple Vision Pro to experience it for myself. My take is that everyone, including Apple, is a bit wrong about the Apple Vision Pro. The age of spatial computing is coming, but is that future now? How will it impact our lives? And what’s it’s REALLY like?

From Apple’s perspective, the Apple Vision Pro is welcoming in, “the era of spatial computing.” They’re saying it’s here now. But is it? This is supposed to be the future we were promised for so long. I’m not talking about the hype of Oculus when they hit the scene in the early 2010s. I’m not even talking about the Virtuality 1000 that I got to experience back in the early 1990s in an arcade. The first references to the possibility of a VR headset go back to 1935 in Stanley Weinbaum’s short story Pygmalion’s Spectacles. Starting in the 1960’s, the first head mounted displays were developed and tested. One of the first was called, “The Sword of Damocles” and is often cited as one of the first augmented reality systems. And of course, there’s the popular novel, Ready Player One, that speculated where this kind of tech may go in the future. The reason I mention all of this is because the basic premise of the technology isn’t new … at all. It’s been in the works for decades and has always excited people’s imagination for the future possibilities. But is that future here today?

I’m not going to rattle off the specs of the device. If you care about things like foveated rendering, pancake vs. fresnel lenses, and framerates (especially when you get over about 90 frames per second), there are other videos already out there that get into all of that. My background is in user interface and experience design, and that’s what I’m going to focus on. I’d argue that for devices like these, specs don’t matter. The final user experience is what matters. Are you able to do what you want to do? Are you able to work faster with a new technology? Does it get out of the way and let you work and play better? Does it solve a problem for you? Not others … you.

Well, here we are with the Apple Vision Pro. I think the best way we can break this down is into three categories: the hardware, the user interface and experience, and finally, the utility, which is where I think most people are getting this wrong.

The Hardware

The build quality of the Vision Pro is absolutely nuts. On one hand, it’s incredibly solid and feels great in the hand. Part of the reason for that is the aluminum frame that gives the headset heft and the feel of quality. It looks and feels like an Apple product. The downside is that it’s heavy because of things like the aluminum frame and all the guts inside. This matters in the user experience because it’s strapped to your face, and depending on which head strap you’re using, you’ll feel it. Every headset on the market struggles with this too, so it’s not just an Apple problem.

The strap that you see in all the product photography is what’s also attached to the device out of the box. It’s a single loop that gently wraps around the back of your head. It’s extremely well designed and easy to put the headset on and take it off, but there’s a massive compromise … it’s tightening around the circumference of your head to hold the headset in place. Basically, the Vision Pro is smashing into your face with compressive force. It’s only comfortable for short periods of time. In my experience that would be something under 30-45 minutes, so if you’re only going to use these for short stints, then that’s the way to go. However, I far prefer the double loop strap, which takes all that pressure away because the weight of the unit is suspended across the top of your head. The back loop just helps to hold things in place. The single loop strap looks better, will mess up your hair less, but it’s uncomfortable. The double loop is more fiddly to adjust, will mess up your hair, looks utilitarian, but is far more comfortable. On the hair issue … I … uh … don’t have that problem. I’ve been able to wear the Vision Pro with the double strap for hours without any issue.

The screens that are just centimeters from your eyes are the best I’ve ever seen in a VR/AR headset … ever. Yes, better ones do exist, but they’re not consumer-level devices and cost dramatically more than the Apple Vision Pro. Compare this to the Sony PSVR2 or Meta Quest 3 and there’s no contest. Black is truly black. Colors are bright and vivid … not real-life vivid, but incredible-screen-technology vivid. The external cameras that pass through the external world to make it feel like you’re “looking at reality” are also dramatically better than anything else out there.

However, they aren’t perfect. There’s no mistaking that you’re watching a camera feed, especially in low light. Contrast is diminished, you’ll get blurring during motion, and you’ll see a lot of video noise. It’s still very usable, but you feel separated from reality by a screen. The responsiveness is spot-on. There’s no issues walking around, picking up objects, or even throwing and catching objects. The perspective directly matches what you’d be seeing without a headset on, and there’s no perceivable delay.

One important note on the screen is the field of view. When you see the screen captures and other videos of what it looks like in the headset, it’s deceptive. Not deliberately so, but it’s an issue all headsets struggle with. It’s more like looking through a pair of binoculars. There’s vignetting around your peripheral vision that really narrows your field of view. It’s like looking through a tunnel or large tube. The Vision Pro’s exact degree of view isn’t detailed on Apple’s website, but it’s slightly narrower than the very popular Meta Quest 3.

I got a separate light shield for my wife and gave her the headset to try. It was a great “average person” experiment because she doesn’t care about tech like this at all. Her impression was very positive, but the two big negatives were: 1) how uncomfortable the single loop head strap is, and 2) feeling separated from the real world by cameras and a screen. But the experience of apps, games, and movies got high marks from the UI, responsiveness, and the quality of those screens. She was impressed by the hardware, too.

The spatial audio coming out of the earphone slits on the side straps sound fantastic. It’s not audiophile level sound, but it’s surprisingly good and immersive. The downside is, because it’s just blasting audio at your ear holes, everyone around you can hear it too. If you don’t want to bother people around you, you’d need to make sure you’re wearing headphones like AirPods. It works great.

The User Interface and Experience

This is when I put on my UI/UX designer hat. As amazing as the tech is inside this device, it both solves many of the problems other headsets suffer with, and also introduces new ones.

The onboarding experience for calibrating it for your eyes, hands, and learning about how to click with two fingers, is phenomenal. It’s kind of a magical experience looking at an object and feeling like you’re almost willing it to kick into action. Other headsets have similar functionality, like the PSVR2, but it’s not system-wide and is implemented at the app level. Some games use it and it works well enough, but it’s nothing as good as the Vision Pro. The Meta Quest 3 also has a hand tracking system, but I’ve found it very fiddly. This is something some critics are using to dunk on the Vision Pro because the Meta Quest 3 is so much cheaper and gets you some of the same feature set. That’s something else I think is wrong, but I’ll get to that in a bit.

There’s two main problems with Vision Pro’s UI, though. 1) On a regular computer, you aren’t necessarily looking directly at the things you’re clicking on, and 2) it’s not accurate 100% of the time. Because this is the fundamental way to navigate around the UI and every single app inside the headset, it kind of needs to be perfect. On the first issue, it’s not a dealbreaker, but it does take some rewiring of your brain to get a handle on it. There’s been quite a few times where I’m double tapping my fingers to select something as my eyes are already starting to glance away. That can cause unintended side effects like windows moving around on you, or an app starting when you didn’t mean it to, etc. Again, it’s not as bad as it sounds, but it’s a tiny road bump for actually feeling comfortable in the headset. BUT … it makes doing something like hitting the delete key on the virtual keyboard a pain. You have to be looking directly at the delete key while holding it down if you’re deleting a bunch of text … but that means you can’t glance over at the text to see how much has been deleted. As soon as you do, the text stops deleting. When it comes to doing a lot of writing in the headset, you definitely will want to attach a bluetooth keyboard.

The second issue is more of a roadblock at times. I’ve found that it’s highly dependent on external lighting. If you’re sitting in a room that’s not dark, but just dim, it’ll start to struggle. To be fair, it sometimes will give you a warning about this. It’ll see false finger taps. If you’re sitting with your hand on your lap or desk and your fingertips are close, but not touching, enjoy some phantom clicks. I think it’s from subtle head movements and changes in perspective that make it look like your fingertips are touching in a tap position when they haven’t moved at all. It’s a matter of perspective and low light. If you turn up the lights when that happens, the phantom clicks go away.

When it is happening, though … it can be maddening. While I wore the headset writing this script in Google Docs using Safari, I kept having sentences getting auto highlighted and cut or moved on me. It was infuriating, until I turned the lights up in my office. That makes me wonder how useful this will actually be on a plane to get some work done or watch movies. If you’re on a night flight, they often turn the lights off in the cabin, which means you might have to be “that guy” who turns on your overhead light while everyone else is trying to rest.

There’s also no mistaking that this is a 1.0 product when using the home screen. You can’t rearrange the apps in the list at all right now. Any iPad app you install ends up in a “Compatible Apps” folder (again, with no ability to rearrange them). It seems obvious that it’s a feature that will come in an update later, but that they had to cut due to time constraints getting this out for launch.

The separate battery pack with a tether to the headset is an understandable compromise. It helps to keep the weight on your head down, but that tether is a concern. Most of the time I just had it slipped in a pocket and forgot about it, but the cable has a tendency to get twisted up after a bunch of use and moving the headset around after storing it. If you’re someone without pockets, you’ll have to get something like the Belkin’s external battery pack clip and sling. Another concern is cats … yes, cats. I had my Vision Pro for less than two days when my cat discovered the USB-C charging cable and let her opinion of the headset be known. Thankfully, she didn’t find and chew through the battery cable. It is replaceable, but it’s a proprietary cable. You can’t remedy any accidents with a stash of spare USB-C cables on hand.

Speaking of batteries, I’ve found the battery lasts for about 2 ½ to 4 hours before needing to be recharged. I got in the habit of plugging in while I’m sitting down doing work and only using the battery standalone when up and about. While the battery is a compromise, it’s serviceable.

But where the experience becomes pure joy is in how it works with other Apple devices. Being able to seamlessly connect your Mac to the Apple Vision Pro to use it as a virtual, massive monitor is incredible. The fact that you can surround your virtual Mac screen with other Vision Pro and compatible iPad apps and then bounce your mouse between all of those windows is … like nothing else out there. You can do the whole virtual monitor thing with the Meta Quest 3, but it’s not as seamless and user friendly as this. The entire Apple platform and app ecosystem unlocks a lot of benefits.

The Utility

And that brings us to category three: the utility. This is where the real story starts for the Apple Vision Pro. It’s also where I think most people, including Apple, have gotten the Vision Pro wrong. Many are writing this off because they don’t see the utility. Why would I want something like this when I’ve got my phone, tablet, or laptop? And in some cases, they’re correct. Apple says in its marketing copy, “Welcome to the era of spatial computing.” They’re declaring that the utility of this type of device is here … today. Implying that everyone should get on board. And many of the Apple fans are right there cheering this on. And in some cases, they’re also correct.

But the truth isn’t black or white, it’s shades of gray. It’s also a matter of perspective. Niley Patel at the Verge put out a great review that hit on the point that he’d rather be in the real world vs. separated by screens. That’s making the assumption that you’re intended to wear the Apple Vision Pro the majority of the time, or in social situations where the human connection is more important. I don’t know about you, but when I’m buried in my phone looking at what’s happening on social media, or on my computer writing a script for a video, it’s a very solitary experience … on a screen. The Vision Pro’s ability to mix the virtual with the space around you isn’t “more isolating.” It’s just … a different way to do many of the same things. As I wrote this script, I was no more isolated than normal, but I was comfortably sitting on my couch, feet kicked up, with a myriad of screens floating around me making it easier to do my work versus using just a laptop. When it came time for dinner and some TV watching with my wife, off came the Vision Pro. It’s just a tool to be used in new ways.

There are gimmicky videos of people trying to wear the Vision Pro out in the world everywhere, all the time. While that may be good for clicks, that’s not the intended use of this device. It’s a tool to get a job done or to help you unwind and relax.

Some of the first things I did once I got it set up was to watch some TV and movies … and … oh my … it’s one of the best video-watching experiences you can have. It made my 85” OLED TV feel small and weak in comparison. Some of the scenes I watched from Dune felt like I was actually in a theater looking at a massive screen. No pixels visible at all. When I popped on Avatar: The Way of Water, not only did it look pristine and massive, but the 3D looked even better than it did when I saw it live in IMAX.

So, movie watching? Check. Killer experience and even better than what I’ve experienced on other headsets. No contest. But is that worth almost $4,000 when you get a decent enough immersive movie experience with something like the Quest 3 or even Nreal headset? It’s a matter of perspective.

What about working? Well, most of the apps and services I use either have apps or good web apps. For instance, I use Discord to work with my team, Notion to manage our projects, Gmail for email, Google Docs for writing scripts, like this one, and Fantastical for managing my calendar. I got those all up and running and did actual work … with ease … for the most part.

But this is where you start to bump up against some walls. It’s very similar to the walls you’ll hit if you try to work completely off an iPad in some cases. You’ll get 80-90% of the way there, but then come to a screeching halt if you need to do a lot of file management. Using professional apps like Final Cut, Adobe Audition, etc. are a non-starter because they aren’t even on the App Store, or appear there as very hobbled versions.

However, this is where some of the Apple magic kicks in. You can use screen sharing and the feature Apple calls “continuity” to seamlessly link multiple devices together. I was able to use my Mac in a virtual 50” screen hovering in front of me running Final Cut. Hovering off to the sides I had iPad compatible apps like Mastodon, Notion, and Discord up, and Vision Pro apps like Fantastical. Being able to bounce around between them was effortless … but … you’re also talking about a $4,000 headset streaming an equally expensive Mac, which isn’t exactly economical. Again, perspective.

But this is where I think we all have a lack of imagination. There are use cases for what the Vision Pro could do with this exact version of the hardware. In some cases, the apps might not exist yet, but it’s absolutely possible. The spatial aspect of spatial computing is something I haven’t seen spoken about much. You can literally place a window or an app in a specific location in one room or area, then walk to another area and set up other windows and apps. The easiest example of this is cooking. Place a virtual timer floating above each pot on your stove top. Another might be a work setting where you have physical objects you have to interact with, but virtual controls and readouts located next to each one of them. The headset keeps your hands free to do work, so there’s no juggling a phone at the same time or carrying around a laptop from place to place. It allows for contextual computing in whole new ways.

It unlocks new and interesting things for education. The app JigSpace is a little bit of a light experience right now, but it’s clear what they’re trying to do. They have instructional walkthroughs of a fuel cell. How to replace a solenoid. Jet engines that you can break apart to see how they work. This kind of thing will be phenomenal for education … I wish I had it when I was in school.

Then there’s accessibility. You can configure the Vision Pro to work with alternative input devices, long gazes to click, screen readers, and more. There are people who can’t easily access the world around them that could have amazing new experiences unlocked by the Vision Pro. Believe it or not, there’s a lot of accessibility options in there for people who can’t see well, which seems counter intuitive for a device that’s all based around your eyes. But it’s in there. Imagine someone who’s bedridden being able to have experiences out in the world again virtually. It could also help deepen online connections with friends and family in ways that feel more real and grounded. I have friends located around the world that I’ve never met face-to-face and this could unlock new, fun ways to connect. Best part is that it isn’t a feature for the future, it’s already able to do that stuff today.

Oh, that reminds me … Apple’s personas. Now, this is a beta feature, so we need to cut it a little bit of slack, but you scan yourself with the headset and a computer generated version of yourself is created for video calls. Any app that needs a video feed of you can use this persona. It’s both incredible and horrifying. My brother’s reaction was perfect the first time he saw it. He and I do a Star Trek podcast together (yes, we’re both geeks), so I knew he might appreciate the tech.

“So, uh, Sean, what do you think?” -Matt Ferrell

“This is the creepiest thing I’ve ever seen. Oh my god. Yeah, the fact that your … that your avatar keeps smiling the way it does. It’s not you, but it has elements of you that look like you. This is wild. I would say figure out a way to do a call with mom and dad, but then I thought no.” -Sean Ferrell

“This would freak them out.” -Matt Ferrell

“Yeah, don’t do that to them.” -Sean Ferrell

Final Thoughts

The Apple Vision Pro shows us a very clear path for where this technology is heading. It’s clear there will be a plain Apple Vision and maybe Apple Vision Air down the road. The future of spatial computing is coming, but is it here today? The short answer is … no … and yes. It depends on your use cases … and yes, perspective. For instance, if the apps and services you use to work or play are here, then you could start living the spatial computer future right now.

For everyone else, until there’s more apps available, the UI quirks get ironed out, and the device itself gets smaller and lighter weight, it’s not quite here yet. But it’s damn close. And of course … there’s the price. If you want to dip your toe in the spatial computing pool, there’s always the Meta Quest 3, which is much cheaper, but that has always felt more like a toy to me. It’s a gaming-first product that can also do other stuff. The Vision Pro is the other stuff first that can also play games. It doesn’t feel like a toy.

Why do I think Apple is also a bit wrong about the Vision Pro? They don’t seem to know how to market it. I get why the marketing has to say, “the era of spatial computing is here,” but that’s definitely not the case for most people with this first iteration of it. This reminds me a little bit of the Apple Watch launch. They had a pretty muddled message around that one too, but that got refined pretty quickly when they saw how people were actually using the device. That might happen again. In all my experience working in the tech sector and releasing products, you can user-test the heck out of something and think you have it nailed down, but users will always surprise you in how they actually use it. This is Apple getting a product (even an imperfect one) out into the wild to see how we actually use it, so they can evolve it over time.

Apple has an incredible ecosystem of devices and apps that all work seamlessly. It’s something Facebook is lacking with the Meta Quest devices, which means this is Apple’s game to lose. I’m pretty excited for our spatial computing future and have high hopes for how this will evolve with new apps in the coming year or two. It’s the first VR headset I’ve bought that I really do think I’ll still be using 6 months from now. But I thought the same thing for every VR headset I’ve bought, most of which ended up on a shelf collecting dust. I guess I’ll have to report back in half a year.

Have we been doing Solar wrong all along?

Previous article

Are Airships Finally Making Their Comeback?

Next article

You may also like

Comments

Leave a reply

More in Reviews