Passthrough is massively oversold, even if it's technically impressive. They set expectations way too high.
The comfort is the #1 reason I don't own a Vision Pro anymore. I feel exactly as the author put it, "relieved," when I take it off.
I feel significantly more disconnected from my family when I have it on because I have no easy way to share my content with them. A massively improved guest mode, better casting, or something else would go a long way here.
The eye tracking + tap is incredible, but Apple tried to shoehorn this into everything. It should have been the primary mode of interaction with a detailed/precision interaction when needed. Eye tracking + tap is simply not good enough for power user use cases. It was such a relief to go back to my Quest after Vision Pro because the controllers were so precise and easy to use.
And finally, I'll mention that the OS + standardization of UX is HUGE. The Quest feels like a crappy Chinese clone in comparison. Every single window has a completely different way of moving, adjusting, etc. Sometimes you click in the center, sometimes you click under, sometimes you can't move it at all. On the Vision Pro, everything is standardized. I'd love to see Meta fix this.
This whole paragraph sounds downright dystopian.
“A family that shares content together, stays together”
"Hey, let's play a game!" What sort of game? "One we can play together" Welp looks like this multi-thousand-$$ toy is out then, lets go back to traditional games.
[1] https://developer.apple.com/documentation/gamecontroller/
[2] https://www.apple.com/shop/smart-home/accessories/gaming
Strangely enough, I don't own one, but just assumed this guy had hit the nail on the head with everything. Everything else I've read felt like it was pushing an agenda, this guy's writing just seemed like it was accurately weighing the pros and cons as he saw them.
How about flutter apps like https://flutter.github.io/samples/web/material_3_demo/
I've been complaining that flutter is the next flash, meaning that by rendering their own UX instead of using the platform's they're basically making dead end apps. They'll claim devs just need to reship their apps with the latest version but that's means all old flutter apps are abandonware. It also ignores that the next platform they'll run into the same problem again.
As a Vision Pro owner, kinda, but kinda not?
I have a really low-power glasses prescription (-1/-1.5) and I've come to the conclusion that everyone saying this has the priviledge of being born with perfect (irl) vision. It's intentionally a little bit out of focus to obscure the screen door effect, which I understand might be disorienting to people who've never had to wear glasses.
But taking that into account, it's incredibly clear. If you've used any other pass through, it's night and day. I will frequently put on my headset having forgotten that the cover is on, and when I take it off my brain genuinely reads that as uncovering a transparent lens. The slightly reduced HDR, slight desaturation, and almost imperceptible visual snow all added up still don't detract enough from that to say it is not close to "lifelike".
It's not perfect, but it's incredibly close - and probably better than a good chunk of the population's uncorrected vision.
I mean, the comparison images in the article almost make the point for me. The left image, when adjusted for brightness/contrast (which the human vision system does automatically absent other stimulus, e.g. when all you can see is the headset) has a greater resemblance to what a cloudy day looks like IRL compared to the completely oversaturated, amateurly HDR'd image on the right.
On a whim I went to the optometrist, got examined and a prescription for glasses (I use over-the-counter 1.25x readers from CVS), then uploaded the Rx to Zeiss using their iPhone app, and ponied up $149 for their optical inserts.
Since the inserts arrived and I put them in, my experience overall with VP has been better: still way too difficult and confusing, but movies and TV shows, which were the only redeeming factors in my initial three weeks of use, seem really, really crisp and vivid, more so than initially.
Take these observations with a grain of salt, after all, it's a series of one 75-year-old retired neurosurgical anesthesiologist's experiences.
But if you've already dropped 4k, well, that's a lot of sunk cost to write off without trying everything possible to make it work.
At least that was my thinking.
oooof
> Every app on Quest has to reinvent how buttons work, how a scroll view works, how far away from the user the content should be etc.. and every app works differently.
kinda shocked there's not a Material Components equivalent for Quest. I guess it's designed like a game system (where custom menus are standard), but that's a reinforcing loop. As long as there's no standard component system, it'll continue to be game-centric.
This is 101 of product design - put in place a standard UI Guidelines. This is particularly important for a 'platform' player like what Meta is aiming for (Ex. from other platform/OS - Windows UI Guidelines, Apple's Human Interface Guidelines, Google's Material Design, etc..)
I think there's an emphasis on shipping a feature over shipping a cohesive product. Shipping simple vs functional one.
and churn seems to be getting in the way of mature products.
That should be all but standardized, other than exceptions in games where that could be a problem.
It’s not so bad when you’re used to the system, but giving it to someone and trying to explain what button does what isn’t exactly fun. You usually end up needing to move their fingers for them, and playing a weird version of that scene from Ghost with my father-in-law isn’t my version of a good time.
Oh god this.
The last 10 years of UX trends where everything has to be two or three different actions behind a single tap or tap and hold does NOT translate well onto AVP. Most iPad apps are nigh unusable, esp those with high information density. Vision is not quite as precise as a mouse cursor but it definitely not as imprecise as a tap, but only when the UI is predictable. Because where you are looking is also how you read, every informational field also needs to include either both the information and the action, or they need to be separated out.
So far there is no bigger culprit than the native Music app. The bottom of the player has like three nested buttons that all do different shit. The same place you look to see what song is playing is also a hidden progress/playback scrubber and also a shortcut to switch to the miniplayer if you happen to look at the eye catching album art icon directly next to the title of the song. It’s maddening.
Or click "Connect to Bluetooth headset" in Windows. Congratulations, your headset just connected itself, so you now you disconnected it.
Just brilliant, overall.
I couldn't blame her. You touch ANYTHING, even the spam caller in recent calls, and it immediately calls.
why oh why can't there be a setting - on by default - confirm before dialing?
I do this daily and it infuriates me.
I am also a fast typer on both touch and physical. As a result I often type into an auto complete box (e.g. kagi, gmail, etc) and see the result I want come up midway through my type. As I'm going to click or tap it, I am often in the 200ms window where that result is replaced by another or the list is reordered in some way.
Dynamically loading webpages are also a plague of the last decade for the same reason. They load an initial layout, often interactable. Then as you're going to click or tap, they dynamically load other elements or content, reordering and moving things around the page!
If you don't have anything n̵i̵c̵e̵ ̵t̵o̵ ̵s̵a̵y̵,̵ ̵d̵o̵n̵'̵t̵ ̵s̵a̵y̵ ̵a̵n̵y̵t̵h̵i̵n̵g̵ ̵a̵t̵ ̵a̵l̵l̵ ready to load, don't load anything at all.
The Home tab in the music app shows exactly 1 tile of information on my 15 Pro Max. That's 3.6 million pixels and they couldn't be bothered to fit more than 1 item without scrolling.
It's immensely frustrating to be watching a video in the photos app and swipe a tiny bit wrong only to lose all current progress in the video even after you swipe back. Even scrubbing controls for long videos (1+ hours) can be super finnicky and it's tough to select accurately. The traditional scrubbing method of holding while moving vertically to scrub at slower speeds doesn't seem to work either.
As a result, I'm often catching myself in frustrating situations where I'd like to just __UNDO__ whatever action I previously took was. Either jumping around a video to the previous timestamp, or undoing the close safari tab button I didn't mean to select, or any number of other things I accidentally press due to the options being just a little too close to each other.
It's just a mild annoyance, but it's something that would massively improve my experience with using the device, as someone who has spent a LOT of time in it nearly every day since release.
I always disable this feature.
I remember the days when EVERY application in Windows had "File" in the upper left hand corner. You could make a good bet that "Edit" was next and "Preferences" was somewhere in there as well.
Being "good with computers" back then had a lot to do with knowing the standard layout of applications to help guide you through where things ought to be or, at least, could plausibly be found.
I applaud Apple for keeping this going in this modern world we live in of SAAS websites that each do their own thing.
Meta does have standardized utilities for translating movement to touch/drag/etc. interactions on arbitrary virtual surfaces:
https://developers.facebook.com/blog/post/2022/11/22/buildin...
https://developer.oculus.com/documentation/unity/unity-isdk-...
But it doesn't seem (AFAIK) to answer the other side of this, which is the UI design system so apps have a consistent look and feel. Which is perhaps more common coming from a game development perspective, but ever since the Mac OS shareware days, Apple's understood that it's empowering to a certain kind of developer if you make it easy/the default path for them to build experiences that match a standardized look and feel. I'm honestly surprised that Meta didn't at least make an optional SDK for this.
I doubt a UI toolkit or standard OS primitives were even on their radar.
But, historically, it was a misstep not to build these kinds of developer tools and design systems the moment https://forwork.meta.com/quest/business-subscription/ became a glimmer in their eye.
I never felt this was remotely an issue with the Quest, and more than it is with desktop games.
The page you linked to is full of links to Unity APIs / SDKs. So meta doesn't have them, Unity does.
Same thoughts for Quest 3. I use mine exclusively without the light seal now. It is a huge improvement in the ability to just casually use the thing among friends and not seem like a weirdo, and having your peripheral vision massively adds to the overall comfort of the experience. I've found paradoxically that it increases the feeling of presence for passthrough when it feels like you're just looking through a pair of glasses instead of something suction-cupped to your face. This whole idea of "locking in" to VR and closing out the outside world needs to go away. True AR that doesn't remove you from the world is the only future for these devices.
This goes ditto for controllers. The vast majority of people have never held a game controller in their lives. We (as gamers and nerds) take it as second nature, but I've seen it as the single biggest barrier to entry with demoing VR to random folks. Sticking with hand/finger based gesture tracking and rejecting controllers was the absolute best decision Apple made for Vision Pro.
Designing the UI around hand/eye tracking was smart but not supporting VR controllers at all is stupid.
Reminds me of how stubborn they were about bringing mouse support to iPads.
It was a strategic decision. It says "This is not a game console, it's a general purpose computer", and paves the way for opening more people up to the idea. Controllers are a crutch that no one actually wants. We want to be able to just naturally do things in VR/AR the same as we would in reality. And vision based hand/finger tracking has gotten to a point where controllers really are nothing more than an input device now. There's no need for them at all to have 6DOF control, the way there was in the Lighthouse/Constellation days.
Why?
>True AR that doesn't remove you from the world is the only future for these devices.
Again, why? What if I want to be removed? I bought my VR headset during the pandemic precisely for indoor escapism.
I want to be transported and immersed into another universe, not see AR stuff floating around between the same four walls of my tiny apartment that I see all day everyday. It would drive me nuts and I can do that stuff on the cheap with my phone/Ipad.
AR should be without the light seal, VR should be with the light seal, simple as that.
Yes, because you are the target demographic for the current crop of headsets that are essentially just expensive niche gaming peripherals. There will always be a market of a couple hundred million people for that. But the other 7 billion people on earth do not want that. They want spatial computing. They want something that fits as seamlessly into their lives as a smartphone, and can be used in public without looking like a weirdo. They'll never even consider VR/AR until that is achieved.
I am still puzzled as to how to implement navigation in WebXR for the AVP.
If you had a true optical passthrough AR headset that is one thing but it is absurd for a $3500 device which is hardware capable of immersive apps to be kneecapped by software and the control scheme.
Meta is able to offer immersive experiences where a much larger world is mapped into your virtual space and you can move around and grab things and use tools through controllers. Apple is offering hardly anything in comparison except for a $3500 replacement for a $350 TV. Or, “boy I just flew in from an AR experience for 15 minutes and boy are my arms tired” or “I am in terror at looking at anything because it might trigger an irreversible action”.
Which is the ideal balance since they are an inconvenience if you have a keyboard/mouse in front of you and are using the Vision Pro as a replacement for multiple monitors. After all not everyone is using it for gaming or entertainment.
Heck, the PS5 VR headset seems to be having trouble selling, and it is very good at what it is designed for. Anyone buying the PSVR2 knows they are getting it to play games. Solo games, at that. Yes, you can do a party mode for Beat Saber and a few other games. No, it isn't much more compelling than any other "group" version of video games. It is very immersive, though.
So, what did the non-gamer community hope for from this? What is there to feed those hopes?
Since it runs most iPad apps, it can be used for some light productivity workflows even without a Mac. You can use Bluetooth keyboard if you need to type a lot, but the virtual keyboard is surprisingly decent for quick replies.
There are still so many people who have never touched VR who are somehow hyped up and convinced this is somehow more magical or more special than everyone elses headset.
Basically it's just the normal reality distortion effect. Sooooooo many comments of "the original iPhone wasn't perfect either". So?
VR has come a LONG way and it works surprisingly well for anything where you aren't expecting to walk. Gran Turismo could legit work as a teaching tool for driving. Yes, you lose some force feedback from the likes of braking; but a good steering wheel goes a long long way to the rest of the experience. (Heck, reality is the wheel alone is probably good for an immersive experience.)
I still don't get why folks would be so pumped over the rest of the ideas. Especially with how cheap large decently high resolution monitors have become.
Or, what other than VRChat factors for AR/MR/VR/XR/"Spatial Computing Device" to not go flatline and fold?
The home repair ideas is certainly a neat one. Guessing it will be a while before the content is readily available in large quantities?
It's a shame, as it could have been pretty great.
And I didn't have the original, so I can't compare. I do like the thing, and I use it about as much as I thought I would. Any game where you "walk" is a bit much for me. Any game where you are controlling something that moves, or you have things fly at you is fine. Toss, I think, is an in between that also works really well.
if anyone else is wondering like I was, this was a fun read: https://superuser.com/questions/1205451/how-can-i-display-th...
> In text, don’t write the name Apple Vision Pro by combining the symbol with Vision Pro.
> Correct: Get started with Apple Vision Pro.
> Incorrect: Get started with Vision Pro.
https://help.apple.com/pdf/applestyleguide/en_US/apple-style... (p. 28)
I know that I might be a little biased as I really organised my work flow around using iPad, selecting apps that work for me.
The transition to VisionOS is much easier then, as most of the apps are simply there in 3D space. I have tried to connect my MacBook, but then you have this weird mix of UX concepts (mouse, keyboard, look+tap), that's not for me.
In my experience I find it comfortable enough that I forget sometimes that I am looking at digital content. Here on X I added a few photos to give you a little extra context of how I work: https://x.com/wlmiddelkoop/status/1769765197948850463?s=20
Just silly. If you cannot easily input data without and extra accessory than it’s just a fail in this case. Imagine building the headset and then booking up a keyboard and then sitting next to an outlet because it doesn’t hold a charge very long…… almost like using a laptop may be more efficient.
Honestly looks like we are still a few years away from cool cyberpunk level VR/AR.
Notes on the experience...
- On the solo loop band I had to turn the knob to loosen the strap a lot to get enough clearance to position my glasses within the device. The dual loop band probably would not work at all (I didn’t try).
- My glasses are just barely narrow enough to fit within my light seal (25W). It's definitely not a guarantee everybody's glasses will fit into their own light seal.
- Once in, the eye tracking was very off, basically unusable. However, I did not do gaze calibration with my glasses on. That would probably improve things, but I suspect the device will always have issues correctly tracking gaze through glasses because normal glasses lenses can distort your eyes more than it’s expecting.
- There's really not much space between your glasses and the inside lenses on the vision pro. It seems like your glasses would start rubbing up against the inside lenses very easily, causing permanent scratches on one or the other.
So technically you can, but I would not recommend it. I’d rather just spend the $150 for the official lenses.
1. There’s very little space for your glasses and you’ll end up scratching both your glasses and the optics
2. You’ll partially obscure eye tracking.
3. Eye tracking will be significantly warped across your glasses. You’ll have drastically reduced quality. This is extra bad if you’re astigmatic.
Prescription inserts or contacts are the correct way to use these kinds of devices for strong reasons.
Doesn't feel it would be difficult to fix this with video playback, sorta thing I'd expect Apple to solve.
We really need a new Steve Jobs to shake things up. Someone to say “this is stupid, show me again in 15 years”. When they’re actual glasses and cost less than $1000
That being said, video experience, other graphics are a lot better than I expected.
Hopefully someone at Apple reads this and takes it to heart when deciding what to improve for a V2. I might actually buy one if they address some of these problems, despite having been avoiding the hype until a coworker bought one and let us try it.
Passthrough was really disappointing. That part actually kind of shocked me, so many reviews online stating they forgot they were wearing the headset. I can't imagine how, though - it's like looking through drunk goggles. The rendered content was unbelievable though.
I own a Valve Index and the Vision Pro blew it out of the water for pixel density. Mind-blowing.
My take is however that yes at this point, Vision OS has failed to hit the aspirational goal of launching XR into the mainstream. There's no App store gold rush, after the initial couple of weeks of hype most people forgot it exists, and I'm truly surprised that Apple seems to have even fumbled the ball on content. That seemed like the easy part to me, but no new environments, nothing new to watch, the Apps hitting the store are mostly toys and really zero things that haven't been done before or fully explored on other devices. Nobody is raving about watching sports on it. etc etc.
What I really hoped was that the novel features of the Vision Pro would kick off a new wave of creativity in terms of what is possible. When I saw people pulling 3D objects out of web pages and onto their desk it really seemed like this could be a true start of something.
The one upside I think is that the slightly anemic launch is probably going to bias Apple to be more open and less controlling than they would otherwise. I hope so.
Apple will have an uphill battle until they show experiences that cannot be replicated in other devices, rather than “you can do X here as well, maybe slightly better”.
I held off from ordering for the same reason — bought Quest, tried it for a week, still sitting in my shelves since last year. Every advertised thing for AVP I can already do on my phone or laptop, without the need to strap on/off an object onto my face.
I really hope they come up with cool stuff and make it more comfortable to wear, but until then, I don’t think it’ll gain a mainstream adoption.
My unvalidated assumption is that if I can take screenshots of a movie on my Mac, I can also record the full content and duplicate the movie. Thus, this is a copy protection measure.
A better way to think of this might be 'voluntary commercial agreements entered by gigantic corporations'. The results are some mildly annoying misfeatures inflicted on consumers (the 'dystopia' in the GP comment) and wide, cheap, on-demand availability of unfathomable quantities of digital content (the 'utopia' we actually live in).
In my opinion, there are two kinds of people who work in Hollywood:
- People who make movies, who probably don't care if you pirate the movie much because the studio is going to screw them over in their pay anyway. These people want as many folks to see the movie in all its different ways as possible.
- People who run the studios, whose salaries, bonuses, etc, are all attached to investors, who want to see the film returns increase. These people want you to see the movie, but only you, only after you pay, for each viewing, in the time, place, and conditions they set.
Anyway, the one thing that is even more user-hostile is sending app-makers screenshot-was-made events without telling the users. This is a thing on Android.