>>119330For once, I really want to go to an Apple store and try one out. Generally speaking, though, the reviews I've seen seem generally pretty underwhelming. Apple has pushed that this is a platform for, "Spatial Computing," but they really haven't put any particular focus into software that would really make that a reality. At the moment, the headset has the typical apps you would find on a new smartphone: photos, settings, messages, a browser, music... Nothing that would really draw anyone into using it. Even what should be a major selling point, of being able to connect it to Macbook seems underwhelming. You can spawn a handful of any number of other apps, but you're limited to a single screen from an actual computer. What apps do seems available mostly come from cross-compatibility with iPad apps. Obviously a $3599-4000 headset isn't particularly even focused towards gaming, but from what I've seen there aren't even really any games available at the moment either.
The biggest flaw I see is that the input method of using your eyes to select UI items is fairly inconvenient. For example, sitting in front of a normal PC, you can look around and type and click on things as you wish, but to actually select anything you need to literally be looking directly at what you want to select. This is why every other sensible VR headset on the market ships with controllers. Or, as is the case on Oculus/Meta headsets, you can emulate controllers with your hands and either physically interact with windows by tapping, or by using a virtual pointer with hand gestures. The other thing I've noticed is that much less care has actually been given to using applications and organizing them within your view. On the Quest, for example, multitasking windows spawn connected to the main taskbar. From that taskbar you then also have the ability to launch other VR applications, or you can close windows by tapping the X in the corner. On the Apple headset, there's no equivalent taskbar, and windows can be placed anywhere. That naturally allows for much more freedom, but it's unstructured nature leads your immediate vicinity to end up looking cluttered with windows on top of windows with no real management to speak of. This is honestly the weirdest thing about the Apple Vision Pro to me. It seems a considerable amount of time went into the styling of the UI, but little effort seems to have been put into thinking about how people will actually use it. It feels like they expect users to only ever want to open a single window, and when they're done using that application, they close it and return to the app drawer. For example, you can look at the clear level of attention put in, by virtue of the realistic shadows that windows cast onto the real, physical environment that surrounds you. Shadows respect depth and are not just an overlay, and much attention was put into masking out the user's hands and arms so that they can interact with menus and the UI. But then you see other things, like for example, when in "immersive mode" objects that are in your immediate surroundings are not highlighted or masked out like the user's hands and arms. This has become a fairly standard feature on Quest headsets which highlights the contours of objects to prevent the user from bumping into furniture or from stepping on their cat, but it's nowhere to be found on the Apple Vision Pro, seemingly. At most, the Apple Vision Pro seems to decrease the opacity of your VR surroundings to show you the real world, but this looks frankly kind of amateurish. When the headset is capable of seamlessly scanning your surroundings such that the user never has to set up a placespace thanks to the LiDAR sensor mapping out the physical contours of the room, it seems like a massive oversight to then not do anything with that data other than tracking the user's hands and arms and allowing the creation of an AR persona by scanning the user's face out of the headset.
The more I think about the Apple Vision Pro, despite it's phenomenal specs that far surpass anything else on the market at the moment, the actual operating system itself seems almost embarrassing and unlike Apple. That may in part be due to the long development time of the Apple Vision Pro, and it's tenuous status during said long development time; I recall one anecdote from an article written about the headset before it was announced, describing how the project leads had to personally go before executive-level management to vouch for the project. On the other hand, perhaps it's not surprising. Over the last decade Apple has become much more focused on their mobile products, iPhone and iPad, their golden children, much to the languishing of their Mac products that increasingly feel like an afterthought. Although their initial ARM M1 processor was heralded as revolutionary, how much has actually been said about Macbooks and iMac, and Mac Minis since? According to some metrics I can find online, Apple by shipment numbers, made up as much as ~13% of all PC sales in Q3 2023, but this feels incredibly inflated. In the early 2010s, Apple's OS X devices felt far more prevalent than they do now. Furthermore, most people are not out buying a PC with any regularity. The Steam hardware survey seems far more grounded, and the story it tells is that Apple devices make up less than 1.5% of devices surveyed. That's even less than Linux which is near 2%! It's really no wonder then why there's iPad compatibility and not OS X application compatibility then, but at the same time, everything one would likely need or want to do productivity-wise is something that would need to be done on an OS X device, which is what makes this feel so short-sighted.