I tried to use the Vision Pro for work, and I'm not sure if it was just my eyes or what, but looking at code inside of that thing was just...exhausting. When I took it off, I looked at my regular monitors with a newfound love.
I'd love for this thing to reach it's full potential as this 'work from a mountaintop, but really your garage' device, but I feel like until the resolution gets to the same as existing monitors (no small task, I know) it's just...not as good for the vast majority of use cases.
I returned mine today after attempting to use it for work for a few hours at a time over the last week. I felt the same eye strain with my Mac as a mirrored display.
Zoom calls were cool, but nobody could take the Persona seriously.
After a few days the eye strain seemed to get worse and worse, until yesterday it give me such a bad headache I decided that was enough.
> Zoom calls were cool, but nobody could take the Persona seriously.
This is going to be another of their socially awkward gimmicks like Memojis they will double down until they inevitably fail.
I really feel like Apple actually just doesn’t feel it and every time they’re pushing their weird geeky ideas onto their users they loose a bit of coolness factor. And if kids decide Apple got too cringe, while someone else manages to use that to spin their momentum (think e.g. Nokia respawning riding the 90s sentiment wave), they may actually start to seriously struggle.
> This is going to be another of their socially awkward gimmicks like Memojis they will double down until they inevitably fail.
I don't necessarily agree. Some have talked about how the weirdness starts to fade after getting into a conversation and focusing on the discussion or collaboration at hand. It seems that once the brain has adjusted a bit, it can start to fill in for the badness somewhat.
The feature clearly has a long way to go before it's good, but I think it's premature to dismiss it. Future iterations will only improve, so if some people are finding some success with it now, that will only grow.
No, I absolutely get that. I still remember how talking to yourself wearing wired headset on the street was weird.
But it’s been over 5 years since Apple started pushing Memojis, they still continued as recently as last year and they have little adoption still, as far as I am aware.
Those who watched WWDC remember how cringey those Memoji bits were, I specifically refer to that aspect of their being increasingly out of touch.
I dunno. I know people who use their Memojis regularly and seem to enjoy it. I personally do not, but I think I’m not their target user. Not sure what the broader adoption looks like.
But I think these are different enough capabilities that the success or failure of one is not necessarily predictive of the other.
As a method of estimating adoption, that seems fairly suspect. It doesn't exactly seem like a capability that would generate much news.
Google Trends seems more likely to be instructive, and as a topic, it has shown fairly steady (if low) interest over time after peaking on release with some periodic spikes most likely correlated with major updates.
But as a feature, I wouldn't expect even Google trends to be very instructive (for understanding adoption), since people who know how to use them aren't likely to be out there searching. More use = more familiarity = less searching. Who knows; maybe they're barely used, but there isn't good data to back up that claim, and there are a number of other ways to interpret the data that does exist.
With all of that said, I'll maintain that I don't see any real connection between them and Personas, or any predictive value in comparing them.
Uh, of the 100 or so acquaintances I have who use iphones, way more than half of them use their memoji for their public-facing visual representation. Nobody writes news articles about that for the same reason they no longer write news articles about the ability to use your voice to dictate messages, or charge your phone without plugging in a cable.
I’ve wondered the same. Apple has been a “given” for 20 years because they somehow keep shipping great stuff and avoid the Microsoft trap of looking like total dorks by existing in an echo chamber.
Even if they made a small misstep or had an awkward moment in a launch announcement, it was seen as endearing and forgivable.
But there seem to be an increase in moments where Apple comes across as behind the curve, or not as aware of where the public is at relative to them, compared to then.
>avoid the Microsoft trap of looking like total dorks by existing in an echo chamber.
Lolwhut?? Microsoft wishes it had a fraction of the echo chamber Apple fans create. It's what Apple is known for.
>Even if they made a small misstep or had an awkward moment in a launch announcement, it was seen as endearing and forgivable.
Uhhhh... "You're holding it wrong" was an absolute unmitigated PR disaster for Apple. It was one of the worst kinks ever in the "reality distortion field". People were rightly pissed. It was smug and stupid, not endearing.
I believe the premise is that Microsoft--the employees and management or whatever: the entity, not the ecosystem or the users--exists in an echo chamber... as in, they keep thinking their users want stuff but their users actually don't.
Don't worry, most people on Hacker News have no real idea how normal everyday people use tech products. This person thinks Memojis are unused, but they just lack perspective.
Apple hasn't been cool for a long, long time. Everyone has an iPhone so it's no longer special. It's just a waste of money when you can get basically the same android phone for half the price
> After a few days the eye strain seemed to get worse and worse, until yesterday it give me such a bad headache I decided that was enough.
Curious if you've ever gotten your vision tested?
I don't need glasses in everyday life, but I did go to an optometrist and got a pair anyways after I got annoyed one night that a friend could read a faraway sign and I couldn't quite. They make things a little bit sharper but not that it ever makes a difference for anything I actually need.
But then I discovered that if I wear them, zero eye strain in VR. Without them my eyes hurt after 20 minutes. With them, I can use VR for hours, zero problem.
No idea why. And I can't seem to find much information on it, but I asked my optometrist and they said it's a whole thing -- people who wear glasses sometimes not to see better, but to reduce eye strain and headaches.
Other vision issues can be relevant too. I have a very slight lazy eye - I can still see 3D video but my stereo vision is worse than average. I suspect it affects the eye tracking because for me it feels a little tedious and imperfect, but others don’t seem to feel that way.
Interesting. I'm also nearsighted, so I've always assumed that I don't need glasses when wearing VR headsets. It's easier to just take them off before putting on the headset (MQ3) and I've not noticed a difference in clarity — but I do experience eye strain and visual exhaustion if I wear the headset for too long, so it might be worth comparing longer sessions with and without glasses.
Could be the motion blur. Vision Pro is in this really weird cross section of insanely good visuals but really bad motion blur. Generally a little motion blur is OK, but the better your visuals get, the worse and more apparent motion blur can be.
Personally I found AVP to be most draining if I'm moving my head around a lot and experiencing this motion blur. If I'm just looking at the screen in front of me, I get fatigued less
I’m not extremely productive, but I have 2 use-cases I’m excited about.
1. Having more screen real estate in my small home office. I often dive into spaghetti code, and seeing more of it helps me maintain context.
2. Working in my RV. I can’t take my extra screens with me (it’s a multi-use family RV, so I’m not mounting anything. Plus, there isn’t room.), and I’m so, so excited about having more screen real estate in there. We lived/worked in it for 3 months last summer, and it was really nice coming home to more screens at the end of the trip.
Similarly, I'm intrigued by the possibility of using it for coding whilst aboard a sailboat, where having multiple large physical screens isn't a possibility due to limited space and lack of suitable mounting surfaces.
Very curious to see how tolerable the UX is in an environment that's almost always experiencing some degree of motion independent of the user.
> Similarly, I'm intrigued by the possibility of using it for coding whilst aboard a sailboat, where having multiple large physical screens isn't a possibility due to limited space and lack of suitable mounting surfaces.
It’s amazing to me what people will do to avoid being present in the moment. You’re on a sailboat, enjoy it, smell the ocean, feel the water on your face, breathe deeply and take it all in. If you’re just going to strap goggles on your face to blind you to the beauty around you, then why did you even get a sailboat to begin with?
I don’t have unlimited PTO, but it sure is nice to shut my laptop at 5 and be somewhere beautiful. I don’t have that luxury at home and it’s a huge perk with my current job.
When I’m on PTO, my laptop doesn’t go with me. I absolutely use all my PTO every year.
As permanently mobile as a wife and young kid allow. We spent 3 months on the road 2 summers ago and had a great time!
I expect our radius to get smaller as our child gets older and goes to school, but we are very passionate about being outdoors as much as possible.
For example - I helped build a mutually accessible structure in a national forest, and my favorite work days are out there. We installed solar last summer, so it’s workable year round now.
In this model, the work pays for the cruising. There's plenty of time to enjoy the elements, local sights, boat maintenance, what have you, after taking care of business.
Besides the ocean and elements, there's also a certain beauty in using the latest technology to operate a SAAS company from almost anywhere on the planet.
Same. I returned it the other night because I finally just gave up on trying to make it comfortable. My body was just rejecting it. I could make it 20-30 minutes.
It's a bunch of factors. Heavy, lots of pressure on a few specific points, the dangling cable to a slippery battery which I had to leave plugged in all day, the grainy passthrough... But most of all it's just too heavy. And connection would sometimes get janky between Mac and Vision Pro.
I've got high hopes for generation 2 and 3 but it needs time to cook.
When I don't wear contacts for a while(just normal glasses), wearing them throughout the day also makes me a bit "tired". Even though the "resolution" is basically the same. But after wearing them consecutively for a few days, it becomes like the same as glasses.
I wonder if there's something similar here going on. Since we rely so heavily on vision for everything(Especially balance). Any difference to our normal perception will cause "exhaustion" of sorts. But maybe wearing it continuously for days can cause our body to adapt to it?(Which, obviously is impossible for AVPs)
That might have to do with the eye "enlarging" a little to make space for the lenses, or maybe more likely with you learning to "lubricate" the lenses with tearing
(pure conjecture, I last used lenses fifteen twenty years ago)
I haven't tried them, but I imagine that despite the high resolution, text that is badly aligned (and probably constantly imperceptibly wobbling) strains the eyes.
The software should probably force text to be aligned on whole real pixels, even if that detracts a little from the realism.
And the best would probably be to keep virtual screens completely fixed until you move by a certain, largish degree (as an option).
Then again, maybe this has nothing to do with the straining.
I've watched a few reviews where the claim is "multiple 4k monitors", and even on Apple's site it says "More pixels than a 4K TV. For each eye.". But any virtual monitor is going to be scaled down, and with "spatial computing" being the desired interaction, it's not going to be projected at a fixed point on the embedded screens. Sure, when you have a 4k monitor across the room, it's smaller because it's further away, but the full resolution is there (reality is much higher fidelity than "retina display" ever was and even the Vision Pro is). When a virtual display is projected into a space further away, it's going to take up fewer pixels and be down-sampled. It's kind of annoying that the term "4k" is being used to refer to "physical space the display takes up" or "size reported to the operating system" rather than the physical pixel density.
While the resolution is high, the PPD is very low (pixels per degree of vision). It's lower density than a classic monitor, and nowhere close to a modern high density 4k or retina display.
Also your eyes can't really focus the same, anything within about 12 ft causes you to struggle to focus leading to eye strain. This is an unfortunate reality of the lenses
High DPI monitor text effectively can't be misaligned, especially since Apple's text rendering always dilated characters instead of trying to fit them onto pixel grids.
That's an interesting point. You potentially also have alignment issues with the window being placed spatially, i.e., rendering text that is not perpendicular to the plane of the screens. When moving my head in the AVP I'm moving the screens, unlike when I move my head to look at a different part of a monitor.
Text in natively rendered apps is perspective corrected before rendering and incredibly sharp as a result. It’s been mentioned in a few interviews in passing.
Text in streamed displays from a Mac may suffer from pixel misalignment.
Well ok, but is it a problem to have the text aligned on both screens, if the user is fine with some "stuttering" when moving or with keeping the virtual screen fixed?
I couldn’t stand the constant glare when looking at code. Everything just felt hazy combined with the awful pass through was a downgrade. Was more productive at first since I could block the world out. Didn’t last.
VR headsets’ resolution are still a couple orders of magnitude away from being indecipherable from normal vision, and that doesn’t even include motion.
Eye stain has been a huge issue for me with earlier VR goggles.
Some people seem to be suggesting the higher quality of the Vision Pro overcomes this, but I'm starting to really wonder. Would even more resolution actually solve it? Or maybe there's really no way around the discomfort for some of us.
When you move your head around in Vision Pro, do the windows stay where you put them, or do they follow your head? If they stay where put, is this perfect, or do they jig a little bit?
My NReal Air are nice glasses, but the fact that I can’t look away from the huge ass screen is annoying af. Eye movements can only get you so far, and the intuitive head movement does not work. Argh!
Not you you responded to but I tried both. I mean I never seriously thought the passthrough would be good enough, and it wasn't, but it was just barely legible and useful when I was trying to pair my AVP to my MBP. But I really tried to like MVD and I just couldn't do it. It wasn't clear enough and felt like an added "tax" on my mind, also I felt very limited compared to when using my external monitors.
Is the virtual display feature as it's presented now likely to be a stopgap or fallback? A bit like emulation or Rosetta apps when Apple silicon was new, or running iPhone apps on an iPad. Those were things that seemed core when each was first introduced and then quickly disappeared for most people in most cases.
I wonder if it could largely be replaced by native AVP apps or a better way for them to send out data from the Mac to the headset once there is broader software support?
Many people work on remote desktop all day long, and I spend my fair share of time in SSH sessions as well. It's not like it improves the experience compared to working locally, but for me it works fine so long as you're within a few hundred kilometers without much jitter. On the VR, the screen should move as you move your head because that position isn't what's being passed through, so that can't be the difference either. I don't understand (without having a device myself) how/whether this is worse than normal video streaming over LAN?
If i'm not mistaken, judge2020 meant that what's not suitable is looking through your AVP at your real monitor. The virtual display (glorified VNC or whatever) that you're saying is OK, I think the agree is also OK.
And I'd assume that yes, it would be torture to try to view a monitor through the AVP just due to the resolution loss. It would be like poorly downscaling the 4k/5k resolution of your 27" monitor to like 1366x768 but much worse since the pixels are not even staying lined up on a level grid but resampled at slightly diagonal angles as your head moves even a couple of degrees. I am pretty sure setting up 2 more big $1000 monitors left and right would be better than "center monitor + AVP with virtual apps left and right" (and it would save about $2000 lol).
I agree with you completely myself -- I bought a 4k 27" HP Envy monitor in the $400ish range about 5 years ago, and that plus the 16" laptop on a double arm setup is a great setup for me. But I definitely 2.5x'd the numbers in case of the people I've heard of (certain deeply-observant Apple devotees) who wax poetic about how using anything but a perfect multiple pixel ratio is painful to their eyes (they believe a 27" has to be 5k and don't believe in using a scaled screen resolution). Although tbh, I should have said "$1600 monitors" in that case, as there is nearly zero competition in that resolution+size combo, so Apple's hilarious one is the only option.
You misunderstood, the video passthrough of your surroundings is not good enough. Using the macOS virtual display is fine, there is some noticeable streaming latency, maybe 30ms (that would be solved if it could just take DP over USB-C or Thunderbolt in) but it's suitable for long term use.
Not just giving a display port of some sort seems like such a mistake given that there's already the battery cord/pack and that the virtual display latency is so bad.
I'd love for this thing to reach it's full potential as this 'work from a mountaintop, but really your garage' device, but I feel like until the resolution gets to the same as existing monitors (no small task, I know) it's just...not as good for the vast majority of use cases.