Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I tried to use the Vision Pro for work, and I'm not sure if it was just my eyes or what, but looking at code inside of that thing was just...exhausting. When I took it off, I looked at my regular monitors with a newfound love.

I'd love for this thing to reach it's full potential as this 'work from a mountaintop, but really your garage' device, but I feel like until the resolution gets to the same as existing monitors (no small task, I know) it's just...not as good for the vast majority of use cases.



I returned mine today after attempting to use it for work for a few hours at a time over the last week. I felt the same eye strain with my Mac as a mirrored display.

Zoom calls were cool, but nobody could take the Persona seriously.

After a few days the eye strain seemed to get worse and worse, until yesterday it give me such a bad headache I decided that was enough.


> Zoom calls were cool, but nobody could take the Persona seriously.

This is going to be another of their socially awkward gimmicks like Memojis they will double down until they inevitably fail.

I really feel like Apple actually just doesn’t feel it and every time they’re pushing their weird geeky ideas onto their users they loose a bit of coolness factor. And if kids decide Apple got too cringe, while someone else manages to use that to spin their momentum (think e.g. Nokia respawning riding the 90s sentiment wave), they may actually start to seriously struggle.


> This is going to be another of their socially awkward gimmicks like Memojis they will double down until they inevitably fail.

I don't necessarily agree. Some have talked about how the weirdness starts to fade after getting into a conversation and focusing on the discussion or collaboration at hand. It seems that once the brain has adjusted a bit, it can start to fill in for the badness somewhat.

The feature clearly has a long way to go before it's good, but I think it's premature to dismiss it. Future iterations will only improve, so if some people are finding some success with it now, that will only grow.


No, I absolutely get that. I still remember how talking to yourself wearing wired headset on the street was weird.

But it’s been over 5 years since Apple started pushing Memojis, they still continued as recently as last year and they have little adoption still, as far as I am aware.

Those who watched WWDC remember how cringey those Memoji bits were, I specifically refer to that aspect of their being increasingly out of touch.


I dunno. I know people who use their Memojis regularly and seem to enjoy it. I personally do not, but I think I’m not their target user. Not sure what the broader adoption looks like.

But I think these are different enough capabilities that the success or failure of one is not necessarily predictive of the other.


> Not sure what the broader adoption looks like.

Just google news Memojis and see what you get. Hardly any content, and if you navigate 2-3 pages further, you quickly reach 2022 and 2021 articles.

If that doesn't scream "low adoption" then I don't know what does.


As a method of estimating adoption, that seems fairly suspect. It doesn't exactly seem like a capability that would generate much news.

Google Trends seems more likely to be instructive, and as a topic, it has shown fairly steady (if low) interest over time after peaking on release with some periodic spikes most likely correlated with major updates.

But as a feature, I wouldn't expect even Google trends to be very instructive (for understanding adoption), since people who know how to use them aren't likely to be out there searching. More use = more familiarity = less searching. Who knows; maybe they're barely used, but there isn't good data to back up that claim, and there are a number of other ways to interpret the data that does exist.

With all of that said, I'll maintain that I don't see any real connection between them and Personas, or any predictive value in comparing them.


> More use = more familiarity = less searching.

For the record, my point here was: More use = more familiarity = better writing subject for portals and journalism


Uh, of the 100 or so acquaintances I have who use iphones, way more than half of them use their memoji for their public-facing visual representation. Nobody writes news articles about that for the same reason they no longer write news articles about the ability to use your voice to dictate messages, or charge your phone without plugging in a cable.


Good thing its not Memoji.


I’ve wondered the same. Apple has been a “given” for 20 years because they somehow keep shipping great stuff and avoid the Microsoft trap of looking like total dorks by existing in an echo chamber.

Even if they made a small misstep or had an awkward moment in a launch announcement, it was seen as endearing and forgivable.

But there seem to be an increase in moments where Apple comes across as behind the curve, or not as aware of where the public is at relative to them, compared to then.


>avoid the Microsoft trap of looking like total dorks by existing in an echo chamber.

Lolwhut?? Microsoft wishes it had a fraction of the echo chamber Apple fans create. It's what Apple is known for.

>Even if they made a small misstep or had an awkward moment in a launch announcement, it was seen as endearing and forgivable.

Uhhhh... "You're holding it wrong" was an absolute unmitigated PR disaster for Apple. It was one of the worst kinks ever in the "reality distortion field". People were rightly pissed. It was smug and stupid, not endearing.


> "You're holding it wrong"

I cringe every time I hear someone say this to malign stupid users. Yes, I hear it at work.

Some people legit only read the headlines about that story, not the articles.


I believe the premise is that Microsoft--the employees and management or whatever: the entity, not the ecosystem or the users--exists in an echo chamber... as in, they keep thinking their users want stuff but their users actually don't.


You're misunderstanding. Apple exists in an echo chamber, Microsoft wishes they did.


I imagine version 2 will likely have uncanny-valley-crossing AI filters to make it indistinguishable from your real face, background and expressions.


> they’re pushing their weird geeky ideas onto their users

This thing is like $5k all-in and even then you have to get on a waiting list. I wouldn’t really say that they’re pushing this on people.


Not talking about the device itself here.


Damn. My family use Memojis heavily!


Don't worry, most people on Hacker News have no real idea how normal everyday people use tech products. This person thinks Memojis are unused, but they just lack perspective.


> how normal everyday people use tech products

What do you mean? Obviously normies all browse the web in emacs and write their own plugins in elisp.


> This person thinks Memojis are unused, but they just lack perspective.

Or maybe you lack perspective, because maybe some people use it in your circle, but otherwise it’s very unpopular?

Observational fallacies work both ways.


100% of my social circle works outside tech, and nobody uses memojis


The world is bigger than just the US of A. Memojis are big among the teen set elsewhere.


If the amount of internet content regarding Memojis is to be a judge, Memojis might as well not exist at all.

From my broad circle in the US and Europe, I know one person who sends one maybe once a quarter.


Wait till you find out where Apple makes >50% of its revenue ;)


Pray tell, where should we be talking about?


The point is that Apple trades in part on being fashionable.

Lots of people use Facebook but virtually nobody considers Facebook to be fashionable.

Meta doesn't care if Facebook is fashionable, but the more Apple looks like Meta, the bigger the opening for being disrupted on that front gets.


They’re wildly popular with the kids in my orbit especially 6-12


What's the alternative? You have goggles strapped to your face. Personas are the only thing you -can- do to have any sense of presence on a call.

It's not perfect. It's not even good. But it's better than nothing.


> coolness factor

Apple hasn't been cool for a long, long time. Everyone has an iPhone so it's no longer special. It's just a waste of money when you can get basically the same android phone for half the price


By definition it isn't a gimmick.

You either have some rendered 3D model or you take the headset off when making a call.

Physics dictates that those are your only two choices.


I've never heard of memojis before...


Same, and now I feel old again : - \


We're so close to having ai just re-render your face in the videos.


> After a few days the eye strain seemed to get worse and worse, until yesterday it give me such a bad headache I decided that was enough.

Curious if you've ever gotten your vision tested?

I don't need glasses in everyday life, but I did go to an optometrist and got a pair anyways after I got annoyed one night that a friend could read a faraway sign and I couldn't quite. They make things a little bit sharper but not that it ever makes a difference for anything I actually need.

But then I discovered that if I wear them, zero eye strain in VR. Without them my eyes hurt after 20 minutes. With them, I can use VR for hours, zero problem.

No idea why. And I can't seem to find much information on it, but I asked my optometrist and they said it's a whole thing -- people who wear glasses sometimes not to see better, but to reduce eye strain and headaches.


Other vision issues can be relevant too. I have a very slight lazy eye - I can still see 3D video but my stereo vision is worse than average. I suspect it affects the eye tracking because for me it feels a little tedious and imperfect, but others don’t seem to feel that way.


You may be able to help this with toric contacts. The inserts don't do prism correction so can't help with that specifically.


Interesting. I'm also nearsighted, so I've always assumed that I don't need glasses when wearing VR headsets. It's easier to just take them off before putting on the headset (MQ3) and I've not noticed a difference in clarity — but I do experience eye strain and visual exhaustion if I wear the headset for too long, so it might be worth comparing longer sessions with and without glasses.


VR headsets work like you're focusing at least a few feet away, so if you're nearsighted you need vision correction.


The problem with the vision pro is that you can't wear glasses to use them, and neither does Zeiss make all the prescriptions available.


You can wear contacts for VR as long as they're not colored. (Well, if there's no eye tracking even that's fine.)


With my Xreal airs, something is just always slightly off with focus. It's fine for movies and games, but it sucks for reading text.


Could be the motion blur. Vision Pro is in this really weird cross section of insanely good visuals but really bad motion blur. Generally a little motion blur is OK, but the better your visuals get, the worse and more apparent motion blur can be.

Personally I found AVP to be most draining if I'm moving my head around a lot and experiencing this motion blur. If I'm just looking at the screen in front of me, I get fatigued less


I had the same exact experience.

It’s felt more like I was playing a simulator game of myself coding, and it was not enjoyable at all.

I often wonder how people who actually work at fast paced places as engineers are claiming to be more productive with this strapped on.

If I just ingested and read email and slack messages all day, or talked in zoom all day, sure (maybe), but I don’t.


I’m not extremely productive, but I have 2 use-cases I’m excited about.

1. Having more screen real estate in my small home office. I often dive into spaghetti code, and seeing more of it helps me maintain context.

2. Working in my RV. I can’t take my extra screens with me (it’s a multi-use family RV, so I’m not mounting anything. Plus, there isn’t room.), and I’m so, so excited about having more screen real estate in there. We lived/worked in it for 3 months last summer, and it was really nice coming home to more screens at the end of the trip.


Similarly, I'm intrigued by the possibility of using it for coding whilst aboard a sailboat, where having multiple large physical screens isn't a possibility due to limited space and lack of suitable mounting surfaces.

Very curious to see how tolerable the UX is in an environment that's almost always experiencing some degree of motion independent of the user.


Seems like a recipe for motion sickness.



> Similarly, I'm intrigued by the possibility of using it for coding whilst aboard a sailboat, where having multiple large physical screens isn't a possibility due to limited space and lack of suitable mounting surfaces.

It’s amazing to me what people will do to avoid being present in the moment. You’re on a sailboat, enjoy it, smell the ocean, feel the water on your face, breathe deeply and take it all in. If you’re just going to strap goggles on your face to blind you to the beauty around you, then why did you even get a sailboat to begin with?


I don’t have unlimited PTO, but it sure is nice to shut my laptop at 5 and be somewhere beautiful. I don’t have that luxury at home and it’s a huge perk with my current job.

When I’m on PTO, my laptop doesn’t go with me. I absolutely use all my PTO every year.


Sounds like you have the right idea! Are you permanently mobile?


As permanently mobile as a wife and young kid allow. We spent 3 months on the road 2 summers ago and had a great time!

I expect our radius to get smaller as our child gets older and goes to school, but we are very passionate about being outdoors as much as possible.

For example - I helped build a mutually accessible structure in a national forest, and my favorite work days are out there. We installed solar last summer, so it’s workable year round now.


In this model, the work pays for the cruising. There's plenty of time to enjoy the elements, local sights, boat maintenance, what have you, after taking care of business.

Besides the ocean and elements, there's also a certain beauty in using the latest technology to operate a SAAS company from almost anywhere on the planet.


Life on a sailboat is mostly pretty boring.


Same. I returned it the other night because I finally just gave up on trying to make it comfortable. My body was just rejecting it. I could make it 20-30 minutes.

It's a bunch of factors. Heavy, lots of pressure on a few specific points, the dangling cable to a slippery battery which I had to leave plugged in all day, the grainy passthrough... But most of all it's just too heavy. And connection would sometimes get janky between Mac and Vision Pro.

I've got high hopes for generation 2 and 3 but it needs time to cook.


When I don't wear contacts for a while(just normal glasses), wearing them throughout the day also makes me a bit "tired". Even though the "resolution" is basically the same. But after wearing them consecutively for a few days, it becomes like the same as glasses.

I wonder if there's something similar here going on. Since we rely so heavily on vision for everything(Especially balance). Any difference to our normal perception will cause "exhaustion" of sorts. But maybe wearing it continuously for days can cause our body to adapt to it?(Which, obviously is impossible for AVPs)


That might have to do with the eye "enlarging" a little to make space for the lenses, or maybe more likely with you learning to "lubricate" the lenses with tearing

(pure conjecture, I last used lenses fifteen twenty years ago)


I haven't tried them, but I imagine that despite the high resolution, text that is badly aligned (and probably constantly imperceptibly wobbling) strains the eyes.

The software should probably force text to be aligned on whole real pixels, even if that detracts a little from the realism.

And the best would probably be to keep virtual screens completely fixed until you move by a certain, largish degree (as an option).

Then again, maybe this has nothing to do with the straining.


I've watched a few reviews where the claim is "multiple 4k monitors", and even on Apple's site it says "More pixels than a 4K TV. For each eye.". But any virtual monitor is going to be scaled down, and with "spatial computing" being the desired interaction, it's not going to be projected at a fixed point on the embedded screens. Sure, when you have a 4k monitor across the room, it's smaller because it's further away, but the full resolution is there (reality is much higher fidelity than "retina display" ever was and even the Vision Pro is). When a virtual display is projected into a space further away, it's going to take up fewer pixels and be down-sampled. It's kind of annoying that the term "4k" is being used to refer to "physical space the display takes up" or "size reported to the operating system" rather than the physical pixel density.


While the resolution is high, the PPD is very low (pixels per degree of vision). It's lower density than a classic monitor, and nowhere close to a modern high density 4k or retina display.

Also your eyes can't really focus the same, anything within about 12 ft causes you to struggle to focus leading to eye strain. This is an unfortunate reality of the lenses


> The software should probably force text to be aligned on whole real pixels, even if that detracts a little from the realism.

I think that'd be difficult given that pixels are effectively "non-rectangular" given the warping from the lenses.


Well they might be non-rectangular, but text aligned on their borders should still be sharper than misaligned text, no?


High DPI monitor text effectively can't be misaligned, especially since Apple's text rendering always dilated characters instead of trying to fit them onto pixel grids.


That's an interesting point. You potentially also have alignment issues with the window being placed spatially, i.e., rendering text that is not perpendicular to the plane of the screens. When moving my head in the AVP I'm moving the screens, unlike when I move my head to look at a different part of a monitor.


Yeah with slanted screens it might well be best to not align anything, at least beyond certain angles


Text in natively rendered apps is perspective corrected before rendering and incredibly sharp as a result. It’s been mentioned in a few interviews in passing.

Text in streamed displays from a Mac may suffer from pixel misalignment.


> aligned on whole real pixels

Remember that you're working with two screens, not one, and they have to have coordinated projections that will also depend on the user's IPD.



Well ok, but is it a problem to have the text aligned on both screens, if the user is fine with some "stuttering" when moving or with keeping the virtual screen fixed?


It's a problem because a given piece of text isn't going to match the same pixel boundaries on both screens.


Windows aren’t perpendicular to your view, though. They shear because of perspective.


Yes I was mostly considering exactly perpendicular windows; it might be beneficial to let go of some realism and perspective to work more comfortably


I couldn’t stand the constant glare when looking at code. Everything just felt hazy combined with the awful pass through was a downgrade. Was more productive at first since I could block the world out. Didn’t last.


Who knows what will be the long term consequences of using these devices on the eye. Better to be wary


There shouldn't be any as long as you're not a child. Children need to spend a lot of time outside to avoid developing myopia.


VR headsets’ resolution are still a couple orders of magnitude away from being indecipherable from normal vision, and that doesn’t even include motion.


Eye stain has been a huge issue for me with earlier VR goggles.

Some people seem to be suggesting the higher quality of the Vision Pro overcomes this, but I'm starting to really wonder. Would even more resolution actually solve it? Or maybe there's really no way around the discomfort for some of us.


When you move your head around in Vision Pro, do the windows stay where you put them, or do they follow your head? If they stay where put, is this perfect, or do they jig a little bit?


They really seem to be glued to the real world, not moving when I move.


At least one less reason for headaches.

My NReal Air are nice glasses, but the fact that I can’t look away from the huge ass screen is annoying af. Eye movements can only get you so far, and the intuitive head movement does not work. Argh!


Were you using macos virtual display? Using the passthrough to look at monitors is passable but not suitable for even a few minutes of use.


Not you you responded to but I tried both. I mean I never seriously thought the passthrough would be good enough, and it wasn't, but it was just barely legible and useful when I was trying to pair my AVP to my MBP. But I really tried to like MVD and I just couldn't do it. It wasn't clear enough and felt like an added "tax" on my mind, also I felt very limited compared to when using my external monitors.


Is the virtual display feature as it's presented now likely to be a stopgap or fallback? A bit like emulation or Rosetta apps when Apple silicon was new, or running iPhone apps on an iPad. Those were things that seemed core when each was first introduced and then quickly disappeared for most people in most cases.

I wonder if it could largely be replaced by native AVP apps or a better way for them to send out data from the Mac to the headset once there is broader software support?


> not suitable for even a few minutes of use

Because of lag, or why?

Many people work on remote desktop all day long, and I spend my fair share of time in SSH sessions as well. It's not like it improves the experience compared to working locally, but for me it works fine so long as you're within a few hundred kilometers without much jitter. On the VR, the screen should move as you move your head because that position isn't what's being passed through, so that can't be the difference either. I don't understand (without having a device myself) how/whether this is worse than normal video streaming over LAN?


If i'm not mistaken, judge2020 meant that what's not suitable is looking through your AVP at your real monitor. The virtual display (glorified VNC or whatever) that you're saying is OK, I think the agree is also OK.

And I'd assume that yes, it would be torture to try to view a monitor through the AVP just due to the resolution loss. It would be like poorly downscaling the 4k/5k resolution of your 27" monitor to like 1366x768 but much worse since the pixels are not even staying lined up on a level grid but resampled at slightly diagonal angles as your head moves even a couple of degrees. I am pretty sure setting up 2 more big $1000 monitors left and right would be better than "center monitor + AVP with virtual apps left and right" (and it would save about $2000 lol).


Eh?

For coding you don't need a 2k$ monitor setup, you can get by comfortably on 3-400$.

It's an interesting argument but they are going to need to try a lot harder for it to be a compelling desktop replacement.


I agree with you completely myself -- I bought a 4k 27" HP Envy monitor in the $400ish range about 5 years ago, and that plus the 16" laptop on a double arm setup is a great setup for me. But I definitely 2.5x'd the numbers in case of the people I've heard of (certain deeply-observant Apple devotees) who wax poetic about how using anything but a perfect multiple pixel ratio is painful to their eyes (they believe a 27" has to be 5k and don't believe in using a scaled screen resolution). Although tbh, I should have said "$1600 monitors" in that case, as there is nearly zero competition in that resolution+size combo, so Apple's hilarious one is the only option.


You misunderstood, the video passthrough of your surroundings is not good enough. Using the macOS virtual display is fine, there is some noticeable streaming latency, maybe 30ms (that would be solved if it could just take DP over USB-C or Thunderbolt in) but it's suitable for long term use.


Not just giving a display port of some sort seems like such a mistake given that there's already the battery cord/pack and that the virtual display latency is so bad.


I think its a PWM issue.


Since I didn’t know the acronym.. “Pulse Width Modulation”:

https://appleinsider.com/inside/apple-vision-pro/tips/why-ap...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: