I think my eye strain is a software issue. And other issues

If the lenses are easily removable in 5K/8K units, it seems that it wouldn’t be impossible for Pimax to implement some basic focus control be moving the lenses in and out.
This may affect the field of view a little but it seems a small price to pay to sell units to people like me who normally need to wear glasses.

1 Like

Indeed. Even 3dprinted spacers might be possible. One fellow used a wire or something as a spacer.

1 Like

Cheers. I believe the benefit I get from the PSVR is directly related to the infinite focus keeping my eyes in a relaxed state. As explained to me by my optometrist the role of my glasses is simply to reduce strain on my eyes as my double vision is occurring as my eyes become fatigued due to excessive strain. An infinite focus is the ultimate relaxed state so I think this is why the PSVR helps.

My understanding (which is possibly wrong) is the lenses used in VR headsets have a very short focal length specifically designed for the distance to the display. Wouldn’t moving the lenses only be of benefit if the mounting distance is currently incorrect? And if the lenses are currently mounted at the correct distance from the display wouldn’t moving them give a worse result?

This is why I’m struggling to wrap my head around this canted lanse design being anything other than a bad idea. For what little I know about human optics we process our vision in a variety of ways.

For example stereo vision is not required for depth perception and people with vision in one eye can still perceive distance, just not as well as people with two good eyes. My understanding is one of the ways this is achieved IRL is by our brains calculating the difference in speed of the different colours in the spectrum from what I recall reading somewhere at some time. The thing is our vision is far more complicated than VR headsets currently provide for and that is where systems like eye tracking will no doubt prove beneficial to VR in the future depending on how they are employed. It really is early days.

Getting back to my thoughts on this kind of canted lense design. The wall eyed state is not natural. What is natural is for our vision to be parallel at infinite focus and to converge to objects as they become closer. As such focal convergence is a distance cue. And this is part of my earlier comment about how everyone is not the same. The ability to adapt to unnatural conditions will vary from person to person. Clearly I lack the ability to adapt at all to the wall eyed state because my eye muscles are under strain to maintain normal vision and increasing this strain over works my eyes so that they become so fatigued and are unable to maintain normal focus. I do worry about the long term effects of using a canted designed headset on the average person.

I think especially as optical VR technology moves forward the ultimate optical VR experiences will be those that provide the most natural visual experience. I don’t think we are close at all to achieving that with consumer technologies at this point and I have no idea what is available at the cutting edge of VR research and technology at this time.

For me and clearly others, at this time we get our best experience with the Pimax by using it incorrectly setting it up in a way that it provides parallel vision. As explained this will result in an incorrect or distorted image. And that is where this gets interesting for me. Because the distortion I am seeing in 90hz vs 120hz is completely different. The distortion I see in 120hz is limited to object perspective changing as it moves to the edges of the display but the vision looks normal other than this. My vision in 90hz is unbearable for me for more than a few minutes as it causes severe eye strain but my vision is also more like a fish bowl over my head in 90hz (even in the small FOV setting). In 90hz there is a curvature to my vision that is not there in 120hz. The reason this is so interesting to me is we have been informed there is an issue that Pimax is aware of and trying to fix that effects 5700 owners like me. But we have not been told exactly what this issue effects. This means I do not know if what I am seeing in either mode is intended or a mistake. I have no idea how my personal Pimax experience will change with a future update that corrects the current issues with this card. Unfortunately I have a suspicion because I have used my headset briefly with a GTX 1060 and that was not a good experience. So at this time I suspect there is an issue with 120hz currently that is making the Pimax usable for me.

Ultimately I think in a headset that has canted displays the lenses should be designed to keep the eyes parallel. I don’t understand the decision to intentionally design a headset with canted vision and honestly had I have understood this previously it would have prevented me ever trying a wall eyed VR headset. That said this experience has been for the best because it has been educational and allowed me to understand what to look for in future headsets with a priority on parallel vision and infinite focus.

Thankfully with the current firmware the Pimax is usable for me in 120hz mode with a reduced manual IPD setting. After playing around with IPD offset it seems to me this setting helps correct world perspective but also assists in increasing comfort. For now I think I have settled on -1 for my IPD offset while using a manual IPD adjustment that is 3mm smaller than my measured IPD. But this is only approximate as I have noticed how tight I have the straps effects the optimal adjustment, obviously. In 120hz mode with this “incorrect” adjustment the distortion in the image is minimal and no worse than what the average Pimax user is reporting.

So as I have said previously. I’ll just keep using it like this for now waiting for an update. After the update when I can see if a fix improves or worsens my experience I will then decide if I will remain stuck on this current release or if I will use the fixed software. Then I will see an optometrist for my overdue eye check where I will discuss creating a pair of glasses specifically for Pimax use. I don’t wish to do this now knowing the Pimax experience might change for me in the future.

2 Likes

The original Pimax 4k was designed to allow a certain amount of Myopia to 20/20. So depending on one’s sight increasing the distance from lens to screen may help some to use the headset with better focus.

LukeB I believed used 1mm washers to bring the headset into better focus.

1 Like

Very likely. Could still be more than one factor benefitting you, though. :7

Not if the offset compensates a fixed focussing error in the viewer’s own eye lens, just like a pair of glasses would.

Ok, off to work now… Late… I’ll respond to the rest when I get back, unless somebody else beats me to it, or you expressly do not want me to. :slight_smile:

1 Like

So… picking back up…

Hmm… I do have doubts about that, specifically. We had a discussion right here, only the other week, where one among us (neal_white_iii) explained to us how the reason one often get a false sense of depth, where there shouldn’t be any (contrasting bright red areas, on a surface, appearing as if floating a bit off that surface, closer to the viewer than colder-colored neighbouring ones, even though they’re all on the same flat plane), is due to chromatic aberration in the human eye, with higher frequency blue light being more strongly refracted by the lens, than red light, which oscillates slower (not travels slower), and thus covering a smaller part of the retina, than the red. Maybe that is what you were referring to?

Lots of different depth cues in use, though – stereopsis is not particularly effective past a not-all-that-long range, anyway, as the degrees-of-angle-per-metre-away becomes smaller and smaller with distance. :7

Oh yes! :slight_smile:

Yep! …although I believe the vergence-accommodation reflex, (albeit presumably being something of a mutual feedback loop), tends to begin with convergence, and then accommodating with a degree of focussing of the lens, that we have learned should correspond with that amount of vergence, only doing the heavier “image analysis” jobs of correcting for sharpness, and deducing depth for degree of un-focus in the surroundings, after this.

And this is of course the general annoyance that makes every user’s eyes tired from VR, with current commercially available headsets, vision deficiencies or no – that singular fixed distance virtual focal plane, that defies our natural reflex to accommodate, forcing us to suppress it, and keep the lens constantly under a certain (or none, as the case may be) amount of tension, as well as ignoring that everything in view is sharp and crisp, even where it shouldn’t be; And to which setting that fixed focus to infinity, or to armlength +/- some, are both tradeoffs. :7

There have been various kinds of lightfield display HMD prototypes for a long time (…and some that purport to be, but are not - looking at you, ML :P), and such would solve a whole host of disparate issues with current solutions in one fell swoop.

The (not un-mitigatable) problem with those is two-fold: It takes a lot of resolution to beam out different information for every angle out of every spot on the display plane, and it takes a lot of rendering to give you the right view from every position and direction your pupil can possibly be. :7

We may possibly see some intermediate lower resolution LF displays in not too long, before sufficiently detailed ones… -Better in many ways, than what we have today, but likely exhibiting some blurring/ghosting effects when looking around, and a new flavour of SDE… :7

First of all I would argue that a headset being designed with canted optics does not mean they are not designed for you to keep your eyes parallel in. It is just that a tradeoff has (as usual) been made, for better and for worse.

We are limited by a number of things - mainly stemming from how far from the axis of the lens you can maintain focus and low fresnel reflection index (…by which I do not refer to fresnel lenses, but to the way a surface reflects more, the more oblique the angle is, that you view it at). That limitation is governed greatly by the fact that our display panels are typically flat, whereas the focal distance of the lens, away from its axis, curves, making things become more and more blurry toward the edges. There are solutions to this, but we want to do it with a single, light, and cheap lens, that does not add “thickness” to the device… :7

We can obviously also not draw a picture that surrounds us, on a flat plane, and even when we’re still only doing 160°, as with Pimax in Large FOV mode, we are so close to 180, that to achieve that FOV without canted screens, barring some concentrating/redirecting optics, the device would have had to to be much wider than it is, and those last degrees toward the periphery would be so far away from the lens (…and also extremely oblique - try to look at an LCD monitor at 10° from its plane), that they would have been all lost in blur.

The lens needs to face the screen. The two can not cant relative to one another, or you’d have the same planar depth of field effect, as when you do not hold a magnifying glass parallel with the page you are trying to use it to read.

Ideally, the lens and screen would zip around physically, so that no matter which direction you are looking, they are always aligned with your eyes, skewered to their axes, but that is not practical - at least not today - and when/if it becomes (one could imagine a few not too complicated ways, using current technology :7), we may already have trodden down a different tech path. :slight_smile:

With the limited radius of focus that the lens and screen combo gives us (a curved screen could have ameliorated this greatly, btw, but they are still expensive, and not yet as high res), a tradeoff has been made. The lenses are allowed to sit to the sides, giving us longer out to the right, that we can see sharply with our right eye, and longer to the left with the left eye, but this of course taken from when we look “inwards”, and so turning one’s gaze to the right brings the right eye into focus, but the left one goes out of focus, unlike with a in-line lenses HMD, where when you look around, the change in focus is the same between the eyes, and the falloff from the centre is identical in every direction.

Even when one scrunch up the Pimax lenses, the way you and others do, this effect happens to a degree, still producing uncomfortable binocular rivaly.

Believe me: I’ve been nagging about this issue once or twice around these parts. :7

It is quite possible that one could make another little tradeoff: Canting the screens just a little bit less than the lenses, to favour the “inner” FOV with better adherence to the lens’ field curvature, at the direct cost of the “outer”, but that would bring with it some additional concerns, such as making the distortion asymmetrical.

The Valve index also has canted optics, but it matters less there, because its two-piece lenses attempt to “flatten the field”, making their field curvature better follow the flat screen, so that things are pretty sharp all the way out to the edges, provided you have managed to position your eyes in the tiiiny focal spot for each lens, where this happens.

I see some claim to have a similarly large radius of clarity with the p8k/5k - or even larger. -Alas I for one, with my eyes, am far from one of those lucky ones - I get about the same focus falloff as with my old HTC Vive and Oculus Rift DK1 and CV1, and that’s before taking the canting trading inner-FOV clarity for outer, into account, looking at optimal optical alignment with a single eye.

One thing I will not entertain, is the notion a few who bunch up their lenses try to nurture, that it does not matter that the view is only clear straight ahead (and possibly fairly undistorted (only) straight ahead, with the right amount of software compensation), because in their minds this is the natural state of things; Humans, by this notion, do not in any way look around with their eyeballs - only by turning their entire heads, with the eyes nailed in place.

At that, I can only shake my head (the eyeballs may be rolling back in sympathy). :stuck_out_tongue:

I am really curious what it is that differs in the driver, between 90Hz and 120Hz mode (assuming it is a image issue, and not about there being something about the smoother experience, that makes it easier for you, and your case of strabismus, to track your surroundings) – It would be regrettable if a bug fix turned out to make both refresh rates unusable to you. :confused:

I’ll grant that there could be some cause for concern, about how we are psychovisually affected, “reprogrammed”, by all the imperfect visual cues in VR, but do rather believe that such worries give too little credence to our neuroplasticity for good, rather than for bad – I’d even go so far as to say that children should likely be even better at adapting to and fro, rather than “learning-in an irrreverseable bad behaviour at a formative age”, so to speak, unless they live in VR more than not. :stuck_out_tongue:

EDIT: Ok… Stop me, somebody, next time I seem to be barfing up a wall of text. :stuck_out_tongue:

2 Likes

In a vacuum they all travel the same speed, my understanding is when light interacts with media this effects the speed of different colours.

But I think we agree there are many cues other than stereo vision.

Surely this is one of the primary issues eye tracking can address while also improving VR performance.

I think you misunderstood me. By canted optic I meant canted in the way Risa2000 said the Pimax works. I imagine the mounting of the lens is irrelevant if the design of the lens is such to keep your eyes parallel. So by canted optics I mean any optic solution that requires anything other than parallel vision and I am not referring to the mounting of the lens. That is why I wrote canted optics instead of canted lens.

I’m looking at a bottle right now that is sitting to the left of my guitars but on the left edge I can see my guitars that are to the right of it.

I have no doubt you have a greater understanding of optics than I do. That said I’m not convinced a curved lense can’t be used with a flat panel in a high FOV application to provide natural surround vision from a flat panel while maintaining parallel vision. In saying that I am not aware of what such a lens would cost or if distortion can be eliminated completely. I’d doubt the distortion can be eliminated, my wrap around glasses are very expensive and even they have an element of distortion.

As far as the matter of looking around, it sounds like we agree. I’ve had a number of arguments regarding the issue of eye box. I can assure you if I was not able to look around in the Pimax I would not be interested in it. The claim others have made is they do not look around with their eyes and naturally turn their head to look at objects. I’m not like that, when I turn my head I always first move my eyes and then my head catches up and I constantly move my eyes around to look at things without moving my head IRL. FOV is an important factor for me but more important is the eye box or the clear FOV. And this is an area I think Pimax gets a pass for. To put this in perspective, the Lenovo Explorer has a smaller FOV and full stereo overlap and the resolution is not too high. Yet the render resolution required to achieve a decent eye box is ridiculous, I need to run Reverb resolution for a decent eye box but my PC can’t run that resolution and hold 90 fps so I have always had to sacrifice visual clarity and not being able to look around by moving my eyes really annoys me. So even before I purchased the Pimax I was worried because the Pimax has a greater FOV and a greater resolution and I was worried I would have to compromise the resolution so much to run it the visual experience would be terrible in any FOV mode that is bigger than my current headsets. In that regard I am pleasantly surprised. My visual clarity in the Pimax is better than the Lenovo, my eye box is larger and I have more FOV and this is using settings that allow me to maintain 90fps for iRacing. Yes I need to compromise my settings for some tracks but I also need to do this with the Lenovo. The result is the Pimax is always superior to the Lenovo even with my 4790k/5700XT combination.

I should mention 3 things. Ideally I would like clear vision to the edge of the field of vision, I knew this was never going to happen before I bought the Pimax. Increasing my IPD setting does not improve my focus in the edges of my vision, it only increases eye strain. Now the thing that annoys me. For me to have the clearest vision at the edges I would need to go back to the standard unmodified foam and strap the headset on very tight. For the sake of comfort I must sacrifice some vision quality at the edges. Currently in the largets FOV setting in 120hz mode with my slightly modified foam and the straps pulled tight the compromise is minimal. With a more comfortable fit the edges are more blurred but the eye box is actually slightly better.

I’m also curious and yes I hope that whatever fix does not make that software unusable. In that case I’ll just keep using this software so I’m not too concerned although I would like the option of full functionality. I’d really like to be able to use 90hz with the larger FOV options even if that extra FOV really isn’t very useful.

As far as what it is, it isn’t the smoothness, that I’m sure of. Keep in mind I was told there is a known issue Pimax engineers are trying to fix and all I was told is it has something to do with the wrong json being loaded. I was also told it only effects Navi users. But he did not mention which issue this is related to. The issue he mentioned might have nothing to do with the difference in rendering I am seeing, he might have only been talking about the frame rate issue, I don’t know but I’m keen to see what the future fix actually fixes. I don’t even know if when the headset is in 120hz if it is actually in 120hz or if the Pittol is just changing the FOV and rendering differently for some reason. What I do know is in multiple games now my frames are being capped at 90fps when I am in 120hz mode and if I force reprojection my fps will drop to 45. I can’t see a difference in smoothness in the Pimax between 90hz and 120hz currently. I regret that I have just checked this again and with the same IPD settings and only changing from 120hz to 90hz the only difference aside from a slightly different FOV is 90hz mode is not visually as clear and causes eye strain. I just spent less than a minute trying 90hz and I’m now having vision issues.

For what it is worth. I have tried racing in the PSVR at 120hz, 90hz and 60hz previously. 120hz is clearly smoother in it and so nice. 60hz was an uncomfortable slide show. 90hz was not as smooth as 120hz but it is the setting I settled on when using the PSVR because it allowed me to run higher graphics settings. In the PSVR none of those caused me eye strain. In the Lenovo Explorer I also do not get eye strain. Even at 60hz in my other headsets I don’t get eye strain, I just don’t enjoy the stuttering at 60hz, I find 60hz uncomfortable in VR because it has a stop/go feeling to it but it does not effect my eye sight in other headsets. And the lower refresh/frame rate is only really an issue when there is fast movement. I actually tried turning up my settings in PCars2 yesterday enough to drop my frame rate so that stuttering started, it was interesting how I got stuttering as I looked to the side but as I looked forward I did not really notice the stuttering. This also did not effect my eyes.

I’ll disagree with this. My belief is based on anecdotal evidence but I believe I have seen cause and effect for how activity can have detrimental effects on vision. I am absolutely convinced by the bookworms wear glasses connection. And I’d even cite the case of people who live on small Islands like the people who live on Picairn Island as an example of how activity effects vision. How despite their diabesity inducing diet they have superior long distance vision because they look out to sea constantly. I also think visual therapies that retrain our vision are evidence that support my beliefs. I don’t think activity is the single factor but I am convinced we can adversely effect our vision with our activities beyond the obvious don’t look directly at the sun. I don’t even think it is debated that looking at a monitor all day can adversely effect our vision. If this is known surely the concern the issues we are discussing here could potentially effect our vision is justified.

It would seem we have a similar problem.

Also if this comes off as disjointed or some of it doesn’t make sense. I got interrupted a few times and went back and added things, I have not read it back since writing it.

2 Likes

Well, we did have a report some months ago, where some experimenters had managed to detect a measureable slowdown of light in a certain medium, under certain circumstances (albeit that did not concern itself with differentiating frequencies AFAIR), and I have heard before, the explanation that refraction occurs due to travel/propagation slowdown/speedup when going between differently dense media. I don’t know, at the end of the day - too many seemingly conflicting sources of information. :7

Mmmmnnn… You can use the eyetracking together with the Z-buffer to simulate the effect of accomodation, by applying a bokeh filter around the depth at the spot the user is looking/converging on. Everything will still be on a single flat plane, but for the most part the stuff that should be out of focus will be blurred, not entirely unlike how it would have been for the natural reason, which could be “good enough” most of the time (regardless of the quality of the filter, which is a post effect on the already rendered frame, it could e.g. not differentiate between a pane of glass, and the farther away things you see through it).

Around the time of the Rift DK1, there came another successful kickstarter (although they are gone now, AFAIK), for an HMD that had eyetracking already back then - the “Fove”, and the people who built that one swears by the technique, but I still have my doubts – it is not quite the same, for a variety of reasons, and I could imagine situations where you might get into simulated- and actual- accommodation “fighting one another”, so to speak. -Would absolutely love to be proven wrong, though. :7

It’s very true that I am not quite following.

No VR HMD optics is designed to require one to “rip one’s eyes apart”, so to speak - you are supposed to be able to look in any direction you choose - parallel, converging, or heck - even wall-eyed if you absolutely want to :P, and things should look as right as they can, under the limitation that without eyetracking we can not account for one’s pupil moving around in front of the lens, forcing us for now to resort to a single projection/lens distortion compensation profile, that have to make tradeoffs one way or another (and also the limitations of the lens).

It is possible that you could design a lens that will give you good results when looking through it a little “sideways”, and I am sure the StarVR, Pimax, XTal (these in particular allow themselves the a expense of a hefty piece of glass :7), and Valve guys, and most who have not yet released a canted HMD, too, have all spent some considerable efforts trying to achieve just that, with varying degrees of success.

If it is just about the projection: The projection is not a concern: Whether you have two screens that sit in-line right in front to you, or two that sit at an angle to either side, or in any other configuration; A house corner that is right in front of you in the virtual world will appear in the exact same spot right in front of you in either case, as produced by the respective projection, as long as you are correctly aligned, in correspondence with the game camera – just like if one use three monitors for a wide FOV pancake experience, one need to sit at the right distance from them - close up, or the side screen views will appear just as stretched out as the projection naturally make them.

I have those doubts - enough for the both of us, I suppose. :7

Oh, I’ve been waiting eagerly for lenses that curve around you since Rift DK1 days! Focussing the entirety of them on a single flat surface is not an easy optical problem to solve, though, without complicated, expensive, and bulky constructions. I am convinced curving screens could help a lot, but… :7

…as for distortions: The whole conceit of this “era” of VR, is that we use simple and cheap optics, and have enough processing power that we can compensate for distortion in software, instead of with fancy optics, without making a dent in the computing budget – assisted by eyetracking, wonders should be possible there; No preprocessing can make something that is out of focus sharp, though (although with a lighfield display, you could refocus in software).

(I’m going to have to read up on “eye box”, but I believe I get the general gist.)

Hmm, I did hear that there were some problems with quality being lower than it should by all rights be, with the HP Reverb, some time back, but I thought that problem, presumably somewhere along the path: device_driver-WMR-SteamVR-and-back had since been fixed… So this could possibly be a larger, and unresolved issue that affects more WMR devices, then, like your Lenovo Explorer…? Hmm…

I wonder whether this meshes in any way with a very aggressive Reverb owner over on the Elite Dangerous forums, who claims he can do massive amounts of SteamVR (but not in-game) supersampling, with enormous visual benefits, but at zero computing costs… :stuck_out_tongue:

Hmm… If there are now .json files in the file hierarchy, one should be able to read and edit them. :7

We had the discussion only the other day, of how this occurring is supposedly even in the form of one’s eyeballs changing shape (I kind of speculated idly that maybe the designer of the P8k lenses tailored them to his own myopic eyes, and that that would then have been why they seem to have such a small radius of sharpness for so many others, who are not nearsighted :P). I can’t help, though, thinking that if that change can happen, surely it can happen in the opposite direction, too, should one e.g. quit one’s office job, and migrate to that ol’e island o’ ye mutineers, arrr! :stuck_out_tongue_winking_eye:

Happy funeral for 2019, you, and anybody else who had the misfortune of stumbling across this post! :slight_smile:

I read it all and survived.

Some people incorrectly refer to it as the sweet spot.

The eye box is the area you can look around by moving your eyes and see clearly without drastic drop off of clarity.

Sweet spot is simply the physical location you need to locate your eyes in the headset for the clearest image.

As I just said, eye box is often also incorrectly described as sweet spot.

As far as the size of the eye box in VR in different headsets there will be many factor come into play. One of those is simply the individuals own physical limitations. But in the case of the Pimax the lens is a factor as well as the distance you wear the headset from your eyes while in the Lenovo one of the biggest factors is the distortion profile. This is the reason you need to run the resolution so high in the Lenovo.

The thing I really like about the Pimax is how big the eyebox is.

I don’t think it is a WMR specific issue, I believe there are WMR headsets that are better than others. I’m pretty sure it is just a design issue.

Reminds me of a Reverb owner on the iRacing forum. The guy I’m thinking of seems pretty clueless.

Trust me I’ve looked. I’m pretty sure there is more to the story. I have a suspicion but it is one of those high effort things to investigate and like I have mentioned previously, 120hz mode is currently enjoyable to me even with the issues and should carry me through until the engineers fix what needs fixing.

I’m convinced it can happen in both directions but I would not advise considering moving there and these days they are more well known for something more nefarious than mutineers. But for people like us the more pressing issues than a population of 50, no where to go and the Islands morality issues would be the terrible internet, if I’m on an island I’ll need my online racing more than ever and I fear in such a place if I did have online racing I’d be less inclined to spend time outside after the first day of seeing all there is to see.

1 Like

Ok. Googling the term, I got some hits in the contexts of head-up displays and scope sights, describing it as the cylindric volume of collimated light from the lens, within which you can move around, as you say, and still see the extent of the view.

I guess this brings your PSVR back up, too, since its distinguishing itself as one of the rare that stuck with infinite focussing, makes it one of the few still with collimated light on the user side, and as such rather insensitive to how much eye relief one have.

I’ll add yours to the count of testimonies from people who say they get a great amount of “tolerance away from the sweet spot”, or whatever :stuck_out_tongue: (not quite the same as eyebox, if I get this right-ish), and presumably along with that ( by the sound of it) what I have earlier in this conversation referred to as a large “radius of focus” (…and which it irks me no end to hear people call “sweet spot”), but that does strike me as directly contradictory with your need to move the lenses closer together.

(Wish I, too, could have gotten as wide a cone of sharpness per lens out of my now sold 5k+, as others have claimed here, recently – I got something not dissimilar to the Vive and Rift CV1…)

Thing is, I am pretty sure the guy in my case must be seeing something - it can’t all just be placebo, and I’d really like to know what it is (the high resolution of the Reverb should have it quickly bumping up against SteamVR’s default 4096 pixel limit, if nothing else).

Unfortunately the fellow went outright crazy, and began trying to blatantly troll me; By the looks of it projecting all of his own tetchiness and insecurities, and demanding that I “believe”, amidst all the sad, misdirectedly ill-baited hooks, so he doesn’t seem particularly approachable, to put it mildly – it would be nice if one could persuade him to run HMDQ with different amounts of supersampling, to isolate out at least one point in the chain, where things could be happening, and see what is what there. Oh well… :7

2 Likes

I think there may still be some confusion about the use of eye box vs sweet spot. They are not the same.

I’m think you know what sweet spot is. Sweet spot refers to where you need to locate the headset in reference to your eyes to see clearly. Some headsets have a very small sweet spot and need constant adjustment. Some have a bigger or more forgiving sweet spot. Although the Pimax isn’t the smallest I have experienced it’s not big.

But this has nothing to do with eye box. Assuming you are wearing your headset correctly in the sweet spot already. Your eye box refers to the ability to look around within your field of view and see an in focus clear image. The Pimax is surprisingly good for this. I can’t put an exact figure on it because at the moment I have no way to measure it but I’d estimate the eye box FOV is about 45-55 degrees. Ideally I’d love this to be closer to 90+. The Lenovo Explorer with the settings I must run on my PC needs you to be looking pretty much directly at things for them to be clear, I can improve the eye box on it by increasing the resolution to approximately the Reverb’s native resolution but then I can’t play games at that resolution on my PC. With the Pimax I can run it in iRacing at between .75 and 1 in Pitool and 100% in SteamVR and this gives me 90fps in iRacing. The Pimax 5k+ gives me better vision with a wider FOV. The PSVR is a bit different although it has been a long time since I have used it on this PC. I remember it having very little SDE and a massive eye box but because it is such low resolution it doesn’t have great clarity for things like reading text and seeing fine details. What I did like about it was how natural it felt to scan with your eyes because of the massive eye box but no matter where you look it is like you have terrible vision because of the low resolution. The colours and blacks however look amazing.

2 Likes

Although I really don’t want to make this into a semantics debate, and will try my best to restrain from arguing any further, after this: That is not really what I get from descriptions I find online, such as:

  • Eyebox – The optical collimator produces a cylinder of parallel light so the display can only be viewed while the viewer’s eyes are somewhere within that cylinder, a three-dimensional area called the head motion box or eyebox . Modern HUD eyeboxes are usually about 5 lateral by 3 vertical by 6 longitudinal inches. This allows the viewer some freedom of head movement but movement too far up/down left/right will cause the display to vanish off the edge of the collimator and movement too far back will cause it to crop off around the edge (vignette). The pilot is able to view the entire display as long as one of the eyes is inside the eyebox.[11]

…from: Head-up display - Wikipedia

Note the absence of mention of focus, which would be less relevant anyway, to a degree, thanks to the collimation (…which is not strictly true in itself anyway, for an HMD, with its wide FOV, where we do look more at an angle through the lens, to catch that stuff in the periphery).

Ok, so you have the “sweet spot” for the entire HMD as a whole - positioned “correctly” on your head, so that both eyes are in optimal optical position in front of their respective lenses…

…But I see nothing wrong in drilling down to more specifically designating that “sweet spot” optimal optical position/orientation, for each lens individually, which is more what I typically refer to, when I use the term, than to the above.

So what happens when we look around, is that the pupil, 12-ish mm distant from the origin of one’s rotating eyeball, is moving around in space. This coupled translation and rotation occurs within the eyebox, yes - in front of the lens, but the reason things go out of focus is that we move it away from the sweet spot - not necessarily out of the eyebox (…although we do get occlusion by the edge we swivel to look toward – less so laterally with the Pimax, thanks to those wide lenses), so I really feel quite justified in saying: “same difference”, to a degree, here.

(EDIT: …and that drop-off in focus, when moving out of the sweet spot (however much tolerance that has, in that regard), is of course on top of the usual fixed falloff due to the field curvature of the lens, and related to the “pupil swim” effect, that makes the image warp when one look around. (The Rift CV1 “famously” was such that you could wear if pretty much off, and still get a fairly sharp view, but you’d still suffer the consequences to geometric verity.))

As for your Reverb… I can not imagine any way in which the display- nor render- resolution would affect hard optical matters, so I am not sure cause and effect are quite correctly attributed, in this case – you see something, clearly, but I am not sure the conclusion, such as it comes across to my ears, about just what that is, is entirely on target…

…and the PSVR… Your great eyebox there sounds really plausible, given the infinite focus - it should, by my comprehension (EDIT2: …such as it is… :P), be a full cylinder, as opposed to the presumed more restrictive cone, with closer focus HMDs.

So you used it with a computer, you say, so that you were free supersample to your heart’s content, and that didn’t help? -I seriously wouldn’t expect it to be too significantly worse than the HTC Vive, when it comes to resolution; It is 1080, next to 1200, yes, but that is full RGB, next to the Vive’s pentile…