There is something we must understand about this app (and about any other similar solutions): we can’t lock neither the user’s head nor their feet. That’s why I didn’t put fixed teletransport points. As much as you intend to keep your head or body still, at the end, you will move a little… so although I thought about keystrokes and some other events, my bet right now is to assume the human error and leave as much of freedom to the user giving him the liability to remain as still as he gets while performing the test. Until eye tracking (tobii i.e.) technology comes to the headsets, there’s still some margin to improvements. Let me think about it and try some other concepts as well.
Just to be clear, I’m not suggesting any locking of the user’s head or feet. The idea is to provide a simple way (keystroke or button presses) to move both horizontal (or vertical) FOV limit markers at the same time and keep them perfectly vertical (or horizontal). That would make it easier to get a good reading.
Basically, + (or whatever key/button) would move the markers outward. The user would keep holding the key/button until the limit bars “just barely” disappear. Press - if you went too far to move them inward. One or the other limit bar should pop into view as soon as the user turns his head, even a little bit, but both should vanish when you look straight ahead. Then you just step back and read the marked angle positions on the floor (or wall for vertical FOV).
Thank s a lot for this tool. Can you explain a bit more about how to do the test and read the result? Where should one stand, and how to make everyone’s result comparable?
This is a work in progress I’m still trying to figure out how to deal with the freedom of the user. Like you, I’m guessing, talking with others into the VR and changing my mind a lot.
Today I’ll jump into scripting. Having some response from the user’s actions will allow the app to interact with them.
About how to use it:
Horizontal: Placing yourself where the feet are, just look to the red spot, grab the sticks and move them following the arcs on the floor until they disappear from your sight.
Vertical: Again, put yourself on the spot, move a little your head until the middle bar disappears, and keep the second, third bar and 0º at the same level (the third should be hidden by the second at all times.) Then just look the value at the very top of the lens and the bottom line.
I’m still working on how to display the values on the screen. Meanwhile, it’s a matter of writing them down using the spray tool or remember them
The other tests are easier: Check the position when you can read the smallest text (11pt), do the same with the 14pt text and finally, find the position at where the 20/20 is the last line readable.
The glare is impossible to measure without taking a picture thru the lenses. Same happens to the Aliasing aberration (the circle). The spots you are going to see are just for you, there is no way to calculate them beside your experience.
Have fun with it and expect some more changes… for better or worse.
That’s the difference in having Carl Zeiss lens and plastic toy lens and I’m glad Pimax is testing a little more and investing considerably in the 8K lens design.
No No No! You don’t actually put the headset on while you’re doing your testing. You don’t need to wear the thing to do science on the device. Why would you want to do a thing like that?? VR? Yuck!! Just get yerself a fancy camera and lens measuring tool son. Here. Here’s a billion dollars. Now go make me a VR Headset. But remember! Don’t ever wear it!
Thanks for the link @Cdaked. This was actually my first effort doing through-the-lens recordings, and I will work hard to improve them. I may probably buy a better camera for it as well, as I feel the 1080p camcorder might not be enough to make Pimax 8K through the lens videos look sharp enough!
Also flickering is a big problem, as the camera cant compensate for it with exactly the same shutter speed. I had to use a 140 dollar anti-flicker-filter/plugin for Adobe Premiere Pro to get rid of 90% of the flicker, but some of it is still there…
Just a little update on what I think is important about this app: in Unreal (Unity as well) the user has all the HMD data available: coords, location, roll, pitch, yaw… whatever info you ask, the SDK will answer it. So, with all that info in your hands, there’s a lot of stuff you can do.
I’m very focused on finding a common testing place for everybody, sharing the exact same location, no matter the headset, your size or whatever other personal detail. With this version, the app will tell you where is exactly the place for both FOV tests and by pressing the thumbstick you will “lock” the location and rotation of the HMD (it’s very vomiting if you move around locked)
I’ve designed a better way -more precise- to move the poles around. Now Is just a matter of moving them with the thumbstick left or right until you get the correct value (remember that you have your head’s view locked).
I have added an “error tolerance” in the sweet spot. This is just to know that those 87º you get (i.e.) will have an error tolerance of about Xº.
The app will know the Headset you are using right at the beginning, so things will function accordingly. For instance, I’m doing another test space inside the app for comparing the Oculus best practices about the text rendering (VR Compositor Layers) in case you use it.
I’m placing the results on a table to allow you to capture a screenshot, or maybe I can export a .txt with the values.
Again, whatever advice or idea about any other test that could fit inside the app, just tell me and we could talk about how to do it.
Just one comment on the FOV. In the original marketing video from Pimax the FOV is shown as diagonal over the rectangular view port. This is quite “original” Pimax approach, similar to 4K, 5K and 8K branding.
In the 2D gaming the FOV was always considered in horizontal direction, because it was always assumed that the vertical FOV remained the same. This is no longer true in VR.
In VR the vertical FOV is another independent characteristics which defines the whole FOV. While Oculus and Vive are using more square-like displays for each eye, Pimax is using 16:9 panels. Now, to asses whether this aspect ratio will add “more to see”, or improve the resolution, or both, we need to measure both, horizontal and vertical FOVs.
The reason is we can have higher res. panel but still show lower FOV, all depending on how the optical properties of the whole system are defined. In this case however, the picture will have much higher res. On the other hand, the optical system with large FOV may suffer from low res, if the FOV is too large.
When I was trying to send some videos to Pimax support about strange lens effects I had great difficulty with the flicker, it was nearly impossible even using my Sony smartphone, and Sony has some of the best, fastest and highest resolution smartphone CCD of the current market; so the best solution could probably fall on a mid/high end 4K high speed shutter camera, as smartphone cameras are totally unadequate with these things.
Update: The pre-beta of the app is here. It’s more or less functional in both, vertical & horizontal FOV. The text is in English but it can be changed to spanish as well (for the community) and… I just test it with the Rift. Tomorrow I’ll test it with the Vive and I’ll try to export a GO version later.
Some early instructions:
-Vertical FOV is very easy-> just place yourself at the vertical spot zone, look to the red dot and then press the right Thumbstick lo lock the HMD. Once locked, press the (R) trigger to calibrate all the axis to 0. Finally, move the dots using UP and DOWN left thumbstick.
Horizontal-> Place yourself at the horizontal spot zone, try to keep inside <2 at all axis, lock the HMD with the right thumbstick and move LEFT and RIGHT the poles with the (L) Thumbstick while keep pressing the (R) Thumbstick (Locked view).
The rest is as usual, just some panels around. You can press (L) Trigger at any time to place a mark on your spot for whatever reference you would need.
I think it’s better to lock the position of the user at both tests (vertical and horizontal). So… now, when you step into any of the vertical or horizontal spots, the app will lock you until you press (R) trigger. That way, everybody will have exactly the same coords to test.