Version 3 prototype review

We just sent our CO-CEO and new teammate for NYVR demo, we need get right information from them and then give the correct answer.

8 Likes

Glad to hear that, I think we are all looking forward to it :slight_smile:

Listen to this guy. This is important

It’s a limitation of the laptop dp1.2. Idk why people are getting worked up on his, the pimax will do 90hz/180bw

It’s already been clarified that the v3 demo was on a desktop 1080. The desktop 1080 is equal to a laptop 1080 (I don’t know why anyone thinks there is a difference between desktop and laptop GPUs anymore… they did away with mobile GPUs). The only thing holding back a mobile GPU is the ability to disperse heat effectively, generally speaking base/boost clock rates are exactly the same (not taking into account hardware throttling from thermals, or overclocking).

Edit: Ok I was wrong, not exactly the same base clock but boost clock is the same.

1 Like

A majority of laptops only use display port 1.2 …which has been around for many years… Only a handful have 1.3 … and I’m not sure if any have 1.4

Im giving Pimax the benefit of doubt here.

I believe that, for them, its been a steamroller month. An EVER INCREASING interest, people demoing and new people interested in their products.

Errors happens, as they are trying to keep up with all the influx on “need info” and new questions and wadda wadda (so many already answered in the FAQ).

Give them a little time. The main OBJECTIVE here is to reach prototype v5 and begin Mass production of their CV1.

I was not there ( i wish i could, believe me), but i can bet that happenned like this:

They start demoing the v2
People wow’d !
More people is coming, ever in greater numbers
The stand is way too small for everyone
So many people demoing, the prototype begin to crack (we seen the headstreap glued and taped)
Main problems reports are distortions near the borders, lack of ipd adjustments
V3 is made on China
Everyone is eager to test the v3
It’s coming
It finally arrives, but the notebook lacks a DP (how anyone didnt see this!?)
They bought a new laptop (for them, i believe its not that easy, they must transfer funds etc)
The new laptop does have a DP port, but it’s a mini dp (someone must have heard quite a lot of swearing for this LOL)
They bought an adapter, but it limits to 75hz - FFS !
Lots of reviewers there, just to test he V3
A new meeting is set
They bought a PC DESKTOP (much probably a already set one, like Falcon Computers)
It doest have a DP port, but it’s a 1.2, limited to 75hz (someone must have become deaf after the swearing)
Now, they are looking to address this

And that’s that.

3 Likes

I guess that makes sense, but it’s already been confirmed that the test was done on a desktop. I think the idea that it’s a bandwidth issue can be squashed at this point, that’s all I was trying to point out. Again, I’m speculating and only official response from Pimax will be sufficient lol.

The desktop 1080 definitely supports DP 1.4? That’s the graphics card @evertec said they had in the desktop, at least.
“The GeForce GTX 1080 is DisplayPort 1.2 certified and DP 1.3/1.4 Ready, enabling support for 4K displays at 120Hz, 5K displays at 60Hz, and 8K displays at 60Hz (using two cables)”

1 Like

Yep seems to be some kind of issue with whatever chip/chips they are using to receive, upscale or display the images within the hmd itself. It isn’t possible to push a 5k image up a dp stream at 90hz (except dp1.4 with dsc)… so it must be 2x 1440p (as dp1.3 and higher can handle up to 240hz)…

2 Likes

If we could know the exactly model of the video card, we could verify for sure.

Im saying this because all the standart (founder’s edition) of the 1080 that i saw was DP 1.2

1 Like

The picture from VR Commando suggests it is rather something like RGBW pentile matrix. What bothers me more, I have asked explicitly about the subpixel arrangement here and did not get any answer from Pimax.

It seems also it has not been stated anywhere (as far as I looked) and the only claims about RGB stripe here and on KS forum are usually just assumptions from people who consider LCD panel == RGB stripe.

1 Like

It would be a pretty bad oversight for them to be using a DP1.2 card… but then again it is Pimax…

1 Like

Brainwarp should be just a minor improvement. It would not change the rendered picture size, but instead of sending one (composite) image of 5120*1440, the card will send two images (of the half of size). This may reduce the strain on frambuffer (or the final compositing stage), but the workload remains essentially the same.

Getting the prototype to work at the level it is (according to reviews) you need some pretty clever people, so this Hanlon’s razor excuse seems just too farfetched for me to fully buy it.
But there is still time to set things right, we’ll see.

thats the thing though, the workload is the same but you double the perceived frame rate. That means you can have a lower frame-rate, like 75, be processed by the brain as being a great deal smoother than a vive. Edit to add, im guessing the low ceiling for this effect is 75 hz and somewhere below that the effect of rendering each eye in sequence becomes discernible.

The thing that’s disturbing to me is that this is apparently all just on paper, it seems that none of the reviews we’ve seen have had the brain warp technology active. That is a huge part of this technology, and it could potentially make the headset sickening if it isn’t implemented correctly.

1 Like

Simply put its still early. They need to first have things stable before implementing a feature like Brainwarp. They need to have 75hz to 90hz stable otherwise if they dumped Brainwarp in and it seems to most that it works well. For those who’s eyesight are better than average might choose to turn if off.

If they turn Brainwarp off & has unforeseen problems then… Well you know.

Tested Review is an example. The v2 needed some tweaking as later reviews demonstrated compared to Tested Preliminary review. Folks are still confused on rendering of FoV in game (in game camera) vs how a game renders the enviroment. Games need to render more than you see so when you left you don’t see the enviroment being drawn.

As for Hardware; yes it can be a number of componets.

Laptop
-not supplying enough amperage for cable length.
-cable not deliverying enough bandwidth as per spec.
-DP on laptop limited
-Scaler not delivering its specs

Then of course software. One needs to eliminate possible hardware first before investing time in software. The software has likely had synthetic testing & why they are evaluating possible hardware issues.

2 Likes

thank you. If i could make a suggestion, you guys should really hire a native english translator to be present for interviews with staff. I have seen a number of other things brought up on the forums that seem like a result of translation/communication errors too. It might cost a grand a week to hire them but they would pay for themselves in the benefits they would give the company in terms of its english pr communication.

1 Like

The double the perceived rate is still a bit vague claim (from Pimax). Imagine that the scene is calculated every 1/90 of a second and both eyes are rendered from this calculation. Then the image(s) rendered in brainwarp mode and “normal” mode are exactly the same, the only difference is brainwarp images are slightly delayed between right and left eye by half of the “tick” i.e. 1/180 of a second.

This will however not improve fluidity, as all inputs (from motion sensors, or game engine) will be only taken into account each 1/90 of a seconds.

To get the response in 1/180 the game engine will have to support that, i.e. the whole logic would have to render the frames for left and right eye et discrete and different time points. But then the problem is that the right and left eye will never match the same scene (the same time point), which may produce other problems in 3D recognition.

1 Like

One thing I forgot to mention is I asked about ASW, or at least their version of it since it had been reported that they would support it. The ceo said they were working on it but it’s not ready yet. Same with brainwarp, they are further along with that, but it’s not ready for a demo he said.

1 Like