Do you guys think the 8K will run better once the eye tracking is released?
Be careful, blueish tint might be due to false white balance at the moment picture taken. Like @sweviver stated, measured with a "lux"app, there is only a marginal difference in terms of brightness. For me, the brightness here looks pretty much the same.
If foveated rendering becomes supported by more game devs, absolutely.
I think @SweViver comments about 8k being better for simmers and distance , and the conflict this has with some images falls to fact that image quality on 8k only starts to reach potential when you crank it the fuck up. If you have a machine that can do it justice , the 8k probably looks much better. I think simmers should hold out as long as possible for his 2080ti benchmarks to make their final call.
Yeah i after Sebastians video I was all for 5k+ but now that I’ve seen SweVivers fps tests im going for 8k and rtx2080ti. Gtx 1080 with Ek waterblock for sale! (eventually)
Edit. Or I don’t know anymore. Pimax FFS!!
Sorry to interrupt on this destraudo but using lcd panel from the same technology and using the same input signal resolution the 8k image from the start never could be much better except for SDE than the 5K. At best same image with less SDE that’s all. no miracle here you can’t create information! You can trick your brain a little but not that much 8x super sampling won’t make deleted details appear again. This assertion of much better 8K with a 2080ti is simply incorrect a little better maybay. the 5K+ with a 2080ti will also be better with higher SSA
maybe the 8K could be improved has is it but by Pimax, some people very sensible to SDE will prefer the 8K and that’s ok. One thing that was maybe overlooked or downplayed by Pimax is that you send a distorted image to a upscaling process made for a 2D upscaling. You skew your distortion and therefore lose a litter resolution.
I think the Marseille mcable shows that not all upscaling is equal.
We’ve been told that the 8k really shines at high resolution, but also that both headsets will have the same/ very similar overhead for the same resolution.
What we really need is a comparison of the images at the absurdly high resolutions that the 8k supposedly shines at, since surely the 5k+ will also benefit from supersampling. These resolutions won’t be playable with current cards, but might indicate the “future proofing” potential.
Surely not my LG OLED tv does a impeccable job when upscaling a 1080hd signal but it’s still a 1080hd signal. All benefits come from less SDE and that’s not nothing
I’d have thought so too, but apparently slight improvements are possible (at <1ms latency) :
I guess their predictive algorithm just needs to be significantly more hit than miss?
That said, I have doubts whether the supersampling → downsampling → upscaling on the 8k will ever be better than supersampling → downsampling on the 5k+ for the same GPU utilisation.
Yes but from 2K to 4K you have much pixels left to dither and the intelligent process is done between 2K and 4K in the 4K space!!!
in the 8K the SSA effect is still reduce to native input afterword. If the 8K had a intelligence upscaler working in the 4K space this would be a world of difference and we could see some major differences like your presented with the Marseilles cable. The price and power consumption would then go higher
You meam 2k ish. 1080p is near 2k. 720p is 1k ish.
thank ;-)…