What was the point in Pimax Day 1 if you have Pimax Day 2 a few weeks later?
Don’t tell me Pimax Day #2 is being held in Japan this weekend…lol.
What was the point in Pimax Day 1 if you have Pimax Day 2 a few weeks later?
Don’t tell me Pimax Day #2 is being held in Japan this weekend…lol.
Okay. Three… Weeks… Later… Check.
The 8KX is going on general sale too.
8K-X is the new flagship product and no doubt the one that they hope to get maximum sales for. Not limited. Day 2 / orders open ~ 3rd week of October.
Slide summary of Pimax day options (unspecified 8K-X discounts for backers and pre-orders, presumably confirmed on Day 2):
https://community.openmr.ai/t/pimax-day-announcement-highlights/22174
The livestream in its entirety:
I am also very satisfied with my 8k, except for one thing: The aliasing is much stronger than in the 5k+ or in the lenovo explorer. It takes place antialiasing, but not so good. Very clear with IL2-Bos, if you take the pursuer perspective to your own plane. This is a difference to my Lenovo Explorer like day and night.
My guess is that it’s the upscaler that doesn’t “let through” antialiasing as it should. I was encouraged by the comparison shot of sweviver between 8k and 8k+ at the female head (skyrim?).
An exchange set of display and board for the McGyvers would be great. At your own risk of course…
My thought is that the aliasing in the 8K has more to do with the “rainbow” subpixel pattern than anything else. It’s harder to antialias when the subpixels aren’t full RGB stripes.
I myself don’t have much of an issue with Aliasing on the 8k on the 1080ti since upgrading from the i5 6500 to the r7
What are you running? Find on my setup not bad depending on settings.
R7 2700x with 1080ti
Lol twenty character thing
i5 3570k @ 4.7GHZ, DDR3 2666, 1080ti @ 2.1GHZ. Ok, cpu is outdated, but with brainwarp I don´t run into cpu limit :).
if I had known how bad antialiasing is compared to 5k+, my choice would have been 5k+. Unfortunately nobody pointed this out, and even now this difference seems to be rather unknown.
The explanation of neal_white seems plausible to me, but it could also be the upscaler or a combination of both factors. I don’t know.
The joke is that the edges only get sharper the more SS I push ingame or over steam…
Would you be kind enough to check if the antialiasing in Contractors or IL2-Bos is different between 5k+ and 8k? I only have a lenovo Explorer for comparison. That I have tested a 5k+ from a friend is unfortunately too long ago.
I don’t have either of those games to compare with. But did experience terrible Aliasing on the i5 6500 with a 1080ti. Do you by chance have a friend with a 1080ti with a newer Processor? I5 8000 or Ryzen 2000 series(r5 or r7)?
It didn’t seem to be spiking either on cpu with the i5 either. Granted the higher Ddr4 memory frequency could also be a contibuting factor.
Someone I recalled reccommended I think 3000hz memory.
With your oc you might be okay. According though to Bottleneck Calculator at base clocks that cpu has near 60% bottleneck.
But these calculators are more of s basic idea. Testing with a newer cpu would give you a more solid answer.
I just know my upgrade was a considerable one & removed the Jaggies to a strong degree.
Lets hope the weather holds out for pimax day 2!
Please be advised…
Did you ever found an explanation for how switching some PC parts other than the GPU affects the 3D rendering quality i.e. aliasing?
Could you specify?
E.g for a typical abstract case - They all interconnect - if your RAM is slow, the CPU will be slowed down since it has to wait for data and can’t provide it fast enough to the gpu. If your CPU is slow it can’t provide data fast enough to the GPU also and can’t use the potential of the RAM. If the Mobo is old it can’t path through data fast enough to the components,…
Yes, so how does the PC speed affect the quality, thats what I don’t understand.
Maybe the gpu driver sniff that the rest of the PC is crap, not worth pushing quality. IDK. @risa2000 help.
That is right quality is the same, fps is different…
Some games detect low framerates and automatically reduce the graphics quality settings. I think it’s changed now, but Elite D used to swap in lower quality textures when the framerate dropped below 30 fps.
Yeah, thats the only(?) possibility.
We’re quite far from the original topic, my bad.
Tbh not entirely sure. But the common thread if you look at suggested upgrades for cpu without bottlenecks are generally 6 threads.
But at a guess cpu gpu sync, cpu memory controller limits vs motherboard supported memory. Cpu features sets.
The i5 3570k at no OC is slightly faster but has almost 60% rated bottleneck where as my i5 6500 only had if mem serves a 35? Bottleneck. The 3570 is a 2012 processor & getting quite old.
An idea as it might be cheap would be to see if one could source a better i7 in the 3000 series that is stronger on frequency clocks & such.
But if it was just mainly clocks with games mainly focused on Single cores we’d all have cheap top i3s & equivilents. I think it also comes down to OS elements in utilization. Ie look at Consoles have laptop cpu matched up with strong gpu (Apu) with 8 cores (xb1/ps4) but the os is designed around it.