I could be wrong, but I believe that was more a memory pool management thing… Did it happen to you without your having increased supersampling (and/or planet/galaxy texture sizes) a lot (…which is what would invoke the issue for me)?
Using larger textures should have pretty much zero impact on performance, other than the extra time it takes to load/generate them…
Yes, but this was in the early days of ED. I don’t think I’ve seen it triggered in over 2 years. I had a graphics card with plenty of RAM, so that wasn’t the problem. I think it was just stupid code that monitored the framerate, instead of the amount of available RAM.
Actually, it can have a huge effect, if your graphics card doesn’t have enough RAM to keep everything in on-board memory.
I’ve never heard of the CPU speed affecting antialiasing or graphics quality at all. I only know this from graphics cards that lack features. At the moment I don’t have the possibility to test my Pimax in a more modern PC.
You had less aliasing with the new CPU despite the same settings? Probably you increased the supersampling because of the higher performance.
Do you have the possibility to start the ROV test? There is a black wall with a pattern of blue and red crosses. This wall flickers in my 8k like hell. I can set the steam SS as high as I want, the flickering just gets sharper and more pronounced.
I will take a look. Can it be started without controllers? Distance due to tron like graphics can have aliasing.
I just know that the i5 6500 had the same horrible aliasing you described with your i5 3570k… Bottleneck shows you have larger bottleneck caused by your cpu than I had.
In Alien Isolation VR mod; the screen that showed the Xbox controller layout was even horrendous.
The 1080ti was fixed when paired with a better system & the I5 6500 looks as it should paired with a lesser gpu the r9 390.
Regular non VR games looked good on the i5 6500 1080ti mix. But VR not so much.
As you have set your SteamVR SS setting to auto, maybe it pumped up the supersampling after changing the CPU thus giving a less jagged image, or does it check only the GPU speed, idk.
Anyway, your explanation doesn’t contradict all the experiences I’ve made in 25 years of PC gaming. In my experience, a weaker CPU only reduces the minimum fps, not the image quality. The same applies to an outdated/slower graphics card unless it is so old that it does not support certain instruction sets for certain SS procedures.
But it doesn’t matter, I expect a better antialiasing from an 8k+ alone because of the RGB structure. Also, I really need a new cpu
I never said it would reduce the image quality, that is something @Heliosurge thinks (edit. Not putting words in someones mouth, seems to be thinking) I just gave a possible explanation.
With Auto on the i5 6500 with 1080ti was under 20% on auto. Major Jaggies. For ecample even in Alien Isolation the screen showing Xbox controller layout was even jaggy city.