I know you didn’t ask me, but I could not keep it to my self
I used to get evga since they had up to 10 years warrenty. But this time I might get the FE 3090 since I like the heat/cooler design. Quirligen the air around with those 3 fan designs sucks a bit, so besides FE there would else be only an evga with AIO Water cooler - but that gets pricy.
I think it’s not an accident how there’s only a 10 GB 3080. Sure the 3080 is fair priced, but it puts enthusiasts in a tough position where many will opt to spend more for a 3090. No middle ground choice - by design.
I am not sure if you speak German, and Igors not the easiest to keep up with I would guess, but he has good point for Nvidia even when being skeptical. He’s got an article about it also but talked a bit more in the video.
I fear that you are right about DCS, but it’s engine seems older then most of the planes it simulats. If they would get there act together and truly rework it to modern standards, I am sure it would be a different story.
Yeah, it’s like Nvidia is basically forcing us to overpay for the 3090 instead of something in between 3080 and 3090.
Actually I kind of like it like that. Only thing is they should have used the 2 empty space and gone for 12gb memory on the 3080 - 10 is a bit short.
None the less a titan for TI price is nice - can’t resist that if the money comes around.
I guess the Super series might pimp those aspects in a year
I like it too. However, I’m not sure the 3090 is worth the premium price. I’m going to wait a while, to see if nVidia releases a 3080 20GB FE version. To me, that seems like the best balance of value and longevity (useful lifespan).
The price is beefy, but I guess beside paying the typical bonus for higehst end, the memory is not that cheap and might be a cost factor. Also I guess the better dies for the 3090 play a part.
Besides, somebody needs too pay for jensens dough scraper assemblage 
True. My understanding is that there is a very limited supply of both the 3080 and (especially) the 3090. Given the likely demand, that means nVidia can charge whatever the market will bear, at least until AMD releases its new video cards.
No, but Google Translate does. ![]()
I am not so sure. On the multi-CPU/GPU side, large improvements could be realized. But even CryEngine might not perform much better on a single GPU with the amount of geometry typical of flight sims.
Buildings, vehicles, aircraft, terrain, etc, impose a huge amount of polygons (‘geometry’) to process. Flight sim needs to show this accurately through at least 0feet AGL to 60000feet AGL. Going through that entire litany, at 8k supersampled resolution per eye, is legitimately very expensive. Flight sim is also not a genre that gives developers a lot of time to work up clever algorithms to bump map all this - most of the work is in such things as researching/scripting accurate avionics/aerodynamics .
DCS World is about the worst case for all of this, emphasizing competitive scenarios that sometimes benefit significantly from single-frame latency.
Don’t forget those of us who might want to build custom laptop in addition to the maxed out desktop. The RTX 2080 Ti does not quite ‘idle’ well enough for some battery powered use cases, and the 3090 might be the same in this regard.
Good point. But… DCS is not everything. Unfortunately its not Nvidias fault that the DCS devs don’t care enough to optimize for VR.
Optimization may not be the problem. DCS World (and most flight sims generally) has huge polygon counts, due to the detail needed for all the vehicles, buildings, aircraft, terrain, etc, to the point that even CryEngine might not be much more efficient on a single GPU.
VR generally has huge pixel counts as well.
Fill rate pixel/texture specifications for even the 3090 are not much better than the 2080 Ti.
And because of that, I seriously doubt more FP32 units or similar such things are going to solve the real processing problems of VR.
Where DCS World does need optimization is multi-CPU/GPU.
EDIT: I get the meme now. Haha.
I think this time the founders edition cooling is the best out of all . They literaly simulated their cooling solution with super computers
No way I am not watercooling the card this time around. The RTX 2080 Ti already baked the ambient temperature inside my case to above 60degC, and that is with 5x 140mm fans that each make as much noise as a hairdryer, plus a PCI-E slot blower, plus two more 120mm*38mm fans!
So I’ll probably try Founders this time (based on what others say about vendor overclockability), and rip the cooler off.
@RaiN274
And yes, I realize the Founders cooling was specifically designed to blow heat out of the case. That’s not good enough. These things are supposed to break ~325W at stock clocks, not >2100MHz.
After reading a lot of technical stuff about the cards, I think OpenGl-based games like XPlane, also don’t benefit from them. Floating point calculations are not recommended for OpenGl, so the FP32 cores have nothing to calculate and cannot be evaluated as an additional cudacore.
Haven’t found enough about Vulkan yet for one definitive statement.
Makes perfect sense! OpenGL performance has been a Quadro feature, due to its relevance to CAD modeling, so NVIDIA is not going to be in a hurry to improve that for ‘consumer’ cards.
This will also be the reason why the 3090 is declared as a playing card.
The strength of Titan is OpenGl as a cheap Quadro equivalent
To be fair though, that might be more true of proprietary OpenGL CAD programs. FreeCAD doesn’t need that much GPU power to begin with, and open-source programs can be optimized.
