Sure, this qualifies as certifiable insanity, but we suffer for the art dammit! Here she is…
The Titan X (working fine in the AGA by the way, and 6700HQ busiest core is still only hovering around 80% while GPU still hits 99% at 4k) gained about 10fps in Crysis 3 over the (overclocked) 1080 which had given about a 10 or so fps boost vs the 980Ti (factory overclocked) at 4k/Very High/No AA.
Interesting take away is that 980Ti->1080->Titan X is delivering tangible gains, but much less so if you compare OC to stock clocks. Have to keep the OC going. Good news is Pascal overclocks well. I had pushed the 1080 to +225/+500 base clock/RAM. I pushed Titan X to +200/+500. I had the 980Ti SC+ at +125/+400.
At 4k/Very High/NoAA on Welcome to the Jungle the results were:
980Ti +125/+400 : 25-40fps
1080 +225/+500 : 35-52fps
Titan +200/+500 : 45-63fps
CPU Utilization and a Side Note about World of Warcraft
At peak framerates the busiest CPU core was 78% and the GPU hit 99% in all scenarios. Interesting contrast to World of Warcraft, Legion where (also fully maxed at 4k with no AA) I am seeing a big range of 35 – 100fps depending on location with CPU pushing up very occasionally to 90% and GPU rarely going above 80%. Oddly the three data points don’t coincide as you’d expect. So there are times the busiest core is only at about 67% with the GPU around 50% (Titan X more like 35%) and framerates are still somehow 38fps. So a good thing to keep in mind when attempting to chase “bottlenecks” is that code optimization remains one variable you can’t control and in many cases it has the biggest impact.
In terms of the Titan X and the (admittedly odd) pairing with a 6700HQ running stock clocks, at 4k and max details there is still some life left in the CPU. Crysis is a solid test for this, in my opinion (particularly compared to woeful examples like WoW or Metro), where you see 99% GPU Util with CPU peaking at 80%. Unfortunately the nature of game code continues to make it hard to break the work into threads, so tons of actual computing power is being left on the table at 80%. Real CPU util across all cores is more like 30%, but that’s the nature of the beast.
Is Titan X worth it for an AGA equipped, 6700HQ based, Alienware with a 4k panel? Definitely. We’re still not seeing sustained minimums of 60fps with full fidelity and in the majority of cases the CPU never maxes out. That said, I will say that there isn’t that much left on the table with the 6700HQ given the realities of multi-threading in gaming workloads. The next bump up from the Pascal Titan X would probably be about it for the 6700HQ in most cases. Of course that might mean that Pascal+1 plus 6700HQ could actually do sustainable 60fps full fidelity 4k, which would be a beautiful thing because, shockingly, we’re still not there yet!