So a new generation of Crysis is here to make us all feel really bad about the big money systems we have painstakingly assembled and obsessively tweaked and overclocked. Having never really recovered from the abomination that is Metro 2033, I reluctantly, pulled down a copy of C3 from the Origin store and fired it up. For review, the system is currently setup as follows:
- Intel SB-EE 3960x on Corsair H70 @ 4.5Ghz
- 3xEVGA GTX680 4GB @ 1.1Ghz Core/2.1Ghz Mem
- Asus x79-WS PCI-E 3.0/USB 3.0/SATA 3
- 2x Crucial M4 512GB 6Gb/s SATA in RAID 0
- 16GB Corsair PC 15000 matched kit
- Creative X-Fi Fatal1ty PCI-E
- Logitech G510/G13, Razer Naga
- 3xViewsonic VX2268 120Hz 3D
- NVidia 3D Vision 1.0
- NVidia 314.07
The system build of this guy, and some of its predecessors, has been detailed here on the blog and nothing has changed since the last build log. So with the system specifics out of the way, on to some results!
5040×1050 – FXAA, 16X AF, Very High textures, Very High advanced settings, Tri-SLI – Observations
Plenty of sites put up deeply objective measurements and tests complete with detailed frame patterns and timings. Rather than attempt to recreate that experience, I’ll just give a subjective conclusion along with some anecdotal observations on objective performance of the first 15 minutes of campaign mode gameplay (especially since there is not yet any formal “Crysis 3 Benchmark Tool”). I can sum it up in one word: unplayable. Framerates ranged from around 8 fps inside, in firefights, up to around 35fps in open areas. Worse than the absolute numbers was the fact that the framerate was all over the place and very choppy. “Average” may have been something like 13-14fps. A smooth consistent 20fps might not have been horrible, but as it stands I have to call 5040 with maxed out settings unplayable.
5040×1050 – FXAA, 16X AF, High textures, Very High advanced settings, Tri-SLI – Observations
Dropping textures down to high did help a bit, but at the end of the day I would still call the experience unplayable. The average pushed up to around 18fps or so, but the choppiness was still there and the swings were just as dramatic ranging from a low of around 10fps up to a high of 45 or so. So far a very disappointing showing for GTX680 tri-sli at 5040 surround!
1680×1050 – FXAA, 16X AF, Very High textures, Very High advanced settings, 120HZ 3DVISION ENABLED, Tri-SLI – Observations
Expecting very little, I dropped down to a single monitor, but enabled 3D mode. 3D is generally a widow maker where you can expect about half of what you’d get from a given resolution and settings in 2D. I hadn’t tested single monitor 2D, but the results in surround certainly weren’t encouraging. I’ve read that the 3D Crysis experience is compelling, however, and that the game was really designed for 3D, so that sort of implies that someone has managed to actually play it! In any event, it never hurts to check it out, so a few config changes and another 15 minute run through of the first level later and the results were in and they were shocking! Performance with max detail at 1680×1050 in 3D was phenomenal. Gameplay was smooth as silk. Framerates ranged from 80-115fps but generally hovered around 100 even during intense firefights. In addition, Crysis 3 is truly a sight to behold in 3D! If there is ever going to be a killer app for 3D, this is it!
So where does this leave us? Really confused honestly. I can’t say I’m surprised by the surround performance (lack thereof), but I am very surprised by the excellent showing in 3D. It almost leaves me wondering if there is a patch and some optimization needed either in the game or in the drivers and if perhaps some additional (significant) gains might be forthcoming on the 2D surround side. I am going to continue testing for a second update, and possible capture some in-game video. Stay tuned!
UPDATE: 5:35 mins of footage of 1680×1050 Very High, 3D Vision mode. GPU temps, memory usage and FPS from EVGA Precision in OSD:
Interesting findings from the 3D run:
- VRAM usage never cracks 2GB
- framerate really never dips below 70fps
- fps average hovers around 100
- very little stutter, extremely smooth
- 3D effect is fantastic (best FPS for 3D to date by far IMO)
So what is going on with surround? 1680×1050 in 3D Vision is essentially like running two panels. So why does effectively adding a third panel for 2D surround cause a catastrophic framerate hit? Not sure at this point, but really seems like an optimization issue to me. Hopefully additional testing will shed some light.
UPDATE 2: 7:21 mins of footage of 5040×1050 Very High, 3D Vision mode. GPU temps, memory usage and FPS from EVGA Precision in OSD:
The plot thickens! This run was really an “ah HA!” moment. Scaling up to 5040×1050 was actually a very good experience with 3D vision enabled. So what gives here? Is this the first game in history where 3D Vision actually improves performance? I vote no. There are some serious optimization problems with Crysis in 2D surround.
Consider the results. Switching to 5040, the game remained just over the border of playable, with framerates ranging from about 17fps up to about 55fps with a sustained average somewhere between 35 and 45. As with the single panel testing, cinemas dropped framerates down roughly in half resulting in 14-17fps for the cinema (but its a cinema, so who cares really)
After this round of testing, we are forced to draw conclusions about how 3D surround can perform better than 2D surround and, in my opinion, the answer to this cannot be a lack of resources. In other words, I can not yet recommend that Crysis 3 really requires Titan for surround. If the 3D Vision performance were simply to be scaled up by some percentage for 2D, 2D surround would be fully playable on tri-SLI GTX680. Some highlights from this round:
- VRAM usage finally topped 2GB (just over at around 2050MB)
- Dropped FPS in cinemas behavior stayed consistent
- Scaling from 1 panel to 3 was actually excellent with performance cutting roughly in half rather than 2/3
- If 2D surround resulted in scaled up performance from 3D surround, all things being equal, it would be playable
Video evidence follows:
UPDATE 3: 8:06 mins of footage of 5040×1050 Very High, 3D Vision mode enabled in the driver, but DISABLED IN GAME. GPU temps, memory usage and FPS from EVGA Precision in OSD:
As if things weren’t confusing enough already, now we have some really strange behavior. Turning 3D Vision on at the driver level, but leaving it off in the game settings, resulted in performance that scaled as expected from 3D Vision surround. The game proved fully playable at 5040/Very High, with framerates ranging from a low of 14 (admittedly dismal) to a high of 70 with averages hovering around 50-55 fps. Saying this result is surprising is an understatement. I’m working on some theories of why this is and have summarized them below. In the meantime, here is the video of the run:
Interesting notes from this run:
- same perf in cinema as the 3D run (very odd)
- VRAM usage stays the same (2.1GB or so, no surprise there)
- Not a huge increase over 2D, but makes it just a bit more playable which makes it just about “enough”
- Seems that further driver optimization could make 3D playable and 2D enjoyable (meaning sustained 60fps and no dips below 20)
Before today I would have said with absolute confidence that turning on “3D Vision Enabled” at the driver level simply enabled the functionality, forcing 3D on for games that don’t recognize it, and enabling the potential for games that do. It was my understanding that, in games that did support a stereoscopic toggle, disabling that toggle would simply be the equivalent of turning 3D Vision off at the driver level. This latest round of testing throws all of those theories for a loop.
What I am now thinking is that possibly 3D Vision is a full driver modality. Where enabling 3D Vision at the driver level toggles on a separate graphics pipeline, regardless of whether stereoscopic rendering is on or not. I am going to try to research this more (shouldn’t be too hard) and report back. If this theory is true, then we will have to wait for NVidia to get its drivers to a better state of maturity before making the final call. This document would tend to suggest that my theory is at least somewhat correct (or at least on the right track):
This bit seems to be relevant:
“When running in 3D Vision Automatic mode, application issued draw calls are
substituted for two separate draw calls—one for the left eye and one for the right eye.”
In other words, an app with 3D native support, and that 3D support set to off, talking to the driver when 3D vision is ON at the driver level, is different than that same app, with same settings, talking to the card when 3D vision is OFF at the driver level.
As it stands though, I can say that for 5040×1050 you do not necessarily need Titan for Crysis 3. GTX 680 Tri-SLI can handle the workload. Of course anyone building a new system would be better off with Titan SLI for the extra $500, but for existing owners of GTX680 Tri-SLI, this is interesting news. Just make sure to enable 3D Vision surround even if you’re running in 2D. And if you don’t have the kit, buying one may be a cheaper “upgrade” then moving to Titan!
All of that said, I have to think that tri-SLI Titan would simply own Crysis surround at 5040 in either 2D or 3D. Whether or not that is worth the cost of an upgrade from tri-SLI GTX680 (at least a $2000 proposition realistically), is a very difficult question.