One of our favorite pieces from the past year, originally published on August 26, 2022.
You have to forgive PC gamers for turning on about Crysis all the time – it feels like a while ago that a game like that was so impossible to run on existing PC hardware that graphics cards actually had to be redesigned to accommodate it. But we’ve had some close calls with impossibly demanding games over the past few years, and it’s got me wondering what game will be the next game to bring our PCs to a standstill with their extremely demanding system requirements?
If you’re just asking for a list of the games that are primed to showcase your all-singing, all-dancing rig at the moment, we’ve got you covered with our list of best games to show off your new graphics card (opens in a new tab). I’d rather look forward to what’s to come and if we’ll ever reach that Crysis point again, as I’m not yet convinced we will.
But some upcoming releases are emerging as contenders. Starfield, for one. This is a new entry from Bethesda, but built very much in the vision of the company’s greatest successes, such as Elder Scrolls and Fallout. A fresh open world (or rather open universe) with a brand new version of Bethesda’s well-used and well-known rather tough Creation Engine – this is sure to be a lovely, if very demanding, game.
The Starfield trailer from earlier this summer only gave a glimpse of a rocky planet or moon, but even the chunky space rocks looked impressively detailed. The actual gameplay at release could be a lot different from what we’ve seen so far, especially since the game’s release has been pushed back to 2023. However, the shadows and ambient occlusion in that trailer alone seem enough to make a graphics card whine.
Maybe the huge void in space will be easy on the CUDA cores. In space, no one can hear your graphics card’s fans screeching.
Then there’s The Witcher 4 (opens in a new tab). While it hasn’t been confirmed to be the name, we do know that CD Projekt Red is working on the next installment right now, and that he’s wasting no time with his own REDengine on this one. It instead chooses to side with Epic’s Unreal Engine 5 (UE5), which will mean it joins the legions of games in development for that engine. The game will undoubtedly be gorgeous, but I wonder if the impact on the hardware will actually be minimized by using a more mainstream game engine.
The game development business has learned to do much more with much less.
“Players can go in any direction they want, they can handle content in any order they want, theoretically,” said CD Projekt Red’s Slama earlier this year (opens in a new tab). “To really encapsulate that, you need a very stable environment where you can make changes with a high degree of confidence that it’s not going to break in 1,600 other places down the line.”
Already shown to be impressive in its breadth and detail, UE5 feels like a good choice for the much-anticipated Witcher game, and we hope it will offer a much improved launch experience than CD Projekt’s last game, Cyberpunk 2077.
Cyberpunk 2077 was a recent game that really pushed the graphics hardware of the time, but was it because of its impressive expanse or because of a not-so-optimized engine? It’s a mix of both, maybe more the former at times, but the lack of optimization really put a damper on this game’s performance. It is important to distinguish between a game that is challenging for the right reasons and one that is challenging for the wrong reasons.
Perhaps the closest we’ve come to a watershed for graphics hardware like Crysis has been the use of ray tracing in modern games, so I’d assume that whatever game we’re waiting to become a benchmark for processor performance will use it to impress to some degree.
Bouncing a beam for each pixel on the screen must have been thirsty work for even the graphics cards designed with that in mind. The RTX 30 series manages to lighten the load somewhat with more impressive RT cores than the RTX 20 series, and since then we’ve seen AMD join its own RDNA 2 Ray Tracing Accelerators, which are moderately decent at the job. But it’s still a pretty significant price to pay for beautiful reflections and shadows.
Just look at F1 2022. It certainly looked like the part with ray-traced reflections, shadows and ambient occlusion that sparkled off the side of Sainz’s Ferrari, but even an RTX 3080 struggled to make ray-tracing worthwhile. That’s with both Nvidia’s Deep Learning Super Sampling (DLSS) and AMD’s FidelityFX Super Resolution (FSR) working backwards to improve the final image and performance.
Right, yes, scale up. Upscalers can change everything. Are extreme demands for more cores, more VRAM and faster clock speeds swept under the upscaling rug? I would argue that scaling up is and will continue to do so a lot for PC performance (opens in a new tab) on a large scale as faster GPUs.
Indeed, that brings us back to the next Witcher game again, and since CDPR is opting for UE5 rather than its own engine, it will likely come with support for UE5’s built-in upscaler, Temporal Super Resolution (TSR). Not to mention other upscalers that CDPR decides are well worth integrating into the game. There may be many of them at that time, as Nvidia’s framework for better integration of even competing upscaling techniques, known as Streamline (opens in a new tab)may be in heavy use when the game’s release rolls around.
Considering console development, and what feels like a shift away from PC-exclusive development, it also looks like the days of crazy schemes to push PC hardware over the edge may be waning. With compatibility across many PC-like consoles of varying power and capacity, any developer will be keen to at least maintain consistent performance across most platforms.
It doesn’t necessarily close the door to extreme presets on PC, but I expect it to at least reduce their regularity.
It comes down to what we classify as the next ‘Crysis’ – in the sense of a PC-breaking demanding title. It’s not just a game that struggles to run at 120Hz at 4K on a high performance graphics card. We already have many of these. It’s a game that’s so frankly absurd in its adoption of advanced graphics technologies and techniques that your PC gets goosebumps just by installing it.
Feeling as absurd a game as Crysis once was, I look at the big games of the previous two years and those coming in the next two, and I just don’t see anything that fits. Even Crysis Remastered wasn’t a match for its older self in this regard, although it didn’t perform particularly well at launch either. It was also due to poor CPU utilization rather than some new age graphics technology, and was later corrected for proper performance.
The next generation of games are going to be beautiful, I’ve never doubted that, but perhaps the reason we won’t see another Crysis moment is simply that the business of game development has learned to do a lot more with a whole lot less . Times have changed: Crysis came at a time when PC performance wasn’t determined by hundreds of frames, but just an expectation of a steady 30 frames per second. Developers also didn’t bring the console exclusives to PC as they are now, and we score big first-party games on PC like God of War. The goalposts have been moved and people expect a lot more from their matches. I don’t know if today a game like Crysis, with such high demands that they closed the door to the majority of players, would be met with such awe as it once was. A publisher certainly wouldn’t be too keen on the idea – someone spent a lot of time and money making that game.
It’s for the best, really – while it was a fascinating and exciting time in graphics development, it was also quite annoying for those of us without the new, new graphics card to not be able to play Crysis with any semblance of a decent frame rate.