This week we learnt that several next-gen games need a lot of power.
Watch Dogs’ requirements seem at Crysis-high levels, while Call of Duty: Ghosts baffled gamers with the system recommendation of a Geforce GTX780. Battlefield 4’s requirements also seem at odds with the kind of machines we expect that most PC gamers own.
But is this all just fud, or is there more to the story? Keep in mind that for nearly a decade our (and my own) enthusiasm for games on the PC that also appear on consoles has been tempered by terrible ports and poor use of the hardware available, resulting in games that look good but don’t scale very well.
For the majority of multi-platform titles on the PC, it was not the lead platform, and the game only had rudimentary controls and options to make it look better. Dead Space 3 was a prime example of this – the baseline hardware required to play the game on low settings was barely higher than the PS3 and Xbox 360.
To illustrate a point, let’s take a look at those minimum requirements for a moment:
|Dead Space 3
|Operating system||32-bit Windows XP, Vista, 7|
|Processor||2.8GHz Core 2 Duo or better|
|System memory||1GB (XP), 2GB (Vista, 7)|
|Graphics card||Geforce 6800 or better, Radeon X1600 Pro or better|
|Free storage space||10GB|
|Sound card||DirectX compatible|
The minimum specifications for a game tell you a lot about how it will look and scale. There are outliers of course, games that have engines that scale very well, (Team Fortress 2, League of Legends, Diablo III, etc) but these are typically games made for PC in the first place.
Right before the launch of Dead Space 3, Visceral Games said that the system requirements were so low because they wanted the port to have feature- and visual-parity with its console counterparts. This was one of the hundreds of games in the last eight years that had dumbed-down PC versions because of development constraints or issues with porting and scaling the game.
Granted, some studios did take the time to tweak the game for their PC fans. Bethesda fixed Skyrim for the most part; Crytek released some welcome texture and tessellation packs for Crysis 2; Codemasters exposes lots of options to increase fidelity on their games; and the Metro titles from 4A Games are still system-killers.
But next-gen is here and it’s time for a change. Not only do we now have an industry full of skilled and technically adept game developers, we also have more studios and even indie teams pushing the limits of what is possible on today’s hardware.
I personally consider this somewhat of a renaissance for gaming as a whole. We’re seeing incredibly realistic racers, persistent worlds, huge sandboxes, destructive environments on a large scale, and games that try to change how we play and interact with them. The sky is the limit with the hardware available today and we’ve been asking for years to hit that limit.
Do you remember Crysis? That launch birthed a meme. It was developed primarily for the PC and almost single-handedly changed the hardware scene for enthusiasts, despite the hardware requirements on the box being rather tame. No matter how many people said it was poorly optimised, it looked incredible and made people buy more RAM, new GPUs, better processors and Windows Vista 64-bit.
Like the S.T.A.L.K.E.R. series, it brought high-end systems to their knees and spurred on a spike in hardware upgrades. Battlefield 3 did a similar thing by cutting off Windows XP support, and its successor, along with the majority of games coming out towards the end of this year, demand a 64-bit OS.
We have to accept that the gaming PC market now runs on the terms of the consoles. The common denominator sets the tone for how games are made and how they look, and for the past eight years, that’s been the Xbox 360.
I’d welcome any push to get out of that. I’d welcome the price to upgrade if it means that I could play games like The Division, GTA V, Witcher 3: Wild Hunt, or The Crew on full details without being hamstrung by hardware requirements that date back to 2006. Would you feel the same way?