Next-gen console to PC ports need more power

Nvidia Titan SLI header column

This week we learnt that several next-gen games need a lot of power.

Watch Dogs’ requirements seem at Crysis-high levels, while Call of Duty: Ghosts baffled gamers with the system recommendation of a Geforce GTX780. Battlefield 4’s requirements also seem at odds with the kind of machines we expect that most PC gamers own.

But is this all just fud, or is there more to the story? Keep in mind that for nearly a decade our (and my own) enthusiasm for games on the PC that also appear on consoles has been tempered by terrible ports and poor use of the hardware available, resulting in games that look good but don’t scale very well.

For the majority of multi-platform titles on the PC, it was not the lead platform, and the game only had rudimentary controls and options to make it look better. Dead Space 3 was a prime example of this – the baseline hardware required to play the game on low settings was barely higher than the PS3 and Xbox 360.

To illustrate a point, let’s take a look at those minimum requirements for a moment:

Dead Space 3
Minimum Recommended
Operating system  32-bit Windows XP, Vista, 7
Processor  2.8GHz Core 2 Duo or better
System memory  1GB (XP), 2GB (Vista, 7)
Graphics card  Geforce 6800 or better, Radeon X1600 Pro or better
Graphics memory  256MB
Free storage space  10GB
Sound card  DirectX compatible

The minimum specifications for a game tell you a lot about how it will look and scale. There are outliers of course, games that have engines that scale very well, (Team Fortress 2, League of Legends, Diablo III, etc) but these are typically games made for PC in the first place.

Right before the launch of Dead Space 3, Visceral Games said that the system requirements were so low because they wanted the port to have feature- and visual-parity with its console counterparts. This was one of the hundreds of games in the last eight years that had dumbed-down PC versions because of development constraints or issues with porting and scaling the game.

Granted, some studios did take the time to tweak the game for their PC fans. Bethesda fixed Skyrim for the most part; Crytek released some welcome texture and tessellation packs for Crysis 2; Codemasters exposes lots of options to increase fidelity on their games; and the Metro titles from 4A Games are still system-killers.

Crysis 3

Crysis 3

But next-gen is here and it’s time for a change. Not only do we now have an industry full of skilled and technically adept game developers, we also have more studios and even indie teams pushing the limits of what is possible on today’s hardware.

I personally consider this somewhat of a renaissance for gaming as a whole. We’re seeing incredibly realistic racers, persistent worlds, huge sandboxes, destructive environments on a large scale, and games that try to change how we play and interact with them. The sky is the limit with the hardware available today and we’ve been asking for years to hit that limit.

Do you remember Crysis? That launch birthed a meme. It was developed primarily for the PC and almost single-handedly changed the hardware scene for enthusiasts, despite the hardware requirements on the box being rather tame. No matter how many people said it was poorly optimised, it looked incredible and made people buy more RAM, new GPUs, better processors and Windows Vista 64-bit.

Like the S.T.A.L.K.E.R. series, it brought high-end systems to their knees and spurred on a spike in hardware upgrades. Battlefield 3 did a similar thing by cutting off Windows XP support, and its successor, along with the majority of games coming out towards the end of this year, demand a 64-bit OS.

We have to accept that the gaming PC market now runs on the terms of the consoles. The common denominator sets the tone for how games are made and how they look, and for the past eight years, that’s been the Xbox 360.

I’d welcome any push to get out of that. I’d welcome the price to upgrade if it means that I could play games like The Division, GTA V, Witcher 3: Wild Hunt, or The Crew on full details without being hamstrung by hardware requirements that date back to 2006. Would you feel the same way?

More Hardware news:

Watch Dogs: PC specs released, prepare to upgrade

Call of Duty: Ghosts – can your PC run it?

Metal Gear Rising: Revengeance on PC “looking good”

Batman: Arkham Origins better using Nvidia GPU?

Forum discussion
Authors

Related posts

  • ChevronZa

    AMD’s Mantle should fix a lot of the problems with PC ports.

  • Alex Rowley

    Well that does seem to be promising it’s limited to AMD cards and I’m not sure how many PC games will even use it. Only one I know of so far is BF4. I think PC will always suffer from crappy console ports. It just depends on how dedicated and not lazy the developers making the games are.

  • ChevronZa
  • Kromas

    Lies. Mantle is open source and therefore can be used by Nvidia.

  • Kromas

    Have you ever played Mass Effect 1 on Xbox and then played the PC version. I say version instead of port cause it was a hell of a lot better. (cept for the weapon overheat bug)

  • Alex Rowley

    Yea sure. You can come back and tell me “i told you so” when there are Nvidia cards using Mantle.

  • Kromas

    It being open-source and NVidia swallowing their pride to use an AMD graphics lib are two other things.

  • Jaid Orfali

    At least ATI is not like Nvida: “lets make our Physx engine only work on our card blah blah”

  • Kromas

    That is exactly what I am saying. Even TressFX is opensource but NVidia refuses to use it cause AMD made it.

  • http://www.facebook.com/pages/Wesley-Fick/184346154999538 Wesley Fick

    Mantle is open as far as AMD will allow anyone to use it without paying a license fee, as well as adding fixes and tweaks to the API. The source code and inner workings of the GPU are proprietary technology, something AMD would prefer to keep a secret. Nvidia can only get into there if they add in code that allows for developers to code close to the metal for their architecture. Even then, that would undermine efforts that Nvidia already has put into place like Physx and CUDA. And then there would be no incentive to code for Mantle because DirectX would do the same thing and be easier to program for.

  • http://www.facebook.com/pages/Wesley-Fick/184346154999538 Wesley Fick

    Battlefield 4 and every other game on Frostbite 3.0 will eventually use it. Plus we now have support from Activision and Crytek.

  • Jaid Orfali

    OFC Nvidia are ButtHurt Atm cause they got no nextgen love :)

  • Kromas

    They don’t really care since they are winning tablet love.

  • iTile

    Cuda is failing… I’m sure you guys have realised that Cuda is not supported by the keplar and new maxwell architectures. Support for cuda stopped at the fermi architecture aka upto GTX680.

  • Vorastra

    CUDA is supported on kepler though. So what’s your point?

  • Xileer

    Up to?

    http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-680

    The site says “Including” – And the 690 also has it, as well as the Titan, and the 780.

    Mind showing me which main-stream NVidia GPU doesn’t have it, to support your argument that it is failing?

  • Xileer

    Gears of War on PC has an entire extra campaign :p

  • Jacks

    “Plus we now have support from …” We? Who the hell is we?

  • ArchieChoke

    Hi my name is GoogleIsForeignToMe and CUDA IS GONE! https://developer.nvidia.com/ultimate-cuda-development-gpu

  • iTile

    My mistake, its cuda encoder that is only supported upto GTX680 (GTX690 is 2 lower end GTX600 chips)

  • Jaid Orfali

    and they can have em :)

Top