I’m going to do something incredibly predictable and boring – compare computer hardware to cars.
Still here? Good, onto the cars then. Most high-end cars come equipped with the most ingenious features you could possibly imagine. Lane assist, to make sure you don’t stray out of your lane on the highway; automatic parking that parks the car for you; night vision cameras to increase visibility on a particularly dark stretch of road – it’s all brilliant. Brilliant, and a complete waste of money.
The important bits of a car have remained the same for decades. A decent, fuel-efficient engine (or if not fuel-efficient, immensely powerful); a safe cabin; perhaps an aircon; enough space for your needs; and four wheels (this last one is debateable). These are the basics that you’ll find in most new-entry to mid-range level cards, and that’s how it should be.
So what are you going to focus on when buying a new car? The cute little features that you’ll probably only use once and talk about when you’re boring your mates in between Dota games – confirming that you are indeed a dreary bore? Or are you going to do the logical thing and focus on the things that matter?
The same thing applies to computer hardware.
Let’s talk about graphics cards, and the buzz word “overclocking”. Without getting all hipster, overclocking has become incredibly easy and mainstream now – anyone can do it, and everyone knows about it.
However, for the most part, gamers just leave their graphics cards on stock settings and continue on until they start lagging, at which point they might dabble with the overclocking settings before upgrading their card. Be honest with yourself, are you going to overclock the card? If not, don’t worry too much about the marketing surrounding overclocking, or the great new cooler that will kept the graphics card cool while overclocking.
Take for example CUDA technology found on Nvidia graphics cards. CUDA is “a parallel computing platform and programming model invented by NVIDIA. It enables dramatic increases in computing performance by harnessing the power of the graphics processing unit.” A nifty feature to be sure, however it remains largely irrelevant to most gamers out there.
Motherboards are not exempt from this, and they will often punt features that most people won’t need.
For example, short of winning the lottery or getting an unexpected bonus at the end of the year, you probably know what your initial budget is for a PC, as well as their future upgrade path budget. With the knowledge that you probably won’t be willing to shell out for a second or even third graphics card for your setup, why should the fact that a motherboard supports Crossfire/SLI matter to you?
Why should you spend more now “just in case”?
The same trend continues across the entire hardware industry. Mouses with a built-in LCD screen, that you won’t see 99 percent of the time; keyboards that punt a built in hand cooling fan as one of the main features; screens that ship with built in USB hubs… I could go on for days listing hardware which ships with useless features that are so far-detached from the core tasks of the component.
With all of this in mind, it’s important to remember that the features above are useful and serve a purpose for a small part of the consumer market. However if you aren’t one of those consumers, why should the features matter to you? I blame marketing for making gamers lose sight of the core features of a piece of hardware. Marketers will always try and punt some obscure, irrelevant feature if they think consumer will bite. Bastards.
What matter most to gamers is performance. The ability to support all the current gaming technologies such as the latest DirectX is secondary, but also important. Beyond that, do you really care if alternate options run a few degrees cooler? Do you really care if hardware can support Tri-SLI, CUDA or even Eyefinity for that matter?
Keep your focus on the features that matter, you’ll probably save money, and bore your friends less.Forum discussion