AMD today (24 October 2013) has finally revealed its new high-end GPU, the Radeon R9 290X, based on the GCN architecture and belonging to the new Hawaii family.
The price point revealed for the R9 290X is $549 in the US and Canada. That is nearly half the price of a GTX Titan and $100 cheaper than the GTX780. In many scenarios, the R9 290X is faster than both.
While the R9 290X releases today in the US, Canada, Europe, Australia and select Middle Eastern and Asian countries its smaller sibling, the R9 290, will only be launched on 31 October in the same regions. AMD is also expected to release a new driver in November with frame pacing improvements for all their cards.
Hawaii is a shift in the way AMD designs graphics cards around their GCN architecture, which is also slightly revised. The Bonaire-based Radeon R7 260X/HD7790 is the basis for the Hawaii architecture rather than Tahiti, which made up the HD7970 and now the R9 280X. Take the R7 260X and quadruple everything, along with the power improvements and AMD True Audio – you now have a R9 290X.
||Radeon R9 290X
||Geforce GTX 780 Ti
||Geforce GTX Titan
Shader performance goes up thanks to a total of 2,816 cores bunched into 11 Compute Units (CU). Just about anything that needed fixing on the Radeon HD7970 has been doubled – tessellation units, Raster Operators, stencil units, and primitive clock rates (how the GPU addresses and communicates with memory in a single clock cycle) all see a boost.
The memory bus increases to 512 bits in size, standard VRAM size is now up to 4GB, and memory bandwidth increases albiet only moderately. While the R9 290X is a big improvement for today for AMD, what it holds for the future is far more interesting, especially now that 8GB of VRAM is now possible on a consumer card. AMD could tack on more CU’s to take on even larger, more powerful variants of Kepler in the near future.
What it does for performance, however, is duplicated in power consumption, heat generation and noise, especially with the stock cooler. The R9 290X consumes on average 50W more power than a GTX Titan and depending on the cooler can go up as high as 94º C.
The 290X it’s pretty much built for high-end watercooling setups and for use with monitors with resolutions higher than 2560 x 1440. There are use-cases of a 1080p monitor at 120Hz, although one or two in-game settings will have to be lowered to hit that target reliably with a single card. For fluid gameplay at 120Hz or 144Hz, you’ll still need two cards.
As for who needs to upgrade, it’s simple – anyone still on a Radeon HD5870, HD6970, HD7870 and lower, or a Geforce GTX480, GTX580, GTX660 and lower will be well-served by the R9 290X. Performance almost triples in the case of the HD5870, which has only really begun to run out of steam this year.
Keep in mind that, at a minimum, this cards needs to be driven by an AMD FX-8320 or a Core i5-4670K with 8GB of system memory. Anything lower than that is a waste of resources and money. Two of these cards would only be $100 more than a GTX Titan and would be almost twice as fast.
Mantle could change everything
Just to make a brief note here – many reviewers who record GPU utilisation over time also note that in many cases with the exception of UltraHD 4K benchmarks, the R9 290X rarely ramps up to full GPU load, staying at 75% in a few cases at lower resolutions. Mantle, AMD’s low-level API, has yet to be detailed or demonstrated with their graphics cards.
But AMD has said before that Mantle wouldn’t have been an end goal if they “were only chasing single-digit increases in performance.” Should Mantle increase performance by 10% or more, the R9 290X, with a bit of overclocking, could possibly take down a GTX690 all on its own. The HD7990 would be faster still.
Though Mantle is currently only supported in the Frostbite 3.0 engine, it may leak into other studio’s hands once we have a firm idea of the potential performance on offer – and it could put AMD’s products in a decisive lead over Nvidia.
Reviews are overwhelmingly positive
PC Perspective tested the card across a range of scenarios and came away impressed with its performance, but noted that Nvidia had many options to make it less of a threat. It received their Gold award.
TechpowerUp gave it the editor’s choice, but lamented the higher power consumption and temperature figures. Despite that, they say, the price makes it all worth it in the end.
Anandtech pointed out that the 290X wasn’t running at full speed in many games, causing potential performance to be erratic in a few titles thanks to a low speed fan profile. It gets a thumbs-up from them, especially once aftermarket coolers and waterblocks are readily available.
The Tech Report noted that Nvidia and AMD were now back at performance parity, but that the former company needs to do something about their pricing, as features like Physx and GSync do not talk louder than money. They gave the R9 290X a thumbs-up, adding that it could further improve over time.
Tom’s Hardware awarded the R9 290X the Elite award, a first for any GPU in recent years. They also note that with the frame pacing improvements in AMD’s drivers, the R9 290X runs pretty well in Crossfire mode with an Eyefinity setup with frame rating on. That wasn’t possible before.
HardOCP noted that what reviewers were showing today was just the tip of the iceberg. Cards with aftermarket coolers will be able to reach much higher speeds with lower temperatures. They gave it their Gold award, the first given to a GPU in a long time.