I'll try organize a FX-6300, as for sli and cf, I am not gonna bother with that for now.
Most normal gamers don't bother with multi gpu setups.
Printable View
There is quite a marked difference in frame time delivery between the platforms. If you go way, way back to Tech Report's original story on frame time latency, they had a large test with 18 CPUs and they all showed frame variances.
Inside the second: Gaming performance with today's CPUs
Joker, you're correct about the bottlenecking that you see in the Anandtech graph and that even the A10-5600K seems to run pretty well with the other chips, but they're only noting average frame times. Parts of games like Last Light, Civilisation V (actually, lets just say most of Civ V) and DiRT are GPU-dependent, whereas other parts, perhaps even an overwhelming majority, is CPU-dependent.
http://techreport.com/r.x/cpu-gaming...-beyond-50.gif
By only noting average FPS instead of the minimum and maximum frame rates and the way they oscillate, those graphs mean bugger-all. I like what you're doing, but I wish Anandtech would just move to FCAT benchmarking already.
Once I've got my benchmarking rig set up, I can switch to using FRAPS data for my reviews. I did some testing a while ago and although my Athlon X3 and HD6870 keeps up well, in the graphs I can clearly see where I was experiencing stutter and for how long. Seeing those results for the first time gave me data to use to smooth out my experience by adjusting settings until I had less frame variance.
Or just disable one module and clock the other FX-8350 down ;-)
I don't see how it could make a difference. If you get 50fps on one machine and 50fps on the other it is all the same. The only thing I can think of that makes a difference is AMD's tend to be slightly louder.
10fps might not be that perceptible at high rates but if you end up in a scenario where you're oscillating between 30-40fps from one CPU to the next you will definitely notice it. And after some time your average fps on any rig are going too come down to that level eventually - Crysis3 will strain any GPU on the market.
Intel is better for gaming in the long run, if you are not going to upgrade often. The newer games will start to utilize more cores though so AMD sometimes catches up, especially when you are streaming a game.
That is an awesome idea. I also don't really see how you can notice gaming performance differences between an Intel and AMD. I would say maybe when it comes to CPU intensive apps like Adobe Premier or aftereffect, then maybe... with a big maybe.