The views expressed in this column are the author’s alone and do not reflect those of MyGaming.
Anyone who has been playing games for a fair amount of time may have noticed the slow but persistent increase in the mean of review scores over the past decade. Or perhaps you haven’t, it’s one of those things that can sneak past you when you’re not looking, like seeing a head of grey hair in the mirror and wondering how you didn’t see it start.
I’ll take my typical jab at Activision to illustrate my point. When you think of Call of Duty: Black Ops, what kinds of thoughts resonate in your mind? The words “disappointment”, “okay”, “bugged”, “infuriating” spring to mind for me, along with recollections of an organization purporting to defend the rights of gamers describing Black Ops as if it were some kind of infringement of our basic human rights.
Perhaps that particular linguistic set is too harsh though, perhaps some of you think “overcriticised”, “underrated” and “quite fun”. Certainly there were some who enjoyed it, although will readily admit it was a bit less than they had expected or hoped for.
Imagine each person, from each camp, were to rate this game out of 100. What do you think the average score would be? Imagine that number, and then consider this:
Call of Duty: Black Ops received a Metacritic score of 88 on PS3. Of 58 expert reviews, an average score of 88 was reached. A score of 88, to me, says outstanding. It says if I in any way consider myself a serious gamer, this is a non-negotiable purchase.
However, on average, 402 customers who did make this purchase disagree. They gave the game a rating of 58, which in sharp contrast to the former score says nothing more than mediocre.
“We told you what would happen if you gave a sub-80 score, comrade.”
I know it seems like every week I’m on Activision’s case about something, but Black Ops simply serves as an example of discrepancies that have been occurring for quite some time now. The fact is, game reviews are becoming almost worthless. I’ve become far more inclined to trust some good, honest user reviews of the games than that of the “experts”.
Games which are receiving pretty lackluster reviews are receiving scores in the 70s and even 80s. This has fallen in line with our perceptions though. A game with a score in the 70s on Metacritic is considered a bit of a failure, and anything less than 80 isn’t viewed as a good way to be spending your money. The aforementioned score of 58 wouldn’t be mediocre – it would be a death sentence. Anything below 60 and your game is only being bought by those who like the box. Would you consider an examination score in the 70s and 80s as mediocre? Why should a video game review be any different?
So what the hell happened?
It wasn’t long ago when a score in the 80s was something to be damned proud of, and scores cracking into the hallowed 90s were reserved only for the best of the best. The most disheartening thing is that games which aren’t backed by massive publishing houses or arriving on a tsunami of hype are often reviewed a lot more honestly.
The weakened credibility of these reviews is a problem compounded by (and no doubt tied up in) the increase in their importance. Game review scores have become critical for a game’s success. It is something often discussed by publishers, with some publicly saying that a Metacritic score not in the 80s can be seriously harmful to a game’s success.
With video gaming becoming a billion dollar industry, this engenders powerful motivation for dishonest practices – there’s a hell of a lot of money on the line. When good review scores can literally be worth millions upon millions of dollars, bribes are certainly not out of the question.
Okay, now remember you only get the other half when Big Rigs 2 hits 90 on Metacritic.
There is, however, more to this than just money. A publication consistently giving a publisher’s games glowing reviews is likely to be given access to exclusive content and pre-release scoops on that publisher’s next big release – a revenue-generating privilege an honest reviewer is highly unlikely to receive.
There’s been evidence of this kind of thing in the hardware world – Nvidia has been accused of cherry-picking superior cards and sending them to reviewers who have given them favourable reviews in the past. Give a bad review and you get blacklisted; the next time a round of cards is being sent out early for review, don’t hold your breath for a package.
Make it stop
So what can we do to curb this? Unfortunately, probably not very much. As a more pedigreed breed of gamer, we can rely on more than a couple of reviews, but many people read these overbloated praise pieces as gospel. All we can really do is continue to support the honest reviewers out there, and bring attention to those well over the line of credible plausibility.
Unfortunately, it can be somewhat of a challenge for an honest reviewer, when his/her score of 75 is seen as terrible, where the intention was “good, but not great”. If you review games, have the balls to stand by your ratings, even under assault of angry nerdrage and publisher scorn. Your true audience will appreciate and respect it.
Do you rely on game reviews to pick your games? Have you felt misled in the past? Do you think the majority of game scores are accurate? Share your experiences in the forums, or comment below!