Meta Critic

sherbz

Deity
Joined
Mar 27, 2009
Messages
2,532
Location
London
Not really sure whether this is the best forum for this, so if mods think it should be moved then feel free.

I have just answered a questionnaire on meta critic about how they can improve their system. One of the annoyances i have about metacritic is all the yuppies who score a game 0/10 just because it wasnt quite what they were expecting. I would go as far to say that these people are just trolls and dont really add anything significant to the discussion. There is no objectivity to their review, they are simply trying to move the score closer to their view point because they understand it works on the law of averages. This makes the user score less useful in my view. It would be better to either replace it with a positive, negative, or mixed rating system, or remove the top and bottom 10% of ratings. i was just wondering what other peoples points of views were on this?
 
You know what's even worse than people who have been disappointed and give 0/10 instead of a 3/10 or whatever ? People who will give a game 0/10 as a matter of principle if they don't like the publisher or the DRM and haven't even played the game.
"It might be a good game, but Ubisoft=0/10", "I haven't played Deus Ex:Human Revolution, but it requires Steam. 0/10."

Sadly there's no practical solution to this problem. They could employ people to read all reviews and pick which ones to ignore and take seriously, but I figure it would take a lot of man power, and it would be easy to tamper with by publishers who want to bury bad reviews.
 
Yeah but how valuable is an aggregate number anyway?
Mean reviewer scores for anything vaguely AAA are going to be monstrously inflated anyway, to the point where being vaguely competent and stable pretty much guarantee it will sit over 8 somewhere even if it's utterly dull and uninspired dross. You're not allowed to give a game less than 7.3 unless it's niche/indie enough to not have to worry about backlash, or unless it's so terrible that everyone can agree to dogpile, like with Aliens Colonial Marines.

The melodramatic idiots actually serve a useful purpose here - if there's a massive discrepancy between the reviewer score and the user score, it's a good caveat that there's something to perhaps be wary of with the game, and maybe needs a little more investigation. Perhaps it's some ludicrous campaign about not liking the ending or not liking Steam, in which case you can ignore it and happily go buy the game, or perhaps it's something serious like always-on DRM which could be an absolute dealbreaker for those of us with bad internet connections.
 
I think it's just as bad fanboys giving a game 10/10 just because it has flashy graphics and it's their favorite game series. I seen both Civ5 and DA2 rated very highly, and I cannot fathom why.
 
It's a decent enough indication of the community's view on a game. It does detract from giving you an accurate rating of how good the game actually is, but if the score is incredibly low, that should tell you that you should do further research before buying the game, as for whatever reason the community is outraged about *something*. Maybe that outrage you don't care about, maybe you do, but either way it's an indicator that a lot of people don't like a major aspect of the game or another. If users give a game an amazing score? Consistently? That should tell you something too.

The big issue is that a these days you can't really trust many reviews or ratings. Companies often pay to get a "preferred" review out there and a lot of games consistently get good marks, even if there are issues with them. Regular users chiming in with their own personal views seems to be a counterbalance to that. In the end there is BS coming from all directions, so you really have to know what you are looking for in a game and ignore most review scores, sticking to indepth research, making sure that the game delivers the gameplay options that you personally care about.
 
They could work out the current average and standard deviation. Then take the average of reviews within 2 or 3 standard deviation points of the original average.

Other options would be to give a weighting to reviews so that if someone has only reviewed one game then their review score has less influence. If they have submitted lots of ratings then you could check the count of ratings less than 4 or greater than 8 and then compare that total against the amount of ratings they have submitted. If the percentage is too high then the weighting of their reviews could be reduced.

This is assuming that ratings would form a bell curve presumably with a peak around 7 or 8 and dropping down at either side although its not ideal as there is mostly a bias towards higher ratings.
 
Just as bad are the people rate games 10/10 to counter balance the lower ratings, even if many of the lower ratings are because the game actually sucks and not because of one particular point (ie requires Steam, Origin).

Having forced always online and MP into a SP game should drop a score by about 5 points anyway though.

Its not like the "professional reviews" on Metacritic are much better, with some pretty dubious sites being accepted and plenty that go "I enjoyed this game, 10/10" and a few that go "I am complete and utter twit who has no business reviewing an outhouse 5 and/or 10/10" (ie. machinima.com).
 
Just as bad are the people rate games 10/10 to counter balance the lower ratings, even if many of the lower ratings are because the game actually sucks and not because of one particular point (ie requires Steam, Origin).

Having forced always online and MP into a SP game should drop a score by about 5 points anyway though.

Its not like the "professional reviews" on Metacritic are much better, with some pretty dubious sites being accepted and plenty that go "I enjoyed this game, 10/10" and a few that go "I am complete and utter twit who has no business reviewing an outhouse 5 and/or 10/10" (ie. machinima.com).

This is why I think they should do something along the lines of dropping the top and bottom 5% of scores. Anyone know IMDB's method? I consistently hear that their ratings system is sophisticated and fair (or at least more sophisticated and more fair).
 
You need to read some of the individual reviews is all. It will give you a pretty decent idea of what's going on, and better yet, find a video review showing some actual gameplay.

Yeah it's completely ridiculous when people say requires steam 0/10. First off, not everyone hates steam. I used to hate steam when I had super cheap 1.5 mbit internet. When I bought a boxed copy of civ5 but had to wait 2 days to download it I was pissed. But at&t jacked the price so I switched to cable at 20 mbits. Now I love steam. It's quite stable, downloads are fast for me, tons of specials and games available. But to place your entire rating of a game on the delivery mechanism or the drm or whatever is dumb, you should be evaluating the game itself at least some, maybe give it a 3/10 overall and write game - 8/10 servers - 1/10 in your review.

Also a lot of games get patched early or they work out the server issues. I see a lot of reviews (fallen enchantress and heroes of mm 6 come to mind) where the reviewers say something to the effect of, tons of bugs should not have been released yet, then a month later it's patched and mostly fine, but that review still effects the meta critic score a lot.

And then there's other games that suffer from the console port issue. I have held off on buying dark souls because everyone says the pc version is really, really bad controls wise. Conversely, you'll get a lot of complaints about skyrim being too much of a port with bad controls but I think it plays fine on pc. So that's another thing to watch out for is platform.

So just like with anything you read on the net, you have to take it with a grain of salt.
 
Yes, you can try to do better. You could cut out the highest and lowest reviews, as sherbz suggests. You can even try to be more accurate than "5%" by using statistical methods to identify outlier data points. But then if you get a game that truly is polarizing (e.g., most people find it rather bland, but a small minority of players absolutely love it), you risk not capturing that with your average. Some of Paradox Interactive's earlier titles would have fallen into this category.

Or you could only include scores from "trusted" user accounts in your average (make all the reviews visible, but only average ones from some accounts). There's all sorts of stuff you could do in trying to determine what counts as a "trusted" account, but they pretty much all require a fairly significant pool of games already reviewed. So by doing that, you cut out the casual gamers and first-time accounts - if a game is one that has exceptional appeal (or lack of appeal) to the casual crowd, this introduces a lot of bias.

You can try and mix-and-match - cut out outliers that aren't from trusted accounts, but allow those from accounts you trust. You're starting to patch some holes here, but it's still not that good a result. You can hire PhD statisticians to dissect your review distributions and try to find an accurate distribution model that encapsulates "good" reviews while pruning "noise." Unfortunately, it's rapidly becoming very complex and non-transparent. That makes people reading the "average" score less able to accurately judge how much trust they should put in it. If you say users rate it a "7.5/10" game, I have only limited idea what that actually means.

So keep it simple. You take the user reviews. You average them. You present visitors with the average. They know exactly what they're looking at. They can glance at the actual reviews to see the big pile of "0/10: I hate EA" reviews to know the number is distorted; they can actually (*gasp*) read some reviews and see what people thought about the game in detail.
 
Anyone know a good alternative to Metacritic? One place I found recently is videogamegeek.com and I like it, but I'm just wondering if there are similar sites out there I haven't found. What I want is a site that has user ratings of video games and enough users so that the ratings aren't heavily influenced by outliers. Metacritic and VideoGameGeek are okay but it would be nice to know some other places to check.
 
There isn't one. Any similar rating system is subject to the same human flaws. The only good reviews you can get are by reading a variety of, preferably more in-depth, articles and word of mouth such as this forum, a few good sites like Rock Paper Shotgun, certain sections of Reddit, etc. Even then, always carry a grain of salt with you.
 
Its indeed quite questionable of how representative that final number is. Its mainly made by utter 10/10 fanboys, or 0/10 bashers. The people who really think and motivate their grade are a vast minority.

My guide for games is to get only factual information out of professional reviews, and other out of in depth player reviews (offcours multiple, preferably videon but for example GMG has a good review system). Any review that starts using swearwords i quit immediatly.
 
I think the "0/10 always-on DRM" reviews do contribute something. I can see why someone who'd been looking forward to Sim City only to learn about the always-on DRM, where that was a deal-breaker for them for whatever reason, would leave a 0/10 review. Essentially, they couldn't play it due to that, so 0 is the best score they can give. And it's also useful since if you see a bunch of 0 reviews, you know there's something you should be aware of.

I generally disagree with the "I would give it 8, but that review lowballed it, so I'm giving it a 10" reviews. Though I can kind of see it on, for example, products on Amazon where someone gave it a low review because of some shipping issue not related to the product. Still, I'd rather see 0/10 lowball reviews than 10/10 highball ones.

What can be done about it? I think it helps to have a distribution of scores. Having an average and standard deviation also could be useful. But I wouldn't just throw them out, and I'd definitely have at least a view with them factored in.

It could also be helpful to allow users to divide up their score by area. So for example, you could rate it on DRM, graphics, storyline, replayability, and of course overall. And there might be an option to only rate certain aspects - so you could see that while the DRM was awful, and the storyline was mediocre at best, the graphics were glitzy.

The ability to filter out early reviews would be really helpful, and not just on game sites. It'd be nice to do that on Newegg, for example, so you could identify more easily which products looked awesome at first but ended up have really poor longevity. And it would help with games that are super-buggy on release, but well-polished thereafter.

Other sites I use for game reviews (and actually, I rarely use Metacritic in much detail) are Amazon and Gamersgate. Amazon's "Most Helpful" sorting usually brings a few with good-to-know-before-buying information to the top, and Gamersgate's reviews often are in depth. Gamersgate's demographics are skewed in a certain direction, but as long as that's the same direction you are skewed in, that's okay. Or, more generally, try to find a site where the reviewers' preferences tend to align with your own.
 
^^ im sorry but that first part sounds comletly wrong. If people give a 0/10 because it has drm, it means they are idiots and failed to read the system requirements before they bought it, only to whine about it afterwards. If the DRM didnt work, then they would have a case.

Basicly, only thing you know from a 0/10 review for drm is that the reviewer doesnt like drm. That doesnt help you in any way at all, and therefore is pointless and a waste of internet space
 
That's not true, sometimes the drm is too intrusive. Sometimes it's a non issue. Sim city's drm was too intrusive. If someone gave say darkspore or assasin's creed 2 when it first came out a 0/10 for drm I'd understand why. Now when people say omg this game requires steam drm 0/10 I find those kind of silly since steam is a one time activation.

I think bottom line is read the reviews because someone might score a game low but reading the review it actually has appeal to you or vice versa.
 
Yeah and when you use Uplay or Origin, you know what you can expect (especially when its origin). There the fault lies wih the buyer, who didnt do his research. You are more likely to change their approach to drm by not buying the game, rather than buying it, not playing it and make a two line review.

Rating a game then on one feature (which strictly taking isnt even a game feature), is therefore pointless.
 
In some cases like always-online DRM where they try to advertise it as a feature, especially in Sim City 2013's case, and it certainly can and does interfere with the ability to play the game. SC 2013 is like a 3 or 4/10 game anyway though and the online DRM is really bad, unnecessary and came with a truckload of lies and false advertising. Otherwise though, Origin really does nothing most of the time and doesn't have any effect upon the games themselves.
 
I mean if you want to see how silly meta critic can be, to the moon has an 81, 8.9 user score and it's not a good game at all. Civ5 at release was a critic rating of 9 I think and civ5 at release was terrible. At least user scores reflected that. It's tough to tell sometimes.
 
To the Moon isn't being rated for its gameplay, its barely a game and that's not a mark against it in any way. Its really an interactive story with well done pixel art, beautiful music and an emotional story that is way above that of most games (which are largely B movie level).
 
Back
Top Bottom