My overall experience with game reviewers is that they tend to be aware of the problems in a game, but they also tend to have a different idea of what those problems fully mean when held up to the game as a whole, which leads to a sometimes massive disparity between a professional review and a user, or even two professionals. In general, if I'm going to be watching/reading reviews, I'll get a few people's opinion and also maybe check out a few of the users on Metacritic. In some cases, I'll just wait for opportunities to try out the game in question, or talk to someone I know who has similar tastes as me, as the game is never going to get a fair reviewing from anyone (this is the case with Call of Duty).
Also, keep in mind, professional reviewers tend to be a little more level-headed than a lot of gamers. Granted, this level-headedness sometimes leads to them defending something that shouldn't be defended (ex. IGN defending the lack of diversity in ME3 endings), but it also prevents them from excessively overblowing every issue they have with the game, which is a problem that plagues user reviewers. They also don't get involved with the "RAWR! My opinion is right! Ignore anyone who disagrees with me!" ranting that many users get involved in.
Overall, I respect professional reviewers, but I don't take everything they say as gospel. I normally follow reviews from a few critics, some "less professional" ones (ex. AngryJoe, ZeitGeist Reviews, etc.), and I might check out a few user reviews. Normally, though, simply playing a demo, watching a gameplay video, and/or reading up on a few features is enough for me to gauge whether or not I'll enjoy the game, and the reviews are just there to catch any extra strengths or weaknesses.
redmoretrout said:
shapaza said:
It's probably this. I don't really follow most "professional" game reviews, so my opinion probably isn't valid, but I do know that users have a tendency to overreact to the silliest bullshit. Remember that whole controversy about the new Dante design for DmC: Devil May Cry? The average Metacritic user score for it is 4.7 (looking at the PS3 version) even though the game is quite playable and basically alright.
A 4.7 is actually an appropriate score for a game thats "playable and pretty much alright." A game that is just passable and mediocre should be about a five. Thats the problem with pretty much every video game review, they dont use the lower three-quarters of the scale so all of their reviews become meaningless. When every single game falls between 7 - 10 out of 10 you know something has gone wrong.
The problem with this is that it requires a fundamental shift in the way we view grades. When someone receives a 50% on a test in school, they aren't thinking "Well, I did average." No, they are thinking "I am so screwed right now!" Pretty much everything below 70% is a sign that you need serious improvement, with the exact score only indicating how much improvement there needs to be. If you want people to accept 50% as average, then you need to change their thinking on what those grades mean, but that would be very hard considering most people grow up viewing a grading scale as only being acceptable if their grade falls within the 70-100% range. Even in college you won't be able to move on to the next class if your grade is below 70% at the end.
Personally, I think the 1-5 star review score is the most effective. It's easier to use the full spectrum without getting redundant (seriously, there's very little difference between a 1/10 and a 5/10), and the middle grade (3/5) doesn't look as horrible, even if, taken as a percentage, it comes out to 60%. Not to mention, it distances us from viewing video game grades as the same as school grades. Of course, this runs into issues with Metacritic standardization and its way of tracking scores (doing so on a 0-100 scale), but even trying to use 5/10 as the standard of an average game would go against Metacritic's standards.