Ars Technica: The spotty death of gaming review scores

The idea of adding a score or grade to the end of a review isn't unique to games, of course. Reviews of music, movies, TV shows, and even tech gadgets routinely sum up the text with a number, a set of stars, a letter grade, or some other shorthand for classifying the work in relation to its peers. But video games are somewhat unique in just how much focus is put on these scores. Gamers pore over relative scores to the point of obsession, arguing over whether one game really deserved a better score than another or what various scores really mean ad infinitum.

Much more at the source: http://arstechnica.com/gaming/2015/02/the-spotty-death-and-eternal-life-of-gaming-review-scores/
 
I don't think review scores are the problem.

Schreier's example of the comment section of RPS vs IGN fell completely flat with me. My theory is that even if IGN dropped review scores, their comments would still be a festering dump. It's more about the audience than the scores, themselves. RPS definitely serves a different crowd.
 
I don't think review scores are the problem.

Schreier's example of the comment section of RPS vs IGN fell completely flat with me. My theory is that even if IGN dropped review scores, their comments would still be a festering dump. It's more about the audience than the scores, themselves. RPS definitely serves a different crowd.

reviews score are a problem. They drive non healthy relations between editor/pr and the press. they are bad for the dev who got contract negociated upon the review score.
 
reviews score are a problem. They drive non healthy relations between editor/pr and the press. they are bad for the dev who got contract negociated upon the review score.

That's a problem alright, but it doesn't have to do with review scores. It has to do with the relations between PR and the press. And the other has to do with the relationship of the developer and the publisher and the stipulations of bonuses in the first place.

If review scores ceased to exist, you don't think publishers like Bethesda would come up with other (possibly unfair) bonus criteria? Maybe there would just be absolutely no bonus offer at all. Not sure that's better.
 
I don't think review scores are the problem.

Schreier's example of the comment section of RPS vs IGN fell completely flat with me. My theory is that even if IGN dropped review scores, their comments would still be a festering dump. It's more about the audience than the scores, themselves. RPS definitely serves a different crowd.

There is definitely a crowd that will argue over reviews regardless of a score at the end.
 
That's a problem alright, but it doesn't have to do with review scores. It has to do with the relations between PR and the press. And the other has to do with the relationship of the developer and the publisher and the stipulations of bonuses in the first place.

If review scores ceased to exist, you don't think publishers like Bethesda would come up with other (possibly unfair) bonus criteria? Maybe there would just be absolutely no bonus offer at all. Not sure that's better.

I agree.
Everyone uses scores, and nobody really gives a shit, so the problem has to come from somewhere else.

I don't care one way or the other, since i rarely read game reviews, though.
 
With more major sites ditching them its going to lead to a pretty funny situation where all those devs with metacritic bonus clauses in their contracts are having their financial fates decided by the most random backwater sites.
 
With more major sites ditching them its going to lead to a pretty funny situation where all those devs with metacritic bonus clauses in their contracts are having their financial fates decided by the most random backwater sites.

So long as BioGamerGirl is on the scene their bonuses are safe.

Anyways, I welcome in the dropping of review scores and ushering in of "Yes/No/Not Yet" articles like Kotaku, so the more the merrier.
 
With more major sites ditching them its going to lead to a pretty funny situation where all those devs with metacritic bonus clauses in their contracts are having their financial fates decided by the most random backwater sites.

BiogamerGirl: "all according to plan"

edit: goddamnit rokker you motherfucker
 
I don't think there's inherently a problem with a review 'score'.

The problem is that they don't adhere to their own scale. When a game gets shit on from a great height and then gets a 7 out of 10 there's something funky with the score system.

They also don't account for subjectivity. How do you score a game in a genre that you dislike?
 
review scores are so useful as a data point people can argue about them.

if you remove them people can't really interpret whats up and they don't argue.
 
"7/10? Ha, this game sucks. Suck it fans of other consoles/genres/studios/publishers. What a failure of a game."

Never, ever read the review to find out what the reasons behind the score were or, in fact, if it's just a genre or style the reviewer just doesn't particularly like.

Sorry, I'm just ranting.
 
My problem with game scores is that no one site even has a rubric for how they calculate these scores. Some at least have categories, but how each category got their score is totally superficial and inconsistent. Why does "graphics" get an "8" for example? How is an 8 justified as opposed to a 9? A lot of these scores game reviewers honestly just pull it out of their asses.
 
The problem is software isn't set and forget. Games now regularly have patches, DLC and expansions. It was dumb enough to put a score on a piece of work and have it mean much but to slap a static score on a moving target is downright disingenuous. If you want your score so bad then you need to revisit every single updateable game regularly to make sure that score accurately reflects the product in its current state. If that sounds like a lot of work, well it is, and that's why you should at least drop that portion and still append and patch reviews as the software changes.
 
I think Craig on the latest Crate and Crowbar podcast made a good argument for keeping scores. The shift Eurogamer made is essentially useless because it's now just a less nuanced scoring system.
 
I say get rid of review scores. It wouldn't be so bad if people actually had to read reviews for once rather than scrolling to the bottom of the page and looking at the score then spazzing out. That said, Eurogamer's new system is silly.
 
I just wish people reading the reviews understood what the score represented.

IGN uses, for example
10.0 - Masterpiece

Simply put: this is our highest recommendation. There’s no such thing as a truly perfect game, but those that earn a Masterpiece label from IGN come as close as we could reasonably hope for. These are classics in the making that we hope and expect will influence game design for years to come, as other developers learn from their shining examples.

Examples: The Last of Us, The Legend of Zelda: Ocarina of Time, Grand Theft Auto IV
9.0-9.9 - Amazing

We enthusiastically recommend that you add these games to your to-play list. If we call a game Amazing, that means something about it seriously impressed us, whether it’s an inspired new idea or an exceptional take on an old one. We expect to look back at it as one of the highlights of its time and genre.

Example: BioShock Infinite, The Walking Dead, Sid Meier's Civilization V: Brave New World
8.0-8.9 - Great

These games leave us with something outstanding to remember them by, usually novel gameplay ideas for single-player or multiplayer, clever characters and writing, noteworthy graphics and sound, or some combination thereof. If we have major complaints, there are more than enough excellent qualities to cancel them out.

Example: Rock Band 3, State of Decay, NHL 13
7.0-7.9 - Good

Playing a Good game is time well spent. Could it be better? Absolutely. Maybe it lacks ambition, is too repetitive, has a few technical bumps in the road, or is too repetitive, but we came away from it happy nonetheless. We think you will, too.

Example: Resident Evil 6, Call of Juarez: Gunslinger, God of War: Ascension
6.0-6.9 - Okay

These recommendations come with a boatload of “ifs.” There’s a good game in here somewhere, but in order to find it you’ll have to know where to look, and perhaps turn a blind eye to some significant drawbacks.

Example: Tom Clancy's HAWX 2, Wonderbook: Book of Spells, Disney Epic Mickey 2: The Power of Two
5.0-5.9 - Mediocre

This is the kind of bland, unremarkable game we’ve mostly forgotten about a day after we finish playing. A mediocre game isn’t something you should spend your time or money on if you consider either to be precious, but they’ll pass the time if you have nothing better to do.

Example: Dust 514, Time & Eternity, Game & Wario
4.0-4.9 - Bad

For one reason or another, these games made us wish we’d never played them. Even if there’s a good idea or two in there somewhere, they’re buried under so many bad ones and poor execution we simply can’t recommend you waste your time on it.

Example: Aliens: Colonial Marines, Medal of Honor: Warfighter, Dark
3.0-3.9 - Awful

You’re welcome. We just saved you from making a terrible mistake by buying this collection of poorly executed, bad, or unoriginal ideas – or even playing it for free. While even a Bad game generally has some bright spots, an Awful one is consistently unenjoyable.

Example: Samurai Warriors 3, Let’s Fish! Hooked On, Legends of Dawn
2.0-2.9 - Painful

Let’s face it: anything worse than Bad is a trainwreck. Worse than Awful? That’s kind of impressive. Not only are these games not fun, but they’re outright infuriating or insulting.

Example: Quantum Theory, Fast & Furious: Showdown
1.0-1.9 - Unbearable

The silver lining of these dark clouds is that they’re often so poorly made that they crap out after a certain point (if they ever worked at all), so we were spared from any permanent effects that playing a game this terrible might have on our brains.

Example: The Simpsons Wrestling, London Taxi Rush Hour, Elf Bowling 1 & 2
0-0.9 - Disaster

One of the worst games ever made. Games that score this low are rare, because it’s reserved for those that simply don’t work or are outright frauds – they’ve really got to work for it. This is also probably where we’d put a game about how awesome Nazis are.

Example: Extreme PaintBrawl

Gamespots....
10 - Essential

9 - Superb

8 - Great

7 - Good

6 - Fair

5 - Mediocre

4 - Poor

3 - Bad

2 - Terrible

1 - Abysmal

Jim Sterling

10 (Sterling): A 10 represents the finest of the fine, an exemplar of its genre, and the current game of its type to beat. While nothing in life is perfect, these games come as close to the ideal as one can get. Such a score is not given lightly, and is reserved for true pinnacles of the medium. A pinnacle can be relative – another game may eventually come that bests it, but for now, this is the kind of stuff the industry ought to strive for.

9 (Superb): A 9 represents excellence in almost every area, or at least a consistently delightful experience from beginning to end. There may be problems with the game, but they’re of a negligible variety, and often include such criticisms as, “I wish there were more of the thing that was brilliant.” While not a genre leader, it’s truly a beautiful game in several significant ways.

8 (Great): An 8 represents something that could prove immensely enjoyable to a majority of people, if not everybody. There are one or two noteworthy blemishes on their records, something holding them back, but nothing so major as to not be worth a lot of peoples’ time and energy.

7 (Good): A 7 represents a favorable slice of entertainment that ought to prove welcome in the right house. Not the most glamorous, polished, or jawdropping, but most definitely good for a chuckle or two.

6 (Alright): A 6 represents an acceptable game, the kind of experience unimaginative reviewers (like me) would call “solid.” These workaday games put the hours in, do their time, and manage not to offend the senses too much. They’re okay!

5 (Mediocre): A 5 represents “true neutral” on the scale. It’s not good, it’s not bad. It sits perfectly in between, doing nothing to stand out. It’s not going to ruin your day, but it’s not going to add anything positive, either. Truly the kind of videogame that exists solely to exist.

4 (Subpar): A 4 represents a below average, inferior experience. There may be some high points, a couple of hopeful moments, but they soon give way to the notably less favorable issues.

3 (Poor): A 3 represents a game with some significant damage. While it may have had some potential at one point, that’s been lost to lousy design, glitches, or some other unfortunate failure. Might be interesting… sometimes… but rarely.

2 (Bad): A 2 represents a straight-up bad game. A thorough disaster, there is no hope of a positive experience ever shining through all the broken features and atrocious ideas. Only the truly desperate will be able to dig through the mire and find something passable.

1 (Accursed): A 1 represents not just a bad game, but something offensively bad. Typically, but not always, something so truly vile that the reviewer can’t even manage to get to the end. The game doesn’t have to be broken beyond playability, but that’s common. It could also be so unintuitively designed, intellectually insulting, or even morally bankrupt as to render it beyond salvation. Either way, there is NO potential for a good time, even a meager one. There’s no talent, no skill, no depth, and no hope. This is… The Accursed.
 
I don't think review scores are the problem.

Yeah review scores aren't really the biggest thing wrong with gaming reviews. In fact, I wouldn't be surprised if a big part of the reason for dumping them is so that they can't be held to account when they lose their sh*t over the latest, super-hyped AAA marketing vehicle only to turn around a few months later with an oh-we-knew-it-was-crap-all-along. I wish people would just pay them no mind.
 
People could "interpret what's up" by reading the text of the review.
You're missing the point of aggregate scoring and the convenience of quickly viewing an almost infinite amount of opinions. There's really not an argument to be made in favor of limiting yourself to however many full reviews you have time to read in the face of that. Scores also allow you to better choose which reviews you actually want to read, should you choose to read any.
 
review scores should be hidden until a few minutes have passed, people will either leave the page open and go take a shit or may actually read the damn things.
 
I love reviews and Metacritic because they provide a "consensus" viewpoint on a game. Do I always agree with that consensus? Of course not, but it helps me determine if there's a game I should potentially be checking out if I wasn't already interested.
 
I just wish people reading the reviews understood what the score represented.

IGN uses, for example


Gamespots....


Jim Sterling

These people are supposed to be communicators and yet they've obfuscated those sentences into a number rather than putting them in there instead. This extra manual step to map the number to the text makes no sense. The only reason to make it numeric is to combine with other numeric data in aggregate but as this shows the scale is outlet relative meaning a 5 from IGN is not the same as a 5 from Gamespot and the data types are incompatible. So basically, it's flawed design.
 
"7/10? Ha, this game sucks. Suck it fans of other consoles/genres/studios/publishers. What a failure of a game."

Never, ever read the review to find out what the reasons behind the score were or, in fact, if it's just a genre or style the reviewer just doesn't particularly like.

Sorry, I'm just ranting.

A 7/10 is a 70/100 which is a C, and a C is an average grade and average is bad.

Blame the education system and parents, not the scores.
 
I don't think review scores are the problem.

Schreier's example of the comment section of RPS vs IGN fell completely flat with me. My theory is that even if IGN dropped review scores, their comments would still be a festering dump. It's more about the audience than the scores, themselves. RPS definitely serves a different crowd.


Ooho, the Uncharted 3 review thread on GAF tells a different tale. One of the very worst threads ever on this forum and purely down to review scores.
 
I say get rid of review scores. It wouldn't be so bad if people actually had to read reviews for once rather than scrolling to the bottom of the page and looking at the score then spazzing out.

Why in the world would I want to have to read more game reviews? Most reviews are torturous to read. I can't remember the last time I got pleasure out of reading a game review.
 
review scores should be hidden until a few minutes have passed, people will either leave the page open and go take a shit or may actually read the damn things.

Why? People who only care about numbers to go Metacritic or GAF threads and shit post after looking at them.

It's actually very telling that video game websites get a lot of traffic from Metacritic. People see the score and still click the review...

Ooho, the Uncharted 3 review thread on GAF tells a different tale. One of the very worst threads ever on this forum and purely down to review scores.

No, that's a GAF problem not a review scores problem. That problem needs to be fixed by a moderation policy here, not a review score policy there.
 
Why in the world would I want to have to read more game reviews? Most reviews are torturous to read. I can't remember the last time I got pleasure out of reading a game review.
And WAY too many of the reviews have blatant spoilers for the game they're discussing. Want a prime example? Check out GameSpot's review of Brothers: A Tale of Two Sons.
 
People could "interpret what's up" by reading the text of the review.

tbh I was just throwing shade at most reviewers ability to articulate themselves.

half the time i would have no idea what to think after reading a review.

i just wanna know if a game i'm already interested in is busted with as few spoilers as possible and scores do that just fine.
 
review scores should be hidden until a few minutes have passed, people will either leave the page open and go take a shit or may actually read the damn things.

That's not a bad idea at all.

Ooho, the Uncharted 3 review thread on GAF tells a different tale. One of the very worst threads ever on this forum and purely down to review scores.

That's a problem with the people who were posting in that thread, at the time, not with the review scores of Uncharted 3, themselves. Removing scores is a band-aid solution to a much larger problem. It's whacking the mole down, but only for a few seconds. The solution to Whack-a-mole isn't to keep hitting it at each hole it pops up in, but rather to just seal every hole with cement.
 
I just wish people reading the reviews understood what the score represented.

IGN uses, for example


Gamespots....


Jim Sterling

Yep, I feel that's also a big issue. The gaming community as a whole should embrace the full 10 point scale. Instead, pretty much anything below an 8 is crap, which is absurd
 
So for those who think the elimination of review scores would kill metacritic and the practice of review aggregate bonuses:

Even if gaming sites did start dropping reviews en masse, Doyle says a site like Metacritic would be OK. For one, he says Metacritic has discussed featuring unscored game reviews on the site in some fashion. For another, Metacritic can figure out its own scores, even if the outlets themselves don't provide them.

Marc Doyle ladies and gentlemen.

They already do this for movies and everything else under the sun.
I give this thread an 8.9/10
 
I don't think there's inherently a problem with a review 'score'.

The problem is that they don't adhere to their own scale. When a game gets shit on from a great height and then gets a 7 out of 10 there's something funky with the score system.

They also don't account for subjectivity. How do you score a game in a genre that you dislike?

While that is a good question, it is also healthy to gain a perspective from someone who isn't fond of the genre/franchise every now and then as well, otherwise you end up on the other side of the spectrum where the reviewer who's been jerking off to the game in beta's and previews ends up giving the game a 10/10 which leads to a dismissal of opinion.
 
I love reviews and Metacritic because they provide a "consensus" viewpoint on a game. Do I always agree with that consensus? Of course not, but it helps me determine if there's a game I should potentially be checking out if I wasn't already interested.

Except that Metacritic doesn't provide a consensus because there's no universal standard for what a score means.
 
These people are supposed to be communicators and yet they've obfuscated those sentences into a number rather than putting them in there instead. This extra manual step to map the number to the text makes no sense. The only reason to make it numeric is to combine with other numeric data in aggregate but as this shows the scale is outlet relative meaning a 5 from IGN is not the same as a 5 from Gamespot and the data types are incompatible. So basically, it's flawed design.
EXACTLY!!
 
I've long felt that the problem with review scores is that they're too granular. I think a five-point system (no half points) is the sweet spot because it offers a broad-strokes representation of how a reviewer feels.

But I also think it addresses some of the issues that seem to be rampant about reviews and scores

- a less precise score means that a little reading may be required.
- addresses issues about the whole scale not being used (Is there a noteworthy difference on a 10-point scale from 1-4? Those are all probably ones on a 5-point scale)
- Less queasiness about lower scores (I think people view a 3/5 differently than a 6/10).

Look at IGN's descriptions: 6 range is "OK," while 5 range is "mediocre"... those are so close to the same thing, is there any value to that difference? (Likewise everything from 0-3 ("Disaster" to "Painful")

RE: Metacritic, I like it as a quick reference. I don't make purchasing decisions on it, but I think it's a helpful resource -- though it's not just about the scores, the review blurbs let me pick and choose what reviews to read and help me evaluate the outlying opinions and whether they're relevant to me. It can steer my exploration around a game.
 
Agreed. I rather have a Rotten Tomatoes style review, where the reviewer just says "Yes" or "No" and RT aggregates that.

Eurogamer's solution is my favorite, however.

The problem with "Yes/No" and "Recommended/Not Recommended" is that it's taking a 1-10 scale (or 6 to 10) and reducing it down to 0-1. I don't think trying to simplify something that is already over-simplified is a solution to the issue.

If you want to encourage people to read the review, the grading system has to go away completely. No half measures.
 
I prefer scores. I want a metacritic overview of what the games press thinks of a game. I don't want to read 10 different walls of text by 10 different mediocre-at-best writers. I mean, I guess I kind of understand the wall of text tendency, because reviewers have to dive into these things for sometimes dozens of hours, but it really doesn't take much text to say how much you liked a game and why. A review does not have to be an essay.

Kotaku's system is the worst kind of cop-out, you have to be terrible to get a 'No', you don't have to be particularly good to get a 'Yes', and well, 'Not yet' is pretty much the definition of cop-out.
 
Top Bottom