Just read this article on Crispy Gamer about an idea for a new kind of review aggregation system.
Press Pass: Building a Better Aggregator
Can a review-aggregation site actually shift the focus away from numerical scores and toward the critics themselves? CriticDNA aims to find out.
Review aggregators like Metacritic and GameRankings have a bit of a mixed reputation in and around the game industry. Millions of gamers love aggregators for distilling dozens of reviews into a single number that can aid in their purchasing decisions (and their message-board arguments). Publishers love aggregators for providing a concrete metric by which to rate the quality of their developers and PR units. Those same developers and PR units, of course, often hate aggregators for passing judgment on months of hard work with no real context. And the critics often hate aggregators for reducing hundreds of words full of personality into just another statistic to be lost in the crowd.
A new Web site is looking to solve that last problem, at least, by helping people get to know the critics behind the numbers a little better.
CriticDNA, set to launch later this month, sprang directly from developer complaints about existing aggregators, according to founder Jack Bogdan. "I've been networking with a lot of game developers to build relationships for years, and about a year back, I first got in touch with Q-Games' Dylan Cuthbert in Kyoto," said Bogdan, a 19-year-old student at the Gnomon School of Visual Effects in Los Angeles. "As I started to get to know him, he brought up how fed up he was with Metacritic and how he had an idea that might take them down. It started really small and simple, and has since completely evolved into something else. But that's where things started."
Cuthbert's original idea, Bogdan said, was to let users evaluate each review, promoting the good ones while limiting the effects of the hatchet jobs. "His initial concept was giving more power back to the users in the process, which is something we have built into CriticDNA," Bogdan said. Site users will be able to tag each review on CriticDNA with various positive or negative tags provided by the site, somewhat like the player-review system in LittleBigPlanet. These tags will appear on a critic's profile page, building a reputation system for the best critics right into the site.
"The key is to find things people want to critique fairly," Bogdan said, "Rather than getting the spammers who say, 'Hey, you hated Zelda; now I hate you,' and then they review the critic badly. We're trying to avoid that situation."
In the year since Cuthbert's initial idea, Bogdan said he's built CriticDNA into something more robust than a simple critic-ranking system. Yes, at its core, the site will still serve primarily as a way to get a snapshot of the critical landscape regarding a game. "We're doing a Rotten Tomatoes-style consensus [for each game], because we know some people just want an answer," Bogdan said. "So on the game page ... the consensus bar is right there -- Buy, Rent, Skip -- [along with] one line about why." CriticDNA will also offer a visual summary of the critical responses to a game, arranging reviews as dots on a color-gradient bar representing the range of scores, letter grades or general positivity/negativity of unscored reviews.
It's when you slide your mouse across those review dots that CriticDNA's most interesting and unique features will become apparent, Bogdan says. For each review dot, a small pop-up shows tags that reveal important information about the critic behind the game. These tags include basic demographic information, like age and gender, as well as the aforementioned feedback provided by site users.
More than that, though, there will also be automatically generated tags, like "RPG Hater" and "FPS Fan," that let users know about a critic's gaming preferences. CriticDNA will generate these tags, Bogdan says, based on an analysis of the critic's previous reviews (to seed the system, Bogdan says he and his partner are busy building an archival database of reviews from critics representing the 100 most popular review outlets, covering games from the PlayStation 2 era on). Clicking on a critic's name will bring up a more detailed breakdown of their review history, Bogdan said, along with a graph of their average review scores in each genre and optional, critic-addable RSS feeds from their blogs and Twitter feeds. Suddenly, these aggregated reviews aren't just numbers on a chart, but the products of real people, with real preferences, real reputations and real opinions.
With such detailed data on each critic, Bogdan hopes users will be able to better find and follow critics whose tastes and outlook match theirs. "It eases [readers] into the discovery process," Bogdan says. "You can get an idea of what [critics] like and dislike from the tags, see if they are your age from the demographic info, find out how they write based on user critiques, and better get to know the person through their presence on the rest of the Web, all in one place. It's the little pieces that make up the big picture."
And it's this big picture that Bogdan says he hopes will elevate users' appreciation of the individual personalities that make up those aggregate scores. "I don't think people are given a chance, with all the aggregate systems, to actually get to know anyone," Bogdan said. "Doing the research takes so much time, so we're doing it for you. If you're interested, you can go back for more. I would say there's a reason people go back to sites, and with the ones doing it right, it's personalities. If people really didn't care about the people behind the byline, we wouldn't have seen that huge community uproar surrounding the massive layoffs at 1UP earlier this year."
While Bogdan admits there is still a lot of work to be done before CriticDNA is up and running, he's optimistic that the site will provide an important alternative to the industry-skewing tyranny of the all-important Metacritic score. "That was one of the major reasons we set out to make the site," he said. "It's horrible people's jobs are at risk because of one site. I just hope we aren't used in the same way. At least now people will understand where others are coming from on a basic level."
Sounds pretty good to me. My instant thought is that it has some problems, but is no doubt a better idea in principal than metacritic.
Posts
Something like the Rotten Tomatoes system with the Top Critics
Long run, it won't be changing anything.
XBL Michael Spencer || Wii 6007 6812 1605 7315 || PSN MichaelSpencerJr || Steam Michael_Spencer || Ham NOØK
QRZ || My last known GPS coordinates: FindU or APRS.fi (Car antenna feed line busted -- no ham radio for me X__X )
critic critics?
The downside is that it could lead to people only reading things from people with similar opinions to themselves.
Though, honestly, I don't actually think it will be as big as a problem as some people might think. For instance when Wikipedia was still new people all thought it was going to be a disaster full of trolls but the community prevailed and now it's a fairly good resource despite relying on the internet to generate its content. I think that gamers might see the benefit of this resources success and be less inclined to troll the critic ratings but perhaps I'm being overly optimistic.
Yeah, you need to encourage readers not reviewers. If the readership caught on to some of the bullshit that happens when the review written contrasts with the number given ( for example, "the music and setting were probably the redeeming factors; EIGHT" ) then the BS would stop. Unfortunately, people enjoy brisk and arbitrary systems.
It really is sad that it's considered heresy to give a good game 7/10, a great game 9/10, and to give an average game a 6/10.
The numbering is stupid. The middle of the road out of 10, which is 5, is apparently NOT middle of the road; readers are at fault there along with the writers. 7 is apparently middle of the road there. The mechanics are fucking stupid, and I'm not entirely interested in discussing it at too much length.
I think part of the problem is that (some/most/all?) reviewers equate the number system to a grading system for school. Getting a 50 isn't "middle of the road", it's failing. A 7 or 70 is a C and still "passing". It is fucking stupid though, that I can agree with
If that's the mindset they're going with, they should USE THE FREAKING GRADING SYSTEM.
Some reviews do in fact do just that - award A+, A, A-, B+, B, B- and so on.
And then metacritic sites take those letter grades and convert them into numbers and aggregate and average them out with the rest of the numerical reviews.
Interestingly, the result is often that something like a B- can come out to a 50 or 60 when converted for some sites.
Steam ID: slashx000______Twitter: @bill_at_zeboyd______ Facebook: Zeboyd Games
Yeah, it screws with 1 to 5 scales too. Doubles the numbers. Which doesn't play out too well.
Edit - 'Specially since a 3 means a 7, score-bitching wise.
Steam ID : rwb36, Twitter : Werezompire,
That's a downside? I'd love to have a reviewer who shares my tastes. Someone who has time to play all the games I can't, and will tell me which ones are worthwhile.
But if that person has a similar opinion then you can be pretty sure if they disliked the game then you'd dislike it too, correct? So why would I want to read the opinions of someone who hates everything I enjoy and enjoys everything I hate? That's just wasting time.
All right, people. It is not a gerbil. It is not a hamster. It is not a guinea pig. It is a death rabbit. Death. Rabbit. Say it with me, now.
I completely agree, and throwing all of the factors of a game into a neat little number just doesn't work. But I don't see a major shift in the reviewing industry to this ideal (people want to know if a game is good in less than 5 seconds if they can help it), this is the next best thing for those of us who have a little insight into game "journalism."
1) Completion. Did they finish it? Was it not even worth finishing? In many years of reading reviews for games I'm not sure I've ever seen a reviewer just say "You know? This was cool for the first few levels but then I just didn't care and shelved it." Or "The final level was just a pain in the ass so I quit." Being complete about a game is also the only way of knowing if there are game-breaking bugs that a lot of reviews seem to miss.
2) Longevity. It is impossible for a reviewer to determine how they (an by extension "we") will feel about the game a few months from now. And that is crucial to determining whether something is worth buying or simply renting. An actual game length would also be appreciated. I've been using the "65% of what they say it is" and that's been working out pretty well. And please note that's not a "quality vs. quantity" issue. It's just that simple that if something is short I will rent it, and don't tell me something is 12 hours long if I'm gonna see the credits in 8.
3) Multiplayer. I love when reviews say "there was no lag" on a game that isn't even out to the public yet. Oh? There was no lag on the dedicated server for your preview copy? And there is no way to know if the public will really support a game and have a great community. In extreme cases, games really would have one score for single-player, and another for multi/co-op. Resident Evil 5 for example.
4) Difficulty. This is where the "review of reviewers" and having a little bit more inside information can be a big help. For example, any review of "Henry Hatsworth" that does not mention the difficulty is essentially worthless to me. Either the reviewer is a savant and I probably won't have the same opinion, or they were reviewing on concept alone/ the first few levels (also kinda worthless).
5) Personal Experience. Another issue that more information about the reviewer can help. Some games are literally different for everyone that plays them. Dragon Age is a good example. What kind of party did you use? How much was nostalgia a factor for you? Did you play as a goody two shoes or a medieval dbag?
Some of these aren't really fixable. One of the solutions would be to give a game a score when it first comes out (because they sort of "have to") and then give it a "second take/staying power" score a month or two later. I would like to see more sites do that. For some of the other issues, a report of the reviewer's game time/achievements would be interesting to see.
I do like Metacritic. I can look at a score, and blurbs from 20+ reviews, and get an idea of whether I will like it or not. If every score is above 90, or all right around 70, that tells me something. And if scores range from 10 to 90 that tells me something else. If CriticDNA is similar but improves on that system, I'm all for it.
Back to my original thought though, how cool would a game site be where you add all the games you own, and then based on other people who own the same game, and how well you rated it (I'm thinking something like 1-4) it gives you suggestions for other games you might like. This could even fit in with the whole critic system, linking you up to critics who have similar likes to you. Honestly, writing this, I expect someone to come out and tell me that there already is something like this for games because it just feels like such a no-brainer.
And I know the internet, unless it's a select few of actually intelligent people you're going to have worse problems then metacritic
Also, fuck developers for bitching about metacritic ("Oh waaah I worked so hard and it's just a number! Waaah!"). I've never personally disagreed with a metaranking with any game I've played. Even VC, a game I absolutely fucking adore with all of my heart, is worth an 88 metaranking. It just is. It has problems, I'm not stupid, and I know that I look past them, but I know other's won't or don't. That makes it an 88.
I don't know that I see that as a downside. There are some reviewers who I have come to trust/respect, both for the quality of the review/writing, and because their recommendations rarely disappoint.
From the perspective of a review aggregrator that may be less applicable though. If one of my "trusted" critics has reviewed a game, I'll probably read that and skip metacritic altogether. If a game has four optimus prime thumbs up, I don't care how that compares to other sites numerical scores.
I'm still not sure how Metacritic decides a certain review gets a certain number. I can get hard 0's, but when reviews are given a number like 76...what?
3DS Friend Code: 2165-6448-8348 www.Twitch.TV/cooljammer00
Battle.Net: JohnDarc#1203 Origin/UPlay: CoolJammer00
XBL Michael Spencer || Wii 6007 6812 1605 7315 || PSN MichaelSpencerJr || Steam Michael_Spencer || Ham NOØK
QRZ || My last known GPS coordinates: FindU or APRS.fi (Car antenna feed line busted -- no ham radio for me X__X )
In addition:
Goddamn. Metacritic is even more nonsensical than I was originally lead to believe.
All right, people. It is not a gerbil. It is not a hamster. It is not a guinea pig. It is a death rabbit. Death. Rabbit. Say it with me, now.
I tried googling criticdna to see if there was a site up for it. This thread returned as the first result. I did find a site for it, but right now it's just a logo.
Or for, you know, getting an alternate perspective. Beware the echo chamber.
The chart showing what labels they use for what scores and having different values for games makes me think that the scores game reviewers use in general are fucked up.