Aggregation Aggravation, Pt. 1
Posted by: Neil Hollands
Reviewers don’t always get much love. It’s not hard to figure out why. We invest a few hours to experience works that have taken thousands of difficult hours to create, and that disparity in investment makes it hard to take when, inevitably, we get a review wrong. Add in the subjectivity of evaluating anything for the incredibly diverse viewpoints of this crazy species we call the human being, and it’s not hard to see why reviewers don’t get much love. I’ve been a reviewer myself for many years now, but I don’t hesitate to curse a fellow critic roundly when they entice me to read a book that stinks or stay away from a film that I later discover I loved.
That’s where review aggregators come in. These handy websites make an attempt to collect reviews from multiple sources in one place and provide an average score. The logic is pretty simple: look at multiple reviews instead of just one or two, and you’re much less likely to fall victim to the whims of taste. Amazon was one of the first sites to aggregate amateur reviews, and while reading all of those consumer reviews is a great advance, especially for classes of materials like certain fiction genres that didn’t previously get many reviews, the inconsistency of approach from amateur reviewers distorts many of the average scores: One-star reviews from consumers who thought Amazon delivered their item too slowly; five stars from readers excited about the forthcoming release of a book they haven’t yet read; arbitrary reviews from writers who can’t manage a sentence without a grammar or spelling mistake; or great books creamed by legions of students forced to read them in school. Amazon reviews are full of pitfalls, and the company seems to be stepping back, de-emphasizing consumer review options in recent interface changes.
Sites like GoodReads, Shelfari, and LibraryThing do a better job. They focus on books alone and aren’t trying to sell the books under review. As a result, the average amateur reviewer on these sites seems to have a little more skill. Savvy consumers will still learn to read between the lines on reviews, but the average score here is more dependable. Still, wouldn’t it be better if the reviews came from professionals, or at least from critics who have passed some kind of vetting process?
For films, professional aggregator sites have long been in place. The best is probably Rotten Tomatoes, a regular online stop for the cinerati, where one can see not only what the average approval rate is for any given film, but also find links to the full reviews, excerpted quotes from each review aggregated, and an average score (on a one-to-ten scale) for each film.
The next major online aggregator to come along was Metacritic, a site which does a fine job of aggregating reviews for music, video games, film, and television. Again, links to each collected review are provided, and an average score (this time on a one-to-one hundred scale) is aggregated. When the site first opened, books were aggregated as well, but it was obvious that there was never as much love for literature among the Metacritic staff as for the other media formats. To be fair, it’s much more difficult to aggregate book reviews, the number of books published is much larger than the number of films released or even albums produced.
Novelist and other online databases often provide links to multiple professional reviews for a particular book, but there isn’t any kind of score aggregating provided. The reader has to do this work for her or himself, and in addition, these databases have fees associated and are only available if you’re lucky enough to have access through a subscribing library.
There are, however, two significant competitors of which I’m aware in the professional book review aggregation business. Later this week, I’ll return to the subject and describe the strengths and weaknesses of each.