{"id":2017,"date":"2019-03-19T15:59:51","date_gmt":"2019-03-19T15:59:51","guid":{"rendered":"\/?p=2017"},"modified":"2026-02-02T16:57:59","modified_gmt":"2026-02-02T16:57:59","slug":"short-guide-to-rotten-tomatoes-and-metacritic","status":"publish","type":"post","link":"https:\/\/mcc.sllf.qmul.ac.uk\/?p=2017","title":{"rendered":"Short guide to Rotten Tomatoes and Metacritic"},"content":{"rendered":"<p>Review aggregator websites Rotten Tomatoes and Metacritic have become staples of contemporary film culture, making them an integral part of the release and marketing of Hollywood, independent and world cinema. They rose to prominence in the late 1990s as the internet provided wider access to reviews of new film releases by critics working for a broad range of publications. Though different in detail, both Rotten Tomatoes and Metacritic aim to provide a synthesis of the critical community\u2019s reaction to a film in a score of zero to 100. This approach has both been criticised for devaluing film criticism and confusing consensus with quality, and celebrated by those who see aggregators as useful tools for audiences (Kohn, 2017).<\/p>\n<p>Before the internet, film critics\u2019 reviews of the latest film releases mainly featured in local and national newspapers, sometimes accompanied by some form of numbered grade or star system. In the 1970s, US television embraced film criticism, most notably with the review show <em>Sneak Previews<\/em> (1975-1996) and its subsequent incarnation <em>At the Movies <\/em>(1982-1999)<em>.<\/em> Both were initially presented by two Chicago newspaper critics Roger Ebert and Gene Siskel, and reached a general audience through national syndication (Poniewozik, 2013). This was a highly concentrated and unified media environment where the mainstream print press and television made up most of the general audience\u2019s contact with non-marketing material on the latest films. With the internet came wider access to reviews from all major publications, along with the decentralization of film criticism away from newspapers and television stations and towards new media websites. The relationship between spectators, film releases and film criticism changed as audiences generated their own publicly available content, fostering online subcultures that engaged critically with established forms of film criticism.<\/p>\n<p>Rotten Tomatoes and Metacritic, founded in 1998 and 1999 respectively, took advantage of this new media environment by providing a platform for film reviews to be collected in one place. Each review is attached to a score along with a short excerpt from the article encapsulating the critic\u2019s opinion. Both websites aggregate these scores into a percentage for each film (with Metacritic also covering games and music). Despite this broad similarity, the methodology each employ to achieve this score and the user interface in which they are shown are markedly different, each spawning its own set of questions and criticisms.<\/p>\n<p>The most significant methodological difference between the two websites is the community of film critics chosen to make up the pool of reviews that determines the \u201cTomatometer score\u201d and the \u201cMetascore\u201d. Rotten Tomatoes casts a wide net across hundreds of outlets, including established print media like <em>The New York Times<\/em>, new media film criticism and news websites such as <em>The Dissolve <\/em>and <em>Indiewire<\/em>, entertainment and Hollywood-centred publications such as <em>Empire<\/em> and more obscure digital publications often run by non-professional critics such as <em>Popcorn Junkie <\/em>and <em>Cinemixtape<\/em>. The number of reviews varies widely, with a mainstream film such as <em>Argo <\/em>(2012) garnering around three hundred and an independent film such as <em>Paterson<\/em> (2016) closer to two hundred. Metacritic opts for a much more restricted approach, with a community of 62 publications, mostly established newspapers (both national and local) and new media outlets of considerable scale such as <em>The Wrap <\/em>and <em>The Verge<\/em> (\u201cFrequently Asked Questions\u201d). Given the resources available to such outlets, most films with sizeable distribution in the United States can be expected to pick up at least thirty reviews on Metacritic, with more commercial cinema and high-profile titles from film festivals reaching over forty. Matt Atchity, Rotten Tomatoes\u2019 editor-in-chief, argues that the website\u2019s approach is more \u201cdemocratic\u201d than Metacritic\u2019s (Atchity, 2012). The former explores the full breadth of film criticism, while the latter retains a sense of exclusivity about what serious film criticism is, a method which lends itself to accusations of elitism.<\/p>\n<p>The second most significant difference between the two outlets is the way in which they choose to score reviews. Rotten Tomatoes scores reviews with either a \u201cfresh\u201d or \u201crotten\u201d rating for a specific film, echoing the practice of throwing rotten food at a performer who fails to impress. The final Tomatometer score represents the overall percentage of fresh (i.e. positive) reviews the film has received from critics, which is displayed alongside another score referring only to \u201ctop critics\u201d from the most read and respected publications. Metacritic uses conversion scales, available on its website, which turn a critics\u2019 star or letter grades into a numeric scale between zero and 100. The final aggregated score is established through a weighted average and an undisclosed process of statistical normalization where certain publications\u2019 grades are given more weight than others. In both the Tomatometer and the Metascore, the final score is then described. If a film receives above 60 on either scale it is described as \u201cfresh\u201d or as having \u201cgenerally favourable reviews\u201d respectively. Below that the Tomatometer describes a film as being \u201crotten\u201d, while Metacritic describes films with a score between 40 and 60 as \u201cmixed or average\u201d, and 20 to 39 as \u201cgenerally unfavourable\u201d. Metacritic also describes both ends of the scale, below twenty and above 80, as \u201coverwhelming dislike\u201d and \u201cuniversal acclaim\u201d, while Rotten Tomatoes only recognises films that have achieved a consistent Tomatometer score of over 75 per cent, and that has never fallen below 70 per cent, as being \u201ccertified fresh\u201d.<\/p>\n<p>Rotten Tomatoes\u2019 mostly dualistic rotten\/fresh model traces its history back to the thumbs up\/thumbs down scale used by Siskel and Ebert in their television show, at a time when reviews were more commonly scored across a scale of four or five stars. Atchity acknowledges this antecedent and describes a 90 per cent fresh score as signifying that \u201ceverybody said, at the very least, \u2018Yeah, check this out\u2019, [which] doesn\u2019t necessarily mean everybody\u2019s raved about it\u201d (qtd. in Bibbiani, 2015).\u00a0 The logic behind this approach, according to Siskel, was that this was the closest approximation to a normal conversation on a film among general audiences. In this context, the main point of interest regarding a new release was not \u201ca speech on the director&#8217;s career\u201d but \u201cshould I see this movie?\u201d (qtd. in Ebert, 2008), again echoing the theme of being democratic and making film more accessible.<\/p>\n<p>Critics of the fresh\/rotten and thumbs-up\/thumbs-down model include Ebert himself, who bemoaned the absence of a \u201ctrue middle position\u201d. He declared that although he thought \u201c2.5 [out of four] was a perfectly acceptable rating for a film I rather liked in certain aspects\u201d, he ultimately considered this to be a negative review. This runs counter to the Rotten Tomatoes system which translate the 2.5 as a fresh review, reportedly forcing Ebert to change his ratings to accommodate the algorythm (Emerson, 2011).<\/p>\n<p>The problem of conflicts over ratings and conversions doesn\u2019t for the most part affect Metacritic, which allows for a \u201cmixed\u201d and \u201cgenerally positive\u201d terrain, using critics\u2019 own scores to convert directly into its scale and allowing them to appeal should they disagree with a certain score (\u201cFrequently Asked Questions\u201d). In practice, this means that Rotten Tomatoes \u201clends itself to higher high scores and lower low scores\u201d (Weinstein, 2012). However, some publications such as <em>Sight &amp; Sound <\/em>and <em>The Hollywood Reporter <\/em>publish unscored reviews, which presents issues of interpretation. To name but two: how much weight should a critique of <em>Zero Dark Thirty <\/em>(2012)\u2019s politics as a \u201c<em>piet\u00e0 <\/em>for the war on terror\u201d be given if the review also praises the film as \u201ctechnically awe-inspiring\u201d (Westwell, 2012)? And how can the combination of <em>Ex Machina <\/em>(2015)\u2019s \u201cremarkably slick\u201d mise-en-sc\u00e8ne versus that of the script\u2019s \u201cmuddled rush\u201d third act be quantified (Dalton, 2015)?<\/p>\n<p>Aside from the methodological concerns, perhaps the most important question regarding review aggregators is their impact on a film\u2019s box office performance and post-release success, and how it relates to the websites\u2019 corporate ownership structures. Metacritic\u2019s parent company CNET is part of the CBS Corporation media conglomerate, which owns CBS Films and television stations such as Showtime. Since 2011 Rotten Tomatoes has been owned in part by Warner Brothers, which as of 2019 owns 30% of its parent company Fandango, along with Comcast Universal, which owns a majority stake. In 2017, Rotten Tomatoes was heavily criticized due to suspicions that it had delayed showing a low aggregated score for Warner Brothers\u2019 $300bn blockbuster <em>Justice League <\/em>(2017). Ordinarily Rotten Tomatoes would start showing an aggregated score at the end of the studio\u2019s review embargo on Tuesday evening, but they only began showing it on Thursday morning just before opening night, raising concerns that this was calculated to safeguard opening weekend box office grosses (Zeitchik, 2017).<\/p>\n<p>However, it is far from clear that the aggregated scores have a meaningful impact on box office grosses. There is little available research on the topic, and the debate surrounding review aggregators tends to focus mostly on blockbuster Hollywood films. For low- and mid-budget films, word of mouth and critics\u2019 reviews on the festival circuit can be decisive for achieving distribution. Upon release however, marketing campaigns for less high-profile films often amplify critical acclaim to attract audiences, making it hard to disentangle the effect of this advertising, or of word of mouth on social media and among friends, from the impact of review aggregators themselves. In the case of blockbusters, the average Metacritic scores for the highest grossing films have been steadily going down (Crockett, 2016). Low-scoring films such as <em>Suicide Squad<\/em> (2016) and <em>Batman v. Superman <\/em>(2016), big budget Warner Brothers\u2019 movies, still did extremely well at the box office. This is because by relying on pre-existing source material, an A-list cast and relentless marketing campaigns, major studios have attempted to make their tentpole releases, which constitute the bulk of their yearly revenue, as \u201ccritic-proof\u201d as possible (Kiang, 2016).<\/p>\n<p>Despite the shortcomings of both websites, for much of the general audience the two aggregators have become the <em>de facto <\/em>indicator of critical consensus on a given film, and sometimes the only contact they will have with film criticism before or after watching a film. When the name of a film is typed into Google, its Metacritic and Rotten Tomatoes score will appear alongside its IMDb score (given by IMDb users), followed by short excerpts from reviews by key publications and ordinary filmgoers. A similar feature on YouTube displays the Tomatometer alone, and on IMDb the Metascore sits alongside the site\u2019s own aggregator.<\/p>\n<p>Comparing the two websites, the model used by Rotten Tomatoes is too crude and dualistic when compared to Metacritic\u2019s, which allows for more levels of critical reaction and at least makes some attempt at gauging nuance in a review beyond positive and negative. Despite Rotten Tomatoes\u2019 democratic ethos, its use of a \u201ctop critics\u201d score betrays an unspoken understanding that all voices are not equal in the world of film criticism. Any aggregator system that assigns scores to reviews, however representative the author might feel they are of his or her opinion, will never escape criticism for substituting articulated thoughts for a numeric score. A snapshot of a film\u2019s critical reception at the moment of release does risk neglecting divisive but unique works of art and closing off more scholarly approaches to cinema (after all, <em>Citizen Kane <\/em>[1942] was initially poorly reviewed). Yet the usefulness of these tools must not be fully discounted. In the context of an ever-rising number of yearly film releases and the breadth of film criticism available, aggregators do allow for an estimation of what may contentiously be referred to as the critical consensus on a given film.<\/p>\n<p><strong>References<\/strong><\/p>\n<p>Atchity, Matt, \u201cI Am the Editor in Chief at Rotten Tomatoes. AMA\u201d. <em>Reddit<\/em>, 5 Sep. 2012. Web. 16 Feb. 2019.<\/p>\n<p>Bibbiani, William, \u201cYour Opinion Sucks: Matt Atchity dishes on Rotten Tomatoes\u201d. <em>Mandatory<\/em>, 28 Mar. 2015. Web. 16 Feb. 2019.<\/p>\n<p>Crockett, Zachary, \u201cBig-budget films are getting worse \u2014 and we can prove it\u201d. <em>Vox<\/em>, 4 Apr. 2016. Web. 16 Feb. 2019.<\/p>\n<p>Dalton, Stephen, \u201c&#8217;Ex Machina&#8217;: Film Review\u201d. <em>The Hollywood Reporter<\/em>. 16 Jan. 2015. Web. 16 Feb 2019.<\/p>\n<p>Ebert, Roger, \u201cYou give out too many stars\u201d. <em>Rogerebert.com<\/em>, 14 Sept. 2008. Web. 16 Feb. 2019.<\/p>\n<p>Emerson, Jim, \u201cMisinterpreting the Tomatometer\u201d. <em>Rogerebert.com<\/em>, 16 Jun. 2011. Web. 16 Feb. 2019.<\/p>\n<p>&#8220;Frequently Asked Questions.&#8221; metacritic.com. Web. 18 Feb. 2019.<\/p>\n<p>Kiang, Jessica, \u201cToo Big To Fail: What \u2018Batman v Superman\u2019 Tells Us About Blockbuster Culture\u201d. <em>Indiewire<\/em>, 28 Mar. 2016. Web. 16 Feb. 2019.<\/p>\n<p>Kohn, Eric, \u201cRotten Tomatoes Debate: Critics Discuss Whether the Service Hurts or Helps Their Craft\u201d. <em>Indiewire<\/em>, 27 Jun. 2017. Web. 16 Feb 2019<\/p>\n<p>Lattanzio, Ryan, \u201cA.O. Scott Lampoons Studios\u2019 Treatment of Critics via Twitter\u201d. <em>Indiewire<\/em>, 8 Jul. 2013. Web. 16 Feb. 2019.<\/p>\n<p>Poniewozik, James, \u201cWhy Roger Ebert\u2019s Thumb Mattered\u201d. <em>TIME<\/em>, 5 Apr. 2013. Web. 16 Feb. 2019.<\/p>\n<p>Weinstein, Joshua, \u201cMovie review aggregators popular, but do they matter?\u201d. <em>Reuters<\/em>, 17 Feb. 2012. Web. 16 Feb. 2019.<\/p>\n<p>Westwell, Guy, \u201cZero Dark Thirty\u201d. <em>Sight and Sound<\/em>, 23.2 (2012): 86-87. Print.<\/p>\n<p>Zeitchik, Steven, \u201cRotten Tomatoes under fire for timing of \u2018Justice League\u2019 review\u201d. <em>Washington Post<\/em>, 16 Nov. 2017. Web. 18 Feb. 2019.<\/p>\n<p><strong>Written by Ricardo Silva Pereira (2019); Queen Mary, University of London <\/strong><\/p>\n<p>This article may be used free of charge. Please obtain permission before redistributing. Selling without prior written consent is prohibited. In all cases this notice must remain intact.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Review aggregator websites Rotten Tomatoes and Metacritic have become staples &hellip; <a href=\"https:\/\/mcc.sllf.qmul.ac.uk\/?p=2017\">Continue reading <span class=\"meta-nav\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[13,7],"tags":[265,139,264,169,100],"class_list":["post-2017","post","type-post","status-publish","format-standard","hentry","category-short-guide","category-short-guides","tag-cbs","tag-exhibition","tag-film-reviews","tag-marketing","tag-warner-bros"],"_links":{"self":[{"href":"https:\/\/mcc.sllf.qmul.ac.uk\/index.php?rest_route=\/wp\/v2\/posts\/2017","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mcc.sllf.qmul.ac.uk\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/mcc.sllf.qmul.ac.uk\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/mcc.sllf.qmul.ac.uk\/index.php?rest_route=\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/mcc.sllf.qmul.ac.uk\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2017"}],"version-history":[{"count":4,"href":"https:\/\/mcc.sllf.qmul.ac.uk\/index.php?rest_route=\/wp\/v2\/posts\/2017\/revisions"}],"predecessor-version":[{"id":2364,"href":"https:\/\/mcc.sllf.qmul.ac.uk\/index.php?rest_route=\/wp\/v2\/posts\/2017\/revisions\/2364"}],"wp:attachment":[{"href":"https:\/\/mcc.sllf.qmul.ac.uk\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2017"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/mcc.sllf.qmul.ac.uk\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2017"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/mcc.sllf.qmul.ac.uk\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2017"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}