Points for all? Recent evidence of wine score inflation

Is inflation crippling wine scores? In the near future, will it take a wheelbarrow of points to sell a Moscato? Let’s hope Ben Bernanke–or, eegad, Paul Volcker!–doesn’t read this post or he might take the punch bowl away. To the recent evidence:

First, the Wine Advocate’s recent reviews of 1,061 “new releases from Napa Valley” came out. I haven’t crunched the numbers but Blake Gray did and found that new critic Antonio Galloni’s “midpoint” (not sure if this is the mean or median) was 92, up from 91 for Parker’s last set of reviews. Fully 123 of the wines received 95 or more points.

Second, an importer wrote this to me via email last month:

The dollars and points are obviously directly related but so is the timing. 90 means nothing today unless it’s under $10 really. 91 and 92 are no mans land. The difference btwn 93 and 94 at the $35 price point is also another important barrier causing significant sales swings (when it is 94). But the points are only valid for the issue in question. Once that issue disappears subsequent vintages of a wine that got a higher rating won’t cause sales.

Third, this recently appeared in retailer Daniel Posner’s daily blast from Grapes the Wine Company:

Every time I turn around, another 2007 Barolo is getting 96 points or higher. Sales sheets have been coming my way with loads of offerings and the points are nearly always the same. 96 points…96 points…97 points, and then, perish the thought, they try to sell me a 94 pointer! I mean really. Who is buying 94 point wine these days. 94 points is for chumps…losers…people that don’t really love Barolo. Because if you love Barolo, you are buying 96 points and up…

Leave the 94 pointers for the people that like Napa Cabs…

And the 90 pointers…White Zin drinkers drink 90 point wine!

Related Posts with Thumbnails

81 Responses to “Points for all? Recent evidence of wine score inflation”

  1. Kind of funny (or not!) how this is reminiscent of grade inflation in higher education, in particular at private schools à la ivy league. Student grades to enter grad school don’t mean anything nowadays without proper letters of recommendation.

  2. I’d come to the conclusion myself years ago [as I’m sure others did], but without evidence. It simply semed obvious. Anyway I paid as little attention to the inflation as I did and do to scores. Still, if continued inflation damages wine scoring as a concept, that’s all to the good. I look forward to the day when all wines are eight 100s or 99.5s, and a real wine-lover would sooner jump off a bridge than soil his palate with a 99.4.

  3. If there’s “inflation” then who’s the Inflater? The presumption of your post is that someone [Galloni?] is deliberately pumping up scores. But I don’t think you really believe that, do you? I don’t. I think if scores are rising for certain categories of wine, it’s because quality is improving. Critics are simply perceiving that increased quality, and rewarding it with higher scores.

  4. Steve!

    Thanks for stopping by. The inflation comes from competitive pressure to be the high score. That score is then touted by retailers when selling the wine. As one retailer told me in explaining how he chose which review to use, “highest score wins!”

    As Bruno points out, there are parallels to grade inflation.

    Bill – Yes, future scores may have to use the (98 – 100) range.

  5. Tyler: I meant “median,” but didn’t want to confuse Steve Heimoff with technical mathematical terms. He already compared my post to Fermat’s Last Theorem (and blocked me from replying.)

    We can’t say for sure what all the reasons or motives for score inflation are. The point — heh heh — is that it’s happening.

  6. I am getting sick of these %$#& numbers. If I were an art critic would I have to give all Picassos a 100? What if I didn’t like Monet? Could I give all of his stuff a 50? Would it bankrupt Sotheby’s? I think the point the good Dr Vino might be making is that all of these numbers are crap. I am very happy drinking some wines that some wine critic decides are a mere 85. And I am able to save a lot of money in the process. By the way, why doesn’t WA ever give any wine anything less than about 80. How bad does a wine have to be?

  7. Tyler, as a wine reviewer myself, I can’t be held responsible for what retailers do with my scores! Nor do I feel any sort of pressure from anyone to score higher than in the past. Maybe Blake could do an analysis of my scores to see if they’re skewing higher; I don’t know. I do know that California Pinot Noir and Cabernet Sauvignon are really better than they’ve ever been, and I am giving quite a few of them high scores. On the other hand (and this never gets mentioned), there are many dreadful wines out there, and I’m also giving lower scores than ever to them.

  8. By the way, Blake Gray says I blocked him from replying on my blog. That’s not true. I have never blocked anyone. In fact, Blake has commented on my blog many times, and his comments are published automatically; I don’t even get to approve them.

  9. Steve: Glad to hear I’m not blocked; I’ll take your word that it was a software glitch. That happens, though it is particularly unfortunate when it happens on a post when you take a potshot at someone. But still, it happens.

    I’ll say this here instead of on your blog, though, so I can be sure it’s published:

    Steve, if your scores were important — if they moved the market — I would analyze them. I cover Galloni’s scores or the Wine Advocate’s scores as news, and have done the same for Wine Spectator. Unfortunately, whenever I ask industry people whose 100-point ratings actually sell wines, I’m always told only those two. Maybe PR people tell you something else. They tell me I’m really important too, at least to my face.

  10. As quality, technology and calibration to critic’s palates have improved and increased over the years, isn’t it possible that wines have also improved over the last few years leading to the ever-so-slightly increasing scores? If so, a recalibration is likely needed in order for scores to mean something again. How to do any such recalibration is the question? Bad vintages used “help’ with this but given the global nature of the wine world and technology, this is becoming less and less likely

  11. I believe that people place weight on the opinions and scores of the top critics and magazines and instead start looking at the scores and reviews of people they have similar tastes and are more connected with either personally or through twitter or blogs; A shift from old media to new media.

  12. I gather from all of this that people who have been tasting wine for 40 years as opposed to people who have been tasting wine for 30 years or 5 years will all have different reference points for their highest-scoring wines. I guess the oldest wine taster would be the most accurate scorer providing that his/her taste buds were still operating optimally. When is it time to retire?

  13. Wine tasting isn’t like pro sports, where you have to retire fairly young. It’s more like movie reviewing, or writing poetry. You can do it as long as you want to and are healthy.

  14. I work at a winery that received some decent scores from the Wine Advocate for our 2009 white wines (low 90s). Our 2010 wines – maybe not all, but definitely our pinot gris – were better than the 09s, and they all received scores in the mid-80s from the same critic.

    While we were flattered by the high scores, and unhappy with the lower scores, we know our own wines better than anyone, and we know that these scores were not accurate representation of our wine quality.

    This is only the most recent reason I think the 100-point scale is bullsh*t. But the good scores have helped us sell wine, and will continue to do so. Like any artist, I turn my nose up at the critics, while still being at their mercy for my success. Such is life, I suppose!

  15. Dear Gabe, please don’t criticize the 100 point system simply because The Wine Advocate has shown insane inconsistency. Instead, criticize The Wine Advocate, and consider other publications to fairly review your wines.

  16. Blake – sorry to hear that your comment was not posted immediately on Steve’s blog but glad you two have sorted it out.

    Yossie – Perhaps, but then it comes back to the critic’s perception of what is good (yes, criticism is subjective, even with scores). In this example, have the wines of the Napa Valley heded in a good direction the past two decades?

    As Blake pointed out in his post, it does seem somewhat hard to give a 98+ to Kongsgaard The Judge Chardonnay when the same critic reviews grand cru white Burgundy. It will be interesting to see Galloni’s pinot noir reviews, particularly Marcassin, when his Sonoma reviews come out.

  17. Gabe – Thanks for sharing but sorry to hear of the inconsistency from the Wine Advocate. Perhaps Shildknecht will offer more consistency.

    Laura – I’ll drink to that!

    Robin – Taster age figures into wine tasting in interesting ways. I was recently speaking with a psychologist at a cocktail party and he told me that our sensory perception diminishes as we age and apparently the turning point, he said, is 42. Since I’m not quite at that milestone I breathed a sigh of relief.

    However, older tasters have had more of a chance to taste more wines and can, at least conceivably, draw on those as reference points. So age appears to cut both ways. At any rate, it’s a fascinating topic and one that I’d like to explore; if anyone has read research on this, please share the pertinent references.

  18. Who is Steve Heimoff?

  19. Daer Lars, if you ever are in the Bay Area from wherever you live, let me know. I will with pleasure let you know who I am.

  20. Steve and others

    There is no excuse offered for higher scores and better wine is just hogwash. The Wine Advocate, for one, claims it scores based upon peer groups. So, when Antonio sits down and tastes 600 Cali Cabs for a 3 day period, the scores are only related to that particular peer group.

    I have complained for years about how the same wine critic could score a $10 Spanish wine 92 points, but give a $100 Napa Cab 91 points, but try to tell us that the Cab is a better wine. It makes no sense. The root of this problem is the lack of utilization of the full 100 point scale. There are enough numbers (100 to be exact) within the range that it should work. We were down to a 10 point scale. But that appears to have lessened recently to 8 points. And we are at the point of no return. THe critics have dug their graves with this inflated scale, in an effort to stay relevant.

    When Antonio Galloni first came to the Wine Advocate, he was scoring outstanding Tuscan and Piedmont wines 92 and 93 points, giving rave reviews. But he was up against the gravy train of scores, led by Robert Parker and Jay Miller. A couple of years later, people started noticing Antonio’s scores creeping up, in an effort to stay relevant.

    That is what happens. Highest score wins. That is why Spain was listening to Jay Miller. But it is also why Australia has learned to detest Jay Miller. And why Spain and Argentina will. He left no room for improvement, only failure. But when you become a wine critic for just 5 years, you do not really care about the future of those regions, or your future, as you are retiring in your 60s.

    Antonio should be smarter about this. He needs to scale back. The latest round of Napa scores is a bit absurd. When people can show that you are scoring Napa Cabs higher than Robert Parker, it is not about improved quality, it is about grade inflation.

  21. Nail on head, Tyler. The trade quotes are especially telling. I would add a couple things…

    — The prominence and dominance of ONLY 90-plus scores has been further solidified by the daily flow of email offers and flash sites that reflexively post only the highest ratings they can scrape up. The sameness of these offers has numbing effect.

    — Gabe, here is a suggestion. Frustrated by ciritcs’ scores? Ignore all the 100ptscale ratings. Less to stress over, and your customers will appreciate your daring to be different. It can be done; witness Hedges and http://www.scorevolution.com

    — Blake tells the truth: only two sources of ratings matter these days — WA and WS. And given the recent Jay Miller hoo-haw, the whole system is at risk of having its integrity fall off a 96-point cliff. At the very VERY least, recent events will shine light on the fact that the scores are being pumped out like sausages. And not by panels or editors, but rather by individual middle-aged men sitting down with 20 wines at a pop and not a crumb of food.

    — Steve, funny how you are so often quick to defend your fellow 100ptscale pushers on your own blog, but here you are just as quick to say “don’t lump us together.” Can’t have it both ways. And it’s especially disingenuous for you to say “there are many dreadful wines out there, and I’m also giving lower scores than ever to them.” Huh? Last time I checked, the magazine you write for stopped running any reviews under 85 and does not even mention wines apparently so “dreadful” that they were tasted and not scored. (By contrast, Wine & Spirits and Wine Spectator do list all wines tasted.)

    Bottom line: the 100pt scale has devolved beyond repair. What were once “buying guides” are now de facto marketing guides. The inflation Tyler is spotlighting here is a sign that the entire concept of applying numbers to wine is getting painted into a smaller and smaller corner. It used to be a 90-point corner. Now it’s about a 94-point corner.

  22. I wholeheartedly believe that 100-point rating system only serves to corral customers who lack the curiosity or desire to truly learn anything about wine. I’ve never met someone who was truly interested in the excitement, the poetry that is a bottle of wine who also cared about its point rating. If there was an objectively perfect ‘100 point’ wine, then it would seem unnecessary to drink any other wine, ever. We all know this isn’t realistic, it’s not even desirable.

    If you disagree, I hope you enjoy your Screaming Eagle with some Thai Curry, I’ll take a nice Mosel Spatlese.

  23. Who really needs those points? What do they say about a wine? Nothing. I grew up with wine, in the so-called old world, helped in vineyards during summers, have known Mosel Rieslings and some Priorat wines when hardly anybody in the U.S. appreciated or even knew about them. And though I have had many favorites and tastes some exclusive vintages in my life, it never came to my mind to rate them with a scale. I wouldn’t do this with coffee neither with movies, music, women … Why wine?!

    I have also became a trained scientist during the past 20 years and I find it already hard and often pointless to rate anything on a 100-point scale (unless it’s an experiment in which one measures some level or a magnitude in a unit that is pure convention) in physics. Now when it comes to wines, which is utterly subjective, this really makes me wonder – how do people employ this scale? People imbue themselves with this aura of “seriousness” by relying on numbers but it strikes me as an unscientific approach to quantify something that is made of much more than quantity.

    And I have met enough winemakers who don’t care about these numbers. Unfortunately, the commercial drift to sell more with a “label” seems to spread … Well, just for the records. There are lots of people out there who neither care about wine critique nor about lower 90s or upper 90s. They buy a bottle of wine and open it. Try that!

  24. And the best example I can offer of relevancy is the new scores on the Dunn wines.

    All of a sudden clients are emailing me, and asking if Randy Dunn changed his winemaking style.

    Yeah, after 30 years, Dunn changed it all to please a wine critic and get high scores!


  25. None of youse guys have mentioned the true yardstick of quality: CellarTracker. No, they can’t publish the range of WA, WS, and WE when it comes to new releases. But for finding out how people who BUY wine to have with their dinner and friends respond to a large selection of specific wines over time, it’s the only game in town. Down with the gatekeepers, up with the people.

  26. All of Galloni’s scores are ranging upward, except for 100s. There is nothing nefarious about this, but there are some things that are not helpful. The first is that Galloni first gained notice with his own e-publication, The Piedmont Report, which covered only a narrow range of Piemontese wines with which he has significant familiarity, plus some Brunellos. He arrives at the Wine Advocate (and I will stand up and say that I was a frequent and rabid agitator to have Daniel Thomases fired and Galloni installed as WA’s Italy critic), and over time, not only does he cover all of Italy (learning the rest of Italy would have been plenty of work for him), but also Champagne, Burgundy and California. His background in all but Piemontese wines is inadequate, and he is engaged in radical on-the-job training. Additionally, Parker needed to insure that his fruit bomb legacy would be continued, and Galloni is doing that. Galloni does not subscribe to Parker’s foolish and wrong-headed theories of fine wine, so he can be counted upon to cheerlead Rhys Pinots, Ridge Monte Bello and other excellent wines that fall outside Parker’s traditional love embrace. He knows quality wines. However, his Tuscany/southern Italy coverage exhibited a real love for big-fruit, ripe, new-oaky wines, and over time, his Piemonte coverage is beginning to evidence the same. We now see that carrying over into his California coverage. The end result is the appearance that Antonio Galloni rarely meets wines of any stripe that he does not like. Add to that the fact that his philosophy is to emphasize the positive and leave most of the negative on the cutting-room floor, and you are looking at the results. The grades are inflated, but not in any dishonest way. I believe that Galloni simply lacks the tasting experience to be more discriminating, and in the early going, buoyed by living his dream and perhaps a desire, subconscious or not, not to upset the Parker points apple cart, he is succumbing most of the time to Matt Kramer’s “low-cut dress” theory…

  27. […] If 94 is the new 90, as W. Blake Gray suggested last week, will wine scores become crippled by inflation? It’s a good question. […]

  28. Steve Heimoff’s claim that “if scores are rising for certain categories of wine, it’s because quality is improving” and “[c]ritics are simply perceiving that increased quality” has also been Parker’s refrain for years, and it’s a complete fallacy. The problem with this reasoning is that the critics are the worst-situated of any of us to make the determination whether quality is in fact improving, because the wines they are tasting are deliberately made to elicit their approval. If a critic likes a particular wine more than he did five vintages ago, he is in no position to judge whether the quality is actually higher or whether the style of the wine has simply been tinkered to be a closer match to his personal, subjective preferences. And that’s the real story here. WA scores are going up not because the wines are getting better but because more wines are being made specifically to get higher WA scores.

  29. Mr. Keith Levenberg, it’s a which came first, the chicken or the egg? argument you pose. I taste California wine. I think the quality of California wine is getting better. Are a thousand wineries in California deliberately “tinkering” with their wines to match my “personal preferences”? I think not. That’s real conspiracy theory stuff. Instead, wineries are crafting their wines to what they perceive is a genuine shift in the consumers’ palate, of which I’m just one little part. You’re blaming the messenger [the critics] for proclaiming a style you apparently dislike. Besides, at the end of the day, I taste a lot more California wine than you do (probably; I don’t know what you do for a living), so I’m in a better position to judge whether or not quality is on the rise. It is, thankfully.

  30. Steve

    Is it so black and white for you?

    Is every wine that you have reviewed 95 points better than every wine that you have reviewed 94 points.

    My hope of any wine critic is that the answer is yes. Because it should be yes, every time.

  31. Great feedback, and great comments from everyone. Dr. V, you’ve got a great little blog going on

  32. Daniel: Yes.

  33. Steve,


    I should make mention, without inflating your ego, that I do pay attention to your reviews over the past year, while before never even noticing. I think many of your reviews (particularly with Domestic Pinot Noir) are spot on.

    So, a 97 point, $50 Calif PN, is a better wine than a $300 95 point Napa Cab?

  34. Daniel Posner: Yes. I need to add that “better” is a complicated concept. I may blog on this soon, to better understand it myself.

  35. Daniel, good job pushing the simple math that lurks as yet another “issue” with the 100ptscale. And Steve, good of you to clarify that, yes, a 97 is empirically better than a 95, which is empirically better than a 93, etc. (even if wines rated under 85 points are brushed away as if they no longer exist).

    Now, can someone explain how, on certain “Wines of the Year” list, all of a sudden those carefully selected, set-in-stone, serially correct ratings are purposely ignored while editors/critics stretch to make yet one more marketing-guide for the masses?

  36. […] Mike’s Steinberger’s Wine Diarist has quickly become one of my favorite wine pundits in the fray. His professional background as a former writer for Slate comes through time and again as he crafts wonderfully insightful and well-thought articles. One that I felt was worth sharing was from yesterday called Quiz Time wherein he talks about the recent points scoring in Napa by “Proxy-Parker”, Antonio Galloni. Everyone is in a bit of a flutter over the fact that the scores seem to have been inflated which Dr. Vino talks about. […]

  37. Steve

    I look forward to your blog where you better understand yourself. 😉

    In all seriousness, the bottom line problem that I have and you seem hesitant about is (and I single out Steve just for these purposes)…

    He rates 2007 Joseph Swan PN Trenton Vineyard 97 points (a great wine by the way) and by his own admission, that wine is better than the following wines

    2001 Mondavi Tokalon Cab
    2002 Phelps Backus Cab
    2003 Vineyard 29 Cab
    2006 David Arthur Elevation 1147
    2005 Staglin Cab
    2007 Chappellet Pritchard Hill Cab
    2001 Staglin Cab
    2004 Phelps Backus

    All of those scored just 95 or 95 points, but sell for 3 times as much as the Swan does.

    Now, Steve is not the best critical example to do this, as his focus is California.

    Robert Parker is the one to look at, but he insists that 98 from Spain is not necessarily better than 92 in Napa. It is all about peer groups…which is something no one can figure out.

    Do peer groups go by vintage, or is all of Napa one peer group?

    Or is it based upon when he used to sit down with Napa Valley Vintners at the Auberge and taste 300 wines…was that the peer group? undefined by vintage, but rather whatever free samples were on the table?

    Scratching your head?

    me too.

  38. Steve – I checked out the post that Blake said he commented on and still didn’t see it there. Since you say it woud go through without being held for moderation, have you checked your spam queue?

    Blake – did you happen to run the numbers and find out what the median score was for Napa cabs (or reds)? As you point out in your post, the highest score for an SB was 92, so the median for reds may have been even higher than a 92.

  39. I found 3 comments from Blake that were in my spam box. I put them all up. I would never deliberately delete a comment unless it was libelous, insane or actual spam.

  40. “Wine tasting isn’t like pro sports, where you have to retire fairly young. It’s more like movie reviewing, or writing poetry. You can do it as long as you want to and are healthy.”

    Steve, sorry, but Parker is proof positive that the above just ain’t so. Maybe age is not the right yardstick; maybe it is abuse of one’s palate with 10,000 wines a year (plus the “prodigious” quantities that the man drinks). In any event, when you look at all of the wine critics of comparable age…Laube, Suckling, Parker, Miller, Tanzer…you see that increasingly ripe, fruit-forward, thick alcoholic wines garner the lion’s share of the highest scores of all of them these days. Perhaps that has been true of Suckling and Parker for a relatively longer time, but having followed the entire careers of all of the above, I can tell you that Laube, to some extent, and Tanzer, to a greater extent, have come to love the big wines late in life.

    “You can do it as long as you want to and are healthy”, but the question is whether or not anybody is going to give a rat’s ass about what you are writing beyond a certain point…

  41. Re: Bill Klapp’s comment, I think nothing I, or anyone, can say will change his attitude or convince him of things he doesn’t agree with. So I won’t try.

  42. Generally true, Steve, but I am an absolute pill when I am right! Others have a strong shot at changing my attitude, Steve. You? Not so much…

  43. Dr. V: You ask a very good question; I did not seek a median for just Napa reds.

  44. If there’s question as to how an individual, and that individuals perceived relevance to wine criticism, affects the sale of wine – I have Steve’s POV in this thread as all the evidence I need to lend a suspicious eye on all wine criticism. I don’t believe that was his intention but frankly I can’t get beyond his general grace of self promotion as justification and benefit.

    I’ve been involved in the wine business for 30+ years. There is a behind the back handshake association between the shelftalkers scoring citation from a publication(be it the WA, WS, etc all paid for)and the wine. The expectation is that not only will the consumer be interested in the 98 point wine – but the publication that published the score as well.

    To present that the wine reviewer cannot control what anyone does with their reviews is disingenuous. A 95+ score will be used to sell that wine and Steve knows it. That is market power of influence pure and simple.

  45. Dear siaubingas whoever you are, you are full of shit. You insult me behind your anonymity. Identify yourself–take me on directly. “A behind the back handshake association”? Prove it. Come out publicly and present your evidence. If you can’t, think twice before throwing your stink bombs into the conversation.

  46. Steve. Please take yourself down a notch or two. Please.

    What I can prove is that the marque of a given wine rating publication will be 400% larger than the wine review text on a given shelftalker. Much larger than the winesmakers name or the given wine even. A visit to any wine retail location should provide you with sufficient proof.

    Again, that shelftalker represents and opportunity for sales for the wine and for the publisher of the rating – otherwise why the prominent reference ?

  47. siaubingas: Marques? Shelftalkers? What the heck are you talking about? I asked you for proof of your ““A behind the back handshake association” accusation and you have none.

  48. I’m sorry if you’re confused Steve. In my 11:08 post I described exactly what that association is.

    Again, anyone who has visited a retail wine store that has printed tags (called shelftalkers in the wine trade if you’re not aware) advertising not only the wine but the publication that issued that wine a high score knows exactly what I’m talking about.

    The wine rating publications reference is prominent. They don’t allow that out of a sense of charity.

  49. Saying that only scores from WS and Parker matter is showing that you are not up to date with information on wine. Cellar Tracker among the real wine drinkers is a more powerful tool. I still look at Parker and WS for purchasing more expensive wines. I am not dropping $100 on a bottle that two pros gave 90 points to. I have seen many times a wine I just had at an offline and two of us gave it 95 and raved about it disappear in a hurry via Wine Seacher. Locally in Minneapolis I sold out 15 cases of 2001 Chateau Montrose that had been sitting on previous sales. Once I emailed out how good the wine was many people I know and don’t know bought up the wine in less than a week. Boards are also moving a ton of wine with the old “Great deal” or “Wine Hunter”

  50. Steve Heimoff: You seem to have misunderstood what I was saying. The problem I mention is *not* a chicken-or-the-egg problem, it’s an epistemological problem: whether the claim you are making is knowable to you at all. No, I don’t think any California wineries are deliberately tinkering with their style to match your specific personal preferences, but it just so happens that you seem to share the preferences of the person(s) whom they *are* tinkering with their style to please, so you are just as poorly situated as they are to make the determination whether quality’s on the rise. The amount of California wine you taste is pretty much irrelevant. If suddenly every California winery started making their wine to match *my* subjective preferences, I would think quality was on the rise, too. You would probably think quality went down the toilet.

  51. So, in other words, Keith, according to your epistemology, nothing is knowable, because we’re all trapped in our own perceptions. That’s not epistemology, it’s solipsism. I would suggest a more reasonable way for humans to understand their world is that, those things upon which most of us agree are “real.” And since most consumers like the kinds of wines I (and other critics) like, then that makes those wines “really” good. Not because I say so, but because the overwhelming number of wine drinkers say so. If you do not enjoy these wines, then you are in the minority.

  52. Various recent threads on Dr Vino addressing point based rating systems presented the issue well. There is clearly not only a crisis in confidence of wine score churn in the retail world but also in the “wine personalities” who generate them.

    Keith’s point is well presented. Wine reviewing publications dont taste every wine made. Thats unreasonable. Wine publications aren’t independent arbiters, theyre for profit publications and they gravitate to those wines they have had positive experiences with historically. Steve can be honest and say a 90 point rating is more valuable to any given wine publication than an 80 point rating. Wine publications are part of the wine trade. Sales being the goal and a requirement for survival.

    Wine score inflation being the thread topic. What tools do wine producers and distributors have available to them in a down economy ? The wine trade was hit very hard. Generating buzz is critical and wine scores are clearly associated with potential commercial buzz and wine sales. I cant present statistical proof but I feel wine scores have blossomed in an attempt to generate interest. Given the business realities of the wine trade, including publications, it would shock me more if those tools weren’t ramped up.

  53. What percent of wine buyers even know or care about scores? I am sure we are talking less than 20% and that might be stretching it. Of the 500 people I have educated this year on wine I doubt five have even known who Robert Parker is or care who he is. Almost zero get Wine Spectator magazine. I think a lot of us in the “wine geek” circle are arguing amongst ourselves on points. On that note I will still review wines using the 100 point scale.


  54. @John Glas: on the other side of the Atlantic those publications generate only interest with producers who need to sell their wines overseas and therefore need to please “customers”. Most wine lovers, many of which probably more literate in wines than those who make up the scores, ignore what Parker and Co. have to say. The really interested ones go to regional wineries and stop by wineries on their vacations.

  55. John, it depends on the market you’re in possibly ? In the Chicago market – commercial wine sales are dominated by “big box” wine retailers whos staff is largely disconnected thus leaving the consumer to do their own research or rely on in store advertising.

    Consumers will use many resources. I’ll agree that print is in the decline and on line/mobile resources have gained substantial ground. Another thread mentioned a consumer, buried in their iPhone screen, researching. Wine reviews and scores are a very common component of that online search.

    Consumers want to make an educated wine purchase decision. The retail challenge is presenting a compelling alternative to the wine publication generic rating juggernaut. Wine education is healthier now than any period I recall. Todays wine retail professional is better educated and better positioned to capture that consumer – and consumers will appreciate personal service and direct advice thats responsive to their tastes and preferences.

  56. The assumption that these inflated scores are a gimmick that critics are using in order to stay relevant is hard to ignore. Despite popular consensus, wine is growing in popularity with the American people (again) and it began during the second Bush administration (wine sales soared!). Granted, it may have been fueled by individuals and families facing financial ruin but since that period sales have remained higher than before this period. Point being, scores have long been abandoned by the “average” American as price has become the number one factor in choosing a particular brand. The fact that a $10 bottle of wine was given a score of 95 is a bonus but not usually a deciding factor.

  57. Steve, talk about logic chasing its own tail! Damn that Keith Levenberg! He IS in that dreaded minority of people who have overcome their own uniquely American cultural insecurities and trust their own palates instead of what some self-proclaimed expert from the Maryland sticks dictates is the best. Parker created his audience, Steve: a herd of sheep who have rushed out and bought whatever Parker laid the big points upon, and learned to like Parkerized wines because that is virtually all that they drink. One need only spend a few minutes scanning the Squires board to see that phenomenon in action. Parker’s palate and California’s climate have created that “majority” style that you point to. I do think that you are correct that no California winemakers are tinkering with their wines to match your personal preferences, in that, despite years of beating your chest and tooting your own horn, you remain irrelevant in the shaping of the American wine culture. Lastly, your assertion of the “overwhelming” number of wine drinkers in your camp is not only not quantified (or, I suspect, quantifiable), but almost surely wrong, except when viewed through the myopic, California-only lenses in your glasses. Indeed, there seems to be a clear and growing trend away from the big, fruity, oaky wines these days, and there can be no question that the once-faithful are fleeing Parker in droves…

  58. I am not so naive to claim that the 100ptscale will disappear entirely, but this comment thread is showcasing the many reasons that wine ratings are become more marginalized with each passing vintage.

    Scores are still most impactful at either low price point or when the scores are super-high. And they tend to get more attnetion in big box stores where direct customer service is not priority. Ratings are in fact disappearing at a rapidly rising number of small shops (See Dr. Vino’s recent post on this phenomenon… http://www.drvino.com/2011/11/03/wine-shops-no-scores/

    As Tom Merle and John Glas point out, Cellar Tracker is the only resource that provides crowdsourced ratings. Naturally, Cellar Tracker is not the place to look for NEW releases, but in terms of released wines with a critical mass of tasters, there can be no doubt that Cellar Tracker provides a broad and unbiased composite of views.

    Steve H., your sharp reactions here suggest you are either unwilling or unable to accept the truth that all traditional critics’ ratings have become less about consumer guidance than about market positioning. The “behind the back handshake” that siaubingas referred to was not an accusation that you or anyone else who dishes out 90-point ratings is corrupt. Nobody questioned your personal ethical integrity. But if you don’t realize that the retailers who post such scores and the mags that generate them are not working on an unspoken agreement that ratings are a mere promotional tool, you are way out of touch.

    Considerable research by Wine Market Council over the past decade has shown that the number one source that people use for wine-buying advice is DIRECT RECOMMENDATIONS — from friends, peers, mentors and retailers, NOT from magazines. I would argue that the only thing keeping the glossy mags and Parker relevant in the retail realm is a combination of (certain) retailers’ laziness and the sheer volume of available wines, which makes people more anxious to get some sort of specific guidance in the face of countless choices.

    I deal directly with hundreds of consumers annually in the context of leading tastings, and my impression is that evolved consumers naturally gravitate away from simple scores toward advice from those “in the know.”

  59. While I agree with Tish about this thread highlighting some of the flaws in the 100-point scale, it has also (possibly accidentally) highlighted the reason it isn’t going away. There are over fifty comments on this thread, and counting. Like a train-wreck, it’s impossible to look away.

    I spent about 8 years as a wine retailer, and am going on 4 years as a winemaker. Anyone you talk to in the industry can give you reasons they hate the 100-point scale. But every winemaker can also tell you their most recent scores. Every retailer looks at the Wine Spectator Top-100. In fact, the only people who don’t seem to care about the 100-point scale are the majority of customers!

    I can easily list a dozen reason why the 100-point scale is flawed; but nothing in life is perfect. Anything that can drum this must passion out of an internet wine thread can’t be all bad.

  60. Thank you, I just wasted 20 minutes reading thru comments. Altho, I now can score my stupidity–96 pts.

    (Original post by Dr Vino is superb.)

  61. Steve aggressively asked me to “prove” something – not really sure what really. He was clearly emotional. It’s not necessary to prove a beneficial relationship between wine publications and wine sales. Presenting circumstances that may suggest the presence of a mutually beneficial relationship is sufficient to bring that relationship into question. The recent drama associated with the Wine Advocate isn’t the first time this relationship has been questioned. Other publications have had this burden much longer than the WA – a publication and publisher that I do respect for their body of work.

    The 100 point scale isn’t at fault or in question. It’s a fine scale that anyone can recognize. It’s the publications/entities that issue the 100 point scale scores that are in question.

    The scores aren’t based on a flat line – unless someone can point out a publication that has ever issued a score of less that 50 – the 100 point scale is in reality a 30 point scale.

    The scores offered aren’t flat statistically. They operate on a sharply ascending curve. A 90 point wine is better regarded that an 85 point wine and a 95 point wine especially more so. Both examples present the same 5 point swing yet they can’t be compared.

  62. […] somehow found myself once again in the crosshairs over at Dr. Vino’s blog the other day (and what a great job Tyler Colman is doing there). Tyler was writing about “wine […]

  63. […] invaluable Dr. Vino rounds-up evidence that score inflation is causing wine drinkers to become jaded, making 90-point […]

  64. “In fact, the only people who don’t seem to care about the 100-point scale are the majority of customers!”

    Gabe the majority of customers don’t drink good wine. Scores are great in a sense to get people drinking better wine than Yellow Tail, Two Buck Chuck, Barefoot, etc. While many don’t agree with points they allow at least some guidance for the consumer.

    Also most people can’t even name their favorite wine they sampled in the last five years. They might say Riesling or Cab but don’t remember the producer and certainly not the vintage.

    For all of you against points how are we going to educate the wine consumer so they drink better wine. I would hope no one above would encourage the crap wines I listed.

  65. it’s remarkably easy to recommend good wine without the hundred-point scale. and it is actually much easier to educate someone about wine without the 100-point scale. winemaking is not a competition, its an art. and nobody would describe a picasso or matisse as a “97-point painting”

  66. Does anybody know if Blake Gray is as much of a prick in person as he is online?

  67. In general, if the true 100 point scale were used, I think that we would all have less of an issue. If the tasting notes set the wines apart, we would have less of an issue.

    If a good wine got 80 points and a a GREAT wine got 90 points, the formula would work better.

    Unfortunately, 90 points is average and 99 points is great. That needs to change, or the point thing will continue to be a hot button topic.

    11 years in this business, and I can see many consumers getting sick of the points system. At the same time, new folks coming into wine collecting are all about it. Andif you are buying Bordeaux at $100-1000 per btl, it is all about points.

  68. “On January 9th, 2012 at 11:35 pm ,Wino wrote:

    Does anybody know if Blake Gray is as much of a prick in person as he is online?”

    I do not know, but probably not. I know that I am not!

  69. Gabe,

    You assume that all wine shops can recommend good wine. Most people who own a wine shop know something (some more than others) and most people who own a liquor store/wine shop know very little. In Minneapolis there are less than ten wine shops I would feel comfortable sending a friend into to get a good bottle of wine for $10.

  70. Wino – Sorry to have to call foul, but if Blake wrote something here or elsewhere that irked you, please engage with that specifically. Ad hominem swipes have no place here; this is especially so from someone not divulging their own identity.

  71. I like this comparison, same critic…

    The 2009 Meursault is a gorgeous wine at this level. It shows marvelous harmony in a round, supple style well suited to near-term drinking. I find it amazing that Jadot produced a whopping 330 barrels of this wine, an enormity by Burgundian standards, at this level of quality and price. 89 points

    The 2009 Chardonnay Bench Break is a rich, lush wine laced with expressive tropical fruit. It shows excellent depth all the way through to the round, enveloping finish. The Bench Break was fermented in oak barrels (55% new) and aged on its lees with batonnage for ten months. 89 Points

  72. […] I type this, over on Twitter, Dr. Vino, Steve!, and lord knows where else, folks are rehashing—with considerable […]

  73. Having read all of the comments, I am awarding you 92 points.

  74. Parker has devalued himself. Steven Tanzer has not yet given anyone 100 points, but Parker and Wine Spectator have done so on a number of occasions. I much prefer the writings of Tanzer, Broadbent, Johnson, Robinson, Jeffords, Feiring, and others.

  75. […] less likely to result in the inflated wine ratings they saw all around them.” Ironically, score inflation is the most likely threat to the 100-point system […]

  76. […] punktów…” . Wysoki zakres skali miał zapewnić dokładną ocenę wszystkich elementów wina. Niestety obecnie inflacja wyniku jest jednym z najczęstszym zagrożeniem przy użyciu 100-punktowego systemu […]

  77. […] punktów…” . Wysoki zakres skali miał zapewnić dokładną ocenę wszystkich elementów wina. Niestety obecnie inflacja wyniku jest jednym z najczęstszym zagrożeniem przy użyciu 100-punktowego systemu […]

  78. […] all, 20 wines had received scores as possible 100 pointers based on barrel tastings. In January, I suggested that rampant score inflation posed the biggest threat to the use of scores and eighteen 100s do not reverse my view. What’s your […]

  79. […] aspect of being troubling also accompanies wine score inflation. Dr. Vino touched on it the other day, and while he didn’t exactly condemn it, he did kind of cast a mild aspersion on […]

  80. […] Points for all? Recent evidence of wine score inflation Permalink | Comments (0) | | winemaking This entry was posted on Thursday, August […]

  81. […] Mike’s Steinberger’s Wine Diarist has quickly become one of my favorite wine pundits in the fray. His professional background as a former writer for Slate comes through time and again as he crafts wonderfully insightful and well-thought articles. One that I felt was worth sharing was from yesterday called Quiz Time wherein he talks about the recent points scoring in Napa by “Proxy-Parker”, Antonio Galloni. Everyone is in a bit of a flutter over the fact that the scores seem to have been inflated which Dr. Vino talks about. […]


Wine Maps

Recent Comments

Recent Posts

See my op-eds in the NYT
"Drink Outside the Box"
"Red, White, and Green"


Monthly Archives


Blog posts via email



Wine industry jobs


One of the “fresh voices taking wine journalism in new and important directions.” -World of Fine Wine

“His reporting over the past six months has had seismic consequences, which is a hell of an accomplishment for a blog.” -Forbes.com

"News of such activities, reported last month on a wine blog called Dr. Vino, have captivated wine enthusiasts and triggered a fierce online debate…" The Wall Street Journal

"...well-written, well-researched, calm and, dare we use the word, sober." -Dorothy Gaiter & John Brecher, WSJ

jbf07James Beard Foundation awards

Saveur, best drinks blog, finalist 2012.

Winner, Best Wine Blog

One of the "seven best wine blogs." Food & Wine,

One of the three best wine blogs, Fast Company

See more media...


Wine books on Amazon: