subscribe: Posts | Comments      Facebook      Email Steve

On retasting


Lately I’ve seen a spurt of requests for me to retaste wines. The way it works is, I do my reviewing and scoring, then send it all in to the magazine electronically. That’s theoretically the end of my involvement — I say “theoretically” because, as you’ll see, sometimes it isn’t.

At some point after I send in my reviews, Wine Enthusiast notifies the wineries what the scores are — not the text of my reviews, just the scores. That lets the wineries buy a label ad, if they wish; whether they do or not has no bearing on my reviews, nor do I know if they do or don’t buy an ad, nor do I wish to know. Over the years, a few wineries, upon learning their score (and being disappointed by it), have asked me to retaste the wine, which I’m always glad to do, provided it’s okay with my colleagues in New York, who have to adhere to a fairly tight printing schedule.

Almost always, when the winery asks me to retaste, they use the words “bad bottle” as their reason why, accompanied by the verb “afraid,” as in, “We’re afraid you may have had a bad bottle, and we’d like to resubmit the wine for a retasting.” Well, what else could they say? They’re not going to say, “We know our wine deserved that middling rating, but we’re hoping that, by some miracle, you’ll like it a whole lot better the second time around.” “Bad bottle” also seems to be the most delicate way of saying, “You idiot! You must be out of your mind if all you could give that wine was 84 points!”

In my experience of retasting, I’m pretty consistent. Sometimes I’m tempted to score the wine even lower the second time around. Sometimes I’m tempted to rate it higher. If the former, I usually let the first score stand. If the latter, I’m happy to give a higher score. Why wouldn’t I be?

I don’t know why lately I’ve had more requests for retasting than ever. I hope it’s temporary. I obviously couldn’t retaste everything — instead of 5,000 wines a year or whatever it is, it would be 10,000. As long as the requests are reasonably paced out, I don’t mind. But I do know that our staff in New York generally frowns on retasting, unless there’s some reason to think that the original bottle really was “bad.” Like I said, I’m liberal about retasting, but if the requests start piling up, I’m going to have to start saying No.

So what does a “bad bottle” mean? It could be anything. A bottle can suffer in the delivery process, usually from high temperature. A bottle can be corked, but I would recognize that and would not hold it against the wine. What about inherent flaws that mar wine? Does volatile acidity make for a bad bottle, which a taste of a second bottle will correct? Brettanomyces? A super-high pH? Those things make for bad wine, but I don’t see why a second or third taste would change anything, unless the winery hadn’t “equalized” the wine, i.e., put it all into a single blending tank before bottling, to make sure all wines with the same label are indeed the same wine. Doesn’t everybody do that?

My hunch is that when a winery tells me they’re afraid I had a “bad bottle” they’re hoping against hope that lightning will strike and an 84 will turn into a 90-plus. I guess they could always put something else in the second bottle — not the actual wine, but a reserve or special bottling — and try to get the score raised that way. But I don’t think anybody in California would do that. Would they? Maybe over in Europe, where they’re all corrupt ; > But not here. (Disclosure: that is my way of mocking xenophobic Republicans. We love Europe at!)

Sometimes a winery will ask me to retaste, not because they think I had a “bad bottle”, but for some other reason. If the winemaker is very sure he made a terrific wine — if that wine has gotten good scores from the competition (and I don’t mean a bronze in Indianapolis, but from a really reputable critic) — then I don’t blame the winemaker one bit for thinking I made a mistake. I’d probably do the same thing if the shoe were on the other foot. Just today somebody asked me to retaste a wine I gave 85 points to. It was a good wine, but I thought it was too oaky. The person explained that the wine only had 40% new oak on it. Well, what can I say? Sometimes 100% new oak is fine. Sometimes 40% new oak is too oaky. Depends on the wine.

Anyway, lest I forget that every time I review a wine, I’m playing with people’s lives, these requests for retasting remind me. It’s a very humbling experience.

Off to the Napa Valley Wine Auction for the next 3 days. Will do my best to post everyday.

P.S. Award of non-distinction for the worst bottle closure in history: Castello di Amorosa 2006 Late Harvest Semillon. I had to chop this stupid hard plastic coating off with the edge of my corkscrew, and my kitchen counter had about 100 tiny little pieces I had to clean up. What were they thinking? Fortunately, I don’t let this kind of nonsense interfere with my reviewing process. If I did, this wine would score minus-zero.

  1. I know that some wine programs with, say, over 2,000 case annual production may split that production over two or more bottling events. It is very possible, especially if concentrate is added and the wine filtered, that the first batch bottled may differ from subsequent batches. Owners may feel the first bottling was not as good as a later bottling.

    What might be effecting an increase in retaste requests is score inflation. Seems to me that up to about a year ago, if a wine got that magic 90 points or more, retailers and distributors were open to selling it. Somehow it seems that 94 points is the new 90.

  2. These wineries are certainly naive. Isn’t there a built in conflict of interest in a declared retasting, i.e., where the winery has been informed that there will be a retasting. Serious professional critics also happen to be human. So won’t critics consciously or unconsiously have a bias to substantiate their initial evaluation. What if the critic found a difference, positive or negative? Doesn’t that undermine the discrimination and consistency of the taster’s palate? We know that Robert Hodgson’s meticulous research found only a one in 10 wine judges can replicate their scores. I assume that your retasting occurs, like the CA competition tasting, blind; that is, the wine being retested is placed among other bagged wines. If it is, and you are “pretty consistent” you certainly beat the other pros at this game.

    By the way, it’s worth noting that your counterpart at WS retastes, often twice, when a wine that should score well scores poorly.

  3. Well, at least I know where to find 50-60 people of no respect, honor, esteem; with no principles, worthiness or virtues. Just look at the judges for the Indy Wine Competition.

  4. Hi Steve, I’m wondering if you would be willing to share with readers what percentage of samples your receive, and then review, from each tier of boutique/mid/large scale wineries annually? This relates to a conversation some have been having about marketing resources and potential for smaller wineries to maximize D2C, vineyard to glass, and educating consumers about where their wine comes from, ultimately influencing purchase decisions and market share.

    I believe there are a group of consumers who rely heavily on wine critics, bloggers and writers to assign a score and then select what wines to purchase according to that trusted source and score. I’m curious if the playing field is level among the varying tiers of wineries and if each tier is adequately represented when wine critics, bloggers and writers are reviewing over some span of time. i.e. You receive 100 samples a month. 33 are from boutique wineries, 33 are from mid size and, 33 are from what I refer to as “Goliaths” and 1 obscure, hard to find wine a month you send to me! Cheers!

  5. Morton Leslie says:

    If I were a critic I wouldn’t retaste a wine that the maker suggested was a “bad bottle.” Except if the wine was corked the “bad bottle” excuse is pretty lame if you consider the odds of the wine critic getting the one bad bottle. If the critic actually gets one for reasons other than cork, you can bet a lot of people are getting them too. What does getting a bad bottle say about the professionalism of the winemaker? He should be getting you the best tasting sample he can possibly scare up.

    If I were the winemaker, I’d keep my mouth shut and avoid suggesting I had lousy quality control. I’d just ask for a retaste. I’d cite my confidence in the wine and describe its positive characteristics. The”bad bottle” excuse is really a desire to get the wine re-tasted in more favorable circumstances. More likely than not, the “unfair score” was created by how it contrasted to the wine or wines tasted before it. In an perfect world a winemaker would get a group of really crappy wines immediately preceding theirs in the critic’s tasting lineup.

  6. Two Buck Al says:

    Steve – do you really not believe in bottle variation? You must own a cellar yourself. Have you really never noticed that bottles from the same case can taste vastly different?

  7. Two Buck: Sure I have. Doesn’t make my job any easier.

  8. Morton, I think it’s just something they say when they don’t know what else to say. They’re a little uptight and embarrassed and they sometimes don’t know me too well, so they just say that.

  9. Thomson Vyds, that’s an awfully open ended question that I can go into details on now. I taste about 5000 wines a year and it seems to be they’re from all tiers — super expensive to under $5. I treat them all exactly the same in terms of my tasting methods.

  10. Tom, I don’t think I have a bias when retasting because I usually don;t know what the wine is. Sometimes I do, because there’s a deadline in NY to go to the printer and I’m unable to hold the wine long enough to sneak it into a blind tasting of its peers. But usually I don’t know what it is so how could I be biased? As for whether finding a difference undermines my consistency, why no it doesn’t when you consider bottle variation over space and time. Anyway I’ve said a million times it’s not an exact science!!! It’s not like 2+2=4 forever, and any time you add 2+2 it’s going to be 4. I challenge any winetaster in the world to consisently score the same wines the same way over time. Never happened, never will.

  11. Jon Bjork, I’m tending to agree with you that 94 is the new 90. Someone told me Parker is now rating wines above 100. I don’t know if that’s true or not. Maybe somebody here can let me know.

  12. Now I know why so many professional judges vary scores, sometimes widely, when tasting the same wine in a different flight. Clearly, Hodgson misinterpreted his findings. The differences were due to …. bottle variation.

    Using the term “exact science” seems to trivialize the differences. If it isn’t an exact science, 2+2=4, what exactly is it? An inexact science, an art, a craft. And if it is inexact, isn’t the 100 (=35) pt system unnaturally precise. I know you are stuck with it, but there should be some sort of disclaimer, like the discussion we had some months ago about standard deviation. With so much riding on over under 90, isn’t there a moral issue here…?

  13. A suggestion to all those wondering if you’ve sent a “bad bottle”…

    DO YOUR OWN MARKETING!!!!! Quit relying on others to market your wine. In such a tough sales market as we are experiencing, it’s even more vital that YOU the grower, winemaker and owner (the guy or gal who signs the checks) get out there and represent your brand.

    By sending samples to people, you are entering a lotto… a risk a gamble.

    I do not think the bank you borrow from would appreciate knowing you are putting your wine eggs in such a basket.

    Market yourself and you’ll never be dissappointed.

  14. I’m sure that bottle variation is part of it, but I think variation/evolution over time also makes people seek retastes. I suspect that the desire to have the wine reviewed in time for its release leads to submission before the wine is hitting its stride. When the resulting crappy score emerges three months later, when the wine is actually starting to show well, it’s natural to want the wine to be rejudged on its current merits. Of course the timing of submission is on the winemakers, but the lag between submission and publication probably causes many to roll the dice too early.

  15. CA Winediva says:

    Hi Steve, Interesting post. Before I retired I was the “lead” at a very busy Disney winebar. We sold dozens of cases a day. We usually had about 18 wines for sale by the glass daily, and wine tastings. We smelled every bottle opened, and tasted much of the wines we used. We were all sommelier certified. I can’t begin to tell you the amount of bottle variation we had in every case. These were mainly high production wines. In the early 2000’s we had a tremendous amount of TCA taint. That pretty much has slowed the past 5 years or so. You’ve got a tough job, but, someone has to do it.

  16. Wines really vary as they age right after bottling(1-6mos), going in and out of phases. I think most people that are into wine notice this.

    I’ve never asked anyone to retaste, but I understand that a winery is always guessing at the best time to submit for reviews. Sometimes they pull the trigger too early and regret it.

    I don’t believe the bad bottle thing. I have know wines submitted to WE and WS to be scored 10 points differently. It can make a winery want a second bite at the apple.

  17. Hi Steve,

    A very wise man, in reference to scoring wine, once told me: “Just be honest. And doubt yourself”. Very sage advice I’d say.

    Sadly California wineries are falling off faster than a prom dress; these are desperate times, which evidently call for desperate measures. The thing is that you know that you got it right the first time and that if there were actual flaws, you would have recognized them and perhaps even requested a different sample (especially if is was Harlan or such). I do wish however that virtually everyone that submits a bottle for Laube to rate would request a retasting. Sorry Jim you’ve got it coming.

    David Boyer

  18. David, maybe if Laube started retasting those 65s would go all the way up to 66!

  19. HA! Steve, you certainly nailed THAT!

  20. Kudus for the comment on plastic/wax dipped secondary closures…form definately does not follow function!

  21. We’ll never know whether Laube’s retaste because of a low score did lead to a adjustment of a review for publication. We do know that he will always retaste, usually twice, wines that one would expect to score higher. This is commendable.

  22. Sorry, Steve, but this exchange calls out for a reference to a certain website that has built into its evaluation system reviews by many discriminating wine enthusiasts conducted over time. This format would seem to eliminate the problems of bottle variation and selecting the optimal time for drinking.

  23. To Steve and Tom Merle,

    I need to clarify an important point. All my work at the California State Fair commercial wine competition included ONLY replicate samples poured from the SAME bottle and served on the SAME flight. Therefore, bottle-to-bottle variation is not a factor. Each judge receives four triplicate samples. Under these conditions only a few judges can replicate their scores within a single medal range. Secondly, those judges who are superior judges appear to be superior by chance alone, as in subsequent years, their superior states wanes to the ordinary.

    I have chatted with judges who possess the honorable MW, who deplore those other judges who really have not been schooled in the fine art of tasting. I must bite my tongue to not reveal how this judge, who judges internationally, fared with respect to other judges. But, I will say that the hubris was not justified.

    Finally, I have never intended to imply that judges are incompetent. The task is difficult, so difficult that the consumer must be aware of the level of error associated with wine competitions.

    As for your alleged ability to be consistent, based on my studies, I always view such a statement with caution. I have seen the best give the same wine (and I do mean the same wine except for glass to glass variation*) scores ranging from no award to gold!

    Finally, unless one submits themselves to a rigorous experimental plan designed to examine consistency, such a view can only be viewed as self-serving. Sorry, Steve.

    * By glass to glass variation, I mean glasses washed by the same machine. Thousands of glasses are processed every day at the State Fair.

  24. Dear Steve,
    Should you wish to see how consistent you are, perhaps you might apply to become a judge at the California State Fair. You can then judge along with your contemporaries in just the kind of environment that will test your ability to consistently award the same wine a similar medal. Presently, we hold these results confidential, but you might be able to convince the chief judge to go public. ? ? ?

    The results of the judges’ awards are now being looked at by a crew of expert statisticians. I do not include myself in that league. Dom Cicchetti of Yale and Jing Cao of SMU are looking at the data. They will be presenting papers at the upcoming Association of Wine Economists convention in Davis later this month. I look forward to their views, as should you.

  25. Bob, I’ll look for that paper when they post it on the AWE website.

  26. O.K. I’ll inform Karl Storchmann, managing editor, of your interest.


  1. How Positive Wine Reviews Generate Money – Louisville Juice - [...] Heimoff, who bitches more-or-less constantly about blogger ethics and professionalism, gives us a glimpse into the inner workings of…

Leave a Reply


Recent Comments

Recent Posts