Every day, I get blast email advertisements from wineries or wine stores touting the latest 90-plus point score from Suckling, Parker, Vinous or some other esteemed critic. Here’s an example that came in on Saturday: I’m reproducing everything except the actual winery/wine.
_____ Winery’s ____ Napa Red Wine 2013 Rated 92JS.
Notice how the “92JS” is printed in the same font type and size as the name of the winery and wine. That assigns them equal importance; the rating and critic are virtually part of the brand. Later in the ad, they have the full “James Suckling Review” followed by a full “Wine Spectator Review” [of 90 points]. This is followed by the winery’s own “Wine Tasting Notes,” which by and large echo Spectator’s and Suckling’s descriptions.
Built along similar lines was a recent email ad for a certain Brunello: The headline was “2011 ____ Brunello di Montalcino DOCG”; immediately beneath is (in slightly smaller point size), “94 Points Vinous / Antonio Galloni.”
We can see that, in these headline and sub-heads, through physical proximity on the page or screen, the ads’ creators have linked the name of the winery and the wine to the name of the famous critic and his point score. One of the central tenets of advertising is to get the most important part of the message across immediately and strongly. (This is why so many T.V. commercials begin with the advertiser’s name—you hear and see it before you can change the channel or click the “mute” button.) In like fashion, most of us will quickly read a headline (even if we don’t want to) before skipping the rest of the ad. The headline thus stays in the brain: “Winery” “Wine Critic” “90-plus point score.” That’s really all the winery or wine store wants you to retain. They don’t expect you to read the entire ad, or to immediately buy the wine based on the headline. They do expect that the “Winery” “Wine Critic” “90-plus point score” information will stay embedded in your brain cells, which will make you more likely to buy the wine the next time you’re looking for something, or at least have a favorable view of it.
This reliance of wineries and wine stores on famous critics’ reviews and scores is as strong as ever. There has been a well-publicized revolt against it by sommeliers and bloggers, but their resistance has all the power of a wet noodle. You might as well thrash against the storm; it does no good. The dominance of the famous wine critic is so ensconced in this country (and throughout large parts of Asia) that it shows no signs of being undermined anytime soon. You can regret it; you can rant against it; you can list all the reasons why it’s unhealthy, but you can’t change the facts.
Wineries are complicit in this phenomenon; they are co-dependents in this 12-Step addiction to critics. Wineries, of course, live and die by the same sword: A bad review is not helpful, but wineries will never publish a bad review. They assume (rightly) that bad reviews will quickly be swept away by the never-ending tsunami of information swamping consumers.
Which brings us back to 90-point scores. They’re everywhere. You can call it score inflation, you can argue that winemaking quality is higher, or that vintages are better, but for whatever reason, 90-plus points is more common than ever. Ninety is the new 87. Wineries love a score of 90, but I’ve heard that sometimes they’re disappointed they didn’t get 93, 94 or higher. Even 95 points has been lessened by its ubiquity.
Hosemaster lampooned this, likening 100-point scores to Oprah Winfrey giving out cars to the studio audience on her T.V. show. (“You get a car! And you get a car! And you get a car! And YOU get a car! Everybody gets a car!”) Why does this sort of thing happen? Enquiring minds want to know. In legalese, one must ask, “Cui bono?”—Who benefits? In Oprah’s case, she’s not paying for the cars herself; they’re provided by the manufacturers, who presumably take a tax writeoff. It’s a win-win-win situation for Oprah, the automakers and the audience.
Cui bono when it comes to high scores? The wineries, of course, and the wine stores that sell their wines (and put together the email blast advertisements). And what of the critics?
Step into the tall weeds with me, reader. A wine critic who gives a wine a high score gets something no money can buy: exposure. His name goes out on all those email blast advertisements (and other forms of marketing). That name is seen by tens of thousands of people, thereby making the famous wine critic more famous than ever. Just as the wine is linked to the critic in the headline, the critic’s name is linked to the 90-plus wine; both are meta-branded. (It’s the same thing as when politicians running for public office vie for the endorsement of famous Hollywood stars, rock stars and sports figures: the halo effect of fame and glamor by association.) There therefore is motive on the part of critics to amplify their point scores.
But motive alone does not prove a case nor make anyone guilty. We cannot impute venality to this current rash of high scores; we can merely take note of it. Notice also that the high scores are coming from older critics. Palates do, in fact, change over the years. Perhaps there’s something about a mature palate that is easier to please than a beginner’s palate. Perhaps older critics aren’t as angry, fussy or nit-picky about wine as younger ones; or as ambitious. They’re more apt to look for sheer pleasure and less apt to look for the slightest perceived imperfection. With age comes mellowness; mellowness is more likely to smile upon the world than to criticize it.
Anyhow, it is passing strange to see how intertwined the worlds of wineries, wine stores and wine critics have become. Like triple stars caught in each others’ orbits, they gyre and gimble in the wabe, in a weird but strangely fascinating pas de trois that, for the moment at least, shows no signs of abating.
Years ago, during the heyday of Sex and the City, the San Francisco Chronicle ran a spoof piece on what “the girls” would be doing if they lived in the “cool gray city of love.” Samantha, you’ll recall, had her own high-end P.R. firm in Manhattan, where she represented restaurants, celebrities, clubs and so on.
In San Francisco, the Chronicle’s writer determined, Samantha would still be in P.R.—only it would be winery public relations. When I read that, I remember thinking that wine had finally and definitely come to dominate the zeitgeist. It was the cool-hot thing to do, the field everybody wanted to work in, whether in PR, writing or production.
(Sidebar: When I started out, nobody, but nobody, wanted to be a wine writer. I sometimes wonder, if I was beginning my career today instead of in 1989, if I’d even be able to get a writing job at a magazine, much less Wine Spectator. The field has become that competitive.)
Wine remains a highly coveted field for young people to work in, maybe hotter than ever, according to this article in the drinks business, which claims that winemaking and beer brewing are “among top dream jobs” for young people just starting their careers or thinking of changing. (The study was done in Britain, but there’s no reason not to think attitudes here in America are any different.)
So desirable are these winemaking and beer-making jobs that “over a third (35%) of people said they would consider quitting their job to re-train in their chosen profession – regardless of money.” That’s good, because these types of jobs typically don’t make a ton of money. Funnily enough, “Security guards (95%), IT consultants (91%) and accountants (87%) were by far the most eager to pack in the typical 9-to-5 and take up a craft career” such as winemaking.
I know people in both the wine industry and craft brewing, and most of them seem to be very happy. It’s true that the pressures can be difficult, but the joy seems to outweigh any of the inconveniences (such as basically having your normal life put on hold during crush). When I look back over my years in the wine biz, despite all the bitching and stress I went through (or put myself through), I consider myself incredibly lucky to have been able to do what I have. Coming up through the Golden Age of wine in America—the boutique era, the rise of the wine print media, the enormous popularity of wine (and beer), and the emergence of social media—has been a privilege, and also a great opportunity to see history being made, close-up, and perhaps to have been a tiny part of it. No wonder people want to work in this industry.
Have a wonderful weekend.
Did you know that I prefer organic wines to non-organic wines? I didn’t, either. But then I read this new paper from the American Association of Wine Economists, entitled “Does Organic Wine taste better? An Analysis of Experts’ Ratings,” and I found out that, yup, I do.
Well, kinda sorta. See, the paper’s authors decided to study “data from the three influential wine expert publications: Wine Advocate, Wine Enthusiast, and Wine Spectator,” and as it turned out, “During our period of study [74,148 wines produced in California between 1998 and 2009], the main tasters for California wines for Wine Advocate, Wine Enthusiast and Wine Spectator were Robert Parker, Steve Heimoff, and James Laube, respectively.”
The big P-H-L! They took our scores, crunched them in that esoteric way only economists can, and lo and behold, “Our results indicate that the adoption of wine eco-certification has a statistically significant and positive effect on wine ratings.”
How much? Not a lot: “Being eco-certified,” the authors found, “increases the score of the wine by 0.46 point on average.”
Well, one hardly knows where to begin. Right off the bat, I have a problem when the lesson that people will take away is that P-H-L (and by extension major critics) prefer organic wines to non-organic ones. Less than half a point difference? I suppose if they fed 74,148 scores into a computer and found a 0.46 point difference, then who am I to argue with HAL? But a 0.46 point difference doesn’t seem like very much to me. It’s not even round-uppable to the higher score (87.46 rounds down to 87).
But wait, there’s more. The following factors also had an impact on the scores of organically-certified wines, according to the paper:
- ” a 1% increase in the number of cases will decrease score by 0.003
- ” An increase in the number of years of certification experience by one [winery] decreases score by 0.09 point.”
Confused? I am. So the more cases wine the winery produces, the lower the score is; but the longer the winery has been certified organic, the lower the score also is!
How about the winemaker’s hair color? Did they include that?
The authors also counted the number of words in each review and found this: “Next, we examine the impact that eco-certification has on the number of words used in wine notes. As shown in regression (1) of Table 6, wine notes of eco-certified wines are not significantly longer than those of conventional wines. However, as shown in regressions (2) and (3), eco-certification increases the average number of positive words by 0.4 but has no statistically significant impact on the number of negative words.”
My interpretation of this is that it’s gibberish. The authors compiled a list of words [Table 7] but I don’t understand how they infer whether their use is positive or negative. Is “jammy” positive or negative? Do Parker, Laube and I even use it in the same way? How about “offbeat”? Is that good or bad? And “peat”: if I tasted that in an Islay Scotch it would be good, but in a Chardonnay?
The authors also state something that I don’t think is objectively true, or, even if it is, is irrelevant. “Second, as a related point, wine experts have a better knowledge about wine eco-certification and are able to differentiate between different types of eco-labels, namely organic wine and wine made with organically grown grapes, which represent different wine production processes with different impacts on quality.”
I’m not going to sit here and tell you I know the difference between different types of eco-labels. There are so damn many (different certifying agencies, “natural,” biodynamic, etc.), I get confused—and, while I’ll let Parker and Laube speak for themselves, I bet they get confused, too. Besides, if “All the publications claim blind review,” as the paper’s authors write, then we critics don’t even see the labels when we’re tasting and reviewing (much less would we have a tech sheet in front of us).
But finally, this statistic seems to be to be the last nail in the coffin of the study: “On average, 1.1% of the wines in the sample are eco-certified.” By my calculations, that’s a little over 800 wines—out of 74,148. I fail to see how you can extrapolate any useful information from such a small sample, compared to the huge number of wines in the study. Apples and oranges.
I’m no economist, it goes without saying. If I were, I guess I’d spend my days crunching numbers and coming up with interesting factoids. But I have to say, I don’t see the point of this particular study—not if it’s going to be used to make a claim that I don’t regard as true. For the record, let me say that I do not think organic wine is better. And you know what? I don’t care what the numbers say.
Ian Burrows is a great sommelier whom I first met at a Jackson Family Wines event I was speaking at. He was then working at one of San Francisco’s hottest restaurants, Atelier Crenn, in the Marina District. I was never fortunate enough to dine there, because the Marina is really a schlep from Oakland. I liked Ian a lot when we met, and he turned out to be a good correspondent, on both Facebook and my blog. So when he wrote me a fairly long comment, I took it seriously, and want to respond in kind.
Ian had read my post from a few days ago, in which I described how, in choosing wines for my tastings, I rely on—among other factors—the reviews of certain top critics. Ian wrote:
I read your article on choosing sparkling for a comparative tasting, and I have to ask, why on earth would you ever base your choices on other critics scores?
I have never understood the fascination of taking such an incredibly narrow focus on deciding which wines (or automobiles or eye-liner for that matter) are the best value, most accessible, most delicious or whatever from a handful of very influential reviewers.
Why not just send out a bunch of random e-mails to your wine buddies? Ask “what wines in XYZ category should I represent in this tasting?”…. Surely, if you spread it across continents and demographics you’d get a more accurate picture.
I have the utmost respect for what you did at WE (although I still do not completely understand it) and I have even greater respect for what you do at JFE but you gotta let go of what is, quite frankly, a waste of time….. “Wine reviews”.
Reviews – I am pretty sure they will be gone in five years.
You have a better deal being the PR front man at JFE than a reviewer because at least you can focus squarely on industry trends/changes, comment and review issues that directly and indirectly affect the quality and style of wine, not simply assign points and hope that readers respond by supporting your tastes and/or reviews.
It’s perhaps a face to face conversation for another time, but one that I know will be vibrant and respectful
I replied personally to Ian, but I want to expand on that here (and I wouldn’t be doing this if I didn’t have the utmost respect for him). My main points were, (1) I am emphatically not “the PR front man” at Jackson Family Wines! I don’t know how that rumor got started. In fact, my job has nothing to do with PR (although I suppose you could say that everything ultimately touches on public relations).
More to the point, I defend my use of other critics’ scores this way: When you’re assembling a lineup of wines for a comparative tasting, you have to use some kind of parameter. Since you can’t taste everything that theoretically falls within the scope of your tasting, you necessarily must limit the number of entries. Let me ask, Readers, how you would do it?
Let’s say, for instance, that I want to do a tasting of the Cabernet Sauvignons of Rutherford. There are at least 39 wineries in Rutherford, according to the web page of the Rutherford Dust Society. Many of them, maybe the majority, produce more than one SKU of Cabernet Sauvignon or a Bordeaux blend. Let’s say there are 100 different SKUs. That’s too many to include in a tasting, so you have to whittle down the number.
You could do this in any number of ways: Wines from west of Highway 29 on the Rutherford Bench, wines from the Mayacamas Mountains, wines from east of Highway 29 but west of the Silverado Trail, wines from east of the Silverado Trail, wines from way up in the Vacas, wines from south Rutherford, from north Rutherford, 100% Cabs, blends, wines above $75, wines below $30, and so on and so forth. Any of those would make sense, I suppose. But so does the kind of crowd-sourcing I do when I choose wines based on my own experiences, compounded by their critical scores. When Wine Advocate, Wine Spectator, Vinous, Wine Enthusiast, Wines & Vines, Wine & Food, and so on are all giving a wine high scores, that’s a pretty good indication it’s a very good wine. And those are the kinds of wines I want to include in my tastings, especially when we’re including Jackson Family Wines in the lineup. I want to see how JFW wines stand up to the most critically acclaimed wines. (And I hope I won’t be accused of wearing a PR hat when I tell you, they do very well.)
Surely Ian isn’t entirely serious when he suggests sending random emails to my “wine buddies” soliciting their views. I have about 4,000 Facebook friends and 6,500 Twitter followers. Not all of them claim to be wine experts, and frankly, I don’t know most of them, so their opinion is not of the greatest help to me. If I was doing something on popular drinking habits or trends or wine and food pairing, I might, and frequently do, ask my friends and followers, but not for assembling a blind tasting of ultrapremium wines.
Now, Ian (and a generation of young somms) may not care about the major critics—I understand that–but I do. Maybe it’s a generational thing. I respect what James Laube, Robert Parker and the others do. I know how hard the work is…what the pressures are…I know also that when you’ve tasted wine seriously for a good many years you really do develop a master palate. I don’t think there’s anything crooked or unseemly about what they do (and what I used to do). These are men and women of the highest integrity and their opinions should matter.
Nor do I think wine reviewing is “a waste of time” that will be gone in five years. I’ve frequently said on my blog that wine reviewing will always be with us, because as long as there are a zillion wines on the market, consumers are going to seek guidance. I’ve said that this guidance can come from many different sources, including a local and trusted merchant, but merchants—let’s face it—may have a motive to recommend a wine they carry, which makes them less than completely objective. A wine critic of the caliber of a Parker, Laube, Galloni, etc. has no ulterior motive. He or she doesn’t care about the advertising his publication may or may not solicit from wineries—that’s the famous “firewall” between editorial and advertising, and it’s real. Nor does the critic care whether or not someone buys something. So, unless you’re prepared to charge the critics with something untoward—and prove it—you really have no leg to stand on when it comes to criticizing them or questioning their sincerity or ability.
I will concede that every critic has his subjective preferences. Wine Spectator, in my opinion, gives too much attention to Marcassin. The San Francisco Chronicle seems to have a thing for Morgan Twain-Peterson and Bedrock. When I was at Wine Enthusiast I certainly gave a lot of love to Bob Cabral and Williams Selyem. But there’s nothing nefarious about any of this: critics are only human, and we do form attachments, to winemakers, wines and particular styles of wine.
So, my friend Ian, this is my respectful reply. I’d love to get together, anytime you’re free, to chat about this; and maybe I can explain what I did at Wine Enthusiast.
Have a great weekend!
As you may know if you read me regularly, I’ve been having some wonderful wine tastings with my friends at Jackson Family Wines. Over the last 1-1/2 years we’ve done multiple sessions of mainly California wines: Pinot Noir, Chardonnay, Cabernet Sauvignon, Rhone blends and so on. Our next theme is sparkling wine. It’s been, like, forever since I went to a bubbly tasting, so I’m particularly excited.
When I set up these tastings, I first develop the theme. But then it’s time to choose the wines. There are so many choices that you have to have some kind of system, and I do. I realize it may not be perfect, but what system is?
My initial criterion is to pick wines I, myself, have given high scores to. It’s been a while since I was an everyday critic, but not that long. Of course, you can learn a lot from tasting average, or even mediocre, wines, and I’ve included some of those in my tastings. But for the most part, I want to try wines that are high-end, and the best way to do that, IMHO, is to look at critical scores.
Here are the critics I routinely check out: Robert Parker/Wine Advocate; Wine Spectator; Antonio Galloni’s Vinous; and my former employer, Wine Enthusiast. I have subscriptions to three of them; Enthusiast doesn’t charge (I think they should, but that’s not my call). I also try and look at Food & Wine and a few other publications, but those four are my must-sees.
If all of the major critics give a specific wine a high score, it’s a go for my tastings. Usually, the critics are pretty close. Someone may give something 96 points, someone else may give it 92 points, but that’s okay, it’s ballpark. Every once in a while, I come across a wine somebody gave mid-90s and somebody else scored mid- or even low 80s. The lesson is that sometimes the critics can’t agree amongst themselves. In that case, it’s fun to see how my score, under blind conditions, matches up to the other critics’. My impression, which is simply that—an impression, not the result of a database crunch—is that Galloni and Parker tend to give higher scores to California wines than Wine Spectator. Wine Enthusiast is less predictable. But then, they’ve had some turnover in their California coverage.
I wonder how people who don’t like the critics or the 100-point system go about choosing wines for tasting. In Europe you can always do hierarchical tastings since they have formal tiers, but here in California, we don’t. You can’t do a First Growths of Napa Valley the way you can in Bordeaux. Some writers try to get around this absence of rankings by producing their own: I Googled “first growths of napa valley” and got 4,180 results. These can be interesting to read, but they have problems: They’re only the writer’s opinion, the writers may not have had access to everything (who does?), and even worse, the rankings change over time. One year Chateau Montelena is in; the next, it’s not, and Futo is, or Kenzo, or Yao Ming, or some other newcomer. So if I was doing a Napa Cabernet tasting (and I haven’t yet, but I will), I’d make things simple for myself by looking up what the major critics say. Of course, that doesn’t mean I’ll be able to get the wines I want! I have some pretty good connections, but even for me, some of these wines are totally impossible to buy.
At any rate, comparative tasting, done blind, is one of the most thrilling and instructive things a wine writer can do. In fact, it’s a prerequisite for the job. I’m very fortunate that Jackson Family Wines gives me the budget for it. I sure couldn’t afford to do it on my own!
Since by now it is obvious that anyone can write and publish a wine review via social media, we need to seriously address the issue of whether “Anyone can become a wine taster with a little practice.”
That, at least, is the contention of Anna Harris-Noble, a Brit who runs a company called Taste Exchange. She rejects the notion that any special palate is required, arguing instead that “Wine tasters are no different to [sic] anyone else, they’ve just had more training in identifying tastes and smells, so the good news is that anyone can become a wine taster with a little practice.”
Is this true, or does a real taster need special talent?
We’re all familiar with the concept of the “supertaster.” As developed by Linda Bartoshuk, it argues that some people perceive tastes more intensely, due probably to genetic factors; some famous critics, including Robert Parker and Ron Washam, might conceivably be supertasters.
But what is tasting ability, anyhow?
Whenever somebody reviews anything—movie, car, wine—and writes about it, the public inherently trusts that the person knows what he’s talking about. It’s human nature. “So-and-so wouldn’t be reviewing the thing, if he weren’t qualified.” This is particularly true if the review appears in a respected source, such as a well-known magazine or website, which almost guarantees credibility.
But the Internet and social media have begun eroding the trustworthiness of magazines in recent years; the public seems almost as likely to believe a self-published blog as a magazine with a circulation of hundreds of thousands.
Setting aside for the moment the question of “What is tasting ability?” we first encounter the reality of many people reviewing wine online. That is a fundamental truth: there may be upwards of 1,000 of wine blogs in the U.S. alone. They’re tasting wine, they’re writing about it, they are presumably thinking seriously about it, they are presumably being taken seriously by others. Therefore, from one point of view we have to assume that they have tasting ability because their behavior exhibits all the external parameters of a tasting professional.
But we think of tasting ability as more than the ability to publish a tasting note, right? So what is it? Is Harris-Noble right—wine tasters are no different than anyone else? Or do professional wine tasters have some sort of special gift that the rest of us don’t?
Harris-Noble suggests that it’s training and practice, not inherent ability, that makes for a professional taster. I think that begins to address the issue, but it’s only a beginning. Because, let’s face it, you don’t become a wine taster—a good one—solely because you get your hands on the occasional bottle of wine and write up some notes.
What else does it take?
I don’t think there are any absolutes, but if I were in charge, I’d want credible wine tasters to
- Taste as widely and broadly as possible. You can’t taste everything, of course, but you can taste as much as you can.
- Determine whether you will be a specialist or a generalist. A specialist focuses on a single country or region. I was a specialist. A generalist focuses on the world. Jancis Robinson is a generalist. One is not better than the other. You also should visit the places you’re writing about as often as you can.
- Develop a certain craftsmanship in writing. The best tasters/writers consciously seek a personal style. Think of it as the terroir of your writing.
- Read, study, learn. The knowledge of wine—its history, methodology, geography and so on—is a lifetime pursuit. Understanding, for example, the history of oak influence in Chablis wines will make you a better taster and writer.
- Continuous self-evaluation, which depends on self-knowledge. If you’re not getting better as a wine taster all the time, then you’re getting worse. And you have to be honest with yourself about it.
By the way, I saw a news report the other day about a man born without arms who became a world-champion archer. He trained himself to use his legs and feet, and even invented a new type of bow. So can anyone at all be a good taster? Yes. But some have to work harder at it than others.