Wine Criticism: More Than a Matter of Taste
by John Szabo MW (August 13, 2013)
Republished from winealign.com
The National Post's drinks columnist Adam McDowell recently published a controversial article questioning the motives, reliability and general usefulness of wine critics with the provocative title of "Vintage snobbery: Are wine critics fooling us into buying pricier bottles?"
It was yet another in a recent string of similarly themed articles with plenty of sizzle but little meat, recycling a collection of studies to demonstrate "the limits of the wine snob's art," and to make the point "that [wine] judges and critics simply invent their often florid descriptions of wine and pull scores out of thin air."
I can see the attraction of the topic, and McDowell is hardly the first nor surely the last to cover it. Writing a piece citing a few choice "gotcha" studies that show up the experts serves to assuage the angst felt by wine consumers when faced with troublesome questions like, "how much should I spend on a bottle?" or "is this wine good, is it worth the price?" It's a titillating topic – if the experts can't get it right then there's no shame in the layman struggling to figure it out.
McDowell's article relieves you of the pressure of buying expensive wine since you won't like it more anyway, and reminds you that you couldn't tell the difference between cheap and expensive wine with the price concealed. He also warns you that wine critics are little better than circus sideshow illusionists who are out to make you spend more money.
You can almost hear the sighs of relief of those who can now feel content in their ignorance, or the smug satisfaction of others who agree there's no need for wine experts since they don't exist. "Perhaps wine drinkers should take comfort in studies that suggest the critics are at least as stupid as everyone else," says McDowell.
Predictably, the article was met by harsh criticism from both folks within the wine trade, as well as from consumers who actually enjoy drinking wine and want to learn more about it. Though there was also support for McDowell's views to be sure. It's not that there wasn't any truth to his words – there was. But it was the generalization of the industry and the misleading implications drawn that have inspired these words from me.
Wine, much like any other non-essential consumer good, is subject to speculation driven by things like price and scarcity that have little to do with intrinsic quality. And the notion of quality itself is a matter of debate, since it often gets confused with enjoyment, as though one necessarily leads to the other. A wine that meets the set of objective factors that a large sample of experts agrees upon to define quality won't necessarily be the wine you like best. And yes, plenty of wines fall short of quality while still commanding a high price.
There is no shortage of inconsistency and confusion in the wine world, which is in turn reported on by a huge number of commentators with widely varying degrees of competency and experience, further adding to the confusion. I can easily see how this could lead you to believe that the whole business is a sham. Not all wine critics are equal. Many are in it for the perks of the business. And many wine producers are in it (however misguidedly) for quick money, or worse, pure ego.
But implying, as McDowell does, that since there are a few bad cobblers, going barefoot is your best bet, is a disservice to consumers. Especially from someone who writes about drinks, presumably as a useful service.
Wine critics save you money – That's their only purpose
Reliable and credible wine critics save you money. They are your Virgil through the hellishly complex world of wine, helping you learn about and gain confidence in your own tastes and derive more pleasure, advocating on your behalf all the while. They can embolden you to stray from your comfortable routine to discover new flavours and different grapes or regions to further explore. They are a starting point, not an end point. This is their sole purpose.
The reality is that wine is complicated. If your purpose is anything more than a quick buzz or to quench your thirst, then there's a lot to learn, and it can be as complicated as you want it to be, and even much more than you'd ever want. I don't see anything wrong with that.
Efforts to democratize and demystify wine will always fall short for anyone who cares about what they drink. It's the job of wine critics, at least the best of them, to know far more than you, and thus be able to help you. And since some wines are better than others, but price is not always an accurate guideline, expert opinion becomes even more valuable.
The studies quoted by McDowell to make his points are essentially the same studies that were drawn upon for an article published earlier on June 25th, with the even more sensational title "Is Wine Bullshit," which in turn draws inspiration from a New Yorker article from June 2012 written by pop-neuroscience writer Jonah Lehrer entitled "Does All Wine Taste The Same?" which in its turn drew its inspiration from essentially the same studies cited in all these subsequent articles and probably many more.
But aside from unoriginal reporting, the studies enlisted by McDowell and the others before him to make their arguments fail to recognize some critical mitigating factors, aren't nearly as conclusive as they are made to appear and muddle issues of consumer preference, value, wine pricing, true expertise, tasting conditions, snobbery, and more.
I could have written 10,000 words in response to these articles, but will focus instead on some of the more useful points to discuss that seem to pop up time and again in the world of wine.
Are you wasting your money?
The latest sales stats show that Canadians are opting for pricier labels, and McDowell questions whether consumers are wasting their money. The implications are that A) wine critics are driving the trend to spend more, even though they can't tell the difference between expensive and cheap without the aid of a price tag ("the litany of embarrassing moments for the wine criticism industry ought to at least give the average drinker pause"), or B) wine critics tend to like wines, generally more expensive, that you won't like because you don't have the same refined palate.
Implication "A" is illogical, since bona fide wine critics – despite what some cynics believe – don't profit from wine sales and have no motivation for recommending more expensive wines over cheap wines. Indeed, the battle cry of the critic in recent years has been value, value, value. It also goes without saying (or maybe it doesn't) that reputable wine critics have no commercial connections with wineries or distributors. Their publishers often do, but the most honest and trustworthy publications respect the "church and state" separation of editorial and advertising.
"B" on the other hand, is likely closer to the truth. The aspects that make some wines more expensive than others can take some getting used to.
McDowell, as many others before him, reports on the results of a 2008 study by American author and food and wine writer Robin Goldstein published by the American Association of Wine Economists, in which he used "a large set of blind tastings to demonstrate that only people who have wine training prefer expensive wines to cheap ones. Mr. Goldstein has long argued that in blind taste tests, ordinary wine drinkers like cheap wine as much as — or even more than — expensive wine." In a similar study in 2011, a psychologist at Hertfordshire University [Robert Wiseman] demonstrated, via blind tastings with 578 subjects, "that most people simply cannot tell the difference between a cheap wine and an expensive one. The experts do little better." So are you wasting your money buying expensive wines?
A matter of price, context and experience
Price and context do heavily modify perception; that much is clear. As Matt Kramer recently wrote in his Wine Spectator column on a similar subject, "ask any psychologist and you'll fall asleep before he or she finishes reciting the number of studies that confirm the relationship between preconceived expectations, high prices and customer satisfaction."
In the case of wine, the average drinker will almost invariably prefer a wine known to cost more than a cheap bottle. Why? "Because it makes them feel more secure about the quality of what they're drinking. Hell, almost no one knows much about wine. But everybody's an expert on money. So, if this here Cabernet costs $200, then surely it is good," says Kramer.
But when price is removed from the equation, as in these studies, then suddenly the wine is left to speak for itself, and speculative or inconsistent pricing policies and innate preferences bubble to the surface, as does the taster's experience or lack of it.
Context, too, is also critical for enjoyment, and not just for wine. Consider the stunt staged by the Washington Post in 2007, when acclaimed violinist Joshua Bell played a 45-minute incognito concert of some of the worlds' most intricate solos in a D.C. metro station on a violin worth 3.5 million dollars. Only six people stopped to listen in earnest, and he collected a paltry $32. Two days prior he had played to a sold out concert hall in Boston with an average ticket price of $100. Context can make or break a moment. I do suspect, however, that if a fellow professional violinist had happened by the metro while Bell was playing, s/he very likely would have recognized greatness. Expertise and knowledge lessen the interference of context.
But as an aside, I also wonder, at the most basic level, if there really is any difference between thinking a wine is good (because it has a fat price tag) and it actually being good. I'm willing to bet, using an fMRI scan, you'd find that the pleasure you feel in both cases is just as real and authentic – the same "pleasure centers" of the brain would light up. So perhaps the average consumer is not being fooled into enjoying more expensive wine, they actually are enjoying it. The implication is that all a winery needs to do is raise its prices and consumers will be happy.
The trouble is, critics are not so easily swayed into enjoyment by high prices. In fact, it's quite the opposite. In what can sometimes resemble a witch-hunt, wine critics can be found looking to expose and tie up all of the over-inflated labels to the stake to be burned. As you become more educated, it becomes easier to spot the ruse. But until you're there, a reliable critic can come in handy. They can make those pleasure centers light up for far less money.
But back to the studies. First of all, Robert Wiseman's study was far from scientifically sound and the methodology is questionable. McDowell, like most other reporters on the study, fails to point out one critical detail. As Jamie Goode notes in his excellent blog The Wine Anorak, "It was not a comparison between two wines, one cheap and one expensive. Instead, subjects were given just a single wine to taste, and then asked to say whether it was cheap or expensive. 'To keep it as realistic as possible, we presented them with a single glass of wine and they had to say whether inexpensive or expensive,' revealed Wiseman when I asked him about this. This makes the results, which showed that people had a more-or-less random chance of getting it right, unsurprising. … Being asked whether a wine in front of you is expensive
or inexpensive is a difficult task indeed. It would still be difficult, but considerably less so, if the subjects had been offered a comparison of two wines to chose between." In the end, it's hardly conclusive evidence that people prefer inexpensive wines.
Similarly, in Goldstein's study, participants were asked to identify the wines they liked best, not the wines they thought were more expensive: "The rating was the response to the question "Overall, how do you find the wine?" and the available answers were "Bad," "Okay," "Good," and "Great"."
So what the results prove here is that expensive wines aren't always the wines people enjoy most – price and pleasure are not strictly correlated – and not as Lehrer or McDowell would have it, that all wines taste more or less the same and that you can't tell the difference. There are many possible explanations why sometimes people prefer less expensive wines, aside from there being no intrinsic difference between cheap and expensive wine.
It's also clear expensive wines aren't always judged better than cheap wines even by the experts, with price known or concealed. There have been countless instances in which I've preferred the less expensive wines in a winery's lineup, for example. What's evident is that neither study proves that it's never worthwhile to spend more.
The more you learn, the more you find, the more fun you have
Consumers with little wine tasting experience are less likely to be able to identify the factors such as concentration, depth, length, complexity and longevity that are the widely agreed upon markers of "quality" that make a wine worth more money. Without any training or prior knowledge I'd be hard pressed to make any useful, technical quality judgment on a musical performance – I might well have walked by Bell with little more than the smile generated by pleasing music. But I could easily have given the performance a thumbs up or down. So what?
But to draw the conclusion again that there's no point in buying expensive wine because they're hard to identify without knowing the price tag assumes that consumers are already satisfied with their wine drinking habits and have no interest in learning more about it. It ignores the fact that they might come to recognize these attributes and gain a little bit more pleasure out of drinking wine rather than just knocking back a few glasses of whatever red or white is going.
So because you don't know any better, McDowell and others would have you drinking plonk for the rest of your days in your comfortable ignorance, blissfully unaware that it could get any better. That's surely fine for some, but considering the massive increase in interest in wine in recent years, I'd have to say that the stats point to the fact that consumers are searching for something more in greater and greater numbers. And wine critics can help them find it.
As James Harbeck (@sesquiotic) nicely summed it up in a couple of tweets: "people who have X training prefer expensive X to cheap X. Applies for X=wine, art, music etc. Education! The more you learn, the more nuances you find, the more value you discover. Aficionados pay more for quality." By learning more about wine you will very likely end up spending more, but you won't be wasting your money.
Sometimes the wine is wrong, not the expert
The above studies also ignores the fact that wines are priced by wineries, importers, distributors and retailers, not by consumers or wine experts, which might also explain why "the experts fared little better" in trying to figure out what cost what. That is to say, sometimes the wine is wrong, not the expert.
The matrix to arrive at a final shelf price involves dozens of factors. Sometimes you're paying for tasty and original fermented grape juice, sometimes you're picking up the tab for equipment amortization or paying for the privilege of scarcity or lining the pockets of a greedy importer-distributor-retailer. There isn't any clear equation between price-quality-pleasure. Many wines are over or under-priced according to expert consensus, beyond the simple laws of supply and demand. And the more expensive the wine, the more speculative pricing becomes.
This makes the job of the critic even more critical. If it were a simple linear equation of higher price equals better quality equals more pleasure, there'd be no need for critics or scoring; the price tag would be the score. For consumers who do care about the nuances that make some wines better than others but don't have the access to pre-taste hundreds of wines before settling on a purchase, knowing from a trusted source which wines over and under-deliver, or where to start looking for the better values and what those values look like, is about the most valuable service imaginable.
Trustworthy wine critics, contrary to the title of McDowell's article, can save consumers thousands of dollars by pre-sorting everything out, and increase their wine drinking enjoyment.
Less romantically, the critic's "thumbs up" for value can replace the big price tag as the driver of those cerebral pleasure centers.
But such critics need an awful lot of expertise, that is, context, in order to do that for you reliably. As well respected beer and spirits writer Stephen Beaumont puts it, "The critic's role is to offer informed opinion to help the consumer navigate an often vast array of choices. …years or even decades of knowledge and experience… are…. what cause readers to place their trust in certain critics."
Wine tasting is a tough discipline, so get to know your expert
Jonah Lehrer in his article cites the results of a blind tasting of expensive Bordeaux cru classes and less expensive wine from New Jersey (in which Bordeaux narrowly comes out on top), to question the ability of experts to reliably describe wine and assess quality. He implies that the differences observed in wine scoring and general qualitative appreciation are, quite literally, imaginary, and concludes that "tasting wine is really hard, even for experts." Furthermore, he says "Because the sensory differences between different bottles of rotten grape juice are so slight—and the differences get even more muddled after a few sips—there is often wide disagreement about which wines are best."
There is some truth in these observations. It's true, assessing wine is a highly challenging discipline. And tasting wines blind, without any knowledge of what's in the glass, and trying to guess grapes or origin, not to mention price, is even tougher. Notions of absolute quality are downright slippery.
But how expert were the experts for this Lehrer tasting?
The high degree of difficulty of tasting with objectivity, accuracy and consistency doesn't mean that it can't be done, but rather that most people can't do it. It is in fact done all the time; I've seen qualified tasters describe, assess, and identify, even guess the price, for wines with uncanny accuracy countless times.
And even "quality" is more often agreed upon than not by those properly trained to recognize it. "I see wine-tasting as a discipline… not a talent, not an art, and not even a skill but a discipline," says Canadian wine writer, judge and broadcaster Konrad Ejbich. "It takes years of study and serious application to consistently approach each wine with honest objectivity."
John Blind Tasting
Such are the pre-requisites to pass the rigorous Master Sommelier or Master of Wine examinations, among many other reputable examining bodies in the world of wine. Have a look at any of the episodes of WineAlign's So You Think You Know Wine to see some qualified tasters work through the vast range of possibilities and arrive at pretty accurate conclusions.
But most people have neither the time nor the inclination to do the pre-requisites. That's what makes qualified wine critics useful: they do.
It's also true that the wider the degree of experience within a sample group of tasters, and the wider "the disagreement about which are best." So much is obvious. While no critic of any kind is infallible, there are wide degrees of competence within the wine trade, as within any field. So the first question for any such studies is how expert were the experts? Their names are never listed.
It's fun to make fun of snobs
Nobody likes a snob and they're fun to mock, like in the old Roald Dahl story with which MacDowell opens his article. But there's a distinction to be made: expertise does not equal snobbery, any more than snobbery equals expertise. It's a popular sport to brand anyone who knows a little more than the average about wine as a snob. But technically, a snob is "a person who believes that their tastes in a particular area are superior to those of other people" (from Google definitions, which, incidentally, uses the phrase "a wine snob" to support the meaning).
A useful wine critic doesn't believe his or her tastes to be superior to anybody else's, it's just that they've acquired the expertise and language to describe and contextualize the sensations that most people struggle to do. It's not a question of taste, it's a question of knowledge and experience – what's generally called expertise. A good wine critic doesn't dictate taste; a good critic describes and guides others to discover their own preferences. And most of the time, contrary to McDowell's sensationalist headline, critics aim to help people spend less doing that, or at the very least to help them appreciate the reasons for which you'd want to pay more.
"A 2001 University of Bordeaux experiment tricked 54 undergraduates in a university wine program into thinking that white wine was red simply by adding food colouring," cites MacDowell. This, in my view, proves little, other than that the students weren't colour blind. There are several explanations for the results other than wine tasting is bulls—t. For starters, the fact that it was a class of undergraduate students necessarily means that they are not experts. And the less experience you have, the more likely you are to be influenced by factors like colour, price or famous label, outside of what you can smell and taste, in your description and assessment.
But even beyond this study, it's also worth examining what exactly is the difference between red, white and rosé wine in the first place. It's all fermented grape juice. I've had countless of red wines that smell like grapefruit and as many white wines that smell like strawberries. Red, whites and rosés can all taste like green peppers. White wines can have tannins, red wines can have none. Whites wines are made from red-skinned grapes (champagne, white zinfandel, etc.), while rosé can be made from white grapes with a dash of red wine to finish. These students were not fooled into thinking that a white wine was a red wine. The wine was red, because it had a red colour. That's the only criteria for being red wine. Colour alone has no intrinsic flavour.
And that different descriptors were used for the same wine is not surprising either. Wine tasting is a multi-sensory activity. It's not just about your olfactory bulb or taste buds. Visual cues can condition the experience of aromas, flavours and tastes of everything we eat and drink. That's all this study shows. And the less experience you have, the more interference you'll have between the senses.
Furthermore, anyone describing wine will describe the elements that are most relevant at the time of tasting – it's not an absolute DNA profile of the wine. So a wine can be both slightly jammy and slightly fruity, yet only the most striking aspect will be described. But since we can measure the aromatic compounds in a wine and correlate them to sensory experience, it's clear that trained tasters can accurately describe wines. It's just not that easy.
Inconsistent wine scores
MacDowell brings up the fallibility of the 100-point system as yet another warning against trusting wine critics: "Or take the contentious issue of wine scores, which many consumers use to make quick decisions on what to buy."
He makes the case that wine scores are inconsistent based on the widely used example of a 2005 experiment set up by Robert Hodgson at the California State Fair wine competition. Hodgson compared judges' scores on the same wine served at different times during the competition and found wide variations. "Their scores were wildly inconsistent, with an average plus-or-minus variance of four points."
On the surface this would seem to prove that scores are fickle and shouldn't be used as a basis for purchasing decisions. But in reality, all it shows is that a particular group of wine judges in a specific situation scored inconsistently. And there are even plenty of reasonable explanations why this might have been so.
Yet again, one has to question the degree of discipline and experience these judges had. Get a group of amateurs in a room to taste wines blind and see the scores on the same wine vary like the putting accuracy of a group of beginning golfers trying to make the same putt. But even assuming that all of the subjects were highly trained tasters – unlikely in a competition that pays a pittance and is thus avoided by the top tier – there are other factors that can explain variance. Bottle variation, cleanliness of glassware, temperature, order of wines, or how long the bottle had been open, not to mention taster fatigue or health, for example, can all have an impact on how a wine is rated on a given day. The better the taster, the less impact these variations should have on the score given.
And then, in this specific case of the California State Fair, the judges were virtually set up to fail.
Konrad Ejbich, who was a judge at the fair when this experiment took place, recalls the brutal demands placed upon the tasters. His panel was assigned a flight of 77 high-alcohol zinfandels to sort through, a practically inhumane task. Ejbich and his panel set about selecting the best wines to move forward to the next stage of the competition, taking less than a minute per wine. Despite his forty years of experience and confidence in his ability to quickly and accurately assess wine, it's unlikely that an identical score could be repeated on the same wine served in such a massive flight, and then later on in a very different flight.
With so many samples "the organoleptic equipment gets tired," says Ejbich. "The mental strain of adjudicating so many wines is very draining and despite the fact that we spit every drop of wine, we smell and inhale plenty of alcohol and eventually it has its effect."
Michael Vaughan, another experienced Canadian wine critic and judge who has also judged at the California Fair, told me that some panels were tasting in excess of two hundred wines per day – and these are not 7% alcohol Mosel Riesling but generously alcoholic California wines. There is no taster in the world who can accurately and consistently score two hundred wines in a single day and repeat it the next. Period.
So the more logical conclusion you should draw from Hodgson's experiment is that not all wine competitions are created equally, and some are more consistent than others. Compare the conditions at the California Fair, or, say, VinItaly and its 100+ randomly chosen "experts" (I've been, they're not all experts) to other competitions, such as the Decanter World Wine Awards, which employs principally masters of wine for the judging panels and limits the number of samples evaluated in a day to 80 or 90, or the International Wine Challenge, again presided over by top critics who are respectably paid for their expertise, or Australia's wine judging circuit where it takes many rounds of testing and training before you're even allowed to sit on a panel, or even WineAlign's National Wine Awards of Canada, which gathers some of the country's acknowledged experts and puts out a maximum of ten samples per flight and no more than 70–80 samples per day, and it's
easy to see which are more likely yield more accurate and consistent results.
Everyone concedes that scoring wine is imperfect, or rather problematic because it implies perfection and precision in judgment. It's not meant to be an absolute measure of quality or even less pleasure, but the fact remains that ranking wines is useful, providing another piece of data to arrive at a purchasing decision. It surely beats a price tag alone in trying to sort through the overwhelming number of labels available today. Just be sure you know who is doing the scoring – you should demand consistency.
Bottom line: Everyone's a critic, so find the right one
Getting paid to taste wine looks like a pretty glam job, and as such naturally attracts a large number of people wanting to get in on the action. Since there are no longer any of the traditional barriers to entry – you don't need to convince an editor that you're qualified, you just publish your own stuff – there are countless people of hugely varying degrees of experience and training, starting at zero, writing about wine. It's no wonder that anybody scratching the surface can find a large cache of hucksters.
An article that would be of real service would caution readers that not all wine criticism or scoring is equally reliable. It would not suggest that it's all nonsense. It would reveal the wine competitions that apply more rigorous tasting methodologies and hire better qualified judges, yielding more reliable results,. A list of wine writers with professional training, plenty of experience, and a better and more consistent track record of reviewing and scoring wines, would be helpful, rather than insinuate that the entire profession is a sham.
In any case, I hope that my WineAlign colleagues and I will save you a little money in the long run and increase your wine drinking pleasure. That's both our goal and our job.