“While everyone may not agree with everyone’s score for any given wine, there is no question that scores for wine in general have helped quantify the quality in the world of wine.” – John Kapon
Much has been written about the need – or lack thereof – for scoring wines these days. Yes, liquor stores love stocking 90-plus point score wines because it moves bottles off of shelves much quicker, but the question we should be asking ourselves is one that many either haven’t been able to fully answer or simply don’t want to answer, and that is: what’s the real motivation behind the score?
There are so many factors to consider, so many numbers out there and too many critics with (questionable?) motives handing them out. Is that critic being completely objective (if there is such a thing) or was he or she ‘wined and dined’ to post a favourable score? Is the critic posting a high score out of self-promotion? What kind of relationship does the critic have with that particular winemaker or wine proprietor?
Knowing that the benchmark score for consumer attention is 90 points plus, are these posted wine scores meant to be more of a means of helping consumers wade through a crowded field, or is it merely being done to help strengthen relationships with proprietors/selling agents, doing favours to help get specific products to the front of the line in a crowded market?
Motives and conspiracy theories aside, each critic has their own way of scoring – which can only be an obvious assertion. After all, wine tasting is not a science. It can be best described as a feeling within the moment. And some moments throughout our day or week can be more suitable for wine tasting than others. For example: are we tasting wines at 11am on a Monday or are we tasting at 5pm on a Saturday? (Doesn’t the taste of wine just seem better on the weekend after a long week?)
Objective critics will say that it makes no difference when and where they are tasting, but I say no matter how great your palate is, there’s always going to be a certain amount of fatigue that one experiences – especially when embarking on your 25th tasting of the day (or perhaps your 100th tasting of the week). As objective as we would like our intentions to be, it always tends to circle back to the subjective.
Let’s cast that aside for now. We’ll come back to that.
Right now let’s learn how some people evaluate their wines; understanding the mindset one enters a tasting with. A couple years ago I was at a vertical tasting for the Sassicaia – one of the world’s most collected wines and the original Super Tuscan. I was seated across Steve Thurlow who writes as a member of the Canadian winealign team. We had a chance to discuss wine scoring and I wanted to pick his brain on how he evaluates a wine.
Thurlow generally evaluates wine on a six point scale. He says that 95 percent of the wines he publishes evaluations for land somewhere between 85-90 points (85 being acceptably good to 90 being very good). “If a wine is outstanding or exceptional, it gets above 90 and the score is pretty arbitrary – it’s based on gut feeling and years and years of practice and knowing what tends to be a 92 vs. what tastes definitively like a 95,” says Thurlow. “But it’s very rare that I ever score that high.”
Conversely, he tends to avoid publishing reviews that reflect scores of anything less than 85. I can only imagine that he likely drinks a LOT of bad wine and we just don’t know about it. (He mentioned that he evaluates as many as 7,000 wines in a year!)
Value has no bearing on Thurlow’s final score. He goes on to say: “If the wine is cheap but excellent, I don’t throw it any more points (like many critics do).” Likewise, he doesn’t take any points away for a wine that is perceived as too expensive for what you get. This is known as factoring QPR (quality-to-price-ratio) into scores – something I will further get into in just a bit.
“A rating/number from a critic can be helpful if you can establish the trust.” – David Lawrason
Thurlow reminded me of something important, citing a conversation he once had with Steven Spurrier. He said that Steven told him if you give a wine a score of 93/100, you need to be prepared to answer what kept the wine from scoring 100. What would need to be different about the wine to get it to 95, 97 and so on. I threw that question back at Steve and he told me that in many cases, the difference is the length or the finish. But it can be the aromatics and the perceived balance of the wine at its current state. Answers I would have expected, and it does make sense.
Carolyn Evans Hammond, who is the wine writer for the Toronto Star, takes a different approach. It’s one that has its share of controversies. She is very upfront about how she reviews wines and always factors QPR into her reviews, drawing her focused attention to those hard to find diamonds in the rough.
Quality-to-price-ratio refers to wines which are excellent relative to their price point. It’s something that high-end wine drinkers and budget deal seekers alike talk about. Perhaps it could be a $20 shiraz that drinks like it should cost $50, or maybe a Bordeaux that shocks your guests when you tell them you only spent $30 on the bottle. Either way, searching for “good QPR” wines can help you to ensure that you’re only drinking the best of what’s out there at certain price points.
Evans Hammond has always maintained that if an inexpensive wine over-delivers in her estimation it will be factored into the numerical score. Case in point: she awarded 96 points to a sub $8 bottle of Spanish wine in 2019. Predictably, people went crazy and store shelves were stripped bare of it.
She explains her methods by saying that “the short answer is, price matters. For example, a $10 Californian cabernet isn’t the same thing as a $100 one, so they shouldn’t be judged by same yardstick. Comparing apples to apples is the fairest approach, which my scoring reflects. Otherwise, it’s too easy to overlook inexpensive wines entirely, which is ridiculously unfair and a huge disservice to wine drinkers.”
So, does 96 points for a $7.95 bottle of wine look too good to resist? How about 94 points for a $15 bottle of pinot? Many would argue that with scores like those, it would be too hard to pass up the opportunity to try – especially when consumers are looking to spend less.
“It’s subjective, and if you disagree on the assessment, move on and find a critic who shares your tastes.” – Rick VanSickle
To a certain degree, I think that QPR should play a role in scoring wines but it can’t dominate the influence of a score. There needs to be a baseline created for scoring built on merit regardless of price (what happens when one is tasting blind with no price reference?).
David Lawrason, another writer for winealign recently went on a Twitter rant about a certain Italian critic famous for awarding super high scores to wines that leave many scratching their heads. He writes: “The LCBO must cease promoting deception of (wine critic’s name) ratings.” He then suggests that one of the wines promoted in the Vintages Magazine should actually be scored an 89 vs a 98.
Lawrason goes on to tweet: “There are not many $20 wines being sold in tax laden Ontario that qualify for 90 point ratings. The LCBO is mining the web for the best ratings they can find.” He points out that the average price of an LCBO vintages release is $19.95.
“This is going to sound self-serving,” tweets Lawrason, “but a rating/number from a critic can be helpful if you can establish the trust. A lot to ask. Many retailers will always publish the highest rating they can find. Get beyond the numbers and read the reviews. My job is not to move units. It is to connect consumers to wines they will enjoy. The media is not a marketing arm.”
As Rick VanSickle, chief writer for winesinniagara.com points out on his website: “Consumers always have to look beyond the number. The number is derived from the note critics write and the trust one has with a reviewer who lines up with their own palate. When the number is separated by the words, the context gets lost in the shuffle because there is no guidebook on how to review wines; everyone does it differently, everyone assesses wines based on their own palate. It’s subjective, and if you disagree on the assessment, move on and find a critic who shares your tastes.”
So, on that note about subjective scoring: Isn’t it possible that the difference between earning a 93 and a 97, or even a perfect 100 score is pure emotion? Determining or settling on that final score can be challenging, not to mention a bit intimidating, so writers will reflect on how they perceived the experience of both coming across the wine and how it was consumed, enjoyed and ultimately immortalized in afterthought or discussion. These ‘experiential’ emotions can lend to settling the score.
Was the wine consumed alone or with close friends, or a loved one during great conversation? Was it consumed with or without a meal or appetizers? These are weighted factors that can contribute to how a wine is enjoyed.
Monica Larner writes for Robert Parker’s The Wine Advocate and she is a firm believer in the emotional aspect of consuming wine. Larner reviews an average of 11 wines every day, 365 days a year. Over her many years of being a critic, she’s given many wines a perfect 100 point score. When asked what the perfect wine tastes like, she said “the answer is always the same. Robert Parker (the inventor of the 100 point scoring scale) once said that a 100-point wine is based 90 percent on the tasting quality of the wine and 10 percent on pure emotion. Believe me, that emotion is overwhelming. It’s the proverbial ‘Wow!’ moment that literally stops you in your tracks.”
“If the wine is cheap but excellent, I don’t throw it any more points (like many critics do).” – Steve Thurlow
Though admittedly scared with self-doubt at times, Larner insists it’s also a liberating experience because a strong wave of confident affirmation sets in. “The clouds begin to part, a single beam of sunshine falls to earth with golden luminosity and the angels start to sing. You just know – you are armed with the deep inner security that you could look any person straight in the eye, at any time during your life, and say: ‘Yes, I gave that wine 100 points.’”
At the end of the day, people need to understand that awarding a score to a wine is not scientific, that there needs to be certain levels of stipulation or transparency so that consumers know exactly what the score means.
Are wine scores necessary? I’m not sure we’re any closer to figuring out the answer, but perhaps we’ve moved the dial a bit on understanding what those scores actually mean. As long as critics are being transparent with how they score a wine and have a proven consistency with those scores, consumers should be left to decide if they want to be influenced by that writer or not. It may take some troubleshooting.
I will boldly confess that I not only believe in assigning wine scores, but that I believe in perfect wines and that critics should not shy away from admitting that. Many operate under the theory that no such wine exists, but could that just be an admission out of fear of not being taken seriously? If a wine connects with you on all levels and there’s really nothing particularly negative about it, then why not just acknowledge the fact that it’s a perfect wine – ‘your’ perfect wine.
Let the debate continue…
To learn about my method for scoring wines, and to receive reports on wines that I have assigned scores to, please sign up for the VineRoutes Newsletter and receive exclusive content.