In 1985, Coca-Cola thought it had cracked the code. Their marketing and labs teams buzzed around with hard data. 200,000 blind taste tests showed that a new, sweeter formula was favoured over the original 55% to 45%, even edging out Pepsi. Convinced this was consumer truth, they replaced the iconic drink amid great hype, introducing the ill-famed New Coke.
But within weeks, an unprecedented outrage against the company broke out. 8,000 furious letters, thousands of angry phone calls and plunging sales numbers exposed the miscalculation. It was unbelievable mayhem with some overzealous loyalists forming groups like the Society for the Preservation of the Real Thing and the Old Cola Drinkers of America to demand the return of the original Coke. Imagine grown adults losing their heads over soda. Such brand love.
But what really went wrong? The data did not lie. They did a study with a very large sample size. Numbers checked. The problem was the interpretation. Those tests captured isolated sips ignoring deep emotional ties to the original’s taste and heritage. Coca-Cola did not test the New Coke with a burger, at a barbecue, on a hot day, after a bad meeting with your boss, or on a first date. The test did not have a story.
After 79 shameful days, Coca-Cola backtracked and brought back the original taste as Coca-Cola Classic to reclaim loyalty.
You see, marketers easily get intoxicated with empirical evidence, looking at research numbers like doctor’s prescription. Take twice daily, no questions asked. The real failure here is faith in evidence without context.
Look, data often feels objective. Numbers look solid. Percentages, oh, so foretelling like the handwriting on the wall. They feel final. But evidence only answers the question you asked. In Coke’s case, “which one of these tastes better?”. Coca-Cola asked about taste. The market responded with meaning. When you ask the wrong question, you get the wrong truth.
This happens everywhere. Surveys, polls, dashboards, reports… you name it. People confuse measurement with understanding. They skip the assumptions and ignore what sits outside the spreadsheet.
Evidence can be accurate and still mislead you. Not because the data is wrong but because the frame is incomplete.
So, not all truth is true. Read between the lines, check the details. As they say, the devil likes to pitch his tent in there.
