Last week, Bill Gates recommended that over the summer everyone should read the book How To Lie With Statistics. Since that's a book that I've also repeatedly recommended, I thought I give an example of how to use the book effectively.
Earlier today, I ran across an opinion piece using a statistic to suggest that manmade climate change is scientifically controversial. This is a business issue because it will (at the very least) determine government regulation and spending for the decades ahead.
For example, let's suppose you're writing a business plan for a solar energy startup. If manmade climate change is real, you should forecast sharp sales growth and government subsidies. If manmade climate change is bullsh*t, not so much.
In the opinion piece, syndicated columnist Star Parker criticizes President Obama for publicly stating the oft-quoted statistic that "97 percent of scientists believe that climate change is real, manmade and dangerous."
To counter Obama's statistic, Parker cites a different statistic that she feels is "more representative"-a 2012 survey showing that only 52 percent of meteorologists believe that "global warming... has happened and is manmade."
Two very different statistics; two very different scenarios. If the 97 percent figure is correct, there's no controversy just consensus. If the 53 percent figure is correct, then there's no consensus just controversy.
In these situations, most people pick the statistic that matches their politics. That's not a good idea for a business plan, though, because to make an accurate sales forecast and market projection, you need to know which statistic actually reflects the real world.
The main takeaway of How To Lie With Statistics is that you shouldn't take statistics at face value but instead look at how the data was gathered. To do this you need to 1) do some research and 2) apply some common sense.
Fortunately, research is far more easy than when How to Lie With Statistics was originally published. Back then, checking data sources might take months; with Google it takes less than a minute. So here are the facts behind both statistics:
Obama's 97 percent figure comes from a 2013 article in Environmental Research Letters, a peer-reviewed journal. To arrive at that figure, the authors compiled search results from 11,944 abstracts of climate articles in peer-reviewed scientific journals.
This methodology is called meta-analysis and, while it does not guarantee the absence of bias, it is a generally accepted statistical way to measure consensus on scientific issues.
Parker's 53 percent figure comes from a poll of the membership the American Meteorological Society. To arrive at that figure, the authors emailed a questionnaire to 7,197 members, about a quarter of whom responded.
This methodology is called a self-selected sample. Because respondents are already interested in the subject, self-selected samples always generate biased results. Such polls are meaningless statistically. To be valid, a poll must take a random sampling.
So, based on some quick research into the data behind the statistics, it's clear that Obama's 97 percent figure is far more likely to match reality than Parker's 53 percent. With that out of the way, let's apply some of that good ol' common sense.
The people who wrote the peer-reviewed journal articles that were analyzed to arrive at the 97 percent statistic were climatologists. The people polled to arrive at the 53 percent statistic were meteorologists. There's a difference.
Climatologists study long-term climate trends and most have earned a doctorate degree. Meteorologists study short-term weather patterns and most have a bachelor's degree. To make this distinction a bit more clear, here is a climatologist at work:
And here is a meteorologist at work:
Of course, it's not outside the realm of possibility that the climatologists are wrong and the meteorologists (at least the ones who answered the questionnaire) are right.
But would you bet your business on it?