Data is important to rational business decisions. But it's not easy to use right. Trying to understand customer behavior using big data can be tricky. And even experts like Nate Silver from the site FiveThirtyEight.com can make big mistakes, like in the 2014 World Cup.

Or in the 2016 election.

Just about everyone in media totally blew reading the public and interpreting polls. There is a variety of reasons that are easy mistakes to make, particularly when you're not an expert with data, but even when you are. Here are some of the things to watch for.

Look for insight, not confirmation

This is such a big problem. Too many in the media had decided on a story, which was that Clinton was a shoe-in, and fell for confirmation bias. They placed all the data they saw into the framework and then didn't question it because they liked what they heard. Instead, you need to want insight into problems, strategies, and market observations, even, and especially, when they run counter to your beliefs.

Check the sampling

The basis of statistical analysis is the ability to get a truly random sampling of a phenomenon. If you have control over the samples, like testing every hundredth or thousandth product coming off an assembly line, or even developing way to pull samples at random times, you've got a good chance of getting a real statistical view. But if there's bias in your selection, your results aren't going to be trustworthy. When it comes to people, you need their cooperation. That is much harder and causes problems even in the most experienced polling operations. You might have to rely on poling panels, which can have their own problems in theory. Ask about how the people were chosen.

Check the response rate

This is one area where very few journalists or businesspeople I know are on the ball. The more people you contact who refuse to answer questions, the less reliable your answers because the sample gets closer to a self-selecting group, and that means less random. When you see a study mentioned, check if the response rate is given in a story. (It almost never is.) If not there, go check the source of the study. If they can't or won't tell you the response rate, consider writing off the results.

Don't misuse margin of error

There are many people who misunderstand what the margin of error, often reported with a study, means. It is emphatically not a measure of how close to the truth the information is. Rather, it measures consistency. As usually given, it means that if you repeated the same survey, talking to people in the same sampling, then 95 times out of 100 the answers will be within the margin of error percentage point spread. When the difference between answers is smaller than the margin of error, you can't statistically differentiate between them. And 5 times out of 100 the answers could be wildly different. The time the survey was run could have been one of those unusual times.

Look at the questions and their order

The biggest source of bias in a study often comes from exactly what questions people are asked and the order in which the questions are arranged. Respondents may be subtly guided toward specific answers, either unintentionally or because someone, somewhere wants to promote a specific result. Check the question list and, if it's not available, dismiss the poll. Below is one of the best examples I know of how this can be a problem -- a clip from the old BBC comedy, "Yes, Prime Minister."