If you've worked on a marketing, business, or sales plan, you know that getting information on target markets and trends is critical ... and difficult. Market research is often wrong, and some questions are outright invitations to wrong answers. However, even when you use the market research of others there can be problems.

In the news

Studies and polls quoted in regular news media may have huge potential flaws. Even the polling experts in media companies can go badly wrong.

The recent NBC/Esquire piece on American rage was interesting and the methodology section laid out the decisions they made, including recruiting participants from pre-existing research panels.

That isn't inherently bad. The cost of reaching people has become outlandishly high and it can take significant time and effort. Using panels of people who volunteer to take surveys can offer a useful tool, so long as you keep in mind the limitations of not having a survey of people randomly selected from a general population. Or, as Esquire and NBC put explicitly said:

Because the sample is based on those who initially self-selected for participation rather than a probability sample, no estimates of sampling error can be calculated. All surveys may be subject to multiple sources of error, including, but not limited to sampling error, coverage error, and measurement error.

At least they went into some detail. Many of the polls you see used in the media don't have anywhere near this level of disclosure. Sometimes reporters know enough to ask for a methodology section and consider the potential problems. Often they don't. Such shortcomings could mean that the data you see wouldn't apply to your market, which means your plans are based on something completely irrelevant.

And then there are the problems that go beyond statistical errors. For example, the results of the anger poll, which measured self-reported responses to hypothetical headlines, explained the differences in reaction between Democrats and Republicans. But 43 percent of Americans identify as independents. As FiveThirtyEight pointed out, it may be that many of these people are effectively two-party partisans, but to assign everyone into a dual camp model is inherently biased in its eventual interpretation.

Third parties

There are many ways that polls can be misrepresentative or not useful, even with the best of intentions. When those responsible for the polls have an axe to grind, the dangers mount up. Offering polls and infographics has become a widespread PR tactic. Companies want to be mentioned and interviewed and so provide what they hope will become the basis of a media story.

As you might expect, these materials are frequently anything but trustworthy. Problems that I've seen have included polls of a company's customer base only with results touted as more broadly representative, questions that lead people into desired answers, samples that are too small or not varied enough to project out to a larger market, and even a lack of methodology section that would explain how the study was done.

Not all vendor studies are terrible. I've seen some that are interesting. And there are also non-profits, think tanks, and others that release studies and data. Some of them are sound. Some aren't.

Market analysts

"Professional" market research can be an ugly business. I've done some in the past, working for companies that specialized in the business. Some were good and careful to ensure that their methods, data, and analysis were solid and mathematically defendable. Then there are others too anxious to provide a report and collect a check.

I remember one major firm, which will remain nameless, that asked me to work on a project. The supervising VP took what I had written up and started adding additional conclusions and observations. But the response rates were so low for some of the questions -- as in the teens of answers -- that you could not draw any conclusions. When I objected, his response was, "I've got enough experience so I can tell there's a trend here." Again, a big reputable name used by technology companies at a time before the dot com bomb. And people wonder why so many companies imploded.

When you're making decisions that will affect your business or someone else's, you have to push and get the basis of the information you use. Don't be satisfied with finding something that supports what you'd like to believe, which is known as confirmation bias. Learn about how research is supposed to be done and analyses conducted. Become an educated consumer. You'll have a lot riding on the trustworthiness of the percentages and graphs that look like they'd be completely at home in the report you're readying.