For over two decades, I've been in and around market research.

When I was a marketing exec, I controlled a $1 million market research budget and hired many of the top high-tech research firms. When I left that job, I was an associate analyst at Technology Business Research covering Microsoft and the Japanese PC business. Finally, as an author and journalist, I've written extensively about market research firms as well as peer-reviewed scientific studies.

Based on that experience, here's how to identify market research that's either biased, invalid or generally not up to snuff:

1. There's correlation but not causation.

Example: "90% of successful companies have open plan offices, therefore open plan offices make companies successful." (90% of failing companies have them, too.)

2. The numbers are implausible.

Example: "Single women over 40 are more likely to be killed by terrorism than to get married." (Death by terrorism is microscopically likely.)

3. The research supports a consultancy.

Example: "Our survey of 300 companies shows that 85% plan want to increase employee engagement." (Says the company with an "employee engagement" service.)

4. Forecasts that assume major breakthroughs.

Example: "Our research says that artificial intelligence will replace millions of jobs in the next ten years." (There have been no breakthroughs in AI for decades.)

5. Any statistical use of averages.

Example: "Our research shows that the average taxpayer will get a $1,200 tax break." (50 billionaires get $10m and everyone else gets 10¢.)

6. No sign of peer review.

Example: "Send for our white paper describing our exclusive market research." (If it's not been vetted, it's probably bogus.)

7. An "apocalyptic" tone to the analysis.

Example: "If we don't do absolutely everything possible to prevent climate change, it will mean the end of the world as we know it." (Over-react much?)

8. It's contradicted by multiple, better studies.

Example: "There are literally hundreds of independent studies that prove vaccines cause autism." (The science is quite clear that they don't.)

9. Invalid research methodologies.

Example: "To find out how common it is, we asked our readers if they'd ever been sexually harassed." (Only those who have been harassed are likely to respond.)