Wednesday, January 18, 2012

Reading polls—Critical questions to ask


This political season you will be reading and hearing about lots of polls and surveys reporting on who is ahead and who is behind in this race or that one, who is gaining ground and who is falling out of favor, and so on.  Many polls, particularly those conducted by well-known polling organizations are reasonably accurate snap shots of public opinion at the time the polls were conducted.  Other polls are just nonsense.  A few are deliberate distortions of public opinion designed to serve one interest or another.

So, how do you tell the good poll from the bad?  Here are four questions you should ask about any poll you come across.

1.  Is the poll biased?
The first and most obvious things you need to know when presented with the results of a poll or survey are “Who did the poll?” and, even more importantly, “Who financed it?”  Did the people conducting the poll and/or financing it have an agenda?  Was their purpose to inform or persuade? 
2.  Is the poll/survey based upon a random sample of the population under consideration?
Look for a statement such as the following:
 “The results of this poll are based upon a scientifically designed random sample of xxxxxxxx people.  It has a margin of error (or sampling error) of xx%.”
Valid national polls will usually sample somewhere between 1,000 and 2,000 people. If there is no reference to “scientific design” or “random sample” and/or “sampling error” or “margin of error,” then take the poll as largely fantasy and little fact.  Also, be suspicious of any telephone poll that did not include calls to both land lines and cell phones.  Many people today have opted not to have a land line at all.
Pay attention to the "sampling error" or "margin of error."  Typically, that will be something like three to five percent.  In other words, if the sampling error is three percent and the results say a candidate is favored by 52% of the voters what it is really saying is that his support is somewhere between 49% (-3%) and 55% (+3%).
Be particularly suspicious of the results of any poll if there is any indication that people who participated in the poll/survey volunteered to participate by, for example, calling an 800 number or going to a web site and voting online.  Such polls are usually very unreliable since the people who participated probably had a reason for doing so and aren’t reflective of the general population.
3.  Do the poll results represent true opinions or solicited no-opinion opinions?
Even in the most carefully designed survey or poll some opinions expressed are really no-opinion opinions.  Sometimes people don’t want to appear ignorant about social or political issues so they will offer and opinion even when they don’t really have one, a no-opinion opinion.
How do you know if what you are reading about are genuine opinions or no-opinion opinions?  Here some things to look for.  Did the pollster include a no opinion or other "don't know" face-saving option for people who genuinely didn't have an opinion about the issue?  If not, then some of the opinions expressed may be no-opinion opinions. 
Also, consult more than one survey on an issue if you can?  Did the surveys get different responses during roughly the same time period or reflect an abrupt change in opinion over a relatively short period?  People with strong opinions about an issue rarely change their opinions quickly.  If several organizations poll on the same subject during the same period and the results of one poll are significantly different from the others, distrust that outlier.  They may be something bad wrong with the way that poll was conducted.
4.  Are the questions biased?
The wording of the questions asked in polls along with the sequence in which questions are asked have a major impact on poll results.  Be suspect of any pollster that doesn’t reveal these two critical pieces of information.  Ask if the order in which questions were asked could have affected the final result.  In other words, did earlier questions lead the person being poll or suggest the “correct” response?
Additionally, never trust the results of a single poll.  If multiple polls conducted by different pollsters using different sets and sequences of questions tend to agree then you can consider the composite results as having some validity.  Otherwise, treat the results as interesting but preliminary.

Here are some sources of more help on reading polls”

The British Polling Council has prepared an excellent guide for journalists on sampling and other polling issues entitled “A Journalist’s Guide to Opinion Polls.”  See http://www.britishpollingcouncil.org/questions.html.

For a summary of the ways polling has changed over the last 30 years or so, see Michael Kagay, “Looking Back on 25 Years of Changes in Polling,” The New York Times, April 20, 2000.  See http://www.nytimes.com/library/national/042000poll-watch.html.

Also see Michael W. Link and Robert W.Oldendick, “Good’ Polls / “Bad” Polls—How Can You Tell?: Ten Tips for Consumers of Survey Research,” at http://www.ipspr.sc.edu/publication/Link.htm.


No comments: