How do we assess the wide variety of poll results we鈥檝e seen during this election? We asked our resident polling expert at 黄色直播, Peter Butler, to give us some tips.
Dr. Butler ran the Atlantic office of Decima (now Decima Harris) under Alan Gregg for more than a dozen years. He also worked for Western Opinion Research, a research firm out of Winnipeg, and RDI, a local market research firm (now defunct). He鈥檚 been a consultant for provincial and federal governments, and has done issues research and political polls for the Conservative Party of Canada. He is still associated with The Strategic Counsel a public issues-oriented research firm in Toronto and is currently professor emeritus of Sociology and Public Administration.
He also wrote one of only two textbooks about public opinion polling in Canada, entitled Polling and Public Opinion: A Canadian Perspective.
1) Question the randomness and design.
The trustworthiness of a poll comes down to three things: sample design, the quality of measurements they employed and confidence in the data collection process.
The first thing Dr. Butler thinks about is what kind of sample was taken. The irony of this, Dr. Butler admits, is that commercial pollsters (himself included) are notoriously secretive about their methods.
鈥淚 even say to students, I鈥檓 not telling you how I do it, because that鈥檚 how I make my money! It鈥檚 very much a closely guarded secret,鈥 he says.
That said, without knowing exactly how they do it, he trusts Nanos, Decima Harris, Angus Reid and other big name pollsters because they can afford expensive polling equipment like predictive dialers, that generate phone numbers and manage the randomness of samples.
鈥淎 CATI (Computer Assisted Telephone Interview) system and a predictive dialer sorts through the supervision of the data collection very effectively,鈥 he says.
2) Ask how the questions were asked.
鈥淲hat you want to know is not just the opinion,鈥 says Dr. Butler, 鈥渋t鈥檚 how intensely held the opinion is.鈥
Find out if a poll result came from asking simply one question or if it represents a summation from several questions. For instance, instead of asking 鈥淲ho will you vote for,鈥 Dr. Butler normally uses 5 questions to determine a respondent鈥檚 vote choice.
鈥淎sking one question is not enough for me. I group my questions. And there are a number of statistical techniques that are done so that a batch of questions is asked to reflect an issue.鈥
He also likes to know when a question was asked. Did they ask it at the beginning or the end of a questionnaire? It influences responses since respondents usually try to answer consistently during the course of the interview.
3) Check if the poll was a 'one off' or part of a series of polls.
Polls can be conducted over a period of days or they can be a 'one off'. A poll should explain this part of the methodology from the outset. If they don鈥檛 reveal this, pollsters can sin by omission.
鈥淎re they telling you the nightly results鈥攐f a good night鈥攐r do they roll up results of the end of the week,鈥 he asks. 鈥淲e just collected 500 cases last night and this is what it shows. But the night before it didn鈥檛 show up.鈥
Dr. Butler believes polls become more accurate over time. He likes seeing how polls track responses night after night.
鈥淣ic Nanos says, and I think he鈥檚 right, that the repeated numbers of polls that are done give you a closer and closer perspective on the accuracy of the opinions being reported.鈥
4) See error beyond the margin of error.
Margin of error tells you how accurate your sample is鈥攊ts size relative to the population鈥攂ut there are many other sources of error in a poll.
鈥淧eople are not asking this question enough: what is the refusal rate on your survey? You made how many calls so you could get 500 people? Oh, 3,000. Wow. Polling companies don鈥檛 want to talk about it,鈥 says Dr. Butler.
Butler says increasing numbers of people are missed in polls. This is a problem. What if all the university professors decided to go on a 鈥榥o call鈥 list or refuse to participate? You have demographic holes that require filling.
鈥淎lan Gregg was saying in a recent Globe and Mail article that the telephone poll is going out the window if we have a 鈥榥o call鈥 list. If people have their cell phones off, or just become weary of market researchers it is going to be difficult to deliver accurate polls by telephone surveys.鈥
5) Pick out the bias in national polls.
National polls give a big picture, but the numbers are often misleading. National data is made up of regional data. Pollsters segment Canada into polling regions, however, small regions often get overlooked and this creates bias.
鈥淭he big picture is going to be stuff that emanates from the centre,鈥 he explains. The problem is, 鈥渟amples get so much smaller in regions that their numbers always have a higher margin of error.鈥
If your sample numbers for a region is low, what did you do? 鈥淒id you collect more people than you needed to, or did you do a mathematical calculation, called weighting the sample?鈥
鈥淏ut as I tell students, mathematics is not opinion,鈥 he says. Many pollsters, being expert statisticians, do the math.
鈥淓verybody does it. How would we (pollsters) get PEI opinions if we didn鈥檛 weight samples in a national poll, since there is a low likelihood that many cases would be collected from PEI? Well, I have an issue with that! The issue is that it means that a PEI opinion [where] we only got 85 [responses] are coming in as having the opinion of 250. It鈥檚 not 250 opinions! It鈥檚 a weight.鈥
6) Understanding the undecided vote.
In the waning part of the election campaign, pollsters try to figure out which camp to put the last percentage of uncommitted voters into. As more people commit, polls get more accurate, but until then, pollsters come up with a battery of questions to better read an undecided electorate.
鈥淚s it really undecided or is it they won鈥檛 tell you, and if they were going to vote tomorrow, how did they vote in the past,鈥 he says. 鈥淓verybody in the business has to deal with the notion of how do we get at the undecided vote.鈥 聽
鈥淚鈥檇 hate to tell you the tricks we have used in the past,鈥 says Dr. Butler, smiling.
He really would, because that would reveal another mystery of his profession.
How pollsters decode polls
While we're being deluged with polls now, the only 'poll' that matters is the one on May 2.
Andy Murdoch - April 20, 2011