Canada

Angus Reid: What went wrong with the polls in British Columbia?

The veteran pollster reviews the lessons of the BC election

Every decade or so, some election in some part of the world confounds pollsters so utterly they produce completely wrong election predictions. This, in turn, sets off alarms bells about the accuracy and legitimacy of polling. And lately both the number of election polls and missed elections have been increasing, leading some to suggest the polling industry is on the verge of collapse.

The biggest miss in the early years of the polling industry was the American presidential election in 1948, when polls projected victory for Thomas Dewey over then-incumbent Harry Truman. Several newspapers, believing the polling numbers, produced early editions proclaiming a Dewey victory.

Fast forward almost forty years to the California election for governor in 1982. Popular Los Angeles mayor Tom Bradley was seen as the putative victor in a slew of polls that proved inaccurate. A decade later in Britain, the polls pointed to an almost certain Labour win, but the Conservatives won by a handy margin.

The last ten years have seen the record of public pollsters take something of a beating. Last year, most pollsters wrongly predicted victory for Danielle Smith’s Wild Rose Party in the Alberta general election. In 2004, several major polling companies seriously over-estimated Conservative party support in the election that returned Paul Martin’s Liberals to power. And then there was BC.

The British Columbia election in 2013 was historic because never have so many public opinion research firms missed the mark so badly with their final election projections.  The average “miss” was in the vicinity of twelve points. The industry that normally asks the questions is now faced with some tough ones it needs to answer, namely: Why did this happen? What does it say about the state of polling in Canada? What steps can be taken to prevent a reoccurrence of this type of episode? What is the future of public opinion research in Canada?

I can’t speak for all pollsters who covered this contest because I haven’t had access to their detailed methods, but I will give an account of what we, at Angus Reid Public Opinion, did to arrive at our findings.

In the months leading up to the campaign period, Clark’s Liberals trailed Adrian Dix and the NDP by as many as twenty points.  Our national tracking poll of provincial premier approval ratings placed Clark at the back of the pack.

Two days after the official start of the campaign, we published our first election survey, showing a 17-point lead for the NDP over the Liberals. Ten days later, with the campaign underway in earnest, our next poll showed Clark and the BC Liberals still well behind the NDP, but the gap had narrowed to fourteen points.

This narrowing was expected. Liberal attack ads aimed at the BC Conservatives, coupled with a disastrous Conservative party meeting in September that revealed open fissures between their leader John Cummins and some of the membership, meant British Columbians who had parked their voting intentions with the Conservatives were now thinking twice. Three days later, a TV debate did nothing to re-start momentum for the sputtering party.

Indeed, our third poll, conducted after the debate, contained more good news for the Liberals. The gap had narrowed to seven points, changing the dynamic for the BC Liberals—as I told the Liberal campaign at the time—from “mission impossible” to “mission attainable.”  The Dix campaign seemed stuck; the leader himself confronted with intense criticism over a shocking reversal on the controversial Kinder Morgan pipeline expansion.

Our final election poll was completed on May 8 and sent to our news client a day later. When I first saw the results I was somewhat incredulous: the positive Clark momentum had stalled. She had slipped back to trail Dix and the NDP by a nearly insurmountable nine points.

Accurate polling predictions conducted in the final week of a campaign count on three factors: good representative sampling, solid estimates of party and leader support levels and weighting factors that control for turnout. For sampling we have pioneered the use of online polling in Canada using our 150,000 person panel as the source of respondents. This approach, though not without its detractors, had stood the test of time in almost 30 elections in the United States, Canada and the United Kingdom.

To estimate party support levels we pioneered the use of a technique called “real ballot” which places a facsimile of an actual ballot, complete with the names of actual local candidates, in the final election survey. (This same approach produced the most accurate result for the 2009 BC election).  Finally, on the matter of turnout, we weight our sample based on how respondents self-report their likelihood of going to the polls.

Despite the use of these techniques, ARPO and most other published election polls missed the final election results. The Liberals, rather than losing the popular vote by nine points actually won the popular vote by four. A 13-point miss is the biggest I have produced in almost five decades of polling. I can take some comfort that I was not alone, but getting to the bottom of what happened has been a priority at the polling firm that bears my name.

Why did this happen?

Over the last two months a team of our specialists conducted a thorough investigation about what happened in BC. Contrary to the instant post-analysis of some firms which blamed record-low turnout (this proved false; turnout was just under 60 per cent) or last minute vote switching (our post-campaign follow up saw no evidence to support this), we found almost all of the discrepancy with our poll boiled down to one issue: low turnout among young voters.

The principal flaw in our methodology was that we represented voters under 35 (where the NDP held a commanding lead) in relation to their proportionate share of the BC population (roughly 30 per cent) rather than in relation to their actual share of voters (closer to 15 per cent according to research conducted by Elections BC after the 2009 contest). Had we made this one change in our turnout projection model the final Angus Reid poll published on May 9 would have shown the NDP lead diminish to only three points.

Another factor was the endgame of the BC election campaign. We saw the BC Liberal vote surge in traditional strongholds—ridings in the BC Interior where the NDP needed to pick up wins. The Liberals were similarly successful in ridings rich with ethnic voters in and around Metro Vancouver. It is likely that the prospect of an NDP win motivated and produced a high turnout for the Liberals in these areas.

Our polling in the late stages of the campaign showed a palpable fear factor about the prospect of an NDP victory among Liberal leaners that offset any misgivings that these voters might have had about the Liberals’ flawed governance in recent years. Indeed, some would say the biggest losers in the BC election campaign weren’t the pollsters but Adrian Dix and the NDP, who ran a feckless campaign. Clark’s revival was a testament to her high spirits, winning smile, and strong, high-priced private polling that surveyed in specific swing ridings and sampled at a granular level.

Ironically, public polls, including ours, which had pummeled her throughout the campaign, may have been an equally potent asset by energizing her base to get out and vote in order to beat back, as W.A.C. Bennett famously called them in 1972, the “socialist hordes”.

What does it say about the state of polling in Canada?

The polling industry, like most other parts of the information sector, has experienced a triple tsunami over the last decade. First it has been buffeted by major technological changes. The rise of the mobile phone has produced a situation where there are now many more phone numbers than voters in Canada. In some segments—especially young voters—land lines are as archaic as the rotary dial to an earlier generation. This means pollsters have a harder time finding younger voters, who either don’t have a landline at all, or are loathe to answer calls from pollsters on their mobiles, when they are being charged by the minute.

Second, the financial framework of the polling industry has changed. Media and government, once major sponsors of public opinion polls, have dramatically reduced their spending; media because they are broke and governments because freedom of information laws take polls that could contradict policy and put them in the public domain. Finally, Canadians themselves have changed in terms of their willingness to participate in polls. Telephone polling refusal rates now top ninety per cent.

These changes have not all been negative for the industry. The economics of collecting poll data have also changed drastically. Robo-polling technology allows brief snippets of opinion to be collected for pennies per interview and the Internet has opened an entirely new arena for involving voters and understanding their intentions, perceptions and attitudes.

Despite less funding from paying clients than ever, there are now more election polls than ever in Canadian and major provincial campaigns. The industry is largely unregulated, the cost of entry minimal and the rewards associated with being accurate seen as a better bet than the risks of getting it wrong.

At Angus Reid Public Opinion we’ve been carrying out “public” public opinion research for the better part of half a century. We helped pioneer the transition to highly controversial telephone polling in the 1970s and 80s and for the past decade we have invested heavily in online polling technology and methods.

Our experiments with online panel based polling started in 2007 with the Quebec election which saw the ADQ leap into second place (we were the only pollster to catch that trend) and have continued forward through over 30 contests in Canada, in the US (where ARPO was ranked in the top three pollsters for accuracy in a field of over thirty competitors,) and the UK. Our overall accuracy rate for this largely self-funded exercise has been 96 per cent.

What steps can be taken to prevent a reoccurrence of this type of episode?

We will be making several changes to our methods as a result of the most recent experience in British Columbia. Chief among these will be a change to the way in which we weight results to adjust for different turnout levels for younger and older voters. This could mean different sets of prediction numbers based on different levels of turnout. It will mean adjusting the weight of younger voters not to their proportion of the general population, but to that of actual voters. Looking back to the last five elections we have covered in Canada, the employment of this approach would not have substantially altered our estimates elsewhere, but would have produced a far more accurate prediction in BC.

It is new territory for us, something we might have previously regarded as unthinkable, and indeed, unnecessary, with our previous electoral record serving as the guide. As pioneers of online polling methodology however, we are always open to change where required.

Further, key demographic changes in BC regional populations require sampling and analyzing on a more granular level. Such an exercise on its own, without age weighting, would not have resulted in a vastly different outcome in our BC election call. Done together, it is probable that I would not be writing this summary today.

What is the future of public opinion research in Canada?

During the days and weeks following the BC election a question has bounced around my mind, and the boardrooms of Vision Critical, our parent company which I chair: “Why bother?” The economics of “public” opinion research just plain suck. There are more players than ever who use technology that can be bought for the price of a laptop. There are fewer dollars to fund this work. And Canadians appear to be less willing than ever to share their views with strangers.

On the other hand, “private” opinion research conducted for large corporations, special interest groups and political parties is booming. In the last US election over $100 million was spent on private polling for the various parties. In Ottawa, Prime Minister Stephen Harper has reduced the public opinion research carried out by the federal government while expanding the reach and scope of polling carried out by his party.

A strong democracy needs accurate and independent public opinion research to help balance the discipline of power and add context to public debates. Pollsters may be derided in Canada or the US but in Iran, Russia and Venezuela they have been jailed. Election polling is important because it serves to legitimize and make credible polls between elections on questions about important issues.

At Angus Reid Public Opinion, we’re carrying on. And we’re taking steps to ensure we get it right.

Angus Reid is the Executive Chairman of Vision Critical and Angus Reid Public Opinion. He’s been in the research and polling business for more than 40 years, and founded Angus Reid Group in 1979. The market research firm grew into Canada’s largest, and was sold to Ipsos SA in 2000. Four years later, he founded Angus Reid Strategies. He has a Ph.D. in Sociology from Carleton University.

Looking for more?

Get the Best of Maclean's sent straight to your inbox. Sign up for news, commentary and analysis.
  • By signing up, you agree to our terms of use and privacy policy. You may unsubscribe at any time.