Statistics Canada responds

Statistics Canada is still refusing to discuss exactly what went wrong with its jobs numbers last month, or why nobody noticed it until several days after the survey results had been made public, except to say that it was a problem with a computer program that wasn’t updated and that more details will be released as part of an internal review sometime within the next two weeks.

It did, however, contact Maclean’s to respond to a few of my earlier conclusions about what went wrong, which you can find in posts here and here.

Among them was the idea that summer is an inherently riskier time to collect data on the working population because people — both Stats Can workers and Canadians in general — are on vacation.

I also noted Stats Can is required to tweak its employment numbers to account for predictable changes in seasonal employment and that many of those changes are in the summer, since schools and post-secondary institutions go on break and lots of jobs in tourism and construction are tied to the summer months.

In an interview, Jason Gilmore, Stats Can’s acting chief of current labour analysis, says summer is no more risky for statisticians than any other seasons. “Summer doesn’t make it any more complicated for us than other times of the year,” he says. “Just like construction and accommodation and food services tend to have less employment for winter and teaching employment goes down in the summer, that doesn’t affect our capacity for seasonal adjustment. Those kind of factors are already factored in.” Despite summer vacations, the agency’s response rate to the survey isn’t much different in July and August than it is at other times of the year, he says.

The picture is a bit more complicated, however, when it comes to how to count teachers and other education workers. The big changes in education employment numbers in the summer is among workers — both teachers and support staff — who are on contracts that end in June and who may or may not sign a new one come September. Contract workers might end up being coded differently in the Stats Can databases depending on whether they have another contract lined up, or when their next contract is set to begin. It can take the agency a few years to get enough data to detect a shifting trend in seasonal changes in education contracts.

In his analysis of the updated job figures, CIBC chief economist Avery Shenfeld argues that “shifts in the timing of when teacher contracts come to an end for the year have made it difficult for Statistics Canada to adjust for seasonality, and many of these jobs could therefore be reversed in subsequent months.” He added that the sudden jump in jobs in Ontario in the corrected survey was because “that was the province where the statistical quirk boosting teacher jobs was featured.”

Statistics Canada also took issue with the fact that I suggested their survey works off data that is, in some cases, many years out of date. I pointed out that the agency uses population data taken from the previous census (right now that’s the 2006 census) and then estimates the changes over time. The farther it gets from the census year, the larger the error. Gilmore says the survey’s population estimates also uses data from the Stats Can demographics department, which regularly tracks things such as people immigrating and emigrating from Canada, and moving between provinces, and then creates models that calculate how the population has changed.

Survey staff also regularly go out to neighbourhoods, particularly new developments, and count the number of dwellings in order to update the 56,000 sample households who are interviewed for the Labour Force Survey (representing 110,000 people out of 29 million working age Canadians.) The data, he says, is “continuously refreshed with new information about where people live and neighbourhoods that are shrinking and growing.”

Part of the 10-year survey redesign that caused the error involved updating the sample population used in the Labour Force Survey to reflect the most current population data. For instance, if more people have moved to Alberta, the new sample might include more interviews with people from Alberta and fewer from, say, New Brunswick. However Stats Can says this isn’t the computer glitch that caused the problem with the July data.

So why did such a huge error occur? And why wasn’t it caught by Stats Can’s rigorous data verification process — which includes approvals by management and a high-level policy committee — until an independent analyst noticed some problems? For that, it seems we’ll have to wait for Statistics Canada to release its internal review.




Browse

Sign in to comment.