Economic analysis

Statistics Canada rewrote our story on Statistics Canada

The federal agency submitted its own version of our story on its botched July job numbers

Studying, midterms, exams

Last week, I wrote a post detailing the findings of Statistics Canada’s investigation into its botched job numbers in July. The investigation report, authored by Claude Julien, director general of methodology, and Craig Kuntz, director general of economy-wide statistics, was essentially a description of a bungled computer system upgrade that was poorly understood by both the technicians who maintain the computers at the federal agency and by the statisticians who crunch the numbers.

Statistics Canada took issue with some of my interpretations of its report and emailed a rewritten version of my post, which it asked Maclean’s to publish. I’ve compared the rewrite to the original and highlighted the agency’s changes. The text of the original post is marked in red strike-thru, while the StatsCan changes are in blue:

Statistics Canada released the details of its investigation into its troubled July jobs numbers (which were changed from 200 new jobs to 42,000 earlier this month). It’s written in pretty dry and technical language, describing what is essentially an IT department error  a failure to fully understand the impact of a change to systems. But it also offers a fascinating look into internal problems plaguing the federal agency.

First, a bit of background. The monthly Labour Force Survey tracks about 110,000 people over six months. It asks them a variety of questions about their jobs. In order to calculate whether the economy is creating new employment or losing jobs, analysts look for workers who said they were employed in one month, but unemployed the next (or vice versa).

Sometimes people don’t answer all the questions, or don’t answer the survey at all. When that happens, StatsCan’s computer systems are programmed to make an educated guess based on the person’s answers from the month before. If they said they were employed in June, but didn’t answer in the July, the system assumes they that most of them still have a job and marks them as employed. (Some are , while the others are labelled as unemployed or not in the labour force to reflect the fact that some people do lose their jobs.)

Fast forward to July. StatsCan technicians were updating the Labour Force Survey computer systems. They were changing a field in the survey’s vast collection of databases called the “dwelling identification number.” (The report doesn’t explain what this is, but it’s likely a unique code assigned to each of the 56,000 households in the survey so that analysts can easily track their answers over time.) They assumed they only needed to make this change to some of the computer programs that crunch the employment data, but not all of them.

The changes themselves were happening piecemeal, rather than all at once, because the system that collects and analyzes the labour force survey is big, complicated and old (it was first developed in 1997) and complicated. Despite being a pretty major overhaul of the computer system a pretty rare update to a complex computer system, the report makes it clear that the agency considered the changes to be nothing but minor routine maintenance. After updating the system, no one bothered to test the changes to see if they had worked properly before the agency decided to release the data to the public, in large part because they considered it too minor to need testing. The system was tested and appeared to have worked properly. The updated system was put into production. (Author’s Note: According to the report, only some parts of the system were tested, but not all of them—including the part of the system that fills in the blanks when people don’t answer questions. A technician later tried to replicate the results in a testing environment and couldn’t, which is how StatsCan realized there was a problem with the numbers.)

One of the programs that was supposed to be updated—but wasn’t—was the program that fills in the blanks when people don’t answer all the survey questions. But since technicians had changed the identification code for households in some parts of the system, but not others, the program couldn’t match all the people in the July survey to all the people in the June survey. The result was that instead of using the June survey results to update the July answers, all those households who didn’t answer the questions about being employed in July were essentially labelled as not in the labour force the program couldn’t match all the people in July survey to all the people in the June survey. The result was that instead of using the June survey results to update the July answers, information from June was not used for the programs that process some households who didn’t answer the questions in July. With the push of a button, nearly 42,000 jobs disappeared people were wrongly classified as not being in the labour force instead of employed.

Still, while Statistics Canada describes the problem as mainly a technical failure, it also highlights problems with what is otherwise a pretty rigorous data verification process at StatsCan, one with multiple layers of reviews and lots of fact-checking and discussion when results seem problematic.

First, the computer programs didn’t have enough systems diagnostics in place to alert technicians when it wasn’t able to match up some people in the July survey with their answers from the June survey. There was no red flag or error code that popped up to let technicians know there was a problem, so everything seemed fine.

Next, few analysts and seniors managers were even aware that the computer systems were being updated at all. Since the update was considered minor maintenance, the agency’s IT department apparently didn’t bother to warn other departments about the change.

Meanwhile, analysts The review was conducted by the Director Generals in charge of the review, not analysts who work for the Labour Force Survey set about justifying the shocking results. They researched monthly jobs data going back to 1985 and determined that it wasn’t that unusual for the country to create just 200 new jobs in a single month, even when the economy is generally growing. It’s also not inconceivable for the country to have suddenly lost 60,000 full-time jobs in a month. (The economy has grown, but created few jobs—or even lost jobs—20 per cent of the time, according to StatsCan. We’ve lost more than 60,000 full-time jobs about 16 per cent of the time. Those numbers themselves are pretty scary.)

There is a particularly illuminating passage in the report that reveals problems of miscommunication and misunderstanding at the agency:

Based on the facts that we have gathered, we conclude that several factors contributed to the error in the July 2014 LFS results. There was an incomplete understanding of the LFS processing system on the part of the team implementing and testing the change to the TABS file. This change was perceived as systems maintenance and the oversight and governance were not commensurate with the potential risk. The systems documentation was out of date, inaccurate and erroneously supported the team’s assumptions about the system. The testing conducted was not sufficiently comprehensive and operations diagnostics to catch this type of error were not present. As well, roles and responsibilities within the team were not as clearly defined as they should have been. Communications among the team, labour analysts and senior management around this particular issue were inadequate.

Problems like these are pretty common in big organizations, where it’s not unheard of for IT departments to start updating computer systems without telling anyone for updates to systems to be made without a full understanding of their impact and for senior managers to have no idea how their computer systems work, causing mass panic and confusion. But the Labour Force Survey is, arguably, the country’s most important economic indicator. Much is riding on the ability of the federal government to accurately track the job market. There is already plenty of criticism about the quality of the government’s jobs data, so Statistics Canada justifiably demands close scrutiny and must be held to very high standards. In this case, a computer update that seemed so minor that no one bothered to pay it attention became a national embarrassment for Statistics Canada that reverberated across the economy.

Looking for more?

Get the Best of Maclean's sent straight to your inbox. Sign up for news, commentary and analysis.
  • By signing up, you agree to our terms of use and privacy policy. You may unsubscribe at any time.