Economic analysis

How StatsCan lost 42,000 jobs with the stroke of a key

A minor computer update became a national embarrassment


Statistics Canada released the details of its investigation into its troubled July jobs numbers (which were changed from 200 new jobs to 42,000 earlier this month). It’s written in pretty dry and technical language, describing what is essentially an IT department error. But it also offers a fascinating look into internal problems plaguing the federal agency.

First, a bit of background. The monthly Labour Force Survey tracks about 110,000 people over six months. It asks them a variety of questions about their jobs. In order to calculate whether the economy is creating new employment or losing job, analysts look for workers who said they were employed in one month, but unemployed the next (or vice versa).

Sometimes people don’t answer all the questions, or don’t answer the survey at all. When that happens, StatsCan’s computer systems make an educated guess based on the person’s answers from the month before. If they said they were employed in June, but didn’t answer in the July, the system assumes they still have a job and marks them as employed. (Some are labelled as unemployed or not in the labour force to reflect the fact that some people do lose their jobs.)

Fast forward to July. StatsCan technicians were updating the Labour Force Survey computer systems. They were changing a field in the survey’s vast collection of databases called the “dwelling identification number.” (The report doesn’t explain what this is, but it’s likely a unique code assigned to each of the 56,000 households in the survey so that analysts can easily track their answers over time.) They assumed they only needed to make this change to some of the computer programs that crunch the employment data, but not all of them.

The changes themselves were happening piecemeal, rather than all at once, because the system that collects and analyzes the labour force survey is big, complicated and old (it was first developed in 1997). Despite being a pretty major overhaul of the computer system, the report makes it clear that the agency considered the changes to be nothing but minor routine maintenance. After updating the system, no one bothered to test the changes to see if they had worked properly before the agency decided to release the data to the public, in large part because they considered it too minor to need testing.

One of the programs that was supposed to be updated — but wasn’t — was the program that fills in the blanks when people don’t answer all the survey questions. But since technicians had changed the identification code for households in some parts of the system, but not others, the program couldn’t match all the people in July survey to all the people in the June survey. The result was that instead of using the June survey results to update the July answers, all those households who didn’t answer the questions about being employed in July were essentially labelled as not in the labour force. With the push of a button, nearly 42,000 jobs disappeared.

Still, while Statistics Canada describes the problem as mainly a technical failure, it also highlights problems with what is otherwise a pretty rigorous data verification process at StatsCan, one with multiple layers of reviews and lots of fact checking and discussion when results seem problematic.

First, the computer programs didn’t have enough systems in place to alert technicians when it wasn’t able to match up some people in the July survey with their answers from the June survey. There was no red flag or error code that popped up to let technicians know there was a problem, so everything seemed fine.

Next, few analysts and seniors managers were even aware that the computer systems were even being updated at all. Since the update was considered minor maintenance, the agency’s IT department apparently didn’t bother to warn other departments about the change.

Meanwhile, analysts set about justifying the shocking results. They researched monthly jobs data going back to 1985 and determined that it wasn’t that unusual for the country to create just 200 new jobs in a single month, even when the economy is generally growing. It’s also not inconceivable for the country to have suddenly lost 60,000 full-time jobs in a month. (The economy has grown, but created few jobs — or even lost jobs — 20 per cent of the time, according to StatsCan. We’ve lost more than 60,000 full-time jobs about 16 per cent of the time. Those numbers themselves are pretty scary.)

There is a particularly illuminating passage in the report that speaks to problems of miscommunication and misunderstanding at the agency:

Based on the facts that we have gathered, we conclude that several factors contributed to the error in the July 2014 LFS results. There was an incomplete understanding of the LFS processing system on the part of the team implementing and testing the change to the TABS file. This change was perceived as systems maintenance and the oversight and governance were not commensurate with the potential risk. The systems documentation was out of date, inaccurate and erroneously supported the team’s assumptions about the system. The testing conducted was not sufficiently comprehensive and operations diagnostics to catch this type of error were not present. As well, roles and responsibilities within the team were not as clearly defined as they should have been. Communications among the team, labour analysts and senior management around this particular issue were inadequate.

Problems like these are pretty common in big organizations, where it’s not unheard of for IT departments to start updating computer systems without telling anyone and for senior managers to have no idea how their computer systems work, causing mass panic and confusion. But the Labour Force Survey is arguably the country’s most important economic indicator. Much is riding on the ability of the federal government to accurately track the job market. There is already plenty of criticism about the quality of the government’s jobs data and so Statistics Canada justifiably demands close scrutiny and must be held to very high standards. In this case, a computer update that seemed to so minor that no one bothered to pay it attention, became a national embarrassment for Statistics Canada that reverberated across the economy.

Looking for more?

Get the Best of Maclean's sent straight to your inbox. Sign up for news, commentary and analysis.