Canada

Our best (and worst) run cities

EXCLUSIVE REPORT: Which cities provide the best services per taxpayer’s buck? Canada’s first ever study of municipal effectiveness finds some surprises.

Our best (and worst) run citiesEveryone agrees that cities matter. No, they’re crucial. The Federation of Canadian Municipalities notes on its website, “urban economies are where people live, where jobs are created and where most goods and services are produced and consumed.” The Conference Board of Canada calls them “drivers of national prosperity.” Economists such as Richard Florida have celebrated their vital role in fostering creativity, innovation and trade.

At the same time, there is widespread agreement that city governments lack the funding they need to fulfill their responsibilities. Federal political parties have sought to outbid each other in their commitment to Canada’s cities. Billions of dollars in federal infrastructure funding has been promised, with billions more on the way in the form of a share of the federal gas tax.

Canada's best-run citiesRELATED: How Burnaby earned the top spot ——  How the poll numbers break down ——  What’s wrong in Charlottetown

Yet for an institution that is, by common consent, so critical, cities face remarkably little scrutiny. Voter turnout in municipal elections is commonly under 40 per cent. Candidates are often elected by acclamation. In the absence of party labels, in most cities voters have little information on what the various candidates stand for, if they even know their names.

And for governments that so aggressively advertise how short of funds they are, Canada’s cities are notably averse to providing data on just how well they are using existing funds. At other levels of government, and in other public sector institutions, there has been some progress of late in pursuit of what is often called, as if to emphasize its radicalism, “evidence-based decision-making.” Cities stand apart, defiant outposts of opacity.

This survey, the first of its kind in Canada, is an attempt to change all that. It provides citizens in 31 cities across the country with comparative data on how well—or poorly—their city is run, measured by the cost and quality of the public services it delivers. (Why 31? We took the 30 largest cities in Canada, added whatever provincial capitals were not on the list, then subtracted a few cities from the Greater Toronto Area for better regional balance. Somehow that left 31.)

Though the overall results—Burnaby, Saskatoon and Surrey, B.C. lead the pack; Charlottetown, Kingston, Ont., and Fredericton trail—will be of particular interest, they are less important than the process this is intended to kick off. We aim not merely to start some good barroom arguments, but to help voters to hold their representatives to better account, and indeed to help city governments themselves. For without some sort of yardstick to measure their performance, either against other cities or against their own past record, how can they hope to know whether they are succeeding?

To compile the survey, Maclean’s commissioned the Halifax-based Atlantic Institute for Market Studies, expanding on the institute’s earlier work measuring the performance of municipalities in Nova Scotia and New Brunswick. Unlike other studies, this does not try to measure quality of life, or which city is the “best place to live.” Rather, it focuses on the contribution of local governments to this end.

And while other studies, for example the Frontier Centre Local Government Performance Index, have attempted to measure how efficient Canada’s cities are, this survey attempts for the first time to measure how effective they are, that is in terms of the results obtained. It measures not just how much they have spent, but how well they have spent it—the bang for the buck.

There’s a reason this has not been done before. It turns out it’s damned difficult. In many cases, AIMS researchers found, the data didn’t exist: it simply hadn’t occurred to anyone to collect it. In other cases cities refused to release it to the public. Even where the data was both available and public, it often was not comparable. In some provinces, for example, cities are responsible for delivering services that in others are delivered by the provincial government. Or they use different accounting standards. Or. Or. Or.

The result? In all too many cases, according to AIMS executive vice-president Charles Cirtwill, researchers found that “no one had any clue what was going on. They were trying to manage without the most basic benchmarks. It was all kind of artisanal.” At worst, he says, “you don’t have evidence-based decision-making, but decision-based evidence-making.”

In all, AIMS staff, including director of research Bobby O’Keefe and researcher Holly Chisholm, collected data for 71 different performance indicators, gathered into seven broad categories: government and finance; taxation; safety and protection; transportation; environmental health; economic development; and recreation and culture. Cities were scored in relative terms, report-card style, their grade depending on how far they deviated above or below the average for the survey as a whole. To smooth out any unusual spikes that might occur in a given year, data was tabulated as three-year averages.

Roughly half of the indicators were chosen as measures of efficiency: for example, police service costs per capita, or transportation costs per lane-kilometre of city road. The rest measure effectiveness, or the level and quality of service provided: the percentage of the labour force that uses public transit, or the number of square metres of outdoor space per square kilometre. Adding up each city’s efficiency and effectiveness grades yielded a total score for each. Then the two were combined to produce the overall rankings.

It’s the combination of the two that is key: that a city has higher snow-removal costs, for example, does not prove that it is spending too much on snow removal. For what if it also provides a higher level of service—plowing more often than other cities do, or on more side streets? If a city were paying more for the same level of service, that’s one thing: but paying more to get more may well be what the citizens of that city might prefer.

Not every city reported on every indicator. Thus important gaps remain in the data we present here, and in the more complete tables on the Macleans.ca website: under the rules of the survey, if fewer than half the cities reported for a given indicator, it was left blank. So we are unable to report results for what would seem particularly useful measures, such as fire department response times or the percentage of roads in good condition. Likewise, if a city omitted data for half or more of the indicators in a given category, it was not ranked in that category. In two cases, Laval and Victoria, the data provided was so incomplete generally that we cannot give them an overall score.

That’s not necessarily a sign of lack of co-operation: indeed, AIMS researchers found city staff generally helpful. But in too many cases, Cirtwill says, the attitude seemed to be that it was not fair to compare City X to the others, because of the particular details of its situation. “You keep running into the same response: ‘we’re unique, we’re special, we do things differently here.’ ”

In some respects, this is a legitimate objection. To take our snowplowing example, it would obviously be unfair to compare Montreal, with an average annual snowfall of 212 cm, with Vancouver, which gets just 37 cm. Nor would differences in snowfall only enter into spending on snow removal. It would also affect the cost of road repairs, and perhaps even the quality of public transit, or police and fire response times.

To take account of these and other legitimate differences in city situations, AIMS adjusted the data for more than 20 different “input” variables, everything from crime rates to road coverage to the percentage of families headed by a single parent. For each indicator, cities were assessed both an “absolute” and an “in-context” score. The differences between the two can be quite striking: Sudbury, for example, does quite poorly in absolute terms, but much better when the data is adjusted for context. Conversely, Vancouver scores much less well once its abundant natural advantages are taken into account.

Which is the more “accurate” measure? We’ll leave that to you to decide. On the one hand, one wants to be fair to cities facing special challenges. On the other hand, it can be too easy for a city to use these as an excuse. For those interested, both sets of measures are available online; here, for simplicity’s sake, we use either the in-context number, or for the overall rankings, a blended score.

However you crunch the numbers, certain unmistakable trends emerge. The chart above compares how cities did on the efficiency and effectiveness scales. You can see they fall into four distinct groups. To the upper right, the three front-running cities—Burnaby, Surrey and Saskatoon—do particularly well on both tests; to the lower left, also-rans Charlottetown and Kingston score just as poorly on both.

Yet it’s the groups to the lower right and upper left that are perhaps the most intriguing. Longueuil and Gatineau rank first and second among all cities in efficiency terms, yet are dead last in effectiveness. At the opposite pole, you have five of the biggest cities in Canada—Toronto, Montreal, Vancouver, Edmonton and Calgary—which, while providing relatively good services, do so at comparatively high cost.

The other notable trend is geographic. Three of the top four performers overall come from the Lower Mainland of British Columbia: Vancouver and its satellites. Quebec also shows well, with three cities—Longueuil, Sherbrooke and Quebec City—in the top 10. Conversely, four of Atlantic Canada’s five entries finished in the bottom third of the rankings. The exception? Saint John, N.B., whose eighth-place showing puts it two spots ahead of Toronto.

What should we read into all this? Hard to say just yet—it will be interesting to see if these trends persist over time. But ultimately, it is the citizens of these cities who will make the call. As Cirtwill puts it, “we make no claims that this is the only way to measure how well city governments perform. It’s just another tool.”

Some cities did well in particular categories. Last-place Charlottetown can console itself that it was first in the government and finance category. Saskatoon, notwithstanding its recent designation by Maclean’s as the crime capital of Canada, shows up seventh in safety and protection, on the way to an outstanding second-place showing overall. That said, the safety category was the most spottily reported of the seven: fully one-third of the cities failed to provide enough data to be counted. This sort of collective reticence, says Cirtwill, is telling. “It’s an attitude that says ‘I work for you, but you’re all too stupid to understand what I do.’ ”

That points to both the strength and the weakness of this survey. On the one hand, it was carried out by an organization with an eight-year track record in measuring performance in the public sector. It was peer reviewed by leading experts in performance reporting and municipal governance. It drew upon previous work in the field, for example Ontario’s groundbreaking Municipal Performance Measurement Program, begun in 2000. And with more than 70 indicators, the chances of the results being skewed by one wonky data point are greatly reduced.

On the other hand, there are those gaps. Some of the data we do have may be open to challenge: 89 per cent of Toronto’s roads are in good condition? And a number of other standard indicators remain on our wish list, especially in the effectiveness department: for example, the percentage of council meetings with all councillors present. Police response times to emergencies. Number of days it takes to fix a pothole. We hope to include some of these in future instalments.

But in a way, it shouldn’t be up to us. For that matter, it shouldn’t be up to the cities, each releasing different bits of data as it sees fit. These are the sorts of basic indicators that all cities should be providing their citizens, as a matter of course. Though Canada is hardly alone for sparseness in the data we collect and publish on city governments, we are far behind countries like New Zealand or the United Kingdom, where reporting is mandatory across a broad range of categories.

Should other provinces follow Ontario’s lead? Is there a role for the federal government in setting compatible standards of reporting? Let the debate—and the bragging, complaining, and explaining away—begin.

Looking for more?

Get the Best of Maclean's sent straight to your inbox. Sign up for news, commentary and analysis.
  • By signing up, you agree to our terms of use and privacy policy. You may unsubscribe at any time.