Skip to content

Canadian Urbanism Uncovered

LORINC: Welcome to the City’s policy-by-surveillance

Read more articles by

I enjoy the revelations of a muck-raking auditor-general’s report as much as the next red-blooded taxpayer, but I must confess a sense of unease about the way Beverly Romeo-Beehler presented her findings about the lingering malingering among Toronto’s tree crews – a report cleverly entitled, “Getting to the root of the issues.”

Released this week, the report is a follow-up to an earlier assessment, and shows that tree crews – both City staff and contractors – seem to be “productive” for only 3.5 hours in an eight-hour shift, with much of the rest of the day spent on long-ish breaks or waiting for parked vehicles to be moved, checking phones, and whatnot.

Romeo-Beehler’s audit team also spent over 500 hours last summer doing “physical observation” of said tree crews, evidence of which is offered, salaciously, with a video that shows hard-hatted guys not working. The segment, complete with telephoto shots and blurred faces, feels like something you’d find on an investigative TV newsmagazine show.

There’s also a bit of cognitive dissonance at play. The narrator and accompanying document note that the key (and unaddressed) failing has to do with improper management oversight. But we aren’t shown surveillance-type videos of managers not managing; after all, it’s not so easy to tell if someone sitting at their computer is shopping or crunching numbers, especially when working from home. The AG has opted, instead, to reinforce hoary old tropes about lazy civic employees and contractors – you know, the men and women who have no choice but do their work in public, in full view of anyone with a cell phone.

When I saw the video, my mind went to a couple of places – first, that shameful and widely circulated photo taken of an apparently sleeping TTC ticket booth attendant, who, it turned out, was in poor health and subsequently died.

Second, I found myself wondering to what extent did the AG’s office go to ensure that the individuals in these images could not be identified. Yes, their faces are digitally obscured. But de-anonymization techniques have become extremely sophisticated, highly effective, and readily available. In fact, one need only look at how various media and law-enforcement organizations used artificial intelligence and search tools to identify members of the Capitol Hill mob to recognize that a blurry or fragmented image doesn’t necessarily do the job.

“Reasonable steps were taken to ensure people in the photos would not be identifiable,” a spokesperson told me yesterday. “The Auditor General cannot comment further because, consistent with her standard practice, she does not comment about her reports before she has presented them to the Audit Committee.”

This admittedly minor episode suggests, to my eye, yet another step taken in the troubling direction of policy-by-surveillance. Everyone knows that law enforcement agencies have long used surveillance to do their work, but should panoptic observation techniques become a standard tool for policy-making? I’d hope we’re not going there.

As with so many things technological, this is a good-news-bad-news story, with an abundance of both nuance and red flags.

Earlier this month, for example, the Privacy Commissioner of Canada (PCC), together with counterparts in B.C., Alberta and Quebec, came down hard on the use of the Clearview AI app, which scrapes billions of freely accessible images from sites like Facebook and uses them to train smart-phone-based facial recognition software, making it possible to identify almost anyone at whom you can point your phone.

“Clearview asserted that the tool is intended for use by law enforcement for legitimate law enforcement and investigative purposes,” the report said. “A variety of organizations, including private sector entities, used this service via a free-trial service.”

The technology violated Canada’s privacy laws, the PCC concluded. “We found Clearview’s purposes to be inappropriate where they: (i) are unrelated to the purposes for which those images were originally posted; (ii) will often be to the detriment of the individual whose images are captured; and (iii) create the risk of significant harm to those individuals, the vast majority of whom have never been and will never be implicated in a crime.”

It’s easy to hate Clearview. Who, besides cops, wouldn’t?

Now consider the more subtle example of mobility tracking during the pandemic.

For several months, Apple and Google have teamed up with Environics Analytics to provide information on movement patterns, revealing, for example, specific neighbourhoods whose residents appeared to have travelled overnight beyond their postal code over the holidays (the data comes from cell phone signals). The Globe and Mail broke the story, and it brought an immediate response from policy-makers, beginning with the prime minister.

The software that aggregates and processes all that data, MobileScapes, is normally sold as a commercial analytics service to marketers, but it has been put to work assembling data for public health officials (via the media).

I understand that my phone generates a veritable geyser of location-specific information, most of which is consumed by companies trying to sell me stuff, or selling my data to their customers. For the record, I’m not worried that my whereabouts will be reverse engineered so I can be found by COVID cops, should I disobey public health edicts.

My point, rather, is that this type of mapping, undertaken in the public interest, is done mainly because it’s now possible, but without much consideration beyond utility. Last year, the Toronto Star published a series of similar findings which showed that park use was way up, but the data wasn’t location specific; rather, it categorized the types of places people seemed to be going (retail, parks, etc.) Those insights turned out to be very useful in encouraging the City to do more to keep park spaces accessible in the cold months.

This latter use of aggregated mobility data stays on the sunny side of surveillance. But the movement analytics in the former example, which identifies what’s happening at the postal code level, looks, to my eye, like something a bit different, and potentially less benign. Yes, in this case, it revealed that the residents of wealthier neighbourhoods travel more at a time when they should be staying home. But it’s not difficult to imagine that these policy-surveillance techniques could be pressed into service for less agreeable ends.

The general point is that we aren‘t really talking in a focused way about whether data surveillance should become a tool for policy-making, any more than few people batted an eyelid about the city’s auditor-general hiring PIs to stake out tree crews and then throw the results up on Youtube like they’d just busted a trafficking ring.

It’s all just being normalized, one report at a time, and there will come a day, likely sooner than later, when the surveillance genie will be impossible to stuff back in the bottle.

photo by Carl Lender (cc)

Recommended