Mugshots to megabytes: facial recognition has made privacy protection more urgent than ever

What was once science fiction of the past is now the science reality of today.

Once a fanciful idea, facial recognition technologies have arrived, and their use, particularly by law enforcement agencies, raises serious concerns about personal privacy, not to mention other fundamental human rights and civil liberties. These technologies create a face-print from one or more existing photographs of an individual — basically a digital representation of the unique traits and characteristics of a human face.  Law enforcement can then compare this face-print against another image, say a photograph or video of a suspect at a crime scene, and if a match is found, assign an identity to the suspect — bingo, crime solved!

Well, not exactly … turns out to be a lot more complicated than that.

The practice of using photographs to identify criminals began in the 1840s. In 1888, Alphonse Bertillon, a French police officer and biometrics researcher, began taking the first mugshots using a uniform system of full face and profile views with standardized lighting and angles. I imagine, at the time, this was quite a time-consuming and cumbersome process.

Fast-forward to a little over a century later, and we now have computer software that can scrape and scan thousands upon thousands of images from the internet at incredible speeds. While facial recognition technology could offer law enforcement agencies efficiency and cost savings, it also has the potential to be highly intrusive unless clear legal controls and effective privacy protections are in place.

That is why today, my office joined the federal privacy commissioner and other privacy authorities from across Canada in issuing draft privacy guidance on the use of facial recognition technologies by police. This draft guidance is being made available for comment by stakeholders and the public. I’m particularly interested in hearing perspectives from Ontarians on the suitability of our current legal framework for regulating police use of facial recognition technologies.

For instance, there are long-standing rules strictly regulating how police can use DNA lawfully collected from convicted offenders to look for matches with crime scene samples that can help solve serious crimes. DNA samples can also be voluntarily provided by consenting individuals, victims, or their family members for humanitarian purposes, such as helping to find missing persons or identify human remains. Should similar type rules be established for limiting the collection and use of face-prints, which some could argue are just as unique and sensitive a form of biometric?

The right to live free from state surveillance is a fundamental human right in our society. The Canadian Charter of Rights and Freedoms allows us as individuals to go about our daily lives without the risk of being routinely tracked and monitored. While surveillance technologies like facial recognition may contribute to public safety, much more must be considered.

Given the significant risks to privacy posed by facial recognition technologies, the draft guidance urges law enforcement agencies to obtain proper legal advice to confirm whether they have the lawful basis for using such technologies in their specific jurisdiction either by statute, common law, or judicial authorization. This requires not only identifying the source of lawful authority based on existing laws but also confirming that their purported use of such authority is properly circumscribed, necessary and proportionate in the particular context and circumstances of a given case.

Assuming lawful authority exists, the draft guidance also calls on law enforcement agencies to take a privacy by design approach from the very outset. Among other things, this requires them to conduct a thorough privacy impact assessment so that they may properly anticipate and assess the risks, and effectively address them, by establishing appropriate limits and building effective privacy protections into the very design of proposed initiatives prior to their deployment.

In addition to this, the guidance includes a number of other recommendations for police services, such as ensuring the accuracy of the information they gather and to guard against the risks of error that have been known to happen. There are also potential risks of systemic discrimination and over-policing of marginalized communities based on inherent forms of bias.

The draft guidance recalls basic privacy obligations of law enforcement agencies to collect only the data they need and use it only for its intended purpose. Moreover, police services should consult with their data protection and human rights regulators, and strive to be transparent and open with the public if they are even considering using these technologies.

That last point should not go unheeded. The public needs to be able to trust that public servants, especially those sworn to protect them, will also respect their right to privacy and the right to live free from suspicion without just cause.

Transparency is key to ensuring public trust when it comes to facial recognition technologies and artificial intelligence (AI) systems more generally. Trust and transparency are at the core of the IPC’s recently adopted strategic priority on Next-Generation Law Enforcement, and our related goal to contribute to building public trust by working with relevant partners to develop the necessary guardrails for the adoption of new technologies that protect both public safety and access and privacy rights. This draft guidance on the use of facial recognition by police, issued in collaboration with our FPT counterparts and made open for public consultation, is part of the contribution we hope to make.

In the end, engaging the public on questions about the proper use and limits of surveillance technologies, the nature and extent of the information they collect, the purposes for which they are being used, and the diligent safeguards in place to protect people’s privacy, can help earn the public’s confidence and go a long way in helping to foster respect and trust between police and the communities they serve.

It will be interesting to hear what stakeholders think of these draft guidelines, including the broader policy questions of whether facial recognition technologies should be used by law enforcement at all, and if so, in what circumstances and how. Hopefully, we will receive as broad a range of perspectives as possible, including from Ontarians!

Patricia

This post is also available in: French

Media Contact

For a quick response, kindly e-mail or phone us with details of your request such as media outlet, topic, and deadline:
Telephone: 416-326-3965

Social Media

The IPC maintains channels on Twitter, YouTube and Linkedin in its efforts to communicate to Ontarians and others interested in privacy, access and related issues.