Facial recognition technology gains popularity with police, intensifying calls for regulation
CBC
Some police services in Canada are using facial recognition technology to help solve crimes, while other police forces say human rights and privacy concerns are holding them back from employing the powerful digital tools.
It's this uneven application of the technology — and the loose rules governing its use — that has legal and AI experts calling on the federal government to set national standards.
"Until there's a better handle on the risks involved with the use of this technology, there ought to be a moratorium or a range of prohibitions on how and where it can be used," says Kristen Thomasen, law professor at the University of British Columbia.
As well, the patchwork of regulations on emerging biometric technologies has created situations in which some citizens' privacy rights are more protected than others.
"I think the fact that we have different police forces taking different steps raises concerns [about] inequities and how people are treated across the country, but [it] also highlights the continuing importance of some kind of federal action to be taken," she said.
Facial recognition systems are a form of biometric technology that use AI to identify people by comparing images or video of their faces — often captured by security cameras — with existing images of them in databases. The technology has been a controversial tool in police hands.
In 2021, the Office of the Privacy Commissioner of Canada found that RCMP violated privacy laws when they used the technology without the public's knowledge. That same year Toronto police admitted some of its officers used facial recognition software without informing their chief. In both cases the technology was supplied by U.S. company Clearview AI, whose database was composed of billions of images scraped from the internet without the consent of those whose images were used.
Last month, York and Peel Police in Ontario said they had begun implementing facial recognition technology provided by multinational French company Idemia. In an interview, York PoliceConst. Kevin Nebrija said the tools "help speed up investigations and to identify suspects sooner," adding that in terms of privacy, "nothing has changed because security cameras are all around."
Yet in neighbouring Quebec, Montreal Police Chief Fady Dagher says the force will not adopt such biometric identification tools without a debate on issues ranging from human rights to privacy.
"It's going to be something that is going to take a lot of discussion before we think about putting in place," Dagher said in a recent interview.
Nebrija stressed that the department consulted the Privacy Commissioner of Ontario for best practices, adding that the images police will acquire will be "obtained lawfully," either with the co-operation of security camera owners or by obtaining court orders for the images.
And although York police insist officers will seek judicial authority, Kate Robertson, a senior researcher at University of Toronto's Citizen Lab, said Canadian police forces have a history of doing just the opposite.
Since the revelations about Toronto police using Clearview AI between 2019 and 2020, Robertson said she is "still not aware of any police service in Canada that is obtaining prior approval from a judge to use facial recognition technology in their investigations."
According to Robertson, getting the go-ahead from the court, usually in the form of a warrant, represents the "gold standard of privacy protection in criminal investigations." This ensures a facial recognition tool, when used, is appropriately balanced against the right to free expression, freedom of assembly and other rights enshrined in the Charter.