اخبار العرب-كندا 24: الخميس 4 ديسمبر 2025 05:32 مساءً
The Edmonton Police Service’s (EPS) plan to deploy artificial intelligence (AI) facial recognition on officers’ body-worn cameras should alarm Edmontonians. EPS calls this a world-first pilot with Axon, a U.S. body-camera manufacturer. But turning cameras that were sold as tools for police transparency and accountability into instruments of mass surveillance and identification is a dangerous shift in policing.
Body-worn cameras were introduced for greater police accountability. The bargain was simple: Cameras would record police–citizen interactions to create an objective record, deter misconduct and improve transparency.
That social contract is now being rewritten by EPS. Instead of primarily watching the police on the public’s behalf, the cameras are being retrofitted to watch the public on the police’s behalf. Flipping an accountability tool into a surveillance tool is a serious breach of public trust.
Advertisement
Advertisement
Advertisement
Advertisement
According to EPS, this “proof of concept” will run through December, testing whether the system can flag “individuals with safety flags” or serious warrants. The software will run in “silent mode:” Officers will not see live alerts; footage will be reviewed later for algorithmic matches.
But “silent mode” does not mean benign. Anyone who walks past a participating officer may be scanned and compared to police databases, without knowledge or consent. The public is treated as a pool of potential suspects, their faces run against criminal databases simply for going about their day.
Edmonton is also being used as a global testing ground for Axon’s newest surveillance product. No other police service in the world has deployed facial recognition on body-worn cameras. Edmontonians are, in effect, test subjects for a technology Axon may wish to market elsewhere.
This move contradicts Axon’s own prior commitments. In 2019, after reviewing the evidence, Axon’s independent AI Ethics Board concluded that facial recognition was too flawed and too biased to be ethically deployed on body-worn cameras and recommended that it not be used in that way.
Advertisement
Advertisement
Advertisement
Advertisement
Axon publicly pledged to keep facial recognition off its cameras and to shelve such plans. For Edmonton now to trial the very thing Axon’s experts warned against is baffling and risky, and suggests an eagerness within EPS to embrace controversial surveillance technology without fully weighing the consequences.
The plan should also alarm anyone who values Charter-protected freedoms. If police can identify and track people in real time or near-real time, there is a chilling effect on fundamental rights. People may think twice about attending a lawful protest, being in certain public spaces if they know their face could quietly end up in a police database.
Section 8 of the Charter guarantees the right to be secure against unreasonable search and seizure. Running everyone’s face through police databases — without suspicion, warrant or consent — is hard to reconcile with that protection. Section 7’s guarantee of life, liberty and security of the person, and Section 15’s guarantee of equality, are also engaged when constant biometric monitoring falls hardest on racialized and marginalized communities.
Surveillance technologies are not neutral. Facial-recognition systems have repeatedly shown higher error rates for women and people of colour. If those biases translate into more false accusations, investigative stops or “matches” on racialized Edmontonians, we are importing inequality into the heart of policing. Mass biometric surveillance in public spaces pushes the limits of our constitutional rights and expectations of privacy and may cross into a Charter grey zone that courts have not definitively examined in Canada.
Advertisement
Advertisement
Advertisement
Advertisement
Given Axon’s own 2019 report, which was highly critical of deploying facial recognition on body-worn cameras, has the technology suddenly become accurate and fair enough to justify this rollout? No. Even in 2025, facial recognition remains deeply flawed and biased — impressive in controlled conditions with clear, front-facing images of well-represented groups, but far less reliable in the real world of moving officers, crowded streets and poor lighting.
Edmonton should not be first in line to absorb the legal and social costs of this experiment. Before EPS turns its accountability cameras into roaming ID scanners, Edmontonians deserve something they have not yet had: choice, and an open debate.
Gideon Christian PhD, Associate Professor and University Research Chair (AI and Law), University of Calgary.
Letters welcome
We invite you to write letters to the editor. A maximum of 150 words is preferred. Letters must carry a first and last name, or two initials and a last name, and include an address and daytime telephone number. All letters are subject to editing. We don’t publish letters addressed to others or sent to other publications. Email: letters@edmontonjournal.com
Bookmark our website and support our journalism: Don’t miss the news you need to know — add EdmontonJournal.com and EdmontonSun.com to your bookmarks and sign up for our newsletters here.
You can also support our journalism by becoming a digital subscriber. Subscribers gain unlimited access to The Edmonton Journal, Edmonton Sun, National Post and 13 other Canadian news sites. Support us by subscribing today: The Edmonton Journal |The Edmonton Sun.
تم ادراج الخبر والعهده على المصدر، الرجاء الكتابة الينا لاي توضبح - برجاء اخبارنا بريديا عن خروقات لحقوق النشر للغير
أخبار متعلقة :