Evasive Accountability: A New Norm for Police and Security Services in Canada

  Since the founding of this country, a totalitarian, closed form of government has been considered unacceptable and un-American. The public assumes they have the freedom to be left alone […]
Published on September 10, 2021

 

Since the founding of this country, a totalitarian, closed form of government has been considered unacceptable and un-American. The public assumes they have the freedom to be left alone and to live a life in privacy, while the government is believed to be open and transparent, humbly serving the people.

In reality, this patriotic belief is no longer true: it now appears that we are to serve the political class as our personal lives are forced to become more transparent—while the decision makers operate in complete secrecy.

– President John F. Kennedy, New York City, April 27, 1961

 

The end of carding has been perceived as a significant loss of police powers, resulting in a sense among many police officers that they no longer have the authority to approach and question suspicious individuals—that they are more likely to be subject to public complaints if they exercise the initiative and are particularly vulnerable when dealing with persons of colour.  

The collection and submission of information about persons of interest is, however, not about to cease with the elimination of carding. That information will now simply be gathered by other means, such as facial recognition technology (FRT). More concerning is the fact that this information is now being gathered and incorporated into police intelligence data bases without public knowledge.

Canadians should be alarmed by the fact that the RCMP has already been caught in a lie about its use of facial recognition technology.  

In an official response to a written query from the CBC earlier this year (January 17, 2021) the RCMP stated: “The RCMP does not currently use facial recognition software.” The response went on to state: “However, we are aware that some municipal police services in Canada are using it.”1  

Such technologies are neither cheap nor easy to incorporate; they require extensive technological investment, resource allocation, and equipment. The prospect that senior management, among them those who control the purse strings, did not know of the acquisition, incorporation, and application of this technology is extremely unlikely. It is simply unfathomable that the RCMP and its senior management were not aware of the use of this technology at the time of that statement.  

Now an investigation by the Office of the Privacy Commissioner of Canada (OPC) has found the RCMP’s use of facial recognition technology to conduct hundreds of searches of a database compiled illegally by a commercial enterprise a violation of the Privacy Act.2 

The OPC shared its findings regarding the RCMP’s use of Clearview AI, a technology company that was itself the subject of a previous OPC investigation. Clearview AI was found to have violated Canada’s federal private sector privacy law by creating a database of more than three billion images scraped from internet websites without users’ consent.3 

OPC Commissioner Daniel Therrien says: “The use of FRT by the RCMP to search through massive repositories of Canadians who are innocent of any suspicion of crime presents a serious violation of privacy.” The commissioner added: “A government institution cannot collect personal information from a third party agent if that third party agent collected the information unlawfully.”4

While the OPC maintains the onus was on the RCMP to ensure the database it was using was compiled legally, the RCMP argued that doing so would create an unreasonable obligation and that the law does not expressly impose a duty to confirm the legal basis for the collection of personal information by its private sector partners. Therrien says this is just another example of how public-private partnerships and contracting relationships involving digital technologies are creating new complexities and risks for privacy.5

This in effect amounts to an attempt by a security service to launder its illegal activities through a third party. 

Therrien says: “Activities of federal institutions must be limited to those that fall within their legal authority and respect the general rule of law. We encourage Parliament to amend the Privacy Act to clarify that federal institutions have an obligation to ensure that third party agents it collects personal information from have acted lawfully.”6

The substantive report, “Joint Investigation of Clearview AI, Inc.by the Office of the Privacy Commissioner of Canada, the Commission d’accès à l’information du Québec, the Information and Privacy Commissioner for British Columbia, and the Information Privacy Commissioner of Alberta, made another alarming statement—that Clearview expressly disagreed with the privacy commissioner’s findings: 

In disagreeing with our findings, Clearview alleged an absence of harms to individuals flowing from its activities. In our view, Clearview’s position fails to acknowledge: (i) the myriad of instances where false, or misapplied matches could result in reputational damage, and (ii) more fundamentally, the affront to individuals’ privacy rights and broad-based harm inflicted on all members of society, who find themselves under continual mass surveillance by Clearview based on its indiscriminate scraping and processing of their facial images.7

It should be disturbing that the OPC itself remains concerned that the RCMP too did not agree with its conclusion that it contravened the Privacy Act:

The RCMP is no longer using Clearview AI as the company ceased to offer its services in Canada last summer in the wake of our then ongoing investigation. However, the OPC remains concerned that the RCMP did not agree with its conclusion that it contravened the Privacy Act.8

The oversight and police practices have widened dangerously. Police associations, and police organizations such as the Canadian Association of Chiefs of Police, have become influential lobbyists for pro-police policies and support. Armed with increasingly complex and integrated reports, these organizations are able to influence the boards and municipal councils, which lack the depth of understanding or integrated critical view to counter the lobbying.

The current state of oversight is contributing to the development of policing that is becoming remarkably unaccountable.  

The RCMP is not alone in misrepresenting, or evading scrutiny about new technologies. In January 2020, we learned that the Toronto Police Service (TPS) was also using Clearview AI and other technology to surreptitiously record and identify potential suspects and intercept cellphone communications without requiring a warrant. The TPS was also using a device known as StingRay, which allows police to track people’s whereabouts and communications through their phones. In that instance, even Chief Mark Saunders suggested that he was not aware that his officers were using Clearview AI for surveillance of Toronto’s citizens.9  

How the TPS board, its chief, and now the RCMP commissioner, and Public Safety Minister Bill Blair (who is also a retired TPS chief) failed to be aware of these significant activities is perplexing and verges on incompetence.10   

As I noted in an earlier article (Municipal Police Services Boards: The Widening Gap in Oversight), and is worth repeating again—from a security services perspective we should have faith in and be assured of their commitment to fairness, transparency, and accountability. However, if history and past performance provide any indication of the potential for unchecked power to lead to abuse, we must remain vigilant and seek assurable transparency and accountability as a measure of the legitimacy of our criminal justice system.

Retired TPS board chair Alok Mukherjee, now a professor at Ryerson University, has raised alarms about the TPS using technologies that fundamentally alter surveillance and privacy without the prior knowledge of the service’s board.11

Until recently, police services boards have had to deal with fairly routine policing policy-making. Even issues related to the management of information—data retention policies, recording of race-based data, or carding, although highly contentious, have been issues which most board members as representatives of the community could understand and govern based on normative and empirical evidence.  

However, there have been significant developments over the past two decades that challenge the board’s ability to understand and oversee twenty-first century policing. The emergence of widespread surveillance systems, facial and voice recognition technologies, and sources of widely available data on publicly held platforms like Google, Apple, and Amazon pose new challenges for police boards. 

According to a report by the New York Times, more than six hundred law enforcement agencies have started using Clearview in the past year without any public scrutiny.12 Most police services already rely on accessing data from technologies ranging from doorbell cameras to in-home voice assistants like Alexa and Google Mini, and yet it is unlikely that most boards have the expertise or information to grasp the scope and implication of such policing practices. The introduction and adoption of such technologies and analytics by police services have far outpaced the capacity of boards to constrain the unintended consequences of the collection of massive amounts of data on the mostly unaware citizens who were never subject to a formal investigation.  

It is worth repeating a portion of the privacy commissioner’s statement following the RCMP’s use of Clearview AI:

Notably, we found there were serious and systemic gaps in the RCMP’s policies and systems to track, identify, assess and control novel collections of personal information through new technologies.13 

Police use of facial recognition technologies, with its power to disrupt anonymity in public spaces, and enable mass surveillance, raises the potential for serious privacy harms unless appropriate privacy protections are in place.14

Canadians must be free to participate in the increasingly digital, day-to-day activities of a modern society without the risk of their activities being routinely identified, tracked and monitored.15

While certain intrusions on this right can be justified in specific circumstances, individuals do not forgo their right to privacy merely by living and moving in the world in ways that may reveal their face to others, or that may enable their image to be captured on camera.16

Technological advances like those of Clearview AI and StingRay, artificial intelligence, automation, and other emerging systems for surveillance and records management have relegated police oversight and boards to rubber-stamping budgets and complaints. In effect, boards are threatened with becoming relics of the twentieth century without meaningful capacity for post-modern oversight.

The most striking example of the next approach to profiling is the application of artificial intelligence towards predictive policing or, as law enforcement agencies like to call it, PredPol. Data analysis is used to predict locations, times, and types of crimes expected to be committed, and even predicting the individual who is expected to commit the crime at a designated time and place. Facial recognition technology, traffic cameras, closed-circuit surveillance cameras, police-worn body cameras, in-car cameras, and drone surveillance are just a few of the new tools for collecting intelligence.

A lack of technological expertise and understanding of surveillance technology on police services boards has contributed to a widening gap between oversight and policing. Chiefs have incorporated training and technologies in areas of surveillance and records management, as routine analytical investments that are changing the very nature and course of future surveillance and policing. Changes that impact human rights, privacy laws, and constitutional freedoms. Worse—police agencies are now openly defying the findings of oversight.

The improper application of carding did much harm to police-community relations and significantly eroded the legitimacy of community policing programs, especially within marginalized communities. Without oversight and accountability, we are assured of facing the same, if not worse, misuses of new technologies; the fact that two of Canada’s largest police services have been less than forthright is highly alarming.

Most of us will never know how surveillance data are used, where they are retained, for how long, if and how they are shared with other agencies or jurisdictions, or how they impact our security assessment; and most importantly, who determines the relevance of the data collected. We must rely on the ability and veracity of boards providing oversight on our behalf.

Understanding and safeguarding personal freedom and civil rights is today more critical than ever. The pace at which artificial intelligence is being developed and incorporated is far outpacing the regulatory and ethical frameworks required to control its unintended and deliberate intrusion and erosion of hard-fought-for civil liberties. We are now in the era of predictive policing, geo-profiling, and crime prevention—carding 2.0—and need to ask the tough questions about what that means.

Police oversight, accountability, and transparency are at the crux of the police reform movement. Policies that define and guide the collection, interpretation, and application of police information and intelligence should be among the issues at the forefront of any police reform agenda, and now is the time for those interested in substantive reform to step up and take part. 

The OPC is currently seeking feedback and input from interested parties on the use of facial recognition technology by police and security services. The OPC’s website is seeking written feedback from stakeholders both on the draft guidance as well as on the legal and policy framework for police use of facial recognition more generally.

The website notes:

Canada’s federal, provincial and territorial privacy protection authorities have jointly developed guidance for police agencies to clarify their privacy obligations with respect to their use of facial recognition (FR) technology, with a view to ensuring any use of FR complies with the law, minimizes privacy risks, and respects privacy rights. This guidance is meant for federal, provincial, regional and municipal police agencies. It is not intended for other public organizations outside of the police that are involved in law enforcement activities (for example, border control), nor for private sector organizations that carry out similar activities (for example, private security).17

This is an important and rare opportunity for those seeking police reform to have direct and meaningful engagement with the systemic determination with which our police and security services will operate for decades to come. Facial recognition technology is the new carding and anyone who had concerns about who was carded, where that information ended up, with whom it was shared, how it was used, and how long it is retained should be at the front of the line to participate in this process.

Demanding reform is one thing; understanding how to effect reform is another, and recognizing, seizing, and capitalizing on those opportunities is far more important than protesting after the fact.  

 

Anil Anand is a research associate at the Frontier Centre for Public Policy.

 

[show_more more=”SeeEndnotes” less=”Close Endnotes”]

  1. Catharine Tunney, “RCMP Denied Using Facial Recognition Technology— Then Said It Had Been Using it For Months,” CBC, March 4, 2021, https://www.cbc.ca/news/politics/clearview-ai-rcmp-facial-recognition-1.5482266.
  2. Office of the Privacy Commissioner of Canada, “RCMP’s Use of Clearview AI’s Facial Recognition Technology Violated Privacy Act, Investigation Concludes,” June 10, 2021, https://www.priv.gc.ca/en/opc-news/news-and-announcements/2021/nr-c_210610/.
  3. Ibid.
  4. Ibid
  5. Ibid.
  6. Ibid.
  7. OPC, “Joint Investigation of Clearview AI, Inc. by the Office of the Privacy Commissioner of Canada, the Commission d’accès à l’information du Québec, the Information and Privacy Commissioner for British Columbia, and the Information Privacy Commissioner of Alberta,” PIPEDA Report of Findings #2021-001, February 2, 2021, https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2021/pipeda-2021-001/.
  8. OPC, “News Release: RCMP’s Use of Clearview AI’s Facial Recognition.”  
  9. Staff, “Toronto Police Admit Using Secretive Facial Recognition Technology Clearview AI,” CBC, February 13, 2020. Accessed February 20, 2020, https://www.cbc.ca/news/canada/toronto/toronto-police-clearview-ai-1.5462785.   
  10. Ibid.
  11. Alok Mukherjee, “Public Being Kept Out of the Loop on Facial Recognition Technology,”
  12. Kashmir Hill, “The Secretive Company that Might End Privacy as We Know It,” New York Times, January 18, 2020. Accessed February 21, 2020,  https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html
  13. OPC, “Statement by the Privacy Commissioner of Canada Following an Investigation into the RCMP’s Use of Clearview AI,” June 10, 2021, https://www.priv.gc.ca/en/opc-news/speeches/2021/s-d_20210610/.
  14. Ibid.
  15. Ibid.
  16. Ibid.
  17. OPC, “Notice of Consultation and Call for Comments—Privacy Guidance on Facial Recognition for Police Agencies.” Accessed June 16, 2021, https://www.priv.gc.ca/en/about-the-opc/what-we-do/consultations/notice_frt/.

 

[/show_more]

Photo by Daniel Tafjord on Unsplash.

Featured News

MORE NEWS

Canada’s Indigenous Burial Hoax Is Still Very Much Alive

Canada’s Indigenous Burial Hoax Is Still Very Much Alive

History shows that many hoaxes, fake news stories, and conspiracy theories have proven nearly unassailable, even when proven false. So far, it seems a British Columbia burial canard will be added to this list. The assertion that thousands of Indian Residential School...