From a report released today, Law Enforcement across Canada uses controversial algorithms to foretell where crimes could occur, who might go missing and helps them determine where they should patrol, despite significant human rights concerns.
While in the ‘Minority Report’ they used psychics, this is predictive data modeling and this type of technology is new to Canada, it’s also been a topic of concern in the US; in 2013, New Orleans procured software from Palantir Technologies. They did so secretly without notifying the City Council. Palantir was founded with seed money from the CIA’s venture capital fund. In 2018, Palantir was valued at over a whopping $20 billion.
The report, To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada, breaks down how police are practicing or contemplating algorithms for several purposes, including predictive policing, which uses historical police data to predict where crime will occur in the future.
In Canada as an example, two agencies, the Vancouver Police Department and the Saskatoon Police Service use algorithms to review data and predict which individuals might go missing and use the technology throughout the criminal justice system in the future.
To predict such crimes, Law Enforcement combs through troves of public data to predict infractions or potential crimes, including social media posts, and subsequently apply facial recognition to existing mugshot databases for investigative purposes. The data also includes location-focused information, a fancy and subtle term for historical GPS data.
This article also reveals information suggesting that the Ontario Provincial Police and Waterloo Regional Police Service may be unlawfully intercepting private communications in online private chat rooms through reliance on an algorithmic social media surveillance technology known as the ICAC Child On-line Protection System (ICACCOPS).
The report raises dire consequences for civil liberties and privacy advocates, the use of recent technology such as drones, cameras that detect if humans are wearing masks, and creepy smart dust has already raised alerts for civil liberty groups as technology takes a forefront in the surveillance of humans.