Thousands Of Mathematicians Call For Boycotting Predictive Crime A.I. From Police
Thousands Of Mathematicians Call For Boycotting Predictive Crime A.I. From Police by Aaron Kesel for Activist Post
After a flurry of police brutality cases this year and protests swarming the U.S. streets, thousands of mathematicians have joined scientists and engineers in calling for boycotting artificial intelligence from being used by law enforcement.
Over 2,000 mathematicians have signed a letter calling to boycott all collaboration with police and telling their colleagues to do the same in a future publication of the American Mathematical Society, Shadowproof reported.
The call to action for the mathematicians was the police killings of George Floyd, Tony McDade, Breonna Taylor, and many more just this year.
“At some point, we all reach a breaking point, where what is right in front of our eyes becomes more obvious,” says Jayadev Athreya, a participant in the boycott and Associate Professor of Mathematics at the University of Washington. “Fundamentally, it’s a matter of justice.”
The mathematicians wrote an open letter, collecting thousands of signatures for a widespread boycott of police using algorithms for policing. Every mathematician within the group’s network pledges to refuse any and all collaboration with law enforcement.
The group is organizing a wide base of mathematicians in the hopes of cutting off police from using such technologies. The letter’s authors cite “deep concerns over the use of machine learning, AI, and facial recognition technologies to justify and perpetuate oppression.”
Predictive policing is one key area where some mathematicians and scientists have enabled the racist algorithms, which tell cops to treat specific areas as “hotspots” for potential crime. Activists and organizations have long criticized the bias in these practices. Algorithms trained on data produced by racist policing will reproduce that prejudice to “predict” where crime will be committed and who is potentially a criminal.
“The data does not speak for itself, it’s not neutral,” explains Brendan McQuade, author of Pacifying the Homeland: Intelligence Fusion and Mass Supervision. Police data is “dirty data,” because it does not represent crime, but policing and arrests.
“So what are its predictions going to find? That police should deploy their resources in the same place police have traditionally deployed their resources.”
Several, if not all, U.S. states and major cities are thought to use some type of predictive policing or pre-crime software with known users including — Chicago, Atlanta, Tacoma, New York, and LA, though not without protesting its use. As Activist Post previously reported, many of these states are using Palantir software for their predictive crime algorithms and have been exposed for doing so, like Florida, whose police terrorized and monitored residents of Pasco County.
These police organizations across the U.S. have been using what is known as “heat lists” or pre-crime databases for years. What is a “heat list,” you may ask?
Well, “heat lists” are basically databases compiled by algorithms of people that police suspect may commit a crime. Yes, you read that right — a person who might commit a crime. How these lists are generated and what factors determine an individual “may commit a crime” is unknown.
Activists and journalists sued the Chicago Police Department in 2017 for failing to disclose how these programs operate, as Activist Post reported.
Chicago wasn’t the only major police department exposed using predictive crime algorithms. The Los Angeles Police Department was also caught one year later in 2018 by activists from the Stop LA Spying Coalition, as Activist Post reported.
This heat list idea in local law enforcement actually originated in Miami then was rolled out in Chicago in 2013. However, Activist Post may have missed other cities that gained less media attention; and as this writer will discuss shortly, the idea comes from a federal database.
A paper released last year by MIT entitled “Technical Flaws of Pretrial Risk Assessments Raise Grave Concerns” has been signed by some of the highest level university experts in the field of A.I. and law who warn about the “technical flaws” of these pre-crime based systems, Activist Post reported.
Fortunately for us, as Nicholas West noted, the pushback has already started in several cities, and a few police departments have dropped their programs after becoming aware of the inaccuracies. In 2018, for example, New Orleans suspended its 6-year running pre-crime program after its secret predictive policing software was exposed.
The scariest part of all this is that the New Orleans and LA police departments were actually both linked to Palantir Technologies, which directly works with the CIA and is suspected of being the current fork of PROMIS Main Core software. PROMIS pre-dates all of these local police heat lists, with algorithms that put suspected “domestic terrorists” into their own round-up lists and highly scrutinized tracked purchases, created at first by Oliver North for President Ronald Reagan and Vice President George H.W. Bush under FEMA’s Readiness Exercise — 1984 (REX-1984.)
The use of Palantir’s pre-crime algorithm software posits that other police departments may be utilizing the same software for their own pre-crime programs. Palantir is also the same company working with the U.S. Immigration and Customs Enforcement agency on its own lists to catch illegal immigrants, as Activist Post and investigative journalist Barrett Brown originally reported.