AI is joining the police force

AI is joining the police force

predictive policing.png

India is looking to put smart cops on the beat using artificial intelligence. The state of Assam says it is building AI-powered analysis capabilities for use in predictive policing, aka identifying potential criminal activity à la Tom Cruise in the hit film Minority Report.

“A Big Data analysis facility will be raised under the project, which will be based on artificial intelligence technology,” Assam finance minister Himanta Biswa Sarma says. “This will be a futuristic step for predictive policing.”

The aim of the AI-based solution is to check local and national police to profile likely recurring criminals to help local forces to monitor their activities and prevent them from committing further crimes.

US-based predictive policing outfit PredPol is believed to have worked with police forces in both the US and UK, and to hold sensitive crime data indefinitely on its servers. PredPol uses a machine-learning algorithm to calculate predictions across three data points – crime type, crime location and crime date/time.

The outfit says its model comprises three aspects of offender behaviour:

Repeat victimisation, eg, if a house is broken into today, the risk that it is broken into tomorrow actually goes up

Near-repeat victimisation – recognizes that not only is that one house at greater risk of being broken into again, but the neighbour’s house is also at greater risk.

Local search – offenders rarely travel very far from their key activity points such as their home, work and play locations, so crimes tend to cluster.

Unsurprisingly, this is controversial stuff. Human rights pressure group Liberty is asking UK police forces to abandon tests predicting where crimes are likely to happen and whether individuals are likely to re-offend. It believes at least 14 UK forces are testing or developing predictive policing systems.

In its recent Policing by Machine report, Liberty says these systems can entrench bias, while their algorithms are impossible to scrutinise.

Late last year, police deployed live facial recognition technology in central London using an AI-based system able to scan 300 faces per second.

Addressing the use of such solutions, civil liberties and privacy campaign outfit Big Brother Watch said that monitoring innocent people in public is a breach of fundamental rights to privacy and freedom of speech and assembly.

And recently, an Amazon shareholder group made a move to keep the online giant from marketing its Rekognition facial recognition technology to law enforcement. “Tests of the technology have raised concerns that it is biased, inaccurate and dangerous,” the shareholders said.

Brexit slams German jobs

Brexit slams German jobs

Adidas accelerates in Paris

Adidas accelerates in Paris