Computerised crime fighting needs support

Geoffrey Barnes, visiting authority on artificial intelligence and crime, at the University of...
Geoffrey Barnes, visiting authority on artificial intelligence and crime, at the University of Otago yesterday. Photo: Linda Robertson
Computerised crime-fighting programmes are useless without the proper support systems in place, a world authority in artificial intelligence criminology says.

Police forces worldwide  increasingly are experimenting with algorithmic computer programmes which try to predict likely future offenders.

Geoffrey Barnes, director of  criminology for West Australia Police and a lecturer at the University of Cambridge, is in New Zealand to advise academics and politicians  about the capabilities and limitations of this emerging technology.

"The algorithm is designed to identify a group of people who can be helped," Dr Barnes,  in Dunedin for a meeting organised by the University of Otago  artificial intelligence project, said.

"The responses we put in place for what the algorithm has forecast are crucial."

Dr Barnes has worked closely with Durham police in northern England on developing the harm assessment risk tool (Hart), a project which has attracted considerable media attention in Britain.

Hart assesses offenders into high, moderate and low risk categories for reoffending: moderate risk offenders are enrolled in Durham’s Checkpoint programme, an intensive treatment regime designed to address drivers of criminal behaviour.

The programme is specifically designed to err on the side of caution, which means high-risk offenders should not be treated lightly, but also runs the risk of some low-risk offenders may receive intense treatment they do not need.

"It also means that if we say they are low-risk, we really mean it," Dr Barnes said.

"Low-risk people are actually a problem, because the criminal justice system is a gigantic vacuum cleaner.

"It pulls in people who had one bad day and they’re not going to have another bad day any time soon, but we bring them in and treat them like criminals ... which can make things worse.

"If you focus on people who are going to reoffend, that is a needle you can move, you can see if you have had an effect."

Early results suggested Hart had a 98% success rate in picking low-risk offenders and an 88% strike rate in picking high-risk offenders.

However, just like a human, the algorithm could make a mistake, something policy makers had to accept if they were to pursue AI policing, Dr Barnes said.

Hart was developed using five years of offending history data, but he acknowledged it had weaknesses.

It had  no data of offending recorded in other databases, and it also had the risk of mirroring any biases in the original data.

However, programmers believed they could addressed the issue of bias — a regular criticism of AI crime-fighting programmes — through how the data was used.

Similar tools could be developed for all kinds of offending, Dr Barnes said.

In  his new role in  West Australia  he is looking at developing AI tools to predict offending patterns in driving and domestic violence crimes.

mike.houlahan@odt.co.nz

Comments

unproven program that does not work that's why it needs support to work it useless

 

Advertisement