Police confirm links with software security system firm

Police have been working for more than three years with the company behind security software used to identify potential shoplifters in a Dunedin supermarket.

Yesterday, the Otago Daily Times revealed some Foodstuffs NZ supermarkets in the North Island used facial recognition CCTV systems, while other stores in the South Island, including Centre City New World in Dunedin, used the Auror system.

Auror captured images from the store and licence plate numbers to identify potential offenders more easily, Foodstuffs head of external relations Antoinette Laird said.

Started in New Zealand, Auror offers software platforms designed to ``help police and retail businesses collaborate and fight crime'', its website says.

A police spokesman confirmed they worked with Auror to reduce retail crime, and information retailers provided to the company's software system was also shared with police.

``This information would include the time and nature of the crime, victim/witness statements and any CCTV images held by the business.

``This assists police in investigating offences and gathering evidence, especially when it comes to repeat offenders.''

Police and Auror had worked together for three and a-half years, the spokesman said.

Asked if police were involved with the automatic facial recognition systems now known to be used in a selection of Foodstuffs stores in the North Island, he said police did not receive any information from such systems.

`` ... and we understand that a facial recognition function is not a feature of Auror'', he said.

Privacy Commissioner John Edwards said his office had not examined the facial recognition CCTV systems used by Foodstuffs in the North Island, but any such technology ``runs the risk of misidentifying people''.

Mr Edwards cited a study on bias in facial recognition software by US computer scientist Joy Buolamwini, a researcher at the MIT Media Lab.

The study showed the software was much more likely to misidentify darker-skinned people compared with those with lighter skin.

``Gender was misidentified in less than 1% of lighter-skinned males; in up to 7% of lighter-skinned females; up to 12% of darker skinned males; and up to 35% in darker-skinner females,'' the study found.

Mr Edwards said he expected companies or agencies using facial recognition systems ``to have a high level of scrutiny over how accurate it is and how thoroughly it has been tested for use in New Zealand''.

``Don't leave it up to automated systems alone. When it comes to identifying people accused of a crime, getting it wrong can have a severe impact on the person affected.''

george.block@odt.co.nz

Add a Comment

 

Advertisement