Disquiet still around ACC’s use of tool

Warren Forster
Warren Forster
Questions are still being asked about the Accident Compensation Corporation’s use of a predictive software tool, amid continuing claims  it could implicitly discriminate against clients on "problematic grounds such as age, ethnicity or gender".

Last September, Dunedin lawyer Warren Forster and several University of Otago academics linked to the "Artificial Intelligence and Law in New Zealand" project queried several aspects of the ACC’s use of the predictive tool.

Recent overseas media reports, including in the Guardian newspaper last August, have highlighted concerns that some predictive software systems used in predicting reoffending risks in United States courts, and in the US insurance industry, are subject to racial bias. The Otago academics, including Associate Prof James Maclaurin of the Otago philosophy department, also issued a statement asking a series of questions about the software tool and its use.

The ODT later sought answers from the ACC, under the Official Information Act, and Mr Forster commented briefly on the answers.

Mr Forster said the tool could discriminate against ACC clients.

The ACC had done "the wrong thing, the wrong way", and it should be "completely transparent" about the tool’s use, Mr Forster added.

The Otago academics asked how accurate the tool was; if  the ACC could explain how it worked so clients could appeal individual decisions; and if the tool’s use distorted the way the ACC pursued its "stated policy objectives".

James Maclaurin
James Maclaurin
The academics also asked if the ACC was "ducking its responsibility to make fair and humane decisions" about the treatment of New Zealanders in need by "passing the buck" to the machine, and if the tool implicitly discriminated against individuals on "problematic grounds such as age, ethnicity or gender".

Another query was if ACC employees had been "effectively trained" in the use of the system, given the risks of falling into "autopilot" mode, when guided by it.ACC spokesman James Funnell said he was "quite concerned at the erroneous assumptions that seem to be underpinning the questions".

It was "important to understand that our predictive tools are not used to make cover or entitlement decisions", Mr Funnell said.

"We use the tools to help our staff focus their efforts on providing rapid support to those clients who need our help, ahead of those clients whose recovery is progressing as expected," he said.

"Gender and ethnicity are not factors, and testing has found negligible gender/ethnicity bias in the results of the tool."

The tools helped the ACC "make fairer and more humane decisions by allowing us to focus our resources more rapidly on the clients who most need our help".

The tools were not distorting the ACC’s pursuit of its policy objectives, and provided a "prediction of the likelihood that a client will require weekly compensation within the first 28 days" after an injury and a predicted range for the expected length of time a client would need to receive weekly compensation.

But no decisions on the "length of time a client actually receives weekly compensation" were made using this tool.

To predict the support needs of ACC clients, the ACC had previously used a forecasting tool called the Medical Disability Adviser (MDA), based on American insurance and health industry data.

But about "80% of predictions" were inaccurate.

The ACC subsequently recognised the need to "better serve our clients"  by developing "more sophisticated predictive tools" based on the data the ACC held.

In developing those tools in 2013, data was taken from 364,000 claims — all the claims that involved weekly compensation payments— from the previous seven years. Staff training had a "strong focus on doing the right thing, in the right way, at the right time".

This included how to use the various tools and information resources available, including "our predictive tools", he said.


Add a Comment