Govt rolls out new facial recognition tech

The government is pushing ahead with new facial recognition technology despite concerns about flawed metrics for assessing individuals' identities.

The government begins the roll out new facial recognition technology next week despite an "untested risk" around racial bias.

No tests have been done for bias on New Zealand's specific population mix, even though the new tech has been in development for four years by the Department of Internal Affairs.

US testing raises more questions.

Identity Check uses facial recognition tech from Irish company Daon to match a live image you take on a phone with your driver's licence or passport photo in government databases.

The Ministry of Social Development will be the first to go live with it on November 20. Beneficiaries may choose to use it, or stick with existing verification systems.

But a recent MSD report calls the level of racial bias in the technology "unknown", "unconfirmed" and "untested".

"Before MSD can include within scope the use of Identity Check information from DIA for fraud investigation purposes, the current unknown level of bias in the facial recognition algorithm DIA use will first need to be better understood," according to a report in September, released under the Official Information Act.

"This is because if there is any bias in their technology, any clients who are affected, which may include Māori and Pasifika clients who make up a large proportion of MSD clients, may also go on to become the subject of a fraud investigation. This will likely have a significant impact on those Māori and Pasifika clients."

But DIA told RNZ that racial bias is not an issue because in recent tests the tool was 90 per cent accurate. The tests covered 250 people.

"While we have not specifically tested the algorithms against different ethnic groups, the numbers ... suggest that the technology we are using is very successful in the New Zealand context," it says.

"It works for the vast majority of people, in the vast majority of instances it's used."

The MSD report says US government tests in July suggested the Daon AI algorithm worked less well "on people of darker skin tones than on people of lighter skin tones".

This has been a long-standing problem with many facial recognition systems.

That might result in "unjustified discrimination" and "emotional harm" for Māori and Pasifika unable to use Identity Check as easily as white people do, the report said.

But MSD's assessment concluded that the risk is acceptable, because ethnic groups are being consulted, and because efforts are under way to improve the tech.

"DIA will use identity check samples (training/performance data) to re-train the algorithm so its performance among the NZ population is improved. This is an ad hoc, bespoke process where DIA contracts an external consultant to carry out."

MSD also cited the 90 per cent test result. This was achieved when people had up to three goes to get a facial match, and did not explicitly test for ethnic differences.

US tests on the algorithm, and Internal Affairs' own limited testing of it, showed a false non-match rate of 10 per cent, MSD says.

No racial tests would be done here because Internal Affairs planned to swap in 2026 to a different Daon algorithm used for years with New Zealand passports, it adds. DIA did not confirm this, only saying it is "always looking at options" for the most secure, effective and cost-efficient options.

Joy Liddicoat, an AI researcher at the Centre for Law and Policy in Emerging Technologies at Otago University, says the development of a system the government has big plans for is flawed.

"When the dangers of the use of these kinds of metrics are so well known, it's simply not good enough, over the life of a four-year project, not to have nailed this down before launching."

She says the overall aim is good - to make accessing benefits easier - and it might sometimes be OK to launch first and learn-as-you-go - but not when a system might red flag a person without them even knowing.

Users - beneficiaries to start with - get five goes before the system locks them out for three days.

Fraud investigation

Identity Check can also be used to detect and investigate fraud.

"MSD will not use DIA data for fraud investigation purposes in phase 1," according to the MSD report on security, privacy and ethics.

"This reduces several risks for this initiative, including an unconfirmed human rights and ethics risk of racial bias in DIA's facial recognition technology."

Officials plan to communicate "an acknowledgement of the untested risk relating to racial bias" after the system went live, the report says.

MSD has talked about this with its Pasifika reference group - and has been due to do that with its Māori group, too - but as for Internal Affairs, while it has promised to engage with Māori and incorporate their views, "it is yet to occur due to resourcing constraints. DIA hopes to begin that process in the next quarter", the MSD report says.

That engagement is coming four years after DIA embarked on Identity Check.

MSD Pasifika reference group member Danny Mareko confirms officials told them about the tech, though he can't recall any discussion about racial bias. The ministry is good at engaging with them, he says.

The chair of the Māori reference group did not provide comment.

In a one-year trial run by Internal Affairs up to September 2023, the facial recognition failed 45 per cent of the time. This is not a racial problem, but a problem with the AI used to ascertain someone's image is a live one of the person, and not, say, a still photo of someone else.

The upgrade to this has scored at 90 per cent.

The government wants Identity Check to become the go-to tech for proving everyone's identity online, to make access to hundreds of services, and doing e-business, easier and more secure.

Daon has been approached for comment.

-Phil Pennington/RNZ.

1 comment

Its not about race

Posted on 16-11-2023 09:15 | By an_alias

Thats just a smoke screen to hide behind, its how this technology is going to be used which has not been debated.
This is not about fraud its about setting up the technology to track everything you do online.
Watch this technology get abused down the track with a complete digital id needed to even look at the internet and make transactions.
Just watch what happens when we have a financial failure, network failure, power outage and all because you can only transact digitally. What could go wrong ?


Leave a Comment


You must be logged in to make a comment.