Back Home 5 News 5 Privacy Commissioner considers code of practice for facial recognition technologies

Privacy Commissioner considers code of practice for facial recognition technologies

31 Mar 2023

| Author: Reweti Kohere

Privacy Commissioner Michael Webster is exploring a code of practice to regulate the use of biometric information as the technologies and concerns continue to grow.

The privacy watchdog will look into whether the code might further guide the way individuals’ most sensitive personal information is collected, used and stored by agencies.

Facial recognition technology (FRT) is ubiquitous overseas, and used in smartphones, at airports and supermarkets, and by Police. But privacy concerns are mounting as FRT gains a foothold in Aotearoa New Zealand. Accompanying its increased presence are issues about past misuses of established surveillance technologies, creepage and algorithmic bias.

Most submitters believed some further regulatory intervention would help give regulated agencies greater certainty and better protect privacy.

“It is fair to say there were mixed views on the most appropriate type of intervention, but what was clear is that something more needs to be done,” Webster says.

“The use of biometrics is growing and diversifying. We want to ensure New Zealanders and New Zealand businesses can harness the benefits of this technology, but also be protected from potential harm.”

A round of “targeted engagement” will take place with agencies and people interested in biometric information and technologies. From there, the commissioner in 2024 will decide whether to press ahead with a code.

The Privacy Act 2020 empowers the commissioner to issue codes of practice that become part of New Zealand’s privacy law. Such codes modify how the Act works for specific industries, organisations or kinds of personal information.

Existing codes include the Civil Defence National Emergencies (Information Sharing) Code 2020, which gives agencies broader discretion to deal with personal information during a state of national emergency, and the Telecommunications Information Privacy Code 2020, which applies specific rules to internet service providers, network operators and other telecommunications agencies to better ensure people’s privacy is protected.

Webster wants the public involved as much as possible if a code of practice proves necessary “because the use of biometric information will affect us all.

“Advances in technology can offer great benefits, but it’s important the benefits are enabled for all and the public is safe-guarded against risk,” he says.

‘Broadly comfortable’
The regulator accepts concern is mounting about the use of technologies that recognised individuals based on their face, fingerprints, voice, eyes and other biological or behavioural characteristics. For the purposes of the Act, biometric information is “personal information” because it helps identify and verify individuals.

Most of the 100 submissions thought some further regulatory action was needed, the Office of the Privacy Commissioner said in a summary of public engagement. However, there was disagreement on the preferred form and how far the watchdog should go.

Many submitters were “broadly comfortable” the Privacy Act was fit for purpose, although a significant number thought further clarity on what the regulations required of agencies would help. Several submitters thought new regulation was unnecessary, but did support additional guidance, while another group said stronger regulation was needed as the existing framework was inadequate in considering the specific risks associated with biometrics.

Submitters generally agreed that not all uses of biometrics were equally risky and that any additional regulation should align with the different levels of risk.

Factors to consider included how trustworthy the user was and the extent of their privacy and technological maturity; whether individuals could provide meaningful consent to the collection and use of their information; and the impacts on children and young persons and other vulnerable groups. A few submitters thought the most appropriate response to the highest risk might be a complete ban.

Perpetuate bias
Māori noted that biometric information is sacred to individuals for it’s taken from the mauri they carry. As it’s a taonga, it should be protected according to tikanga and mātauranga, including by taking care when disposing of biometric samples and ensuring data of the living is not stored with that of the dead.

Other concerns from Māori included the collection, storage and use of images of moko and that using biometric technology can perpetuate entrenched biases against tangata whenua by misidentifying them in alleged criminal activity or exacerbating their oversurveillance.

Already there are worries FRT could further entrench existing biases in New Zealand’s criminal justice system and that some demographics could be falsely targeted due to inaccurate algorithms.

FRT verifies or authenticates individuals’ identities through algorithms that analyse their facial features and find probable matches when compared with digital “face prints”.

However, overseas research during the past two decades has exposed divergent error rates across demographic groups in the US. A federal government study found facial recognition algorithms worked poorly for people of colour, the elderly, women and children, while middle-aged white men generally benefited from the highest accuracy rates.

Submissions from other minorities highlighted additional perspectives, including from religious groups whose members wear head or face coverings or whose beliefs forbid the taking of images of individual; and how biometrics would function for people with disabilities and those in the process of gender transition.

Opposition
Compared with organisations or experts, just under half of the submissions came from private individuals, most of whom were opposed to facial recognition technology (FRT).

Numerous individuals urged the banning of such technologies, saying they didn’t consent to their use and are concerned about potential government surveillance.

The OPC sought feedback in August 2022 on how it could better regulate biometric information, stating a “strong case” existed for tightening the regulatory framework.

This was a marked shift from an initial position paper dated October 2021, where the commissioner considered the Privacy Act could adequately protect individuals’ privacy rights as biometric data is collected and used. The regulator did reserve the option to consider additional action if necessary.

Subscribe to

LawNews

The weekly online publication is full of journalistic articles written for those in the legal profession. With interviews, thought pieces, case notes and analysis of current legal events, LawNews is a key source of news and insights for anyone working within or alongside the legal field.

Sign in or
become a Member
to join the discussion.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Latest Articles

Loading...