The Office of the Privacy Commissioner (OPC) has released a long-awaited report following its inquiry into Foodstuffs North Island’s (FSNI) trial of the use of facial recognition technology (FRT) across 25 supermarkets.
The report provides a comprehensive assessment of the privacy implications of FRT and offers valuable guidance for organisations considering similar technologies. You can read the report in full here.
Key findings from the inquiry
The OPC’s inquiry into FSNI’s trial of FRT provides a detailed assessment of both the privacy risks and the safeguards implemented. The trial, which ran from February to September 2024, involved scanning over 225 million facial images. The OPC concluded that the trial complied with the Privacy Act 2020, but only because of the privacy protections in place, including:
- Carefully managed: The OPC acknowledged that scanning the face of every person entering a store is inherently intrusive. However, the system (designed in consultation with the OPC) was configured to delete 99.999% of facial images within one minute unless a match was detected. This meant that for most customers, no biometric data was retained.
- Strict watchlist criteria: The watchlist was limited to individuals who had previously engaged in serious and harmful behaviour, particularly violent incidents. Importantly, FSNI excluded children, young people under 18, and vulnerable individuals from the watchlist. This helped reduce the risk of unfair targeting or overreach.
- No centralised surveillance: Each store operated independently, with no sharing of watchlist data between locations. This decentralised approach helped prevent the creation of a broader surveillance network and limited the scope of data collection.
- Accuracy controls: Alerts generated by the system were verified by two trained staff members before any action was taken. During the trial, FSNI also raised the match threshold from 90% to 92.5% to reduce the risk of false positives after two individuals suffered harm from false positives.
- Effectiveness: The trial demonstrated that FRT could be effective in deterring and responding to serious retail crime. FSNI reported a reduction in violent incidents, suggesting that the technology may have a role to play in enhancing safety for staff and customers. However, the Commissioner stressed that effectiveness alone does not justify the use of facial recognition technology. As consistent with the new Biometric Privacy Code, its deployment must also be necessary and proportionate.
Further improvements needed
While the OPC found FSNI’s trial to be compliant, the report also identified several areas where improvements are needed before FRT could be considered for permanent or expanded use.
- Bias: The OPC raised concerns about the potential for bias in facial recognition systems, particularly those developed overseas and not trained on New Zealand’s population. There is a risk that such systems may perform less accurately for Māori, Pacific peoples, and other ethnic groups. Organisations must ensure that any system used in New Zealand is tested and validated for local conditions.
- Transparency: Although FSNI provided signage and information about the trial, the OPC noted that more could be done to ensure customers fully understand how their data is being used. Future deployments should include clearer, more prominent notices and clearer explanations of the technology and its purpose.
- Independent evaluation: The OPC recommended that any future use of facial recognition technology is subject to independent evaluation, including assessments of accuracy, effectiveness, and compliance.
- Alternatives: The report emphasised that facial recognition should only be used where less privacy-intrusive alternatives are not sufficient. Organisations must be able to demonstrate that other measures such as improved staffing, store design, or traditional surveillance have been considered and found inadequate.
- Governance: The OPC expects organisations to have clear governance structures in place for the use of facial recognition technology, including documented policies, staff training, and regular audits. These structures should ensure accountability and enable swift responses to any issues that arise.
The Biometrics Privacy Code
The draft Biometrics Processing Privacy Code, introduced in 2024, proposes a tailored regulatory framework for the use of biometric technologies, including facial recognition. It will supplement the Privacy Act 2020 by imposing additional obligations on organisations that collect and process biometric information.
You can read our alert on the draft code here, but the key considerations are:
- Purpose: Organisations must clearly define and justify the purpose of collecting biometric data. The use of facial recognition must be demonstrably necessary for achieving a legitimate aim, such as preventing serious harm or crime.
- Transparency: Individuals must be informed about the use of facial recognition through clear and accessible notices. This includes explaining what data is collected, how it is used, and the rights individuals have in relation to their data.
- Minimisation: Only the minimum amount of biometric data necessary for the stated purpose should be collected. Retention periods must be limited, and unnecessary data must be promptly deleted.
- Security: Strong technical and organisational measures must be in place to protect biometric data from unauthorised access, misuse, or breaches. This includes encryption, access logs, and staff training.
Consultation on the draft Biometrics Code closed in March 2025. The Office of the Privacy Commissioner expects to issue a final version of the Code in the next few months.
Next steps
With biometrics a current focus for the OPC, and the Biometrics Code being implemented shortly, here’s how to stay ahead:
- Keep an eye on developments in the FRT space and the final Biometrics Code from the OPC.
- Make sure your privacy and data governance frameworks are prepared for the added complexities of biometric data, even if it’s not in use yet.
- Include biometric data risk in your strategic planning and risk assessments.
- Brief your legal, compliance, and IT teams on the privacy implications of emerging tech and the use of biometric data.
If you would like assistance in understanding and managing your legal obligations in relation to the collection and use of biometric information, please get in touch with one of our privacy experts.
This article was co-authored by Thomas Anderson, a Solicitor in our Corporate team.