
Bunnings is within its rights to use facial recognition software on customers to curb theft and violence against staff, with a tribunal laying the foundation for a broader rollout.
The Administrative Review Tribunal has overturned a ruling from the Office of the Australian Information Commissioner that the hardware giant breached privacy laws by trialling facial recognition in stores.
The retailer used CCTV to scan the faces of customers entering 63 stores in NSW and Victoria in the three years to November 2021.
In 2024, the commissioner’s office found Bunnings had taken customers’ private information without consent, failed to take steps to notify them and left gaps in its privacy policy.

The hardware chain appealed the ruling to the independent tribunal, which on Wednesday found there were “serious” threats to safety and the risk-sensitive information misuse was “negligible”.
“Bunnings was entitled to use (the technology) for the limited purpose of combating very significant retail crime and protecting their staff and customers from violence, abuse and intimidation within its stores,” the tribunal said.
The extent of retail crime faced by Bunnings staff and customers was one of the “important factors” in its ruling.
A store manager at Box Hill in Melbourne’s east estimated his team was confronted with threatening or abusive behaviour every two to three days, leaving staff “visibly shaken and upset”.
Threatening situations were “the worst it has ever been”, Bunnings’ national investigations and security manager Alexander MacDonald told the tribunal.

Bunnings calculated at least 66 per cent of theft losses on average each financial year were attributable to the top 10 per cent of offenders.
The tribunal recognised the need for “practical, common-sense steps” to keep people safe and identified areas where the business didn’t get everything right, Bunnings managing director Mike Schneider said.
“The safety of our team, customers and suppliers has always been our highest priority,” he said.
“Our intent in trialling this technology was to help protect people from violence, abuse, serious criminal conduct and organised retail crime.
The commissioner’s office said it was carefully considering the ruling and noted it was open to appeal.
The decision confirmed that the Privacy Act contained strong protections for individual privacy related to emerging technologies, it said.
“The Australian community continues to care deeply about their privacy, and is increasingly worried about the challenges in protecting their personal information,” an office spokesperson said.

Facial recognition software could play an important role in protecting workers and customers from serious harm when used properly and transparently, the Australian Retail Council said.
The peak retail body was keen to work with governments to deliver a “balanced framework”.
“This is a positive step forward in combating retail crime,” council chief executive Chris Rodwell said.
Digital Rights Watch was less enthused at the prospect of wider use of facial surveillance technologies, calling for an outright ban in retail and other settings.
“(The finding) clearly goes against community expectations,” the group’s head of policy Tom Sulston said.
“Australians want and deserve clear and enforceable laws that meet our expectations. It is time for law makers to act.”