We can regulate facial recognition technology to address harms and promote innovation: a new Australian report shows the way

We can regulate facial recognition technology to address harms and promote innovation: a new Australian report shows the way

 

A new report from the University of Technology Sydney (UTS) Human Technology Institute sets out a roadmap towards regulating facial recognition technology (FRT) in Australia in ways that protect against harmful uses of this technology and enable innovation for public benefit.

The uses of facial recognition by public and private organisations have grown significantly in recent years, raising concerns about privacy, mass surveillance and unfairness. These technologies can be used to identify people from photos, videos or in real time and have a range of applications, including identity verification, crime prevention and service delivery. However, a growing body of evidence indicates they also replicate – even magnify - existing social vulnerabilities and inequalities, putting people of colour, women, activists, and marginalised communities at higher risk of harm when mistakes happen.

Frontier technologies such as FRT can help us meet humanitarian needs. For example, in 2020, the National Facial Recognition Database was used by Services Australia in the aftermath of the Black Summer bushfires to verify the identities of people who had their documents destroyed so they could access relief payments. However, the lack of legislative framework for the use of the database highlights the risks of testing unregulated tools on people - when things go wrong, our current laws simply do not provide adequate protections.

And while potential for mistake or misuse is heightened in crises, these complex technologies pose challenges to people’s rights in most circumstances. As a CHOICE investigation in July 2022 revealed, several large Australian retailers were using facial recognition to identify customers entering their stores, leading to widespread community backlash.

Responding to a growing recognition for the need to reform the laws that govern FRT, the new report, Facial Recognition Technology: towards a model law, co-authored by Prof Nicholas Davis, Prof Edward Santow, and Lauren Perry, outlines a blueprint to update Australian law and address threats to Australians’ human rights.

Humanitech strongly supports the practical steps set out in this report. Ivana Jurko and Amanda Robinson from Humanitech participated in the Model Law Expert Reference Group, bringing a humanitarian perspective to the table and delivering on Humanitech’s purpose of ensuring technology benefits humanity.

Strengthening laws and policies on the uses of facial recognition will help safeguard the rights and dignity of all Australians, while enabling responsible innovation that can help improve lives and tackle complex social challenges.

To read the report and background material, please visit this page.

Image credit: Tony Liao

© Australian Red Cross 2024. ABN 50 169 561 394