Half of Australians believe that their privacy is being invaded by the presence of facial recognition technology in public spaces, according to a survey released today by Monash University.
The study, which examined the varied attitudes towards the biometric technology, also found only moderate awareness among Australians about the potential applications of the technology.
Where there was awareness, there was also a tension between the desire for personal privacy versus support for a technology which could help bolster security and efficiency.
“Proposed uses for facial recognition technology include targeted marketing, shopping, ATM access, mass transit, employee and student tracking, and much more,” says researcher and lecturer in Communications and Media Studies Robbie Fordyce.
“Many of these applications are being developed and rolled out without robust public discussion and debate – and with minimal regulatory attention or public engagement.”
Respondents reacted with discomfort when asked about its application in workplaces or schools to track movements, attitudes or moods.
They also rejected its use in tracking shoppers’ activities while shopping targeted advertising or verifying the ages of people purchasing alcohol or cigarettes.
Meanwhile, respondents mostly supported usage of facial recognition to gain access to secure locations – already happening in some workplaces and private schools.
“One of the key themes in people’s responses was trust,” says Fordyce.
“People who trust institutions do so because they expect their own data not to be misused, and that their data is kept private and secure.
“However, there were many who were distrusting of the development of the technology in schools, the private sector and by the government.”
With the Australian Government already developing a national facial recognition database, almost two-thirds of respondents expressed concern about the security of such a project.
The same number said they should have the right to opt out of a national database.
There was also concern about inaccuracies in the technologies, with racial bias being a standout fear for some. Others believed the technology was too ‘inaccurate to be practical’.
“As is often the case, when the technology is framed in terms of security and safety, there is substantial support, but people clearly have strong concerns about the privacy and security of their data,” says Fordyce.
“Despite the promise of the technology to make transactions more efficient, people are clearly wary of the level of tracking the technology enables.
“There is opposition to many of the proposed commercial uses of the technology for tracking and identifying shoppers.
“The technology offers to apply the model of online tracking to physical space: everywhere we go and everything we do can be tracked and linked to our identity.”