Academics and digital privacy groups have called for a ban on the use of facial recognition technology until proper regulations are put in place to protect the public, following claims that retailers like Bunnings and Kmart may be breaching the law with their use of the technology.
Yesterday, consumer advocacy group CHOICE released a report investigating how Australia’s top 25 retailers are using facial recognition technology. Their major findings were that three of them — Kmart, Bunnings and The Good Guys — admitting to using the technology.
When CHOICE staff visited their stores, they found that Kmart and Bunnings stores had signs that were “small, inconspicuous and would have been missed by most shoppers”. This may contravene the Privacy Act, according to CHOICE.
Bunnings was the only store of the three that explained why they were using the technology (“to prevent theft and anti-social behaviour”). The company did not immediately respond to request for comment about whether the use of facial recognition had flagged any individuals.
Additionally, the consumer advocacy group also surveyed 1000 Australians and found that three quarters of them were unaware that retailers were using the technology. Two-thirds of them said they were concerned that profiles created using facial recognition could harm them.
It’s on the back of these findings that there are renewed calls for a moratorium on the use of the technology to surveil the public until further safeguards are in place.
The Australia Institute’s Centre for Responsible Technology director Peter Lewis said that CHOICE’s findings showed that the technology “is being deployed without necessary safeguards and redlines to protect the public.”
Lewis pointed towards a 2021 report released by the Hugh Rights Commissioner Human Rights and Technology calling for the temporary ban. The author of that report, the then-outgoing Human Rights Commissioner and now professor at the University of Technology Sydney Ed Santow, reiterated his concerns about the unchecked use of the technology.
“Even if that technology was perfectly accurate, and it’s not, but even if it were, it also takes us into the realm of mass surveillance,” Santow told CHOICE.
“And I think there will be great concern in the Australian community about walking down that path.”
No more shopping at Bunnings or Kmart. Oh well..
Be honest, when was the last time you shopped at either. You sound like an Aldi weirdo.
Retailers will get round this by offering triple bonus loyalty points to frequent faces.
Maybe if regulators had listened when people like me complained on behalf of clients about photos being posted in shop windows there might have been suitable legislation brought in to deal with the privacy issue before it got to this stage.
Ever since it became easy to run off photos on a printer small (and some not so small) retail outlets have posted photos of people they claim have shoplifted or committed other crimes in their shops. I have made numerous complaints to various authorities on behalf of clients who have been clearly shown in such photos, but to no avail. Absolutely no agency has ever shown any interest in the issue, despite the clear and obvious problems and dangers of such a practice. So it’s no surprise that these retailers think they can get away with what they are doing with impunity. I don’t know why government and statutory agencies refuse to learn the lesson that little things become big things if you let them.
“Facial recognition” is something of a misnomer. “Facial matching” might be a more accurate description. A software application matches certain data points on a photograph of one face to those of another facial photo.
In cases where the photos are clearly lit, static, complete and unimpeded (such as a typical ‘mug shot’ police photo) the probability of a correct match is quite high: better than 99% according to most system vendors. However, this level of probability can fall off very quickly in a real-life situation due to range of factors: deliberate use of things like face masks, or simply because of accidental environmental factors like bad lighting.
Companies using these systems therefore need to make a judgement call. If they set the matching probability too high (say, at 100%) then they run the risk of failing to identify a valid target because of a small deviation between the source photo and its comparison. Too low, and there is too great a risk of false positives, that is, someone who looks vaguely like a suspect is falsely identified when they are completely innocent.
FR systems can be a useful adjunct to human security teams when used properly, because they never lose concentration and can cope with high numbers of subjects. They can never be safely used by themselves, however. A human being needs to make the final call on whether or not a particular person should be investigated further. I doubt very much that the average Bunnings or K-mart employee has the skills necessary to do this.
If these companies choose to use such systems, it should be made plain to all customers that it is so. Furthermore, customers should be able to seek restitution if they are falsely identified.
If the second requirement is implemented, it will knock the business case for most deployments fair and square on the head.
Does it work on our darker skinned brothers and sisters yet?
Yes. Most algorithms work on constants such as facial structure rather than skin tone. This is why partial coverage such as sun glasses isn’t enough to avoid matching, which can still be done based factors such as size and shape of mouth, nose, etc.
Poorly – see my comment above and not just those of the tinted peruasion.
GIGO ensures that dodgy parameters, based on dubious ‘norms’ about physiognomy, produces dodgy results.
Who knew that ear & eye shapes/positions meant something?
“Investigating the people”? They are trying to spot shoplifters and people who abuse staff for wearing masks, not terrorists.
Shoppers can simply wear a mask (Covid-warranted) & sunglasses to maintain privacy.
Existing facial recognition software can correctly identify individuals even when wearing masks and sunglasses.
Thanks for that info. Fortunately I don’t patronise any of these stores.
really? my phone can’t, and it KNOWS me
But, inexplicably…coff coff…, not so hot if of an Other ethnicity.
Cannot imagine why.
Esp, pace the LexBot, given the donkey work on the onrushing Gibsonian naturalisation is mostly the excess male demogrraphic not getting any action.
Nothing worse than an incel.
Off your Meds again eh Whackjob?
How does that work?
Facial matching works on cranial topography – it would work just as well (not very) if the face were blacked out in silhouette.
The ears are what gives them away.
An interesting recent report in the Atlantic or New York Times (can’t recall which) that showed in their thorough tests with staff members and journalists that faces could be positively identified going back many years and even when the people were wearing masks, sunglasses or different hair styles or makeup. Truly breathtaking in its ramifications for now and into the future
Scary stuff. Makes you wonder what they’ve got that we are unaware of.