An Australian company claims it has developed artificial intelligence that can detect suicide attempts in prisons after training on real footage of incidents — and prisons around the world are already using the technology.
iOmniscient was one of dozens of companies exhibiting at the Security Exhibition & Conference in Sydney this week. Companies showcased new uses of artificial intelligence, facial recognition, licence plate scanners and robotics as surveillance methods that promise to keep people, businesses and their property safe.
iQ-Prison is iOmniscient’s surveillance product for prisons that, according to the company, can do everything from “detecting aggressive behaviours” like fighting to “track inmates’ movements with facial analytics” by using a combination of CCTV and artificial intelligence technology.
A promise to detect persons “attempting to commit suicide” stands out as its most unusual claim. A salesperson told Crikey the technology had been trained on footage “of the 30 seconds leading up to attempts” to be able to recognise patterns. A 2020 brochure promoting the use says that Hong Kong prisons have implemented the system through its subsidiary Wildfaces.
iOmniscient’s iQ-Prison is one of the company’s many offerings to different industries including policing, schools, healthcare and transport — including Sydney’s driverless trains project, according to its promotional material.
The company has courted controversy over the uses of its technology in the past. In 2017, Toowoomba Council faced a backlash for a trial using its products. A 2016 report found the company was selling surveillance technology to the Bahrain government as it battled mass demonstrations. In 2019, Bloomberg reported that there were suspicions the technology was being used by Hong Kong police — who had already used the company’s products for years — to crack down on the pro-democracy protesters.
An iOmniscient salesperson brought up the Bloomberg reporting while talking about the company’s media coverage in the past.
“[The media] blamed us for the technology being used to track protesters. The police had purchased it to track lost children,” he said. “What can we do? It’s just a tool.”
New applications of artificial intelligence for surveillance was a trend at the show, with a number of companies offering products. The Artificial Intelligence Group spruiked its ability to create a “custom algorithm” to fit customers’ needs. Examples offered by the company include monitoring staff productivity, theft, even “abnormal human behaviour”.
“Think of it as using your existing CCTV system as eyes and including our systems as a superhuman brain, recognising patterns and reporting abnormal events,” its promotional material says.
Another company, Network Optix, shows off how its software can be used as a backend to support AI to recognise objects such as mobile phones or guns from real-time video footage.
Former human rights commissioner and UTS Professor Ed Santow recommends taking claims from companies about artificial intelligence and facial recognition with a grain of salt.
When dealing with claims for more obscure uses of the technology like determining someone’s emotion through facial recognition, Santow disputes their effectiveness: “It’s largely junk science; it literally doesn’t work.”
Licence plate scanners are another popular form of technology promoted by security companies. This technology can turn a camera into a device that records the car type, model, colour, the state licence plate, time seen, even which direction the vehicle is headed. While heavily advertised to police — who’ve used the technology in Australia for nearly a decade — individuals and businesses are also now target markets for licence plate scanners.
“Engineered for roadways speeds, the system can be deployed in neighbourhoods, campuses, business districts,” one company’s promotional material said. “Help keep your community safer, easily and affordably,” read another.
Other technologies promoted at the expo include high-tech safes, RFID scanners, body cameras, thermal fire detection cameras, biometric employee management systems and even robotics.
KABAM Robotics’ Co-Lab Indoor Security Robot patrolled the expo floor throughout the day. Roughly five feet tall, the robot used 360-degree camera vision and Lidar (not dissimilar to radar but using light rather than sound) to navigate its way around the company’s booth at a slow pace, not unlike a Roomba.
The company sells it as a way to constantly monitor spaces, seemingly in lieu of security staff: “Its concierge capabilities and PA System and Siren Alert functions make Co-Lab your perfect surveillance partner and an essential member of your security team.”
With AI keeping an eye on you, do you feel safer… or scared? Let us know your thoughts by writing to letters@crikey.com.au. Please include your full name to be considered for publication. We reserve the right to edit for length and clarity.
Radar does not use sound to determine range and direction, it uses radio waves (RAdio Direction And Ranging = RADAR). If you do it with sound then it’s SONAR (SOund Navigation And Ranging).
Oh no. Are they going to put people in safe cells at the AI’s judgement. Safe cells involve having their clothes removed being given a flimsy gown and put into a seclusion room with a foam mattress on the floor and no blankets. With a camera. I think the safe cells are cruel and unusual punishment.
Inmates won’t admit when they are suicidal becasue they are so afraid of safe cells. Probably contributes to indigenous deaths by suicide because it undermines trust further so they won’t seek help.
It’s absolutely disgusting.
It is so reassuring to know that machines can detect ‘abnormal behaviour’.
Hopefully that would that include buying such systems.
Can’t wait for the Black Mirror episode.
Supposedly series 6 “soon”.
Like Tom Lehrer when he gave up satire ”no point in a world where Kissinger gets a Nobel Peace Prize” Charlie Brooker said that he feels like Alice, running as fast as possible just to stay in the same place as many of the themes, which were so off-the-chart in 2011, are now the norm.
We are creating for ourselves the means of our own destruction. Where not one can disengage, dispute or distance from another. Total surveillance can and will end human individuality, unique behaviours.
Saudi Arabia fully supports the tracking . . . of lost children??
Including in foreign countries where they seek sanctuary or even on the highs seas – see details on the alsehli sisters thread.