London Underground‘s Surveillance System is Testing AI Tools to Detect Crimes

London Underground, or the "the Tube" will be getting a more eagle-eyed surveillance system as AI software will be added to its functions. With the tool, crimes that are committed in the transportation station will immediately be detected.

Commuters using the London Underground
Getty Images

London Underground's AI Feature

The computer surveillance system in the London Underground helps authorities detect people who commit crimes, dodge fares, bring weapons, or even spot people who have suffered accidents. However, some people still still manage to evade this system.

To help with the job, AI has been integrated into the surveillance system which is used to analyze the movements, behavior, and body language of those who use the public transportation system, as reported by Ars Technica.

The machine-learning software along with the CCTV footage will be able to detect actions such as bringing out firearms or knives, or any other form of aggressive behavior. Transport for London already tested 11 algorithms for the current system that's being used.

During the trial to prove the AI system's ability to detect crimes and accidents, over 44,000 alerts were issued based on the live surveillance videos that were studied by the AI, 19,000 of which were sent to the station staff in real-time.

This was baked on a daily visitor number of 25,000 during the pandemic, wherein the AI system managed to detect specific actions including vaping, people going into unauthorized areas, as well pointing out objects like wheelchairs and vapes.

The test was first conducted with people passing through the Willesden Green Tube station. By December, Transport for London will expand the use of AI technology for detection across more stations in the British Capital.

Transport for London says: "By providing station staff with insights and notifications on customer movement and behaviuor they will hopefully be able to respond to any situations more quickly," as it uses "numerous detection models."

Why It Might Raise Concerns

It's already a known fact that the current state of AI technology right now is still far from perfect. Fortunately, the AI system used in the London Underground does not use facial recognition, which is still facing many problems regarding racial bias in identifying criminals.

Still, it uses the same kind of detection system such as analyzing behaviors and body language, which could be seen as something that's just as bad as facial recognition. It could raise a number of ethical concerns for the citizens who are watched by the surveillance system.

New York is already using similar technology for its subway stations to detect fare dodgers as well. According to Gizmodo, two dozen more stations were set to use the same AI feature before 2023 ended.

The system managed to detect many passengers who got through the subway turnstiles without paying, with 12% ducking under them, and 20% jumping over the obstacle. When this happens, ticket inspectors would be alerted through an app, along with CCTV footage of the offender.

To avoid issues with facial recognition and recording, the faces of the fare evaders are blurred out. Still, civil liberty groups expressed that this technology takes away resources from efforts that would reduce the prices of fares.

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost

Real Time Analytics