From Apple’s newly announced iPhone X to biometric banking systems in China, facial recognition is becoming a hot topic for a variety of sectors looking to streamline their services and products.
However, the question of whether the technology should be installed at railway hubs has been mired in controversy. Tensions bubbled recently when Germany’s Interior Ministry began trialling facial recognition software at Berlin’s Südkreuz station, much to the chagrin of local privacy and data protection agencies.
German federal police believe that on a wider scale, facial recognition could be used to fight terrorism and criminal activities in public transport hubs. The police say the technology will help security forces detect and deal with crimes before they are able to happen.
Nevertheless, the trial has caused many to re-assert the question of whether facial recognition software is even needed in railway stations, and if it could be helpful or harmful in the long run.
Recognising rail passengers
Accuracy will be a crucial factor in the Südkreuz trial. In a busy railway station, the identification system must be able to pick out the right faces in the crowd, particularly if its purpose is to enhance surveillance.
Historic trials in Germany have not yielded positive results. In 2006, a test at a railway station in Mainz found that the technology used at that time was only capable of recognising passengers from the front, and could be easily ‘spoofed’ (i.e. tricked using a photo or a cleverly devised face mask).
However, as facial recognition is being evaluated in more and more practical scenarios, experts believe they can create systems that capture far more detailed records of faces. One example is the Bristol Robotics Laboratory, which has been working on a biometric system that could be used in UK rail stations by 2020. The technology is reportedly able to capture the shape, texture and orientation of faces, before comparing these with a pre-established database.
“Our method recovers the face in both 3D and 2D. To do this we need to take at least three images when the face is illuminated from three different known locations,” says Lyndon Smith, professor at the Bristol Robotics Laboratory, who explains that “the system recovers the surface ‘normal’ (i.e. orientation) at each point over the face, which means it can be very high resolution”.
Today’s technology is both more accurate and considerably harder to abuse. “Regarding face recognition, our approach can provide higher reliability than 2D systems, because it is not spoofed by a photo, or changes in background light, and can recover the 3D face in high-resolution,” he adds.
Smith claims that, from a technological standpoint, the software could work well in a surveillance context. “Regarding security,” he says, “It could be trained to recognise suspicious behaviours over a period of time; for example, at airports customs declarations.
“We have developed a system that recovers 3D faces continuously and in real-time. This can work in low light levels or even complete darkness. This could be employed to help improve security at stations.”
The German Interior Ministry was contacted for comment, but was unable to provide an update on the Südkreuz trial. However, security critics believe that problems will arise, even if the software is not exceedingly accurate.
According to a report by German broadcaster Deutsche Welle, security experts estimate that the technology used will have a fail rate of one in one million. While this number may not sound egregious, across a public transit system carrying more than three million passengers a day, an error at the wrong time could prove disastrous.
Worries about the potential failures have been reflected by privacy advocates and law organisations in Germany, which claim that monitoring technologies have done little to stop criminals in the past.
Rena Tangens, chairwoman of data protection agency Digitalcourage, argues that video surveillance has not proven successful in the UK, which, according to numerous sources, is the most highly monitored nation in Europe.
“Video surveillance in the UK or data retention in France did not prevent the assaults happening there,” she says. “Very often the perpetrators were already known to the police, but this did not stop them committing the crimes.”
The issue of privacy
Regardless of how effective facial recognition software could be, the issue of privacy still hangs over its potential use. Digitalcourage was one of the organisations to protest the trials at Südkreuz on the basis that facial recognition tools violate fundamental human rights.
“Video surveillance with facial recognition affects everybody’s freedom of movement,” says Tangens, adding that, “It does not just see the villains. It captures the face of everybody passing by and even if the picture itself was not being stored, which we cannot know, the metadata – who was where and at what time – can easily be stored, giving a complete record of movement of fair and spotless citizens.”
Tangens says that the privacy issues make facial recognition software at rail hubs not only unnecessary, but actively problematic. She argues that the police should find alternatives that focus on rooting out offenders, such as training staff to deal with emerging threats more effectively.
“The police should not ask for technology that claims to be the easy solution, but invest in more and more competent staff,” she says. “Cameras do not keep anyone from committing a crime and they do not help the victims. They just give pictures afterwards that are used to scare the citizens even more to make them give up their fundamental rights.”
Ulrich Schellenberg, president of the German Bar Association, opposed facial recognition for similar reasons. If last year’s deadly Berlin attack was perpetrated by a person who was already under surveillance by security agents, he argued, why would this new technology make any difference?
“Improving security is not about uncovering something new, but rather to go after what we know more forcefully,” Reuters quoted him as saying.
Beating the queues
While the invasiveness of facial recognition has plagued its progress in western countries, China’s lax privacy laws have allowed the country to fully embrace the technology. Back in 2012, facial recognition software was installed at three stations on the high-speed Beijing-Shanghai rail line in a bid to catch fugitives.
More recently, China has invested in facial recognition software for a different application. At stations in Wuhan and Beijing, ticket gates have been replaced with biometric face scanners, which allow passengers to access train platforms by flashing an ID card and peering into a camera.
Smith and the Bristol Robotics Laboratory hope the biometric technology could one day be used in a similar way at UK railway stations. Instead of faffing around with tickets, passengers could simply use their face to walk straight through, putting a stop to mob-sized queues in ticketing halls.
Enhancing throughput at stations is at the heart of this research, and Smith states that “this is not intended to be any kind of Big Brother thing. We just want to develop technologies for those who want to use them to make their lives easier”.
Nevertheless, the onset of biometric systems, and their potential success, could lead rail operators to want to do away with ticketing systems, which could still harbour problems for those who don’t want to be ‘on the system’. Unlike China, where app companies can freely scour through user photos to pick out viable faces for research, the idea of facial recognition is one that is still facing tough opposition in the West.
When compared with China, the Western world’s tentative approach to facial technology appears to demonstrate the major issue facing its installation in transport hubs. The success of facial recognition systems is not solely based on effectiveness, but on how much freedom citizens are willing to sacrifice for improved security.