Facial Recognition Is Improving — but the Bigger That Had Been Scanned
Total Page:16
File Type:pdf, Size:1020Kb
Feature KELVIN CHAN/AP/SHUTTERSTOCK KELVIN The Metropolitan Police in London used facial-recognition cameras to scan for wanted people in February. Fussey and Murray listed a number of ethical and privacy concerns with the dragnet, and questioned whether it was legal at all. And they queried the accuracy of the system, which is sold by Tokyo-based technology giant NEC. The software flagged 42 people over 6 trials BEATING that the researchers analysed; officers dis- missed 16 matches as ‘non-credible’ but rushed out to stop the others. They lost 4 people in the crowd, but still stopped 22: only 8 turned out to be correct matches. The police saw the issue differently. They BIOMETRIC BIAS said the system’s number of false positives was tiny, considering the many thousands of faces Facial recognition is improving — but the bigger that had been scanned. (They didn’t reply to issue is how it’s used. By Davide Castelvecchi Nature’s requests for comment for this article.) The accuracy of facial recognition has improved drastically since ‘deep learning’ tech- hen London’s Metropolitan extracted key features and compared them niques were introduced into the field about a Police tested real-time to those of suspects from a watch list. “If there decade ago. But whether that means it’s good facial-recognition technology is a match, it pulls an image from the live feed, enough to be used on lower-quality, ‘in the wild’ between 2016 and 2019, they together with the image from the watch list.” images is a hugely controversial issue. And invited Daragh Murray to moni- Officers then reviewed the match and decided questions remain about how to transparently tor some of the trials from a con- whether to rush out to stop the ‘suspect’ and, evaluate facial-recognition systems. trol room inside a police van. occasionally, arrest them. In 2018, a seminal paper by computer scien- “It’s like you see in the Scotland Yard, as the headquarters of the tists Timnit Gebru, then at Microsoft Research Wmovies,” says Murray, a legal scholar at the Uni- London police force is sometimes known, had in New York City and now at Google in Moun- versity of Essex in Colchester, UK. As cameras commissioned Murray and his University of tain View, California, and Joy Buolamwini at the scanned passers-by in shopping centres or Essex colleague Pete Fussey, a sociologist, to Massachusetts Institute of Technology in Cam- public squares, they fed images to a computer conduct an independent study of its dragnet. bridge found that leading facial-recognition inside the van. Murray and police officers saw But their results1, published in July 2019, might software packages performed much worse at the software draw rectangles around faces not have been quite what the law-enforcement identifying the gender of women and people as it identified them in the live feed. It then agency had hoped for. of colour than at classifying male, white faces2. Nature | Vol 587 | 19 November 2020 | 347 ©2020 Spri nger Nature Li mited. All ri ghts reserved. Feature Concerns over demographic bias have since passport control at national borders. small group of vendors where false positives been quoted frequently in calls for morato- One measure of progress is the Face Recogni- based on demographic differentials were unde- riums or bans of facial-recognition software. tion Vendor Test, an independent benchmark- tectable”, but that match rates could be compro- In June, the world’s largest scientific com- ing assessment that the US National Institute of mised by outdoor, poorly lit or grainy images. puting society, the Association for Com- Standards and Technology (NIST) in Gaithers- puting Machinery in New York City, urged a burg, Maryland, has been conducting for two False faces suspension of private and government use decades. Dozens of laboratories, both com- One-to-one verification, such as recognizing of facial-recognition technology, because mercial and academic, have voluntarily taken the rightful owner of a passport or smart- of “clear bias based on ethnic, racial, gender, part in the latest round of testing, which began phone, has become extremely accurate; here, and other human characteristics”, which it in 2018 and is ongoing. NIST measures the per- artificial intelligence is as skilful as the sharp- said injured the rights of individuals in spe- formance of each lab’s software package on its est-eyed humans. In this field, cutting-edge cific demographic groups. Axon, a maker of own image data sets, which include frontal and research focuses on detecting malevolent body cameras worn by police officers across profile police mugshots, and pictures scraped attacks. The first facial-recognition systems the United States, has said that facial recogni- from the Internet. (The US technology giants for unlocking phones, for example, were easily tion isn’t accurate enough to be deployed in Amazon, Apple, Google and Facebook have fooled by showing the phone a photo of the its products. Some US cities have banned the not taken part in the test.) owner, Jain says; 3D face recognition does bet- use of the technology in policing, and US law- In reports released late last year, the NIST ter. “Now the biggest challenge is very-high- makers have proposed a federal moratorium. team described massive steps forward in the quality face masks.” In one project, Jain and his Companies say they’re working to fix the technology’s performance during 2018, both collaborators are working on detecting such biases in their facial-recognition systems, and for one-to-many searches3 and for one-to-one impersonators by looking for skin texture. some are claiming success. But many research- verification4 (see also go.nature.com/35pku9q). But one-to-many verification, as Murray ers and activists are deeply sceptical. They argue “We have seen a significant improvement in found, isn’t so simple. With a large enough that even if the technology surpasses some face-recognition accuracy,” says Craig Watson, watch list, the number of false positives benchmark in accuracy, that won’t assuage an electrical engineer who leads NIST’s image flagged up can easily outweigh the true hits. deeper concerns that facial-recognition tools group. “We know that’s largely because of con- This is a problem when police must make are used in discriminatory ways. volutional neural networks,” he adds, a type of quick decisions about stopping someone. But deep neural network that is especially efficient mistakes also occur in slower investigations. More accurate but still biased at recognizing images. In January, Robert Williams was arrested at his Facial-recognition systems are often propri- The best algorithms can now identify peo- house in Farmington Hills, Michigan, after a etary and swathed in secrecy, but specialists ple from a profile image taken in the wild police facial-recognition system misidenti- say that most involve a multi-stage process — matching it with a frontal view from the fied him as a watch thief on the basis of blurry (see ‘How facial recognition works’) using deep database — about as accurately as the best surveillance footage of a Black man, which learning to train massive neural networks on it matched to his driving licence. The Amer- large sets of data to recognize patterns. “Every- “Systems are being brought ican Civil Liberties Union (ACLU), a non-profit body who does face recognition now uses deep organization in New York City, filed a com- learning,” says Anil Jain, a computer scientist to the wild without a plaint about the incident to Detroit police in at Michigan State University in East Lansing. proper evaluation of their June, and produced a video in which Williams The first stage in a typical system locates performance.” recounts what happened when a detective one or more faces in an image. Faces in the feed showed him the surveillance photos on paper. from a surveillance camera might be viewed in a “I picked that paper up, held it next to my face range of lighting conditions and from different facial-recognition software from a decade and said, ‘This is not me. I hope y’all don’t think angles, making them harder to recognize than ago could recognize frontal images, NIST all Black people look alike.’ And then he said: in a standard passport photo, for instance. The found. Recognizing a face in profile “has been ‘The computer says it’s you,’” Williams said. He algorithm will have been trained on millions a long-sought milestone in face recognition was released after being detained for 30 hours. of photos to locate ‘landmarks’ on a face, such research”, the NIST researchers wrote. ACLU attorney Phil Mayor says the technology as the eyes, nose and mouth, and it distils the But NIST also confirmed what Buolamwini should be banned. “It doesn’t work, and even information into a compact file, ranging from and Gebru’s gender-classification work sug- when it does work, it remains too dangerous less than 100 bytes to a few kilobytes in size. gested: most packages tended to be more a tool for governments to use to surveil their The next task is to ‘normalize’ the face, artifi- accurate for white, male faces than for peo- own citizens for no compelling return,” he says. cially rotating it into a frontal, well-illuminated ple of colour or for women5. In particular, Shortly after the ACLU complaint, Detroit view. This produces a set of facial ‘features’ faces classified in NIST’s database as African police chief James Craig acknowledged that the that can be compared with those extracted American or Asian were 10–100 times more software, if used by itself, would misidentify from an existing database of faces.