April CBA Report

F acial recognition software is now part of our common experience. Powered by machine learning algorithms, this technology allows consumers to unlock smart phones, orga- nize photographs, and tag friends on social media. Businesses use it to admit ticket holders, verify employees’ arrivals and depar- tures, personalize customers’ in-store shopping, and evaluate the candor of job applicants participating in video interviews. In addition, many law-enforcement agencies use facial recognition software to aid in the identification of suspects. Photo- graphs of most Americans are included in databases available to law enforcement for searches using this technology. Anecdotal reports highlight the key role played by this technology in crim- inal investigations, such as the FBI’s use of facial recognition software to identify a sex offender who had been a fugitive for almost 20 years. 1 Businesses and practicing lawyers should be aware of a range of issues raised by this technology, which is being rapidly deployed. Accuracy and Potential Bias Like use of any other technology, application of facial recog- nition technology requires an appreciation of its value and its limitations. Facial recognition software maps details of a person’s face, creates a “faceprint,” compares that faceprint to others stored in a database and predicts the likelihood of a match. Machine learning algorithms power facial recognition software. These algorithms are “taught” with human-curated data and change their predic- tions based on that input. The overall accuracy of facial recognition algorithms is improving by about 50 percent every two years. At the same time, recent research also indicates these algorithms are much less Facial Recognition Technology Promises Enormous Benefits and Poses New Risks accurate when analyzing images of women and people of color. For example, a recent study tested three commercial facial recognition programs and found an error rate for images of women with dark skin as high as 34.7 percent in contrast to an error rate of 0.8 percent for images of men with light skin. 2 Last year, the American Civil Liberties Union reported that one compa- ny’s facial recognition software incorrectly identified 28 members of Congress as people who had been arrested for a crime. 3 Eleven of these 28 members of Congress are people of color. In 2018, a civil liberties organization reported that London Metropolitan Police’s facial recognition software had a 98.1 percent false posi- tive rate. 4 In other words, less than two percent of the matches found by the software were accurate. One possible explanation for this discrepancy is developers’ use of insufficiently diverse training data. As a result, the data used to train the algorithms encodes a bias introduced by human psychology. Algorithms intentionally or unintentionally trained using data without enough images of women and people of color might be more prone to error, and using a more diverse set of images with a variety of lighting conditions may reduce or elimi- nate that bias encoded in the software. In the meantime, users of facial recognition software should seek to take into account the potential for such errors, particularly decisions that affect someone’s employment or even their liberty. Privacy Laws Governing Use by Businesses Companies using facial recognition software should be aware of new laws protecting the privacy of biometric information. No federal law comprehensively governs use of facial recogni- tion technology. A handful of states have enacted laws regulating companies’ collection, use, safeguarding, and/or storage of biometric information. Some of the statutes specifically mention By Ted T. Martin and William H. Hawkins III 8 l April 2019 CBA REPORT www.CincyBar.org Feature Article

RkJQdWJsaXNoZXIy MzE5Mw==