Amanda Levendowski

Posted in GJI Fellowship

G+JI Faculty Fellow 2021

Amanda Levendowski - Headshot

Amanda Levendowski J.D., Associate Professor of Law, Georgetown University Law Center

Amanda Levendowski, J.D. (she/her) is an Associate Professor at Georgetown Law, where her scholarship examines how intellectual property law can be used to creatively address challenging social issues that crosscut privacy and technology, such as nonconsensual pornography, biased artificial intelligence, secret surveillance technology, and invasive face recognition. Learn more on Amanda Levendowski.

Research Project: Resisting Face Recognition with Copyright Law

What is your research topic and why?

Face recognition technology used by law enforcement poses a unique danger to marginalized communities because of deep-rooted historic, demographic and deployment biases. Existing approaches to tackling face surveillance – criminal prosecution, corporate moratoria, and local legislation – have had limited success and require vast shifts in prosecutorial, practical or political will, each with its own drawbacks. Face recognition technology is fueled by copying and reproducing copyrightable profile pictures, and my research focuses on exploring whether copyright law offers a new pathway to resisting face surveillance.

Main Research Question(s)

Can we use copyright law to resist biased, invasive face surveillance?

Research Methodology

I drafted a law review article that is both descriptive and normative in its approach. Descriptively, the piece details the historic, demographic, and deployment biases of face surveillance and outlines the law and policy challenges posed by enforcing the Computer Fraud and Abuse Act, encouraging corporate moratoria, and enacting local legislative bans. Normatively, the piece argues and concludes that we ought to use copyright law to resist face surveillance without waiting for companies or Congress to catch up.

Significant or Surprising Findings

Yes! I was shocked by the historic biases that informed the development of face surveillance – the problems of nonconsensual data collection and use, as well as carceral applications, are as old as the technology itself. I was also surprised to learn how well-situated copyright law is to resist face surveillance – the case law on search engines and private subscription services provided a powerful analytical framework for assessing whether face surveillance companies could claim their business models qualified as fair use. (They cannot.)

Summary of Findings or Progress:

Face surveillance poses a unique danger to marginalized communities and public privacy, and developing a way to resist its use by law enforcement is urgent. Copyright law is generally used to enforce economic rights, but it is surprisingly well-positioned to resist face surveillance. Copyrightable profile pictures are used to train face recognition algorithms and respond to search queries. While some uses of copyrightable works to train AI systems are not infringing because of fair use, face surveillance is not one of those uses for three key reasons. First, face surveillance companies use scraped profile pictures for the same purpose as web users: particularized identification. Second, face surveillance companies claim to be transformative search engines, but their operations are more like private subscription services that are consistently found not to be fair use. And finally, copying and reproducing profile pictures harms the unique licensing market for these photographs, which only grows as companies and researchers increasingly reject scraped photographs as sources of training data. Because face surveillance does not qualify as fair use, each copy is intrusive, unauthorized, and ultimately infringing.

The key support is rooted in the relevant statutes and case law: 17 USC 106, 17 USC 017, Oracle v. Google, No. 18-956 (2021), Campbell v. Acuff-Rose Music, Inc., 510 U.S. 569, 579 (1994), Kelly v. Arriba Soft Corp., 336 F.3d 811 (9th Cir. 2003), Perfect 10, Inc. v. Amazon.com, Inc., 508 F.3 1146 (9th Cir. 2007), Associated Press v. Meltwater U.S. Holdings, Inc., 931 F. Supp. 2d 527, 543 (S.D.N.Y. 2013), Fox News Network, LLC v. TVEyes, Inc., 883 F.3d 169 (2d Cir. 2018), cert. denied 883 F.3d 169 (2018), Authors Guild v. Google, Inc., 804 F.3d 202 (2d Cir. 2015).

Together, these cases reveal that face surveillance companies are less like search engines, which are consistently found to be fair use, and more like private subscription services, which have never been found to be fair use. The conclusion is further supported by a range of scholarly work suggesting that copyright law can be invoked to protect privacy interests, as cited throughout the article.