Amanda Levendowski, J.D. (she/her) is an Associate Professor at Georgetown Law, where her scholarship examines how intellectual property law can be used to creatively address challenging social issues that crosscut privacy and technology, such as nonconsensual pornography, biased artificial intelligence, secret surveillance technology, and invasive face recognition. Learn more on Amanda Levendowski.
Research Project: Resisting Face Recognition with Copyright Law
Face recognition technology used by law enforcement poses a unique danger to marginalized communities because of deep-rooted historic, demographic and deployment biases. Existing approaches to tackling face surveillance – criminal prosecution, corporate moratoria, and local legislation – have had limited success and require vast shifts in prosecutorial, practical or political will, each with its own drawbacks. Face recognition technology is fueled by copying and reproducing copyrightable profile pictures, and my research focuses on exploring whether copyright law offers a new pathway to resisting face surveillance.
Main Research Question(s)
Can we use copyright law to resist biased, invasive face surveillance?
Research Methodology
I drafted a law review article that is both descriptive and normative in its approach. Descriptively, the piece details the historic, demographic, and deployment biases of face surveillance and outlines the law and policy challenges posed by enforcing the Computer Fraud and Abuse Act, encouraging corporate moratoria, and enacting local legislative bans. Normatively, the piece argues and concludes that we ought to use copyright law to resist face surveillance without waiting for companies or Congress to catch up.
Significant or Surprising Findings
Yes! I was shocked by the historic biases that informed the development of face surveillance – the problems of nonconsensual data collection and use, as well as carceral applications, are as old as the technology itself. I was also surprised to learn how well-situated copyright law is to resist face surveillance – the case law on search engines and private subscription services provided a powerful analytical framework for assessing whether face surveillance companies could claim their business models qualified as fair use. (They cannot.)
Summary of Findings or Progress:
Face surveillance poses a unique danger to marginalized communities and public privacy, and developing a way to resist its use by law enforcement is urgent. Copyright law is generally used to enforce economic rights, but it is surprisingly well-positioned to resist face surveillance. Copyrightable profile pictures are used to train face recognition algorithms and respond to search queries. While some uses of copyrightable works to train AI systems are not infringing because of fair use, face surveillance is not one of those uses for three key reasons. First, face surveillance companies use scraped profile pictures for the same purpose as web users: particularized identification. Second, face surveillance companies claim to be transformative search engines, but their operations are more like private subscription services that are consistently found not to be fair use. And finally, copying and reproducing profile pictures harms the unique licensing market for these photographs, which only grows as companies and researchers increasingly reject scraped photographs as sources of training data. Because face surveillance does not qualify as fair use, each copy is intrusive, unauthorized, and ultimately infringing.