Share to Google Classroom

Should law enforcement agencies have access to facial recognition technology?

As facial recognition software becomes more advanced, many law enforcement agencies are using these technologies to identify criminals. At the same time, many people are concerned about how facial recognition may be used to invade their privacy, and evidence shows algorithms in these tools reflect and perpetuate harmful racial and gender biases. As a result, many non-profit organizations, local governments, and the federal government have begun to ask if law enforcement agencies should have access to facial recognition software.


The Debate Over Facial Recognition Technology's Role In Law Enforcement

Amazon Halts Police Use Of Its Facial Recognition Technology

Communities come face-to-face with the growing power of facial recognition technology

Additional resources to think about

How police are using facial recognition on civilians
This story from a Canadian Broadcasting Corporation news program investigates which police departments across Canada use facial recognition software and what that means for civil liberties.

Is Facial Recognition Invading Your Privacy?
In this episode of Above the Noise, Myles explores both sides of the debate over the use of facial recognition technology.

How does facial recognition work?
This informational article from NortonLifeLock details how facial recognition software works, who uses it, and how it can become a privacy issue.

How China Is Using Facial Recognition Technology
A quick look from NPR at how the Chinese government is employing facial recognition, especially to surveil its Uighur Muslim ethnic minority population.

London police to use facial recognition cameras, stoking privacy fears
This article from the PBS NewsHour delves into how British law enforcement is using facial recognition throughout London.

San Francisco bans facial recognition technology
This story from CBS News discusses the California city's ban of police use of facial recognition technology, including the racial biases that come with the system.

Facial Recognition Bias | Greater Boston
This video from GBH News covers an MIT study that explored the racial and gender biases associated with AI and facial recognition technology.

Face It, You're Being Watched
This video from Bloomberg Quicktake explains why facial recognition's advance is so alarming to regulators, the public, and even the people developing it.


Who created this message?

  • What kind of “text” is it?
  • How similar or different is it to others of the same genre?
  • What are the various elements (building blocks) that make up the whole?


What creative techniques are used to attract my attention?

  • What do you notice (about the way the message is constructed)? 
  • What’s the emotional appeal?
  • What makes it seem “real?”
  • What's the emotional appeal? Persuasive devices used?

How might different people understand this message differently from me?

  • How many other interpretations could there be?
  • How could we hear about them?
  • How can you explain the different responses?

What lifestyles, values, and points of view are represented in, or omitted from, this message?

  • What type of person is the reader/watcher/listener invited to identify with?
  • What ideas or perspectives are left out?
  • How would you find what’s missing?
  • What judgments or statements are made about how we treat other people?


Why is this message being sent?

  • What's being sold in this message? What's being told? 
  • Who is served by or  benefits from the message
    – the public?
    – private interests?
    – individuals?
    – institutions?

5 Key Questions of Media Literacy used with permission from the Center for Media Literacy.
Copyright 2002-2021, Center for Media Literacy,


Should law enforcement agencies have access to facial recognition technology?

How was your Thinkalong experience?

We actively use feedback to provide better resources to students and educators, so please take 1 minute to provide feedback and help us improve.