The Rise of Citizen Surveillance
Facial recognition technology (FRT) is changing how governments keep people safe but it’s also stirring up big debates about privacy and freedom. This tool which uses cameras and computers to identify people by their faces is popping up in police stations, airports and even public spaces
⚖️ LAW AND GOVERNMENT
6/10/20255 min read
Understanding Facial Recognition Technology
Facial recognition technology scans a person’s face and compares it to a database of images to figure out who they are. It uses artificial intelligence (AI) to look at things like the shape of your nose, the distance between your eyes or the curve of your mouth. There are two main types: one-to-one matching which checks if you are who you say you are (like unlocking your phone), and one-to-many matching which searches a huge database to find a match (like spotting a suspect in a crowd). Governments use both types often with cameras in public places or to review video footage.
You might see FRT at airports to check IDs, in police work to find criminals or even in stores to watch for shoplifters. But as it spreads people are asking: Is this technology helping us or watching us too closely?
Why Governments Love Facial Recognition
FRT has some big upsides for keeping people safe and making government work easier. Here are the main benefits:
Catching Criminals Faster
Police can use FRT to match faces from crime scenes to databases of known suspects. For example after the January 6, 2021, attack on the U.S. Capitol law enforcement used FRT to identify some of the rioters. This saves time compared to old-school methods like asking witnesses or flipping through mugshots.Smoother Airport Security
The Transportation Security Administration (TSA) is testing FRT at places like Los Angeles and Dallas airports. Travelers scan their ID and face and the system checks if they match. This cuts down wait times and reduces mistakes by TSA agents making travel less stressful.Finding Missing People
FRT can help locate lost kids or elderly people with conditions like Alzheimer’s. Police have used it to scan public cameras or social media to find missing persons giving families hope and speeding up searches.Stopping Fraud
Governments use FRT to make sure people aren’t using fake IDs for things like driver’s licenses or passports. This helps keep systems secure and ensures services go to the right people.Keeping Crowds Safe
At big events like concerts or sports games, FRT can spot known troublemakers in real-time. After the 2001 9/11 attacks similar tech helped identify threats with few errors showing it can work well when used carefully.
The Risks to Our Freedom
Even though FRT can help with safety it comes with serious downsides that worry many people. Here’s why critics say it’s risky:
Invading Your Privacy
FRT can watch you without you knowing or agreeing to it. Companies like Clearview AI have taken billions of photos from websites like Facebook to build databases for police often without permission. This feels like someone sneaking into your life and collecting your personal info.Unfair to Some Groups
FRT isn’t perfect it makes more mistakes with women people of color, and especially Black and Asian folks. A 2019 study by the National Institute of Standards and Technology found Black women were misidentified up to five times more than white men. This led to real harm like when Robert Williams a Black man in Michigan was wrongly arrested in 2020 because of a bad FRT match. He was jailed for 30 hours before they realized the mistake.Scaring People from Speaking Out
If you know cameras with FRT are watching, you might think twice about going to a protest or a place of worship. The American Civil Liberties Union (ACLU) says this “chills” free speech meaning people stay quiet to avoid being targeted. For example FRT was used during 2015 protests in Baltimore after Freddie Gray’s death making activists feel watched.No Clear Rules
Many agencies use FRT without telling the public or having strict guidelines. The ACLU asked the FBI and other agencies for details about their FRT use but got no answers. This lack of openness makes it hard to trust that the tech is being used fairly.Governments Misusing It
In places like China FRT tracks groups like the Uyghur minority to control them. Even in democracies, there’s a risk of abuse. In the UK a man was fined for covering his face to avoid an FRT van which felt like the government was being too controlling.Breaking Legal Protections
Using FRT without a warrant might go against the U.S. Constitution’s Fourth Amendment which protects against unreasonable searches. The Supreme Court’s 2018 Carpenter ruling said tracking people without a warrant is illegal and critics argue FRT is even more invasive.
What’s Happening in Policy Debates
People aren’t just sitting back they’re pushing for change. Here’s what’s happening in the U.S. and beyond:
City Bans
Places like San Francisco, Oakland and Minneapolis have banned police from using FRT because of its risks. In 2023, New York became the first state to stop schools from using it saying it wasn’t useful and could harm students’ privacy.State Laws
States like California and Virginia have passed rules to limit how FRT is used. Washington state for example, bans using FRT based on race, gender or religion and stops it from tracking protests or other free speech activities. These laws try to keep the tech in check while allowing some uses.Federal Efforts
The U.S. doesn’t have a nationwide law on FRT yet but there’s talk about it. In 2020, lawmakers proposed the “Ethical Use of Face Recognition Act” to set up a group to make rules and pause FRT use until they’re ready. Another bill the “Fourth Amendment Is Not for Sale Act” would stop agencies from buying data from companies like Clearview AI without warrants.White House Actions
In 2021 President Biden signed an order asking agencies to check if AI including FRT is fair to everyone. The White House also suggested an AI Bill of Rights to protect people from tech abuses. But critics say these steps are too weak to fix the problem.Global Examples
The European Union has tough rules on FRT through its General Data Protection Regulation (GDPR) which requires consent for collecting face data. The EU’s AI Act also limits FRT in risky situations like mass surveillance. These ideas are inspiring some U.S. leaders to push for similar protections.
Lawsuits Shining a Light
Court cases are also exposing FRT’s problems and forcing change. Here are some big ones:
ACLU v. Clearview AI (2020)
The ACLU sued Clearview AI for grabbing billions of social media photos without permission, saying it violated privacy laws. This case showed how private companies can fuel government surveillance without anyone knowing.Robert Williams v. Detroit Police (2020)
Robert Williams the man wrongly arrested because of FRT sued the Detroit Police with the ACLU’s help. The case highlighted how errors hit Black people harder and how police didn’t tell him the match came from a computer.Macy’s Lawsuit (2020)
Shoppers sued Macy’s for using Clearview AI’s FRT without their okay breaking Illinois’ Biometric Information Privacy Act (BIPA). This law says companies need permission to use face data and the case showed how stores can misuse FRT.Everalbum Case (2021)
The Federal Trade Commission settled with Everalbum a photo app that used customer pictures to train FRT without consent. The company had to delete its FRT models setting an example for holding companies accountable.
How to Balance Safety and Rights
FRT isn’t all good or all bad it’s about finding a middle ground. Here’s how we could make it work better:
Make Clear Rules
Governments need laws that say exactly when and how FRT can be used, like requiring consent or banning it for tracking protests. Washington’s laws could be a starting point.Watch the Watchers
Agencies should have strict policies and training to avoid misuse. The U.S. Government Accountability Office found some agencies had no rules which led to problems. Regular checks could catch issues early.Listen to the Public
Decisions about FRT shouldn’t be made in secret. Groups with police tech experts, and everyday people like the Constitution Project’s task force can make sure everyone’s voice is heard.Fix the Tech
FRT needs to be more accurate especially for women and people of color. This means using better data to train it and testing it often to catch biases.Use Other Tools
Instead of relying on FRT governments could use less invasive options like better police training or community programs, to keep people safe.
🎉 Ready to Test Your Knowledge?
© 2025 All rights reserved