
Your Friday night time stroll via the French Quarter simply grew to become information for a facial recognition algorithm. For two years, New Orleans reworked into America’s most surveilled metropolis with out telling you about it.
Project NOLA, a nonprofit you’ve in all probability by no means heard of, quietly put in over 200 AI-powered cameras that scan your face each time you stroll previous. The system compares your features against a database of 30,000 individuals, sending instant alerts to police when it thinks it recognizes someone. Unlike your iPhone’s Face ID that protects your privacy, these cameras treat your biometric data like public property.
The Tech That Turns Every Street Into a Police Station
The cameras aren’t your typical security setup. These sophisticated systems can identify you from 700 feet away—even in poor lighting conditions that would stump your phone’s camera.
When the AI spots a match, officers receive a mobile alert faster than your DoorDash notification arrives. Your location, identity, and a confidence score appear on their screens within minutes.
Project NOLA’s founder Bryan Lagarde, a former cop turned tech entrepreneur, assembled the watchlist from police mugshots and arrest records. The system essentially turns every camera-equipped business into an extension of the police surveillance network.
The technology goes beyond simple identification. Upload any photo, and Project NOLA can trace that person’s movements across the entire city for the past 30 days. Your coffee shop visits, restaurant stops, and evening walks become a digital breadcrumb trail that would make Google’s location tracking look amateur—almost as if the next step is China introducing an autonomous spherical police robot to roll down Bourbon Avenue.
When Non-public Surveillance Meets Public Policing
Your rights acquired misplaced in a loophole designed by legal professionals who watch an excessive amount of CSI. New Orleans banned facial recognition in 2020, then relaxed restrictions in 2022 for violent crime investigations solely.
The catch? Police were supposed to get human oversight and report every use to the city council.
Instead, they received automated alerts from Project NOLA’s private network. No oversight. No reporting. No accountability.
Police reports from the 34 arrests incessantly omitted any point out of facial recognition know-how. Defendants never knew their arrests stemmed from AI surveillance, denying them the chance to challenge the evidence in court. “By adopting this system – in secret, without safeguards, and at tremendous threat to our privacy and security – the City of New Orleans has crossed a thick red line,” Wessler adds.
The Reality Check That Privacy Advocates Feared
Facial recognition errors aren’t theoretical—they’re documented disasters waiting to happen again. Randall Reid spent six days in jail after Louisiana deputies used Clearview AI to wrongly identify him in surveillance footage from a state he’d never visited.
Robert Williams was falsely arrested in Detroit for shoplifting he didn’t commit.
This unchecked expansion of surveillance technology represents a growing threat to privacy and democracy. Your digital anonymity in public areas simply evaporated with out your consent. New Orleans Police Chief Anne Kirkpatrick lastly suspended the system in April after media scrutiny intensified, however Challenge NOLA employees nonetheless obtain alerts and may relay data to officers by cellphone or textual content.
The question isn’t whether this technology works—it’s whether you want to live in a city where every face becomes a potential suspect.