Cities Start Going Flockless, NYPD Cameras Lawsuit, CSAM & the Private Search Doctrine, OSINT Intro for Witness Social Media & More
Vol. 6, Issue 11

November 3, 2025
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. This month, Jon Campbell explains the growing pushback on Flock ALPR cameras. Joel Schmidt discusses a recent lawsuit brought by the Surveillance Technology Oversight Project against the NYPD, regarding the NYPD’s cameras looking into private homes. Jerome Greco reviews two recent federal court decisions about how CSAM investigations are conducted. Finally, Allison Young answers a question about how to find a witness’ social media accounts and posts.
The Digital Forensics Unit of The Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts, the Unit provides support and analysis to the Criminal Defense, Juvenile Rights, and Civil Practices of The Legal Aid Society.
In the News

The Flocklash
Jon Campbell, DFU Paralegal II
The use of automated license plate readers (ALPR) has been steadily expanding for the better part of two decades, and as we’ve detailed before, privacy advocates have been warning about potential abuses for just as long.
Even as it proliferates, courts have refused to restrict the government’s use of ALPR technology in any meaningful way, leaving it as unregulated as it is ubiquitous. The upshot is that networks once used to flag stolen cars are now a routine part of all kinds of police work, along with distressingly popular off-label uses like stalking and harassing ex-wives and girlfriends.
No single company has turbo-charged ALPR’s spread in recent years more than Flock Safety. With consumer-friendly branding and hardware, Flock markets itself aggressively outside of traditional channels, targeting police departments of all sizes, but also school districts, apartment complexes, and even homeowners’ associations. Founded in 2017, Flock has quickly become the largest ALPR provider in the U.S., with more than 6,000 organizations and 80,000 cameras feeding into its network [PDF]. (By contrast, the NYPD’s ALPR provider, Motorola Solutions, formerly Vigilant Solutions, was founded in 2005 and boasts [PDF] just 3,700 organizations on its roster.)
But that kind of growth brings scrutiny. Flock’s recent entanglement in two contentious policies — the Trump administration’s increasingly brutal deportation frenzy, and the crackdown on reproductive rights by some state and local authorities — has prompted a surge in advocacy against the most recognizable name in ALPR.
Call it the Flocklash.
Over the past 9 months, communities big and small — including Los Angeles County, Boulder, CO, Eugene, OR, Austin, TX, Berkeley, CA, Sedona, AZ, Scarsdale, NY, and Evanston, IL, among others — have taken steps to rein in Flock. Some have changed their data sharing policies to prevent immigration authorities from tapping their systems, while others have cancelled their contracts outright.
In some cases, the pushback is being led by government, as in Illinois, where the Secretary of State in August determined the company’s policies violate state law. That probe began after a Texas sheriff used Flock data to criminally investigate a woman for obtaining an abortion. The sharing of Flock data with immigration authorities prompted a similar finding by California’s Secretary of State.
Elsewhere, the backlash is bubbling up from below. In Austin, TX, local elected officials nodded to resident pressure and cancelled their Flock contract this summer. Hillsborough, NC, did the same just a few days ago, by unanimous vote, no less.
Things haven’t always gone smoothly, even after the votes are cast. When leaders in Evanston, IL, voted to dump Flock, the company initially removed their cameras from public property at the city’s direction. It then inexplicably reinstalled them two weeks later. The city finally resorted to wrapping the cameras in black plastic bags and duct tape.
Flock seemingly recognizes their PR problem. The company has been sending representatives to community meetings to defend themselves. Flock’s CEO Garrett Langley has also tried to get in front of the story, doing a series of media hits talking up Flock’s crime-fighting ability.
Langley, for his part, isn’t burdened with worries about privacy. He attributes the recent outcry to “misinformation,” even as he acknowledges local data was improperly shared with the feds. (Langley also isn’t burdened by modesty; his declared vision for Flock is to end all crime in the United States, full stop, and to do so in the near future. As he recently told a podcast host: “I do generally think that, in 10 years from now, Flock will have eliminated crime in America.”)
The advance of surveillance technology tends to march in only one direction, so the pushback against Flock is notable. In tiny Sedona, Arizona, the campaign against Flock was spearheaded by a single local resident, Sandy Boyce, who organized her neighbors and petitioned the city council. Boyce told AZCentral.com that she hoped residents in other communities would do what they did, rather than just accepting the status quo.
“We don’t have to just look at them and be annoyed,” Boyce said of the unwanted devices, and the increasingly pervasive surveillance state they represent. “We can actually do something.”

New York: Where It’s Always 1984
Joel Schmidt, Digital Forensics Staff Attorney
What would you do if the NYPD positioned a surveillance camera directly into your home? What if it was your bedroom or living room? If that sounds far-fetched, think again. Pamela Wridt and Robert Sauve of Brooklyn recently filed a federal lawsuit against the City of New York alleging NYPD surveillance cameras can and did do exactly that.
According to the lawsuit, the NYPD affixed a pair of constantly-recording surveillance cameras directly outside their home aimed at their bedroom and living room windows. These cameras “transformed what should be their place of safety into a space of anxiety” causing them to cover their windows with foil “depriving themselves of sunlight and the simple enjoyment of looking outside.”
As alleged, the NYPD’s CCTV cameras feed into a central surveillance repository known as the Domain Awareness System (DAS) [PDF], which also ingests data from automated license plate readers, ShotSpotter, arrest and complaint reports, and 911 calls, among numerous other sources of information. “From the day it was launched, the DAS has subjected New Yorkers to suspicionless, city-wide surveillance that undermines their rights,” the lawsuit states. “It is an unprecedented violation of American life and now stands as one of the largest surveillance networks operated anywhere in the world.”
“No other police department in the country has ever amassed this much data. Tens of thousands of NYPD cameras, tens of thousands of private cameras, automated license plate readers, transit data, drone footage, all social media monitoring, predicted policing software,” said Albert Cahn, the founder of the Surveillance Technology Oversight Project, “It is basically a living Orwellian nightmare.” He warned that “The NYPD may be the worst offender, but they’re also a model for the nearly 18,000 state and local police departments across the country that are increasingly acting like mini-NSA and CIA operations. American policing runs on data.”
The lawsuit argues this pervasive surveillance infringes on the Fourth Amendment to the United States Constitution because it is a severe invasion of one’s privacy, and on the First Amendment because it infringes on the right to free association and free expression.
As reported by Bloomberg Law, the plaintiffs “want the city to cease using DAS until remedial measures are in place to protect privacy rights, prevent the city from accessing data without a warrant, and require the city delete all data stored in the system after 90 days.”
Welcome to New York, where it’s always 1984.
In the Courts

CSAM, Hash Values, and Search Warrants
Jerome D. Greco, Digital Forensics Director
Modern cases involving child sexual abuse materials (CSAM), still referred to as “child pornography” under federal law, almost always involve some form of technology: P2P sharing, onion services, hash values, cloud storage, etc. Adding in this additional wrinkle of the necessity of tech competency, which many courts seem to lack, increases the likelihood of a bad ruling or factually inaccurate testimony being admitted.
However, courts appear to be getting a better grasp on the intersection of technology and CSAM investigations, and the effects it has on the privacy rights of all people who use the internet. In two recent decisions, United States v. Guard, 152 F.4th 375 (2d Cir. 2025) and United States v. Braun, No. 24-CR-164 (E.D. Wis. 2025), the courts discussed the issues with electronic service providers (ESPs) reporting to the National Center for Missing & Exploited Children (NCMEC), file hashing, and the private search doctrine.
In Guard, the mobile messaging app Kik reported to NCMEC, through the latter’s CyberTipline, that Microsoft PhotoDNA detected that one of Kik’s users had shared CSAM through their service. After the files were flagged, a Kik employee reviewed the files and confirmed that they were CSAM. NCMEC identified the IP address of the user as being located in New York and forwarded the report to the New York State Police. Upon receipt, the NYS Police reviewed the files and after some additional investigation obtained a search warrant for the defendant’s home and arrested him. The Second Circuit found that although NCMEC is a private nonprofit, it is also a government entity for Fourth Amendment purposes, but that the defense failed to meet its burden to show that “there was a sufficiently close nexus” between Kik’s searches and the government, but left open the possibility for defendants in future cases to show that an ESP acted as a government entity. Additionally, in a footnote, the court seemed to lay a path for a potentially more successful challenge, which was not raised in the instant case. The defense only argued the “close nexus” test to try to establish that Kik was a state actor, but there is also the “compulsion test.” A future defendant could argue that an ESP was a state actor because it was “coerced or compelled by NCMEC to conduct the searches.” Considering the legal mandate for an ESP to report to NCMEC when it has actual knowledge that CSAM exists on its platform, compulsion would appear to be a strong argument. The defense would still likely have to counter the prosecution’s foreseeable argument that ESPs are not required to “affirmatively search, screen, or scan” for CSAM and therefore are not compelled by the government.
In Braun, NCMEC received CyberTipline reports from Microsoft and Google. However, similar to United States v. Maher, 120 F.4th 297 (2nd Cir. 2024) and unlike in Guard, neither ESP had a human review the files to confirm they were CSAM before sending the report and files to NCMEC. The report and files were viewed by law enforcement and the evidence was used to obtain a search warrant, which led to the arrest and conviction of the defendant. The Braun court recognized that the Seventh Circuit had yet to rule on
“whether the private search doctrine authorizes law enforcement to conduct a wireless examination of the contents of a digital file where the ESP has not visually inspected the contents of the file but instead relied on hash matching in making a CyberTipline report.”
Additionally, the four circuits that had decided this issue were split. The Fifth and Sixth Circuits found that the private search doctrine did apply to these circumstances, whereas the Second and Ninth Circuits determined that it did not. The Braun court aligned itself with the latter, finding that the private search doctrine “does not authorize government authorities to conduct a warrantless human visual examination of the contents” of a file reported to be CSAM, solely based on a hash value match. Similar to Maher, the court determines law enforcement should get a warrant first and also warned that they should provide facts in a search warrant application establishing the reliability of the hash matching.
Ask an Analyst
Do you have a question about digital forensics or electronic surveillance? Please send it to AskDFU@legal-aid.org and we may feature it in an upcoming issue of our newsletter. No identifying information will be used without your permission.
Q. I need to research the social media presence of a party in a case. We suspect they may be posting about the client or the alleged incident. However, I only have the party’s name and general location. What are some online tools you would recommend using to gather data in this situation?
A. This is a great question and one that’s hard to answer with a particular recipe or list of links. OSINT (open source intelligence) is the name for investigations that entail gathering and reporting with “open source” information, like public social media profiles, dark web forums, blockchain records, and search engine results. You probably already use OSINT without knowing when you look up photos of a restaurant to try to guess the dress code, or if you’ve ever Googled people you’ve met at parties.
When it comes to free tools, the OSINT Framework website is a common starting point for finding resources (although many links are “dead”). I recently found my personal information on services like That’s Them and ZabaSearch,* despite an active subscription I maintain for a counter-service that tries to remove this information from the internet.
You can get very far in researching someone with “dorking,” which I use every day. Google dorking is a method to run advanced searches on engines like Google to refine the results you see by date, negative keyword, filetype, website, and more. For example, if I wanted to get an idea of what fashion trends in the 2010s were, but don’t want to see anything with the word floral, I might type “new fashion -floral site:reddit.com before:2020 after:2010” into Google and check the image results. Google search results for individuals generally include a large number of yellow pages style results that you can cross reference to gather information, or obituaries referencing your target that include a detailed family tree.
If you are comfortable with running tools from the terminal, there are several, like Sherlock, that are useful for further enumerating data about a target. However, search results will contain false positive information, like incorrect addresses or the phone numbers of relatives (instead of the target). Like AI, public information is known to “hallucinate,” so be sure to validate any information gathered.

Once you have a generic profile, you can use that to enrich your social media searches. A person’s home address and employer can help you identify businesses they might follow on Facebook. Lists of potential family members can lead you to more easily identifiable public accounts that you can then include in your review. While your target may use an alias for social media, their parents or siblings may use real names and may tag the alias in posts about the family.
There are also paid OSINT tools you can use. Services like OSINT Platform and Irbis PRO are used by some law enforcement agencies to run social media searches. If you have someone’s photo, there is a growing market for facial recognition tools like PimEyes that is not only available to law enforcement but also offered to consumers for a price.
These suggestions may feel creepy, especially to the readers of a newsletter that discusses privacy issues. Your gut feeling isn’t wrong. Some OSINT tactics operate in the disturbing gray market for personal information that is traded by data brokers. A common practice, creating fake personas, or sock puppet accounts, violates the terms of service of social media sites and may violate rules of professional responsibility for law firms, but is still practiced by law enforcement investigators. When investigating someone, proceed mindfully and proceed with caution.
* Both of these services are paywalled and blur personal information, but the blur is a style applied to the text. It is possible (observed November 1st, 2025) to remove the blur and view text content using the developer tools that are built into popular browsers like Chrome and Edge.
Upcoming Events
October 27 - November 6, 2025
NYC Public Interest Technology Pop Up (CUNY PIT Lab, BetaNYC, and others) (New York, NY)
November 13, 2025
EFFecting Change: This Title Was Written by a Human (EFF) (Virtual)
November 16-22, 2025
SANS DFIRCON Miami 2025 (Coral Gables, FL and Virtual)
November 18, 2025
Digital Dilemmas Ethics Rules and Responsibilities in an Electronic World (NYSBA) (Virtual)
From the Workplace to ICE: Stop the AI Surveillance Pipeline Now! (PowerSwitch Action, Data & Society, Just Futures Law, & others) (Virtual)
November 19, 2025
18th Annual BCLT Privacy Lecture: Blanket Opt-Outs (UC Berkeley Law) (Berkeley, California)
November 20, 2025
Understanding AI: Standing Up for Human Value in the AI Economy (Data & Society and The NY Public Library) (New York, NY and Virtual)
December 17, 2025
Beyond the Basics: Advanced Cybersecurity for Lawyers (NYC Bar) (Virtual)
March 9-12, 2026
LegalWeek (ALM Law.com) (New York, NY)
March 18-22, 2026
SANS OSINT Summit & Training 2026 (Arlington, VA and Virtual)
April 23-25, 2026
19th Annual Making Sense of Science: Forensic Science, Technology & the Law (NACDL) (Las Vegas, NV)
Small Bytes
ChatGPT image snares suspect in deadly Pacific Palisades fire (BBC)
New York court system sets rules for AI use by judges, staff (Reuters)
The Surveillance Empire That Tracked World Leaders, a Vatican Enemy, and Maybe You (Mother Jones)
When Face Recognition Doesn’t Know Your Face Is a Face (Wired)
Teen Sues Maker of Fake-Nude Software (Wall Street Journal)
The Age of AI Anxiety — and the Hope of Democratic Resistance (Tech Policy Press)
Amazon’s AWS outage caused internet-enabled mattresses to malfunction (Washington Post)
Baltimore County student shocked by school’s AI Detection mishap (WMAR 2 News)
DHS Wants a Fleet of AI-Powered Surveillance Trucks (Wired)
Someone Snuck Into a Cellebrite Microsoft Teams Call and Leaked Phone Unlocking Details (404 Media)
You Can’t Refuse To Be Scanned by ICE’s Facial Recognition App, DHS Document Says (404 Media)





