Real-Time Facial Recognition, White House AI Executive Order, Failure to Preserve Video, State of the Surveillance State & More
Vol. 7, Issue 1
January 5, 2026
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. This month, Jon Campbell discusses the expansion of real-time facial recognition to body-worn cameras. Max Behrman covers the effects of the White House’s Executive Order on AI. Joel Schmidt examines the Exclusionary Rule as it applies to unpreserved police surveillance video. Finally, Jerome Greco gives the second annual State of the Surveillance State Address.
The Digital Forensics Unit of The Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts, the Unit provides support and analysis to the Criminal Defense, Juvenile Rights, and Civil Practices of The Legal Aid Society.
In the News

Real-Time Facial Recognition Crosses a Threshold
Jon Campbell, Digital Forensics Paralegal II
Facial recognition technology (FRT) crossed a worrying threshold in December with news that Axon would begin a “field research” of the tech on their ubiquitous body-cams. The one-month pilot will equip fifty police officers in Edmonton, a city of just over 1 million people in Canada’s western prairies, with body-cams that have the ability to scan and ID passersby in real time.
In a statement signed by CEO Rick Smith, Axon explicitly describes the program as a prelude to expanded use of FRT on body-cams in the U.S. and elsewhere. The real-world test, he wrote, will allow the company to apply its “learnings” to a future release here at home.
Facial recognition is already in routine use by police in the U.S., but typically only as a retrospective tool. The NYPD uses FRT to identify suspects in crimes of all kinds — with many errors made along the way — by comparing stills from surveillance cameras, for example, with a proprietary database of booking photos. Many other departments use the technology in similar ways.
But as far as is publicly known, no U.S. police agency currently uses the kind of constant, real-time FRT that Axon is unleashing in Canada. (The one exception is arguably the city of New Orleans, which uses a real-time FRT system built and maintained by a nonprofit organization, indirectly leveraging the technology while technically not maintaining a system of their own.)
According to local reporting, Axon’s cameras will scan the face of any unlucky Edmontonian who passes through their field of vision, comparing each to a database of mugshots. For this one-month trial run, the cameras are on the lookout only for individuals wanted for serious crimes like “murder, aggravated assault and robbery,” the Edmonton Journal reports. The city says it will delete the stills used for FRT after the trial period, a measure meant to mitigate privacy worries.
If the emphasis on such targeted use — sometimes called “hot list” functionality — sounds familiar, it’s because it echoes the framing used by law enforcement when automatic license plate reader (ALPR) technology was adopted widely by police agencies more than a decade ago. Back then, police downplayed ALPR’s potential for mass surveillance. Rather, the “principle objective [was] to identify vehicles that are wanted,” as a representative for the International Association of Chiefs of Police explained in 2013.
Civil libertarians warned then that as both ALPR cameras and data storage became ever more affordable, it would be trivially easy to assemble massive databases of routine movements, accessible to police for any reason at all.
In the years since, of course, that scenario has played out exactly as feared. If the past is any guide, the experiment in Canada may mark the start of a new era for another dangerous and invasive technology.

White House Executive Order Threatens State Regulations of AI
Max Behrman, Digital Forensics Staff Attorney
Just before ringing in the new year, on December 11, 2025, the White House issued an executive order prohibiting states from enacting their own laws to regulate artificial intelligence (AI).1 The administration’s purported rationale for preventing individual states from regulating this novel, oftentimes dangerous technology was to both ensure a competitive edge against China and help tech companies avoid navigating a 50-state patchwork of AI regulation. This means that at least for now, with no federal guardrails in place, tech companies can continue over-promising AI’s capabilities to any industry that will listen. And one industry champing at the bit is law enforcement.
This is a huge problem because we know today’s AI platforms make mistakes all the time. So before law enforcement is allowed to use such unreliable tech — putting aside for now whether police should ever be allowed to use it — states must be permitted to explore ways to prevent (or at least mitigate) the potentially severe consequences from police using AI.
There are plenty of examples of government officers employing AI. Whether it’s to argue someone should remain in jail, help them draft use-of-force reports, or try to identify people based on their face or behavior, law enforcement has demonstrated an eagerness to leverage AI’s ability to sift through vast troves of data to help them solve crimes and surveil us. But the fact that these AI platforms can get things so wrong means we must: have more insight into how these platforms function; ensure they continually undergo thorough validation testing; and better understand how their relied-on training data is collected and used, before putting these tools in the hands of those who can deprive us of our liberty.
The good news is that some attempts are already being made: a recently enacted California law requires that when an officer uses AI to prepare a police report, that report’s first draft, along with an audit trail, must be kept “for as long as the official report is retained.” In Montana, the governor signed a bill that, barring some concerning exceptions,2 prohibits law enforcement and other government entities from using AI for classification or surveillance purposes. And closer to home (albeit not specifically tied to law enforcement), Gov. Kathy Hochul recently signed the Responsible AI Safety and Education (RAISE) Act, requiring that AI companies prepare and annually review “written safety and security protocol[s],” conduct regular independent audits, and disclose critical safety incidents within 72 hours of learning about them.
No one can know for sure how technology will develop over time, and the same is true for AI. Perhaps someday we’ll be able to sufficiently scrutinize it so that we can, with certainty, understand how it comes up with responses to user queries. Perhaps someday we’ll ensure the Big Tech companies that peddle AI are being fully transparent about its functionalities and underlying training data. But right now, AI is not ready for prime time.3 It is a beta product, and we are the beta testers. And until it is ready — if it ever is — those with the power to use force cannot be allowed to, without any oversight, rely on this fallible technology in conjunction with their already-powerful arsenal of surveillance tools. With no federal statutes or regulations in place, it’ll be on state legislators and governors to hold law enforcement agencies accountable as they try to push the bounds of AI-assisted policing.
In the Courts

Digital Evidence and the Fourth Amendment
Joel Schmidt, Digital Forensics Staff Attorney
As we begin 2026 we need no reminder that our world has become increasingly digitized, and unsurprisingly criminal defense attorneys are routinely encountering digital evidence in criminal cases. The prevalence of digital evidence requires vigilance to ensure the government doesn’t benefit when it destroys digital evidence that may have been helpful to a person’s criminal case. Especially as it may be easily destroyed if not properly and promptly preserved.
Attorneys are familiar with the legal maxim that a right without a remedy is not a right at all. Many an authoritarian regime exists in a country that technically has a constitution affording its citizens significant fundamental rights, but an unenforceable constitution is nothing but a piece of paper. What good are rights if they can be violated at will?
In the United States the Exclusionary Rule serves as an important mechanism for ensuring our government complies with the Fourth Amendment right to be free from “unreasonable searches and seizures.” It does so by disincentivizing constitutional violations. Under the Exclusionary Rule, if the government violates the Fourth Amendment any evidence obtained as a result of that violation is considered “fruit of the poisonous tree” and may be deemed inadmissible for use against you at trial (subject to certain exceptions). The Exclusionary Rule is a recognition that the government should not be allowed to benefit from violating the Constitution.
Suppose you’re driving a car in Rochester, New York and a police officer pulls you over, ostensibly for driving without a seat belt. As you reach into your pocket to retrieve your driver’s license the officer observes an unlicensed handgun holstered to your leg. Suddenly you’re in hot water. You’re charged with Criminal Possession of a Weapon in the Second Degree and facing serious prison time.
If you were in fact wearing your seat belt that car stop would have been an unconstitutional violation of the Fourth Amendment. Under the Exclusionary Rule the recovered firearm may be deemed inadmissible at trial, quite possibly resulting in the dismissal of the weapons charges.
Let’s further suppose the Rochester Police Department failed to preserve footage from surveillance cameras it operates in the area. Footage that very likely would have shed light on whether you were wearing your seat belt. At the pre-trial suppression hearing, held by the judge to determine whether your rights were violated, an attorney should argue that the failure to preserve the footage should result in the judge applying an “adverse inference” against the government. With an adverse inference the court is permitted to assume that the missing evidence would have been favorable to the accused.
Let’s suppose even further that your lawyer fails to make any such argument to the judge. The judge finds no constitutional violation and you end up pleading guilty in a plea bargain. The opportunity to use the destruction of evidence to help your case is squandered.
The above hypothetical is drawn from a real case. In People v. Evans, 2025 NY Slip Op 06477 (2025), the New York Appellate Division, Fourth Department, held that in the scenario described above “the single omission of failing to request that the court consider an adverse inference charge at the suppression hearing” resulted in a separate violation of the defendant’s constitutional rights, the Sixth Amendment right to the effective assistance of counsel.
“Although defense counsel otherwise competently represented defendant, we conclude that the single omission of failing to request that the court consider an adverse inference charge at the suppression hearing deprived defendant of meaningful representation. Defense counsel’s error in failing to make that argument was sufficiently egregious and prejudicial as to deprive defendant of his constitutional right to effective legal representation because the only evidence presented by the People at the hearing was testimony from one of the arresting officers, whose testimony was inconsistent at times, and an adverse inference charge could have successfully undermined the officer’s testimony on the issue of probable cause to stop defendant, i.e., whether defendant was, in fact, not wearing a seatbelt.”
To rectify the defendant’s Sixth Amendment right-to-counsel violation the appellate court sent the case back to the lower court to give the defendant the opportunity to reopen the suppression hearing and request an adverse inference based on the government’s failure to ensure the preservation of potentially relevant surveillance footage. If the adverse inference moves the needle enough for a finding that there was a Fourth Amendment violation, the appellate court held “the plea is vacated and the indictment is dismissed.” A dismissal forced by the Exclusionary Rule.
This case serves as a useful reminder that as digital evidence becomes more routine, it is also potentially more likely to be destroyed than traditional evidence. The government should not benefit when it is the cause of that destruction.
State of the Surveillance State
The 2nd Annual State of the Surveillance State Address: The Targeting of Immigrants was Foreseeable
Jerome D. Greco, Digital Forensics Director
Despite our melting pot moniker, the United States has a history of surveilling and targeting our immigrant population, including the use of internment camps. Realistically, it does not take much for people looking to cast blame to focus on immigrants, even though it requires most of them to ignore their own familial histories.
As law enforcement continued to grow its surveillance arsenal, it was inevitable that they would turn their tools on immigrant communities. While far from a new issue, this year saw an explosion of anti-immigrant rhetoric used to justify the expansion of the surveillance state. The list of tools that are being used for immigration enforcement reads like the table of contents to a guide on how to create a dystopian nightmare. The litany of technology being deployed includes social media monitoring, databases and data collection, AI, DNA collection of children, automated license plate readers, facial recognition, spyware, drones, and mobile device data extraction. Throughout the year our newsletter covered many of these invasions of privacy, but the reality is that there were so many happening so quickly, we would have needed to start a separate weekly edition dedicated to immigration surveillance to even try to keep up. If you would like to read more, 404 Media and Wired have done an excellent job this year on regularly reporting on the intersection of technology and immigration enforcement.
For those of you reading who have told yourselves that high tech surveillance was justified to “get the bad guys” or to “prevent terrorism,” I suspect most of you were not including all of the people who have been swept up in recent raids or targeted arrests. Beyond some of the examples of high-profile unlawful detainments like those of Rümeysa Öztürk and Kilmar Abrego Garcia, data obtained by the Deportation Data Project and analyzed by The New York Times shows that “[l]ess than 30 percent of the people arrested in [the ICE operations in Chicago, Los Angeles, Massachusetts, and Washington, D.C.] had been convicted of a crime…and a very small share had been convicted of a violent crime. The most common non-violent convictions were for driving under the influence and other traffic offenses.” Furthermore, I also assume most of you did not support the torture of immigrants (or hopefully anyone), like the conditions reported on in the spiked 60 Minutes segment on the CECOT prison.
Even if you do not care about immigrants, presumably you still have an interest in your own rights, and therefore you should selfishly still care about the expansion of the surveillance state because you are not immune to its effects. The invasions of privacy start with easier to accept justifications, such as anti-terrorism purposes or the need to control “undesirables,” but it is not long before these powers are used for routine criminal investigations and abuses of civil rights. If your response is that you have nothing to hide because you are a law abiding citizen, then I challenge you to publish your medical bills and insurance claims, tax data, travel history, and the full contents of your phone. Not every secret a person wants to keep from the prying eyes of law enforcement is criminal; we all have information and data we want to keep private for completely lawful and legitimate reasons.
Last year, I asked all of you “to make at least one New Year’s resolution addressing privacy in your life.” This year, my request is that you do not blindly accept invasions of privacy, regardless of the alleged justifications. These tools, their adoption, and the laws that permit them (or refuse to properly address them) were built over many years. They did not magically appear in 2025 and the surveillance of immigrants is not new, but once the surveillance state set up this infrastructure it was easy to expand it and target whomever it desired. Additionally, the more entrenched a system becomes, the more difficult it becomes to remove or oppose. So, focus not on what we are told the purpose of a surveillance tool is or its proposed guardrails, but how it could be used and what mechanisms exist to enforce any claimed limitations. Resist all that can be used for purposes you do not agree with or have no practical way to enforce limiting its use to what you find acceptable. You may find yourself opposing quite a lot more than you previously imagined.
Not like the brazen giant of Greek fame,
With conquering limbs astride from land to land;
Here at our sea-washed, sunset gates shall stand
A mighty woman with a torch, whose flame
Is the imprisoned lightning, and her name
Mother of Exiles. From her beacon-hand
Glows world-wide welcome; her mild eyes command
The air-bridged harbor that twin cities frame.
“Keep, ancient lands, your storied pomp!” cries she
With silent lips. “Give me your tired, your poor,
Your huddled masses yearning to breathe free,
The wretched refuse of your teeming shore.
Send these, the homeless, tempest-tost to me,
I lift my lamp beside the golden door!”
Upcoming Events
January 15, 2026
EFFecting Change: The Human Cost of Online Age Verification (EFF) (Virtual)
January 28-30, 2026
LSC’s 26th Innovations in Technology Conference (Legal Services Corporation) (San Antonio, TX)
January 29, 2026
Artificial Intelligence and Federal Courts: What Lawyers Need to Know (NYC Bar) (Virtual)
February 6-7, 2026
CactusCon 14 (CactusCon) (Mesa, AZ)
March 9-12, 2026
LegalWeek (ALM Law.com) (New York, NY)
March 10-12, 2026
MSAB Mobile Forensics Digital Summit 2026 (MSAB) (Virtual)
March 12-13, 2026
2026 Privacy and Emerging Technology National Institute (ABA) (Washington, D.C.)
March 18-22, 2026
SANS OSINT Summit & Training 2026 (SANS) (Arlington, VA and Virtual)
April 13-17, 2026
C2C User Summit 2026 (Cellebrite) (Washington, D.C)
April 20-22, 2026
Magnet User Summit 2026 (Magnet Forensics) (Nashville, TN)
April 23-25, 2026
19th Annual Making Sense of Science: Forensic Science, Technology & the Law (NACDL) (Las Vegas, NV)
Small Bytes
Advertising is Coming to AI. It’s Going to Be a Disaster. (Tech Policy Press)
An AI model trained on prison phone calls now looks for planned crimes in those calls (MIT Technology Review)
‘End-to-end encrypted’ smart toilet camera is not actually end-to-end encrypted (TechCrunch)
A New Anonymous Phone Carrier Lets You Sign Up With Nothing but a Zip Code (Wired)
Deepfakes Are Entering U.S. Courtrooms—Judges Say They’re ‘Not Ready’ (Forbes)
Man Charged for Wiping Phone Before CBP Could Search It (404 Media)
Doxers Posing as Cops Are Tricking Big Tech Firms Into Sharing People’s Private Data (Wired)
AI Bathroom Monitors? Welcome To America’s New Surveillance High Schools (Forbes)
Texas sues TV makers for taking screenshots of what people watch (Bleeping Computer)
Flock Exposed Its AI-Powered Cameras to the Internet. We Tracked Ourselves (404 Media)
The New Surveillance State Is You (Wired)
The Deepfake Courtroom Problem: A Colorado Blue Ribbon Study Sheds Some Light And Offers A Start To Solutions (Above the Law)
Generally speaking, calling this relatively new tech “intelligent” feels improper: though it can leverage data centers full of computers to provide intelligent sounding responses, the current iteration of this technology is not “smart.” But the naming battle seems lost, at least for now, to AI’s marketing and branding blitz; so this post will continue referring to it as artificial intelligence.
A carve-out in the law allows police to use AI in conformity with Montana’s “continuous facial surveillance” law. But at least it (generally) requires the issuance of a warrant prior to the cops’ use of facial recognition technology...
This is not to say there is currently no positive use of AI. Its application in health research, for example, has the potential to be literally life-altering. But in those instances, it is being used for a narrow purpose by (ideally, trained) professionals in the field. How it’s being — and may be — used by law enforcement and the general public is another story.








“For those of you reading who have told yourselves that high tech surveillance was justified to “get the bad guys” or to “prevent terrorism,” I suspect most of you were not including all of the people who have been swept up in recent raids or targeted arrests.”
really powerful point here.
Damn you!!! Damn you!!! You blew it up!!!!Arrrrggggggghhhhh