Facial Recognition Results are Not Probable Cause, Vehicle Tracking, Preserving iCloud Data, The Stored Communications Act & More
Vol. 6, Issue 2

February 03, 2025
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. This month, Shane Ferro explains that a facial recognition possible match is not enough to establish probable cause. Brandon Reim examines vehicle tracking. Allison Young discusses considerations when seeking evidence in Apple iCloud. Finally, our guest columnist, Rebecca Wexler, analyzes the Stored Communications Act and its effect on criminal defense subpoena.
The Digital Forensics Unit of The Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts, the Unit provides support and analysis to the Criminal Defense, Juvenile Rights, and Civil Practices of The Legal Aid Society.
In the News
A Facial Recognition Result Is Not Probable Cause
Shane Ferro, Digital Forensics Staff Attorney
Last month, a Cleveland court appeared to have controverted a search warrant for the suspect’s home in an Ohio murder case after it was revealed that police used commercial facial recognition technology to identify the suspect and failed to mention that in the warrant affidavit.
According to Cleveland.com, the Cleveland police investigating a homicide with no known suspect took a surveillance still and asked a location fusion center to feed it through the controversial commercial facial recognition program Clearview AI. The way that Clearview AI and most other facial recognition programs work is that the software analyzes a “probe” image (the surveillance still here), and returns several potential matches.
However, the Cleveland police zeroed in on a single person from that selection, then went and asked the court for a warrant for that person’s house, without being transparent with the judge that facial recognition was used, or that there were other possible matches returned by the software.
“Other photos included Instagram and YouTube pages now listed as private, a social media post of a man riding a bicycle behind a statue in Minneapolis and another YouTube video of several people tasting Buffalo Wild Wings,” says Cleveland.com. The police also left out that the Clearview AI documentation clearly states that its results aren’t admissible in court.
While the police presented that they had a clear suspect in their search warrant application, the defense argued--and the judge agreed--that the police’s evidence that their “suspect” was actually the shooter was more like an anonymous tip than a positive identification. The prosecution has appealed, claiming in their certification for appeal that without the fruits of the warrant (which included a gun), they don’t have a case.
This is not the only high-profile case to be stalled by the police ignoring the limits of facial recognition technology. In the fall of 2024, a Minnesota judge suppressed evidence after holding a Frye/Mack hearing on the way that police in Minnesota use facial recognition technology to identify suspects. In State v. Archambault [PDF], the judge ruled that facial recognition technology, as used by the police, “is not a process designed to consistently and reliably produce accurate results,” and as such is not admissible in court. (Minn. 2nd Jud. Dist., Ramsey County, 62-CR-20-5866, Sept 13, 2024.). Additionally, the Washington Post recently examined multiple false arrests due to facial recognition technology and insufficient or non-existent policies governing its use.
The bottom line here is that a facial recognition “match” is not a slam dunk identification for the prosecution, courts are becoming more attuned to the realities of facial recognition’s flaws, and it's a fight that can be fruitful for the defense at both the hearing and trial stages of a case. (And if you work at Legal Aid, please come talk to us at DFU about it!)
Help everything is tracking me!
Brandon Reim, Senior Digital Forensics Analyst
When people get behind the wheel of a car, most might open their maps and put a destination in and start driving themselves there. The thought is mainly getting from A to B the fastest way possible. They may expect their phone to have access to their location, and they may even expect a service like Google to store their search results and location along the way. However most don’t expect their car manufacturer to be keeping tabs of where that car has been with scary accuracy. This revelation comes on the back of not only car manufactures spying on you, but other unrelated apps recording your movements, too. A report by Wired shows that the likes of Candy Crush, Tinder, My Fitness Pal and more were all sharing your location history with an outside company. Privacy concerns continue to grow as people rely more on necessary or desired apps and services. Something as simple as trying to eat healthier or taking a drive is now a threat to sharing your location.
Wired also published about modern Subaru’s being hacked and the ability to have the vehicles horn honked, car started, location of the car given and even being able to remotely unlock the vehicle. The exploit comes from the ability to access certain level of Subaru employee’s admin panels. Subaru has since patched the exploit and explained that they give first responders location information in case of an accident. It however shows an alarming amount of user’s location data from their car. The article also citing that the people who were able to access this exploit, Shubham Shah and Sam Curry, were able to access Curry’s mothers’ location history for an entire year. They only had her data to go on but presumably this could go back years. The article states that this is not unique to Subaru, as Volkswagen was also caught exposing location data for users. In fact, many major car manufacturers have been caught collecting, analyzing, sharing, and even selling your data.
Wired specifically cites to “Vehicle Privacy Report” as a tool to help figure out what data your specific car may have on it. Additionally, they discuss how some brands, like Toyota, will even track your acceleration and speed. Since most people do not immediately suspect their car is spying on them, reports like this can help people realize the potential privacy invasion that may occur from getting behind the wheel. As far as what you can do to prevent this, researching what data your current vehicle is gathering, and researching what data may be collected by any new vehicle you are thinking of purchasing, are good starts. Or perhaps, going back to the likes of pre-computer vehicles might be the best chance of avoiding any sort of location spying.

Ask an Analyst
Do you have a question about digital forensics or electronic surveillance? Please send it to AskDFU@legal-aid.org and we may feature it in an upcoming issue of our newsletter. No identifying information will be used without your permission.
Q: My client no longer has access to their phone. Can I get important evidence like messages and photos from iCloud?
A: We’ve answered similar questions in previous newsletters, but for iPhones specifically there are many, many variables that can affect our ability to gather data when it’s stored or synced to Apple iCloud.
Are the messages in the cloud? Maybe! Can we get them down? Maybe! If you’re getting the data through a legal request (hello, law enforcement), Apple has a guide for you [PDF]. This is also a great resource for anyone wanting to understand where and how Apple data is stored (for instance, privacy-minded folks*).
Without an iPhone, you may need to use the “iforgot.apple.com” workflow to recover the account. Often, we start here because clients don’t remember their iCloud passwords. The recovery process can take days and may not succeed.

You will have a much better chance of success if you approach an iCloud preservation with some combination of the following:
iCloud username and password
Access to an iPad, MacBook, or older iPhone that is signed into the account already (trusted device)
Access to the trusted phone number(s) for the account
If we have everything we need to get into the account, are we good to go? Things are still complicated by:
Software limitations: updates to iCloud security have been known to “break” forensic tools in the past when updating 2-factor authentication requirements. Tools may not download every source or category of data. A creative solution, like restoring an account to a new iPhone, may be required, but adds complexity to the chain of custody and authenticity of evidence. The new iPhone should then be properly preserved by an expert who can testify to the process and its evidentiary impact.
iCloud setting complexity: iCloud content can be encrypted on Apple servers with a key they maintain, or it may be end-to-end encrypted (E2EE) depending on the type of data (ex. health data [PDF] and passwords) or user settings (such as Advanced Data Protection). Users may pick and choose which apps they sync to iCloud and which they keep only a “local” copy of. On the flip side, if the user has enabled the Optimize iPhone Storage setting, the best quality photos and videos may only be saved to the cloud, rendering evidence from the device itself less valuable in certain circumstances.
Type of data: So your client says they have relevant messages. What kind of messages? We’re more likely to find iMessages and photos captured with the iPhone Camera app in iCloud... but if they chatted on Signal and shared photos through Instagram stories, downloading iCloud may be a fruitless endeavor. Some “third-party” app data may be saved to an iPhone backup, but not all. If you’re looking for records of deleted chat conversations, a backup will not contain “ephemeral” artifacts like the contents of received messages that were deleted but could be recoverable from push notifications.
The screenshot shows a sample of cached activity on iPhone (1) that was not available from restored cloud data pushed to iPhone (2). This iCloud user may have listened to Phil Collins and browsed the dark web. However, we might not be able to determine that without access to their physical phone. There were about 10 times the number of image files responsive to the word “cache” in this iPhone (1) extraction compared to iPhone (2).
iCloud storage limitations: Ultimately, the most common consideration we encounter in copying data from iCloud is storage space. If a user has had an iPhone for an extended period of time, takes photos and videos, and uses iMessage with regularity, they may no longer have recent backups. Apple provides 5 GB of free storage, which is only slightly more than the capacity of a single layer DVD.
Surprises: I once worked on a case where there were so many attachments in iMessages that iCloud servers were returning error messages when I tried to restore the data to a phone. Software changes, glitches, and features greatly affect what we receive when copying data from a cloud source.
In conclusion, iCloud may be an appropriate solution to copy user data without access to a physical device. Attorneys and analysts should have an idea of what they are looking for when going this route, as well as awareness of potential technical complications and data limitations.

*You can download an archive of your own data from Apple. However, this does not include iPhone backups or iMessages.
Allison Young, Digital Forensics Analyst
Expert Opinions
We’ve invited specialists in digital forensics, surveillance, and technology to share their thoughts on current trends and legal issues. Our guest columnist this month is Rebecca Wexler, the Hoessel-Armstrong Professor of Law at the University of California, Berkeley, School of Law.
Challenging the Stored Communications Act bar on Criminal Defense Subpoenas
In a profound and growing injustice, major technology companies are distorting the Stored Communications Act (SCA), a federal data privacy law, to block lawful criminal defense subpoenas for relevant evidence, even when those subpoenas are so-ordered by the trial court and upheld on appellate review. This issue is currently before the California Supreme Court in a case called Snap v. S.C. (Pina), in which the defense and the district attorney are arguing on the same side against social media giants Snap and Meta. The Court has a chance to right a longstanding wrong and safeguard the fairness of criminal proceedings and the truth-seeking process of the courts; it should construe ambiguous silence in the SCA’s text to be consistent with California law regarding compulsory process and yield to lawful criminal defense subpoenas.
Here's some background on the issue: The SCA was enacted in 1986 to protect privacy in electronic communications that are transmitted and stored by third-party service providers. Back in the day, that meant service providers like your CompuServe and America Online that were far more limited in their functions than the data-mining social media giants of today. Nonetheless, even back then policymakers were worried about invasions of privacy from both the government and private entities. On the one hand, the Fourth Amendment might not apply to information that users voluntarily handed to third-party service providers. On the other hand, the service providers themselves could share the information willy nilly.
The SCA addresses these privacy concerns by limiting how the government can demand information from service providers and when those service providers can disclose information voluntarily. Section 2702(a) states that service providers “shall not knowingly divulge to any person or entity” the contents of electronic communications, and Section 2702(b) enumerates nine exceptions for permissible disclosures including to government entities pursuant to certain forms of compulsory process. Section 2703 specifies procedures for governmental entities to compel the disclosure of communications contents via warrants, subpoenas, and court orders.
Importantly, both the SCA’s text and its legislative history are silent as to the statute’s effect on subpoenas requested by criminal defendants and other nongovernmental litigants. This silence has led to some very poor practices. Technology companies, including Facebook/Meta, GitHub, Google, Instagram, Microsoft, Snap, and Twitter/X, have all argued that the SCA entitles them to not comply with so-ordered criminal defense subpoenas seeking a nonparty’s stored communications contents because none of the enumerated exceptions in Section 2702(b) expressly authorizes disclosures pursuant to such subpoenas. Many trial courts and a handful of appellate courts across the country have agreed.
In Snap v. S.C. (Pina), a San Diego Superior Court and the California Fourth Appellate District Court of Appeal bucked the trend. This is a homicide case in which the defendant is arguing self-defense and seeking evidence of the deceased’s violent character from the deceased’s social media accounts. The superior court judge found that the defendant had satisfied the “good cause” standard and that a balancing of the interests according to California’s Alhambra factors favored enforcing the defense subpoena. The appellate court agreed and ruled that the SCA does not block the defense subpoena because Snap and Meta’s data-mining business models mean that the SCA does not apply to these companies at all.
If the California Supreme Court upholds this “business-model theory” excluding data-mining tech companies from coverage under the SCA, it will have a broad impact on data privacy. None of the SCA’s protections – its limitations on government access to stored communications, its limitations on technology companies’ voluntary disclosures of users’ data – would apply to most major service providers today. On the one hand, the shock to the system might encourage Congress to enact a new federal data privacy law that’s updated for today’s technologies. On the other hand, if Congress fails to do so, then one of the few significant federal data privacy laws that we have on the books will be dead.
There is also an alternative, far narrower basis for upholding Mr. Pina’s subpoena that the California Supreme Court could – and I argue should – adopt. The SCA’s text and legislative history are ambiguously silent as to the statute’s effect on criminal defense subpoenas, so the Court should construe the SCA to be consistent with lawful compulsory process and safeguard the fairness and truth-seeking interests of the courts. The United States Supreme Court held in St. Regis Paper Co. v. United States that courts have a “duty to avoid a construction [of federal statutes] that would suppress otherwise competent evidence unless the statute, strictly construed, requires such a result.” The SCA does not require such a result. Many federal statutes contain broad confidentiality provisions that, like SCA Section 2702, are silent as to their effect on compulsory process, and courts across the country have repeatedly construed those statutes as yielding to compulsory process.
The tech companies’ arguments to the contrary depend on the nonbinding canon of statutory construction known as expressio unius, which presumes that when Congress enumerates a list of exceptions such as those in SCA Section 2702(b), it must intend to omit additional exceptions not listed. But when it comes to creating privilege-like bars on compulsory process, expression unius logic also points the other way. Congress knows how to create express privilege bars on subpoenas and other forms of compulsory process. It has done so in many other statutes in the past. So, the fact that SCA Section 2702 does not expressly prohibit criminal defense subpoenas means that Congress must not have intended the statute to create a privilege that can block compulsory process.
If you are facing this SCA issue in another case and wish to make this narrower argument in support of a criminal defense subpoena in another case, please feel free to use the model brief here.
Rebecca Wexler is the Hoessel-Armstrong Professor of Law at the University of California, Berkeley, School of Law and a former public interest fellow of the Legal Aid Society of New York City.
Upcoming Events
February 5, 2025
The Threat of AI and Technology to Immigrant Justice (Grantmakers Concerned with Immigrants and Refugees) (Virtual)
IoT Forensics Webinar: Investigating Crime Caught on Camera (Cyber5W) (Virtual)
February 6, 2025
Deep Dive into Facebook (Cellebrite) (Virtual)
Social Media as Evidence (ABA) (Virtual)
February 11, 2025
The AI Regulatory Landscape in the U.S. (Future of Privacy Forum) (Virtual)
February 17-22, 2025
AAFS Annual Conference - Technology: A Tool for Transformation or Tyranny? (American Academy of Forensic Sciences) (Baltimore, MD)
February 24-March 3, 2025
SANS OSINT Summit & Training 2025 (SANS) (Arlington, VA or Virtual)
March 17-19, 2025
Magnet User Summit (Magnet Forensics) (Nashville, TN)
March 20, 2025
AI Rising: Integrating and Fighting the Use of Artificial Intelligence (NACDL) (Virtual)
March 20-21, 2025
Privacy and Emerging Technology National Institute (ABA) (Washington, DC)
March 22-30, 2025
NYC Open Data Week (New York, NY)
March 24-27, 2025
Legalweek New York (ALM) (New York, NY)
March 31-April 3, 2025
Cellebrite Case-to-Closure (C2C) User Summit (Cellebrite) (Washington, D.C.)
April 24-26, 2025
2025 Forensic Science & Technology Seminar (NACDL) (Las Vegas, NV)
April 28-May 2, 2025
IACIS Collecting and Admitting Digital Evidence at Trial (IACIS) (Orlando, FL)
June 2, 2025
Amped Connect US 2025 (Amped Software) (Wilmington, NC)
June 3-5, 2025
Techno Security & Digital Forensics Conference (Wilmington, NC)
August 7-10, 2025
DEF CON 33 (Las Vegas, NV)
August 15-17, 2025
HOPE 16 (Queens, NY)
Small Bytes
It’s not just Tesla. Vehicles amass huge troves of possibly sensitive data. (Washington Post)
A New Model for State Privacy Legislation (Tech Policy Press)
Online Behavioral Ads Fuel the Surveillance Industry—Here’s How (Electronic Frontier Foundation)
Researcher Turns Insecure License Plate Cameras Into Open Source Surveillance Tool (404 Media)
White House unveils Cyber Trust Mark program for consumer devices (Nextgov/FCW)
Your Next AI Wearable Will Listen to Everything All the Time (Wired)
Secret Phone Surveillance Tech Was Likely Deployed at 2024 DNC (Wired)
Even modest makeup can thwart facial recognition (The Register)
The Powerful AI Tool That Cops (or Stalkers) Can Use to Geolocate Photos in Seconds (404 Media)
AI weapon detection system at Antioch High School failed to detect gun in Nashville shooting (NBC News)
Lawsuit accuses Amazon of secretly tracking consumers through cellphones (Reuters)