Subway Cameras, AI Legal Research, Pole Camera Decision, Erosion of Equal Protection Guarantees & More
Vol. 5, Issue 4
April 1, 2024
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. This month, Chris Pelletier questions the usefulness of adding more surveillance cameras to the subway. Shane Ferro examines two more cases in which AI-based legal research went wrong. Joel Schmidt reviews the 10th Circuit’s recent decision on pole cameras. Finally, our expert columnist this month, Mitha Nandagopalan from the Innocence Project, discusses the further erosion of equal protection guarantees by digital surveillance.
The Digital Forensics Unit of The Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts, the Unit provides support and analysis to the Criminal, Juvenile Rights, and Civil Practices of The Legal Aid Society.
In the News
Crime In The New York City Subway System: Are More Cameras The Answer?
Chris Pelletier, Digital Forensics Analyst
The installation of cameras to monitor customer areas of trains has been in the works for years, but has recently received a push from New York Governor Kathy Hochul. In her Five-Point Plan, released early last month, she announced an increased armed police presence and National Guard performing bag checks, introduction of a bill to allow judges to ban individuals convicted of assault within the transit system from using public transportation services, and calling on the Metropolitan Transportation Authority (MTA) to add cameras to conductor cabins, among other proposals.
In only the last few years, the MTA has rapidly grown its fleet of surveillance cameras within its stations, announcing in 2022 that every subway car will soon have cameras to monitor passengers in 2025, with the hope of deterring crime and making people feel safer riding the trains.
“You think Big Brother’s watching you on the subways? You’re absolutely right. That is our intent, to get the message out that we are going to be having surveillance of activities on the subway trains and that is going to give people great peace of mind.”
Are these cameras worth the financial investment and worth it in the context of trying to balance privacy and security? In the case of a Brooklyn subway shooting two years ago, apparently several cameras that could have recorded the incident “malfunctioned” and were not working at the time. So, in addition to the initial investment of installing these cameras, there is also the cost of maintaining the cameras and storage of the video. With so many cameras on trains, in stations, and outside of stations, this cost can be substantial.
From the perspective of a digital forensics analyst, video may be an invaluable source of evidence when investigating the facts of an event. If you include video recorded by NYPD body-worn cameras, along with all the cameras I’ve mentioned previously, there is a vast amount of available footage to go through when investigating an incident or building a defense. I feel that the more video that is available, the greater the potential to help corroborate our clients’ narratives. However, even if many of us have grown used to the privacy trade-off of cameras watching us everywhere, the truth is that cameras do not capture everything and can be a significant invasion into one’s privacy. Video that is captured may or may not be useful depending on what we are looking for. Cameras could have low frame rates that may miss events that occur in milliseconds, or may not be of a high enough quality to accurately capture or even pick up details like tattoos, birthmarks, or clothing used for identification. Cameras capture images from one angle of an event, but it’s never the whole story and video may sometimes even mislead triers of fact.
As a regular subway rider, I have mixed thoughts about how safe I feel on the subway. My goal is always to avoid trouble and get to where I’m going safely. When I hear the announcements on the train that “NYPD are on the platform if you need their assistance” (that’s when I can actually hear the announcements), I question whether these measures deter any crime. Is more daily surveillance of New Yorkers worth the financial investment and the loss of privacy?
The NYPD also recently announced Operation Fare Pay (which will allegedly focus on stopping fare evaders by flooding the subways with more officers), and Mayor Eric Adams is playing with flaky AI gun detection tech ... so who knows whether cameras will find themselves within the budget or at the bottom of a growing list of proposed policing tech. The MTA has a historic (and growing) problem with their finances, and the NYPD just went $100 million overbudget in overtime for the subway alone.
It’s up to New Yorkers to decide for themselves whether adding more cameras to their commutes makes them feel safer.
AI-Based Legal Research: Still Inadvisable
Shane Ferro, Digital Forensics Staff Attorney
Will AI-fabricated case law get you disbarred? The answer, obviously, is it depends.
Earlier this month, a Federal judge in Manhattan ruled that neither former Trump lawyer Michael Cohen nor his lawyer, David Schwartz, would face sanctions for submitting bogus legal citations made up by Google Bard’s AI in a motion to end Cohen’s post-release supervision early.
However, just a month earlier, a federal judge in Florida suspended an attorney’s law license for a year for similar reliance on bogus AI citations, according to LawSites.
The Florida lawyer, Thomas Neusom, told a disciplinary committee that in addition to using Westlaw and Fastcase, he “may have used artificial intelligence to draft the filing(s) but was not able to check the excerpts and citations.” (A classic use of “may have.”)
The disciplinary committee found that using fake AI citations without checking them was “beyond a lack of due diligence,” a violation of the Florida Rules of Professional Conduct, and recommended Neusom be suspended.
Meanwhile, here in New York, neither Michael Cohen nor his lawyer will face additional sanctions beyond the long list of Google results showing they are one of the many fools who have been caught trying to ChatGPT their way to a legal brief without checking the citations (at least 15, according to the Times). Even though Wikipedia styles him “Michael Cohen (lawyer),” the Rules of Professional Conduct don’t actually apply to Cohen. He was disbarred in New York State in 2019 following his federal perjury conviction for lying to Congress.
According to the Times, Judge Jesse Furman accepted Cohen and Schwartz’s representations that Cohen misunderstood how Google Bard worked and Schwartz assumed that Cohen was sending him real legal research. (“We are gratified that the court viewed this mistake as one that was not made in bad faith by Mr. Schwartz,” said Cohen’s lawyer’s lawyer, Barry Kamins.)
Judge Furman denied Cohen’s motion to end his post-release supervision early.
In the Courts
The Tenth Circuit Finds Pole Cameras Do Not Require a Search Warrant
Joel Schmidt, Digital Forensics Staff Attorney
There was a time when the Fourth Amendment was thought to only apply to the inside of physical locations belonging to you such as your home or office, places in which other people could be charged with trespass. But over the past half century or so the Supreme Court’s application of the Fourth Amendment has evolved to also include any location in which you have an expectation of privacy.
The classic example taught in law schools is the public phone booth. The Supreme Court held the government may not place a listening device in a phone booth in the absence of a search warrant since one’s expectation of privacy within the phone booth places that location under the protections of the Fourth Amendment, even though the user does not actually own the location. The Court held “the Fourth Amendment protects people, not places.”
Which brings us to Mr. Hay. Bruce Hay was suspected of collecting veteran disability benefits while not actually disabled. To aid in their investigation, federal agents mounted a motion-activated pole camera outside Mr. Hay’s home in a small eastern Kansas town. That camera recorded fifteen hours a day for sixty eight days.
The United States Court of Appeals for the Tenth Circuit, in which jurisdiction Kansas is located, had already previously held that one does not have Fourth Amendment protections against pole cameras because the Fourth Amendment does not apply to activities in public view. In other words, one cannot have an expectation of privacy in activities that are open to all to see.
But that did not deter Mr. Hay from arguing that the pole camera recording his home violated the Fourth Amendment. And for good reason. The Supreme Court has recognized that certain law enforcement practices can in the aggregate tip an unconstitutionally protected activity into the protections of the Fourth Amendment. For example, the government did not need a warrant to affix a tracking device to the inside of a drum of chloroform, which was later purchased, loaded into a vehicle, and tracked to a location, but the government needs a warrant to compel your cellphone carrier to reveal your complete historical cell site location information (HCSLI) over a period of time.
Mr. Hay argued that the Supreme Court’s decision in the HCSLI case effectively overruled the Tenth Circuit’s pole camera case when the government’s long-term pole camera observations were so pervasive that they “painted an intimate portrait of Mr. Hay's personal life” and sometimes “even provided officers with a view inside Mr. Hay's house through the front door.”
The Tenth Circuit, however, was unconvinced [PDF]. Last month, the Court held its prior pole camera decision remained good law and was not overruled by the Supreme Court’s subsequent HCSLI decision because the pole camera only records footage visible to member of the public walking by, and unlike HCSLI, which can record every aspect of a person’s movements, a pole camera can only record at one single discrete location.
Unfortunately for Mr. Hay, unless the Supreme Court chooses to take up the issue, the Tenth Circuit opinion remains the law of the case.
Expert Opinions
We’ve invited specialists in digital forensics, surveillance, and technology to share their thoughts on current trends and legal issues. Our guest columnist this month is Mitha Nandagopalan.
Digital Surveillance: A Further Erosion of Equal Protection Guarantees
As digital tools for surveillance and investigation have proliferated, defense attorneys have had to grow increasingly practiced at litigating Fourth Amendment and reliability challenges in response. But digital surveillance also poses serious problems in another arena that can be harder to uncover, let alone litigate: equal protection.
Racialized surveillance, especially of Black and immigrant communities, is hardly new. But extending it to the digital realm can move discriminatory practices further upstream in the investigatory process, helping hide them. Digital surveillance expands police avenues for evidence laundering and parallel construction of investigations. Often it is outsourced, in whole or part, to private companies that claim trade secrets to evade scrutiny. And because technology overwhelmingly outpaces law, it tends to elude even the patchwork data reporting requirements of in-person encounters.
How often do prosecutors argue that a piece of evidence is not discoverable because it is a mere “investigative lead” that will not be presented at trial? Yet such “investigative leads,” and the discretionary decisions behind them about whom to target, sometimes present the clearest picture of discrimination when exposed. This makes discovering and raising equal protection challenges against digital surveillance especially urgent, even as doing so grows especially difficult.
Take facial recognition technology, for instance. The few agencies that track demographic data for facial recognition searches paint a damning picture. Following three high-profile misidentifications – all of Black people – the Detroit Police Department implemented policy changes and now publishes weekly data on facial recognition searches. Yet these efforts have done little to stem stark racial disparities. In 2023, 95 out of 100 total search requests targeted Black people [PDF]; only one was for a white person. Similarly, Politico’s recent review of roughly a year’s worth of facial recognition search requests from the New Orleans Police Department found that, of the requests that were fulfilled, all but one were for Black people. Research suggests that such patterns may be widespread, finding an association between the implementation of facial recognition technology and increased racial disparities in arrests. Yet police can mask that they conducted a search at all by attributing the identification to an anonymous “credible source” or following it up with a confirmatory witness identification. This makes detecting patterns across cases that much more difficult and makes it far more likely that discriminatory practices will go undetected and unchallenged.
Or, consider ShotSpotter. The recent leak to Wired of locations for thousands of sensors confirmed what reporters and advocates have long surmised based on where it tends to generate alerts: ShotSpotter disproportionately covers Black and Latine neighborhoods. Although mounting evidence [PDF] shows that the system frequently fails to yield actionable evidence and has little impact on rates of gun violence, it nonetheless offers a pretext for police to target the neighborhoods where it is deployed. Yet prior to the leak, SoundThinking refused to disclose its sensor locations even to the law enforcement agencies that contracted with it, making it virtually impossible to concretely demonstrate disparate impact, let alone discriminatory intent.
Social media surveillance may present the most insidious example. Research by the Brennan Center found it is plagued by misinterpretation, presumptions of guilt by association, and unreliability [PDF]. It is also especially prone to perpetuating discrimination. For instance, the Minnesota Department of Human Rights, following the murder of George Floyd, investigated [PDF] the Minneapolis Police Department; it found, among other violations, that police improperly used covert social media accounts to surveil and infiltrate Black communities and organizations, “unrelated to criminal activity” and “without a public safety objective.” Yet few law enforcement agencies impose meaningful limits, oversight, or transparency requirements on social media monitoring.
Where does this leave defense counsel? Explicitly demanding discovery on digital surveillance practices, including aggregate demographic data, is one avenue to push back. In Massachusetts, for instance, the Supreme Judicial Court is currently considering whether to uphold enforcement of that state’s well-established equal protection discovery framework, originally developed in the context of traffic and pedestrian stops, in a case involving Snapchat monitoring. Even if such demands are almost guaranteed to be denied in an individual case, and even if equal protection challenges based on such data are difficult to win, repeated and concerted efforts to raise them nonetheless can, over time, help shift courts’ attitudes.
On a broader level, even in cases where bans, moratoria, or divestment from digital surveillance tools are out of reach, a minimum starting point should be mandating data collection, reporting, and ongoing monitoring of potential racial disparities. We can adapt existing frameworks for in-person encounters: 23 states and the District of Columbia currently mandate at least some form of demographic data collection and reporting for traffic stops. New York City, with the recent passage of the How Many Stops Act over Mayor Eric Adams’ veto, extends reporting requirements to virtually all police-civilian encounters, including those that do not require reasonable suspicion. Such data collection and reporting requirements should be expanded to cover digital surveillance.
A final note: The Innocence Project is building a new initiative to combat this kind of surveillance and the discrimination it perpetuates, using a combination of individual case litigation (including discovery, reliability, and constitutional issues), public records requests, participatory action research, and coordination on policy advocacy with local organizations. If you have a case raising these issues and are interested in collaborating, feel free to reach out to me directly: mnandagopalan@innocenceproject.org.
Mitha Nandagopalan is a Staff Attorney in the Innocence Project’s Strategic Litigation Department, working on the Neighborhood Project and focusing on the community impacts of racialized police surveillance and mass misdemeanor arrests. Before joining the Innocence Project, Mitha represented individuals facing criminal charges as an assistant public defender at the Law Offices of the Public Defender of New Mexico in Albuquerque.
Upcoming Events
April 11-12, 2024
2024 Forensic Science and Information Technology Institute (ABA) (San Diego, CA)
April 15-17, 2024
Magnet User Summit 2024 (Magnet Forensics) (Nashville, TN)
April 18-20, 2024
Making Sense of Science XVII: Forensic Science & the Law (NACDL) (Las Vegas, NV)
May 21, 2024
3rd Annual Decrypting a Defense Conference (Legal Aid Society’s Digital Forensics Unit) (Queens, NY) (Registration link coming soon!)
June 4-6, 2024
Techno Security East 2024 (Wilmington, NC)
July 12-14, 2024
HOPE XV (Queens, NY)
July 22-25, 2024
2024 HTCIA Global Training Event (Las Vegas, NV)
August 8-11, 2024
DEF CON 32 (Las Vegas, NV)
October 19, 2024
BSidesNYC (New York, NY)
Small Bytes
Law enforcement doesn’t want to be “customer service” reps for Meta any more (Ars Technica)
Automakers Are Sharing Consumers’ Driving Behavior With Insurance Companies (NY Times)
The NYPD Sent a Warrantless Subpoena for a Copwatcher’s Social Media Account, but Won’t Defend It in Court (Hell Gate)
Four things we learned when US intelligence chiefs testified to Congress (TechCrunch)
US Lawmaker Cited NYC Protests in a Defense of Warrantless Spying (Wired)
Experian Is Trying To Force WhatsApp To Hand Over User Data In An ‘Odd’ Court Battle (Forbes)
Unpatchable vulnerability in Apple chip leaks secret encryption keys (Ars Technica)
US sues Apple for illegal monopoly over smartphones (The Verge)
MyCity, INC: A Case Against ‘CompStat Urbanism’ (Surveillance Resistance Lab)