Surveillance Budget, Fraudulent Emergency Data Requests, Geofence Warrants, TARU & More
Vol. 3, Issue 4
April 4, 2022
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. This month, Shane Ferro highlights the inclusion of surveillance technology in the New York State budget. Diane Akerman discusses the recent revelations of fraudulent Emergency Disclosure Requests. We discuss the recent geofence warrant decision in United States v. Chatrie and Benjamin Burger answers a question about NYPD’s Technical Assistance Response Unit.
The Digital Forensics Unit of the Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts and examiners, the Unit provides support and analysis to the Criminal, Juvenile Rights, and Civil Practices of the Legal Aid Society.
In the News
New York Governor Attempts to Sneak Millions for Digital Surveillance into State Budget
Shane Ferro, Digital Forensics Staff Attorney
In addition to rolling back bail reform, denying the defense discovery, and sending more children to adult court, it seems that Governor Hochul’s proposed New York State budget also wants to pour money into disastrous surveillance technologies, a proposal which overlaps with many concerning proposals in Mayor Eric Adams’ plans to combat gun violence in New York City.
Shared reporting by New York Focus and the Intercept uncovered the budget proposals that many lawmakers told reporters they did not know about:
Hochul’s administration has proposed tens of millions of dollars and several new initiatives to expand state policing and investigative power, including agencies’ ability to surveil New Yorkers and gather intelligence on people not yet suspected of breaking the law.
The plan includes more money for secretive surveillance hubs called “Fusion Centers,” money to pay “social media analysts” to carry out dragnet surveillance of the online lives of New Yorkers, millions for new digital forensic tools for law enforcement, and tens of millions toward supporting local law enforcement partnerships with the ATF, to focus on unreliable and untested “gunshot forensics” such as ShotSpotter.
Meanwhile, yesterday, our fearless mayor was filmed playing with the FDNY’s new robot dog, which is the same as the old NYPD robot dog, a true surveillance horror spawned out of MIT’s uncanny valley. Back then, the NYPD had their counterterrorism guy John Miller standing up for their use of the creepy surveillance robot. Miller is currently busy defending himself after he went before the City Council and denied that the NYPD conducted surveillance of Muslims after 9/11.
"Data Security Breach" by Visual Content is marked with CC BY 2.0.
Hacking the Emergency Data Request System
Diane Akerman, Digital Forensics Staff Attorney
Recently, hackers were able to take advantage of the Emergency Data Request (EDR) system of numerous internet service providers (ISP) and social media companies. Unlike other requests for data that require court orders or warrants, law enforcement can request information in urgent life or death matters via EDRs simply through portals or via email. In the most recent cases, hackers used compromised, legitimate email addresses from different law enforcement agencies to make the requests, and used the disclosed information to harass individuals or engage in financial fraud.
Numerous companies have admitted to falling prey to these fraudulent requests, including Apple, Meta, and Discord. ISPs accept these requests from law enforcement agencies across the world, and verifying the requests is complicated by a maze of differing jurisdictional privacy and disclosure laws. Most of the requests have come from legitimate – but compromised – law enforcement email addresses, whose login information is available for sale on the dark web. Some lawmakers have attempted to address this flaw by introducing a bill to require the use of digital signatures that will allow recipients to verify the request’s authenticity.
This isn’t the first time the vulnerability of law enforcement systems has been exploited. In November 2021, hackers sent a fake email alert to thousands of state and local law enforcement entities through the FBI’s Law Enforcement Enterprise Portal (LEEP). In that attack, the intruders abused a fairly basic and dangerous coding error on the website, and the fake emails all came from a real fbi.gov address. The hacker, Pompompurin, simply pointed out the vulnerability to the FBI, but was quoted at the time saying, “I could’ve 1000% used this to send more legit looking emails, trick companies into handing over data etc.”
While we more frequently discuss the threat to privacy that comes from law enforcement itself, these incidents bring to light a secondary problem – the data security and privacy of law enforcement agencies themselves. Law enforcement agencies are no less vulnerable to hackers - and in this case, a group of teenagers - and the data they possess is often far more compromising to individual’s privacy.
In the Courts
Federal District Court Finds Geofence Warrants Violate Fourth Amendment
Benjamin S. Burger, Digital Forensics Staff Attorney
The first federal district court judge to consider the constitutionality of geofence warrants, after one had been already been issued and executed, determined that they can violate the Fourth Amendment for a lack of particularized probable cause. See United States v. Chatrie, 2022 WL 628905 (E.D.Va 2022) [PDF]. Chatrie is a bank robbery case. The suspect took $195,000 from a credit union in Midlothian, Virginia. Id. at *1. In an attempt to identify the robber, law enforcement acquired a Geofence warrant, which required Google to disclose location and identification information about users located in proximity to the robbery. Id. In a detailed decision, District Court Judge M. Hannah Lauck determined that the warrant violated the Fourth Amendment. Id.
As part of the litigation, Google filed an amicus brief which detailed how Google collects, stores, and disclosed location information. Id. at *3-10. Google explained that “Location History” is drawn from GPS information, Bluetooth beacons, cell site towers, and other sources. Id. at *3. The information is logged approximately every two minutes. Id. The data is stored in Google’s “Sensorvault” and assigned a unique device ID as opposed to a personally identifiable Google ID. Id. at *4. Location History is off by default and must be enabled by the user. Id. at *6. Location History can also be “paused” or permanently deleted. Id. at *7.
Google also detailed the policies they formulated, in consultation with the Department of Justice, for processing geofence warrants. First, law enforcement acquires a warrant requiring Google to disclose a “de-identified” list of all users whose Location History appears in a specific area (the geofence) at a specific time. Id. at *9. Google will provide responsive user records from the Sensorvault. Id. Second, after law enforcement reviews the di-identified data, they can then compel Google to provide additional location information beyond the time and location of the original geofence. Id. at *10. Typically, Google will require law enforcement to narrow the number of users in which information is requested in this step. Id. Third, using the data from steps one and two, law enforcement can request account identifying information for the users that are relevant to the investigation. Id.
After explaining the steps of a geofence warrant, the court analyzed whether the warrant that was used to identify the defendant, Okello Chatrie, violated the Fourth Amendment. At the outset, the court sidestepped whether Chatrie had standing to contest the warrant because, ultimately, it denied his motion to suppress based on the federal good faith exception. However, the court did observe that the location retained by Google could allow law enforcement to intrude upon the privacy of numerous innocent individuals. Id. at *17-18.
The court did determine that the geofence warrant used to identify Chatrie was invalid. Id. at *18. The geofence warrant did not contain sufficient probable cause to search the particular individuals in the specific geographic location. Id. at *22. The court observed that the limited probable cause in the warrant - that the suspect was seen using a cell phone - could not justify the seizure of all location information within a 150-meter radius of the bank. Id. at * 20-22. This included individuals who may have never actually been within the area of the geofence based on Google’s estimates of their location. Id. at *22. The decision also held that Google’s three-step process could not cure the probable cause defects inherent in the warrant. Id. at 24-25. Finally, the court concluded that the third-party doctrine could not extinguish Chatrie’s expectation of privacy in his location information. Id. at 26-27.
As Judge Lauck observed, the use of geofence warrant has grown exponentially. While some prominent commentators are critical of the decision, the federal courts have begun the process of applying a 231-year old amendment to the novel technology of today.
Ask an Attorney
Do you have a question about digital forensics or electronic surveillance? Please send it to AskDFU@legal-aid.org and we may feature it in an upcoming issue of our newsletter. No identifying information will be used without your permission.
Q. I saw a reference to “TARU” in my discovery. What is TARU and what does it have to do with the NYPD?
A. The Technical Assistance Response Unit (TARU) is a specialized unit within the NYPD that uses specialized investigative equipment and provides technical support to other police bureaus. If you see a notation in your discovery that TARU was involved in your case, it could mean that surveillance technology was deployed during the investigation.
In the initial stages of an investigation, we often see TARU assisting in the retrieval of surveillance video. However, their role is not limited to video acquisition. TARU is the sole unit responsible for deploying the many surveillance tools used by the NYPD. After the New York City Council passed the Public Oversight of Surveillance Technology (POST) Act in 2020, the NYPD was required to publish impact and use policies for their surveillance technology. A review of these policies shows the many areas in which TARU is intimately involved with surveillance technology. For example, TARU oversees the Communications Assistance for Law Enforcement Act (CALEA) collection system and determines who can access the system. They are the only NYPD unit allowed to access and operate cell site simulators, which impersonate cell towers to force mobile devices to connect to them for the purposes of the determining the location of the device. They are also the only unit permitted to operate unmanned aircraft systems (otherwise known as drones). Additionally, they operate Wi-Fi tracking devices and the NYPD’s short-lived robot dogs.
TARU is involved in the arrests of many clients due to their access to technology that can locate and track individuals. TARU is also the clearinghouse for GPS pinging, which utilizes the E911 system. Pursuant to a court order, a wireless carrier initiates a signal to a user’s cell phone. The cell phone responds to this signal with its approximate location. The wireless carrier will then send this location information to TARU for distribution to the investigating officer. Oftentimes, two different tracking methods may be used in combination, like GPS pinging to narrow down the approximate location of the client and then a cell site simulator to discern a more precise location.
When you see that TARU has been involved with your case, it is a signal that some sort of technology - e.g. video surveillance, location tracking, or eavesdropping - was employed in the investigation. There may be court orders or search warrants authorizing these intrusions. When appropriate, attorneys should challenge the use of surveillance technology through motions to controvert.
- Benjamin S. Burger, Digital Forensics Staff Attorney
Upcoming Events
April 7, 2022
SANS Open-Source Intelligence Summit 2022 (Virtual)
April 7-9, 2022
Making Sense of Science: Forensic Science & the Law Seminar (NACDL) (Las Vegas, NV)
April 11-13, 2022
Magnet User Summit (Nashville, TN)
April 12, 2022
The Weaponization of Technology (NYCLA) (Virtual)
April 19, 2022
Facial Recognition in the Five Boroughs (S.T.O.P. x RadTech with Amnesty International) (Virtual)
May 3, 2022
Intro To Artificial Intelligence (AI) Part 2: AI As A Litigation Tool (NYSBA) (Virtual)
May 9-12, 2022
Techno Security & Digital Forensics Conference (Myrtle Beach, SC)
May 16-17, 2022
Unlocking the Black Box (NACDL & Samuelson Clinic Seminar) (Chicago, IL)
June 6-10, 2022
RightsCon (Virtual)
July 22-24, 2022
A New HOPE (Hackers on Planet Earth) (Queens, NY)
August 11-14, 2022
DEF CON 30 (Las Vegas, NV)
October 10-12, 2022
Techno Security & Digital Forensics Conference (San Diego, CA)
Small Bytes
Equalizing Access to Evidence: Criminal Defendants and the Stored Communications Act (Yale Law Journal)
The secret police: Cops built a shadowy surveillance machine in Minnesota after George Floyd’s murder (MIT Technology Review)
Analysis of Mobile Phone Geolocation Methods Used in US Courts (IEEE)
My Wife Tracked Me, for Journalism (The New York Times)
Police Abolitionists Are Building a Dispatch App To Replace 911 (Motherboard/Vice)
Poor tech, opaque rules, exhausted staff: inside the private company surveilling US immigrants (the Guardian)
How Police Abuse Phone Data to Persecute LGBTQ People (Wired)
Drones, robots, license plate readers: Police grapple with community concerns as they turn to tech for their jobs (Washington Post)
As City Pilots New Weapons Scanner in Bronx Hospital, Adams Eyes Expansion Into Schools (Gotham Gazette)
COMIC: How a computer scientist fights bias in algorithms (NPR)
DOJ Doesn’t Even Know How Many Predictive Policing Tools It Funds (Gizmodo)
This Database Stores the DNA of 31,000 New Yorkers. Is It Illegal? (NY Times)