Medical Privacy, Armed Drones, ShotSpotter Decision, Acquiring Cell Phone Evidence & More
Vol. 3, Issue 7
July 11, 2022
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. In this issue, Shane Ferro looks at Facebook’s acquisition of medical data. Jerome Greco discusses how surveillance and technology companies exploit catastrophe for profit. Diane Akerman explains the importance of educating courts on new technologies. Finally, Benjamin Burger answers a questions about acquiring cell phone data from a witness.
The Digital Forensics Unit of the Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts and examiners, the Unit provides support and analysis to the Criminal, Juvenile Rights, and Civil Practices of the Legal Aid Society.
In the News
Hospitals Simply Give Away Patient Information To Facebook
Shane Ferro, Digital Forensics Staff Attorney
It turns out that allowing marketing to be the backbone of the free internet has cost us. In a post-Dobbs world, it’s not just period trackers that are leaking sensitive health data, but seemingly every medical website with an online scheduler that has ever placed a Facebook ad.
Various investigative journalists, most notably those at The Markup, broke stories this month about the amount of pointed, easily identifiable health information that medical providers are openly providing to companies like Facebook (I suppose it’s technically Meta) by using ad tracking tools on parts of their site where patients input what should be confidential information.
The Markup’s first story on this revealed that hospitals around the country were using ad marketing tools that tracked users on their websites and sent that information to third parties like Facebook, even when they input personal information.
For example: “When The Markup clicked the ‘Finish Booking’ button on a Scripps Memorial Hospital doctor’s page, the pixel sent Facebook not just the name of the doctor and her field of medicine but also the first name, last name, email address, phone number, zip code, and city of residence we entered into the booking form.”
Although that information was hashed, the Markup was able to use a free online tool to reverse most of the hashed information it input—and Facebook is explicit about using the information to connect the data received through its ad trackers to specific Facebook profiles. Several of the nation’s largest and most prestigious hospital systems, including Johns Hopkins, New York Presbyterian, and Northwestern Memorial, did not make any effort to remove the tracking tools that Markup identified even after being contacted by reporters.
In a separate investigation, the Markup also found that the abortion pill provider Hey Jane used trackers that revealed information about the site’s visitors to Facebook, Google, Stripe, and other data analytics companies. It was also using a third-party review company that revealed personal information like the Instagram handles of people leaving reviews on the site (Hey Jane removed all user reviews after the Markup contacted them).
After the first Markup story ran, the Washington Post reported that the privacy app Lockdown Privacy had discovered that Planned Parenthood was also using marketing-related trackers on their scheduling page that had access to information such as IP addresses, approximate zip codes, and what services the user was requesting. PP apparently took the trackers off the scheduling page after the Post’s reporting.
This sort of information sharing with social media sites has not received a lot of attention from politicians, at least yet, in the Dobbs fallout. However, Elizabeth Warren did propose a bill [PDF] last month that would prohibit data brokers from sharing and transferring health and location data.
Send in the Drones
Jerome D. Greco, Digital Forensics Supervising Attorney
Surveillance companies never fail to try to exploit tragedy for profit. On May 24, 2022, an 18-year old opened fire at Robb Elementary School in Uvalde, Texas, resulting in the deaths of nineteen students, two teachers, and the shooter. It was the third-deadliest school shooting in U.S. history. Nine days later, in response to the shooting, Axon publicly announced the development of the TASER drone system. Axon, best known for producing body-worn cameras and Tasers, stated that it was “actively developing a miniaturized, lightweight TASER payload capable of being deployed on a small drone or robot” and that it had “begun collaborating with [its] partner DroneSense on a remote piloting capability.”
Axon CEO and founder Rick Smith said, “In the aftermath of these events, we get stuck in fruitless debates. We need new and better solutions. For this reason, we have elected to publicly engage communities and stakeholders, and develop a remotely operated, non-lethal drone system that we believe will be a more effective, immediate, humane, and ethical option to protect innocent people.” The value of Axon’s shares increased by almost six percent that day.
The backlash was immediate. Despite Smith’s claim of “engag[ing] communities and stakeholders,” Axon had already disregarded the recommendations of its own AI Ethics Board. In response to Axon’s announcement, nine of the twelve members of its AI Ethics Board resigned in a harshly worded public letter. The former members stated that, only a few weeks before, the board had voted 8-4 to recommend “that Axon not proceed with a narrow pilot study aimed at vetting the company’s concept of Taser-equipped drones.” A public report on their recommendation was being prepared at the time of Axon’s announcement. The resigning board members cited several issues, including the company embracing an idea that had “no realistic chance of solving the mass shooting problem.” They concluded that they had “lost faith in Axon’s ability to be a responsible partner.”
Despite Axon’s PR attempts, including a Reddit AMA with Smith and a graphic novel, they quickly acceded to the public pressure and paused their work on developing a drone equipped with non-lethal weapons. However, this does not prevent them from restarting the project in the future, nor does it stop any other company to pick up where Axon left off. In New York, Senator Jessica Ramos and Assemblymember Ron Kim introduced a bill known as the Protect Our Privacy (POP) Act (S675/A3311). The POP Act would prohibit equipping drones with weapons and would significantly limit law enforcement’s use, including banning police from using drones to monitor or record First Amendment protected activities.
Technology is not the solution to all problems, and surveillance technology is rarely a solution to any problem.
In The Courts
US v. Hawkins: Why It’s So Important to Educate the Court
Diane Akerman, Digital Forensics Staff Attorney
Courts continue to contend with the evidentiary issues raised by new technologies, including ShotSpotter. This month, the Second Circuit released a decision in United States v. Hawkins, which touched on whether ShotSpotter can provide reasonable suspicion or probable cause.
The facts in Hawkins are relatively run-of-the-mill. NYPD Officers responded to a location two minutes after receiving a ShotSpotter alert from the roof of a building. They observed two men exit a building in the vicinity “acting tense” and “turning sideways.” Officers approached and asked them to remove their hands from their pockets, and observed a bulge in one of the men’s waistband. A witness informed the officers that he had seen the two men coming down from the rooftop. Officers recovered a firearm on the scene after conducting a frisk.
The court’s finding that officers had reasonable suspicion to stop the men in the first place was based on a totality of the circumstances, which included the ShotSpotter alert. The Court’s decision speaks to ShotSpotter’s reliability only in a footnote:
“The Defendants argue that the technology which relayed the shot-fired report to the NYPD, called ShotSpotter, is unreliable. But the responding officers testified that the technology works with a reasonably high degree of accuracy. For example, Officer Lopez testified that ShotSpotter reports in her experience, were ‘usually’ accurate to the ‘block.’
Lopez testified that the reports were even more accurate ‘with respect to elevation.’ The district court did not err in crediting the officers’ reasonable reliance on the ShotSpotter report in supporting the reasonable-suspicion or probable-cause determinations.”
The real takeaway from many of these cases, is how much work needs to be done by defense attorneys to educate the courts. There is a tendency to simply confer an undeserved reverence onto things deemed “technology,” or to carelessly throw around the term “scientific.” It is precisely that misguided assumption that law enforcement relies on to use these unreliable new technologies without oversight.
It’s difficult not to be overwhelmed when faced with new technologies. But while we may be dealing with a new “technology,” the basic legal landscape remains unchanged. The crux of any probable cause determination is whether the underlying information is credible and reliable. In this instance: is ShotSpotter credible and reliable? (Psst: It’s not).
Unable to find out more about the record below, I’m left to believe the worst: the only testimony about ShotSpotter’s reliability came from this one officer. An NYPD officer is not an “expert” in ShotSpotter, who can make such sweeping assertions about its reliability. An NYPD officer only knows what has been fed to them in promotional material.
Yes, an officer can testify about their own experience; Yes, an officer can explain what they relied on to make a determination. But the question becomes: was what they relied on reasonable? That question remains the same whether the underlying information was a ShotSpotter alert, an Instagram post, or a confidential informer.
ShotSpotter can be one of many facts that gives rise to reasonable suspicion or probable cause. However, ShotSpotter is neither science, nor reliable, and it should be used only for very limited purposes. It is going to be up to you to educate the court.
Ask an Attorney
Do you have a question about digital forensics or electronic surveillance? Please send it to AskDFU@legal-aid.org and we may feature it in an upcoming issue of our newsletter. No identifying information will be used without your permission.
Q. The complainant in my case may have valuable evidence on their cell phone. Assuming the complainant refuses to give us the evidence, what can we do to access the phone?
A. New York State enacted much needed criminal discovery reform in 2020. One interesting part of the reform is codified in CPL § 245.30. This section of the law allows for a court to issue orders preserving evidence, allowing access to a location, or for “discretionary” discovery of an agency or person. See § 245.30(1)-(3). The discretionary discovery provision is only available to the defendant. The defense must show that the request is “reasonable” and the evidence can be obtained “without undue hardship.” Upon that showing, the court may order a motion compelling discovery from the prosecution, an individual or agency, or any other “entity” subject to the court’s jurisdiction. The evidence must relate to the subject matter of the case and be “reasonably likely” to be material.
The discretionary discovery provision can be used to compel production of a complainant - or any other witness’s cellular phone - provided the requirements of the statute are met. In People v. Dominicci, 2022 NY Slip Op 22173 (Bronx Co. Sup. Ct., May 5, 2022), a Bronx trial court affirmed the use of CPL § 245.30(3) for that purpose. The Court rejected a witness’s argument that they were not “subject to the jurisdiction of the court” and that Supreme Court precedent prevented the seizure and search of their phone. The Court compared § 245.30(3) to the CPL § 690.05(1), which allows the prosecution to seek a search warrant based on probable cause. Applying the same standards, the Court reasoned that the defendant’s burden to acquire evidence from a third-party is a similar “probable cause” standard.
This section of the discovery law will allow defense attorneys to access evidence through a court issued order. However, this process will not be available in every case. For example, if the prosecution has already extracted the relevant evidence from a witness’s cellular phone, there would be little reason for the defendant to seek a discovery order. Instead, the prosecution could transmit the relevant evidence to the defense for analysis. Additionally, other courts may interpret the defense’s burden for acquiring the discovery order more strictly. It may take some time for the case law surrounding § 245.30(3) to fully develop. There is also the possibility of subpoenaing the complainant’s phone, but there is little caselaw on that issue, and it may be the subject of a future column. Legal Aid Society attorneys should consult with the Digital Forensics Unit when pursuing a court order for a witness’s cell phone or other digital device.
Benjamin S. Burger, Digital Forensics Staff Attorney
Upcoming Events
July 22-24, 2022
A New HOPE (Hackers on Planet Earth) (Queens, NY)
August 11-14, 2022
DEF CON 30 (Las Vegas, NV)
August 15-16, 2022
DFIR Summit 2022 (SANS) (Austin, TX and Virtual)
September 7, 2022
Intro To Artificial Intelligence (AI) Part 2: AI As A Litigation Tool (NYSBA) (Virtual)
September 23-25, 2022
D4BL III (Data for Black Lives) (New York, NY)
October 10-12, 2022
Techno Security & Digital Forensics Conference (San Diego, CA)
Small Bytes
How a New Generation Is Combatting Digital Surveillance (Boston Review)
Why Expensive Social Media Monitoring Has Failed to Protect Schools (Slate)
The State Police Sent You a Friend Request (New York Focus)
How the Federal Government Buys Our Cell Phone Location Data (EFF)
US Marshal Charged for Using Cop Phone Location Tool to Track People He Knew (vice.com)
Your Deleted TikTok Content Can Still Be Used Against You By The FBI (Forbes)
Microsoft Plans to Eliminate Face Analysis Tools in Push for ‘Responsible A.I.’ (The New York Times)
MTA rolling out ‘hidden’ surveillance cameras on subway trains to help solve transit crimes (New York Post)
Are You Ready to Be Surveilled Like a Sex Worker? (Wired)
Tech Companies Won’t Say If They’ll Give Cops Abortion Data (vice.com)
Apple introduces 'Lockdown Mode' iPhone feature to block elite spyware (NBC News)