Apple AirTags, Bodily Privacy, ShotSpotter Evidence, Google Location History & More
Vol. 3, Issue 3
March 7, 2022
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. This month, in recognition of Women’s History Month, Shane Ferro discusses location tracking devices and their effects on women. In recognition of both Women’s History Month and Transgender Day of Awareness on March 31, Diane Akerman discusses attacks on bodily autonomy and digital privacy. Benjamin Burger discusses a recent Massachusetts case concerning ShotSpotter. Brandon Reim answers a question about Google location information.
The Digital Forensics Unit of the Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts and examiners, the Unit provides support and analysis to the Criminal, Juvenile Rights, and Civil Practices of the Legal Aid Society.
In the News
Location Tracking Devices Used to Stalk Women
Shane Ferro, Digital Forensics Staff Attorney
For the New York Times, Kashmir Hill took on Apple AirTags and other similar tracking devices, surreptitiously tracking her husband across New York City. Marketed as an easy way to keep track of belongings, every week it seems there are more stories of people, usually women, discovering their movements are being tracked without their consent using one of these tiny devices.
Many of us are also unwitting players in this surveillance network. In the case of Tile and Apple, people who have Bluetooth turned on and either have the Tile app or simply have an iPhone (or both) are unwittingly having their devices constantly pinged to create the network.
“Yes, the internet of things — our things — is coming alive around us, digitally frisking us as we walk by to see if we’re carrying anything of interest,” writes Hill.
Apple does notify a person (if they have an iPhone, or download a special app on Android) if an AirTag is detected close to them over time and not connected to either their phone or a phone in their proximity, such as another person on the same bus or train. But in practice, that safety feature only helps if you can find the tag. An article this week published by ABC’s Omaha affiliate tells of a woman whose phone notified her an AirTag had been following her while she was in her car. Although she contacted Apple, they couldn’t remotely disconnect the tracking device without the serial number. AirTags also have an external battery that can simply be pulled out—but again, only if you can find it.
If you have an iPhone that has identified an AirTag as following you, the iPhone is supposed to be able to force that AirTag to make sound. But when Hill’s husband tried, he still had so much trouble trying to find the device he gave up. (Apple announced improvements to the notification system the week after Hill’s story.) A separate Guardian article also notes that in a domestic violence situation, a person who is being surveilled might be near enough often enough to the phone of the tag’s owner that they are never notified.
At least Apple seems to be attempting to notify those who are being tracked without their consent and make clear that that is not the intended use. The top three reviews on Amazon for the LandAirSea device that Hill also tested (which are mentioned in Hill’s article and are still up several weeks later) all give 5 stars after the customers claim they planed the devices on cheating spouses to catch them in the act, and many of the reviews on the company’s own website mention parents tracking teenagers (seemingly all of them girls) without their consent.
Law enforcement, of course, has not caught up, and there are multiple examples of people who find previously unknown AirTags in their bags or their cars contact the police and are told that there is nothing that law enforcement can do.
The Attack on Bodily Privacy
Diane Akerman, Digital Forensics Unit Staff Attorney
The ongoing attack on reproductive freedom and access to abortion has taken especially alarming turns in the past few months. Since Roe became the law of the land, State governments have found ways to undermine it by regulating where and how safe abortions can be obtained, all under the guise of “protecting women’s health.” These laws and regulations, in fact, do nothing to protect pregnant people and instead cause irreparable physical and emotional harm.
A more recent, alarming attack on abortion access has taken the form of criminalizing both those who seek and obtain abortions, medical professionals who provide safe access, and anyone who may have played a role in an abortion. This has come both in the form of overzealous prosecutors, and state laws that attempt to subvert legal challenges by deputizing citizens to report (or, more aptly, snitch) on others, and collect bounties for their “good deed.”
With the threat of criminal prosecutions always comes the threat of enhanced surveillance. In 2017, a grand jury in Mississippi indicted a woman for the murder of her still-born child. Mississippi is considered one of the states most hostile to abortion rights, with legal access restricted to a single abortion facility for the whole state. The government’s case was based largely on unreliable forensic science and an extraction of the accused’s phone, which showed numerous google searches for at-home and self-managed abortion. This was not the first, or only, case of its kind.
As states and localities place more and more restrictions on access, individuals will turn to at home and self-managed abortion far more frequently. Currently, patients in at least 19-states who find in-clinic options inaccessible will continue to rely on the internet to find abortion medication. Numerous organizations have taken on the fight against digital threats to pregnant people – from providing information on strategies to protect pregnancy privacy, legal defense, advocating for laws limiting the use of forensic technologies, and banning the use of facial recognition, reverse keyword searches, and other forms of digital surveillance.
Criminalizing bodily autonomy, and accompanying government surveillance, is not limited to abortion access. Laws banning gender-affirming medical care and banning transgender youth from sports have multiplied. In Texas, guardians and parents of transgender minors are now under threat of prosecution for child abuse for providing gender-affirming medical care. These laws are also touted as efforts to “protect children,” but instead put LGBTQ youth at more risk.
Minors are especially susceptible to surveillance [PDF], and increasingly so as classrooms modernize and more students use remote learning. Some widely-used software already flags the terms “lesbian,” “gay,” and “transgender” - presumably to protect students from bullying - but these flags have led to the outing and further marginalization of at risk LGBTQ students, who already have the highest rate of youth homelessness, and are at the greatest risk of suicide.
The right to reproductive choice, and bodily autonomy generally, is enshrined in the constitutional right to privacy. It should not come as a surprise that any attempt to undermine those rights would include an attack on both bodily and digital privacy.
In the Courts
Massachusetts Court Permits Stop and Seizure Based on ShotSpotter Activation
Benjamin S. Burger, Digital Forensics Staff Attorney
Last month, a Massachusetts appeals reversed a lower-court and found that an investigatory stop and “patfrisk” based on ShotSpotter activations was permissible under the Fourth Amendment and state constitution. See Commonwealth v. Ford, 2022 WL 497325 (Mass. App. Ct. 2022) [PDF].
In May 2019, a police officer in Chelsea, Massachusetts received a series of ShotSpotter alerts. Arriving quickly to the location of the alerts, the officer identified the defendant as the only person in the vicinity. He ordered the defendant to the ground at gunpoint, handcuffed him, and recovered a firearm in the defendant’s right pocket. The trial court granted the defendant’s motion to suppress the gun, holding that he police officer did not have reasonable suspicion to stop the defendant and order him to the ground. Id. at *2. Specifically, the lower court held that the ShotSpotter alerts, without more, did not provide reasonable suspicion to believe the defendant was connected to the alerts. Id.
The appeals court disagreed, determining that there was an apparent difference between a single ShotSpotter alert and multiple alerts in the same area. Id. at 3. According to the court, “[e]ach successive report of a ShotSpotter alert, combined with the officer’s own hearing of apparent gunshots, made it increasingly reasonable for the officer to infer that the ShotSpotter devices were activating in response to consecutive gunshots.” Id. These successive ShotSpotter alerts created “an acoustic trail of breadcrumbs, from which it was reasonable to infer that the person responsible for the potential gunshots would be at or near the location where the ShotSpotter had last activated.” Id. Therefore, because the officer responded to the scene only a minute after the last activation, and the defendant was the only one present, the officer had reasonable suspicion to believe the defendant fired the shots. Id. at 4.
The decision in Ford suffers from a number of weak arguments and analysis. First, the Court assumed the reliability of ShotSpotter’s gunshot detection and location system. It does not appear that this issue was extensively litigated in the trial court, as the defendant did not move to preclude the ShotSpotter evidence as unreliable. Id. at *5, FN7, FN8. Even the officer testified that he was not “familiar” with ShotSpotter and that it picked up sounds like fireworks and car backfires. Id. at *5, FN8. Second, the Court’s analysis elides the fact that there was no “individualized” suspicion as to this defendant. Although Ford may have been the only person at the scene, the officer had no factual basis that Ford - as opposed to any person - was the person who discharged the shots. Notably, the Officer did not know whether Ford was a perpetrator, victim, or witness prior to seizing him. Third, the Court’s logic contains no natural limiting principle. How many people at the scene of a shooting can the police search based on a ShotSpotter alert? Considering there were four ShotSpotter alerts in Ford, would the police have been permitted to seize and search up to four people? Conversely, if there is only one ShotSpotter alert and the police identify two people at the scene, can they search both?
In New York, the issue of whether a ShotSpotter alert can provide reasonable suspicion for a search and seizure has percolated through the trial courts and the Appellate Division. The Second Department held that a ShotSpotter alert, combined with a vague clothing description, did not give the police reasonable suspicion to pursue an individual. See People v. Ravenell, 175 A.D.3d 1437 (2nd Dep’t. 2019). The First Department issued decisions that found reasonable suspicion when a ShotSpotter alert was combined with additional factors. See People v. Pope, 194 A.D.3d 449 (1st Dep’t. 2021) (ShotSpotter alert and witness “who, in a face-to-face encounter, described a person involved in the shooting and pointed to the direction where the suspect had fled,” created reasonable suspicion to stop and frisk the defendant).
Ask an Examiner
Do you have a question about digital forensics or electronic surveillance? Please send it to AskDFU@legal-aid.org and we may feature it in an upcoming issue of our newsletter. No identifying information will be used without your permission.
Q. My client states that he was in a different location from that of the incident for which he was arrested. He also has a Google account. What kind of location information can we get from his account?
A. Google location information can be a valuable resource in determining a phone’s location. However, there are a couple of factors required to access and download the location information stored in the account. First, the client must be using a phone with an active Google account. This can either be an Android-based phone or an Apple iPhone. Second, the account must have had location history enabled. The default for a Google account is to turn on this feature, but it can be disabled by the user.
Assuming these two factors are present, a digital forensics analyst can use the Google account username and password to log in and access the account through a web browser. On an Android-based phones (for example a Samsung Galaxy or Google Pixel) this typically will be the Google account they used when setting up the phone. On iPhones, this will likely be a Google account that was used to setup Gmail or another Google application. In either case, the analyst will need to use a two-factor authentication code. This will require the client, or someone who has access to the Google account, to provide the code when it is sent to the device. The analyst can then request from Google a download of the entirety of the account, including location information history.
Depending on the amount of data in the account, it will take Google anywhere from a few minutes to a few days to email a link to download the data. The data in the account can span more than a decade and it may take a digital forensics analyst some time to review the data. Once the data has been downloaded and reviewed, the analyst can map the location data points that correspond to the dates and locations of interest. The analyst can also give an opinion as to the accuracy of the location information. This is sometimes necessary, as Google does allow a user to manually edit or add locations in the location history timeline.
Google location information is a relatively new type of evidence and there are numerous evidentiary issues involving the data. The information is drawn from a number of sources, including GPS, nearby Wi-Fi, mobile networks, and device sensors. Although a digital forensics analyst can understand how the system works, they may lack the personal knowledge to lay a proper foundation to admit the evidence in court. Depending on the circumstances, it may require a witness from Google to testify. Attorneys are encouraged to research the evidentiary issues and consult with available experts.
- Brandon Reim, Digital Forensics Analyst
Upcoming Events
March 5-13, 2022
NYC Open Data Week (NYC Mayor’s Office of Data Analytics, BetaNYC, and Data Through Design) (Virtual & in-person)
March 7-10, 2022
Mozilla Festival (MozFest 2022) (Virtual)
March 29, 2022
Intro To Artificial Intelligence (AI) Part 1: AI As Evidence In Litigation (NYSBA) (Virtual)
April 5, 2022
Intro To Artificial Intelligence (AI) Part 2: AI As A Litigation Tool (NYSBA) (Virtual)
April 7, 2022
SANS Open-Source Intelligence Summit 2022 (Virtual)
April 7-9, 2022
NACDL Making Sense of Science: Forensic Science & the Law Seminar (Las Vegas, NV)
April 11-13, 2022
Magnet User Summit (Nashville, TN)
May 9-12, 2022
Techno Security & Digital Forensics Conference (Myrtle Beach, SC)
June 6-10, 2022
RightsCon (Virtual)
July 22-24, 2022
A New HOPE (Hackers on Planet Earth) (Queens, NY)
August 11-14, 2022
DEF CON 30 (Las Vegas, NV)
October 10-12, 2022
Techno Security & Digital Forensics Conference (San Diego, CA)
Small Bytes
FBI used geofence warrant in Seattle after BLM protect attack, new documents show (The Verge)
Adams eyes expansion of highly controversial police surveillance technology (Politico)
Why Have 14 of 15 U.S. Cabinet Departments Bought Phone Unlocking Technology? Few Will Say. (The Intercept)
NSO Group Gave Pegasus Spyware Demo to the NYPD (vice.com)
Tech Firm Offers Cops Facial Recognition to ID Homeless People (vice.com)
A bill aiming to protect children online reignites a battle over privacy and free speech (Washington Post)
The digital divide in the US criminal justice system (New Media & Society/SAGE Journals)
New York Rolling Out Noise Law, Listening Tech for Souped-Up Speedsters (The City)
The secret police: Cops built a shadowy surveillance machine in Minnesota after George Floyd’s murder (MIT Technology Review)