ShotSpotter's Bad Month, Messaging Discovery, Significant Locations, Emerging Surveillance Technologies & More
Vol. 5, Issue 3
March 4, 2024
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. This month, Jerome Greco discusses developments surrounding the ShotSpotter gunshot detection system, including the decision by Chicago officials to end its contract. Allison Young provides an overview of changes to messaging and how it could affect legal cases. Joel Schmidt reviews a recent Massachusetts court decision precluding the use of an iPhone’s “Significant Locations.” Finally, our expert columnist this month is Legal Aid Society alumnus Cynthia Conti-Cook from the Surveillance Resistance Lab who reveals the newest surveillance technology trends.
The Digital Forensics Unit of The Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts, the Unit provides support and analysis to the Criminal, Juvenile Rights, and Civil Practices of The Legal Aid Society.
In the News
Chicago Reportedly Ending Expensive Relationship with ShotSpotter
Jerome D. Greco, Digital Forensics Supervising Attorney
For SoundThinking, the rebranded company behind the ShotSpotter product, the past few weeks have been rough. First, Chicago Mayor Brandon Johnson announced that the city will not be renewing its contract for ShotSpotter. While ending the contract was part of the Mayor’s election campaign plan, The Plan for a Safer Chicago [PDF], pre-election promises from politicians often go unfulfilled once they take office, and, here, there are still a few wrinkles. Despite declaring that the city would not renew its contract, Mayor Johnson has extended it until November 22, 2024 to allegedly allow the police time to “revamp operations within the Strategic Decision Support Centers, implement new training and further develop response models to gun violence that ultimately reduce shootings and increase accountability.” The nine-month extension costs a total of $8.6 million. The extension also ensures that the ShotSpotter system will still be operating in the Windy City when it hosts the Democratic National Convention in August. The contract allegedly will not be renewed or extended again.
On February 22, 2024, Wired reported it had obtained the previously secret precise locations and uptimes of 25,580 ShotSpotter microphones from around the world. Wired found that:
“An analysis of sensor distribution in US cities in the leaked data set found that, in aggregate, nearly 70 percent of people who live in a neighborhood with at least one SoundThinking sensor identified in the ACS data as either Black or Latine. Nearly three-quarters of these neighborhoods are majority nonwhite, and the average household earns a little more than $50,000 a year.”
The locations of the sensors include hospitals, schools, and public housing properties. Wired reported that by 2019 SoundThinking had adopted a formal policy that “its clients cannot access the precise locations of their gunshot detection equipment.” However, emails obtained by The Legal Aid Society’s Digital Forensics Unit showed high-ranking NYPD officers coordinated with the company in attempting to get specific buildings to install ShotSpotter sensors in 2021.
The most recent hit to ShotSpotter’s already damaged reputation was a revelation from Chicago’s Civilian Office of Police Accountability (COPA). COPA found that a ShotSpotter gunshot alert led police to a boy lighting fireworks, and police opened fire. An officer fired his gun after hearing a loud noise and yelled “Shots fired! Shots fired!” After the officer discharged his weapon, the boy shouted “No, it’s just fireworks! It’s just fireworks!” Thankfully, the child was not hit or injured. As the People’s Fabric notes, the COPA report is consistent with dispatch audio previously published by Jinx Press, once again showing the importance of public access to police radio runs. After the incident, the Chicago Police Department attempted to cover it up. They initially claimed that a man fired upon officers, but later changed their story to “officers saw flashes of light and one of the officers fired his weapon, police said. It wasn’t immediately known if the male was armed.”
For cities and states, surveillance is easy to adopt, but harder to remove, and SoundThinking is not leaving cities without a fight. As reported by Crain’s Chicago Business, during a recent earnings call, SoundThinking CEO Ralph Clark outlined a plan to convince the public, media, and the Mayor that ShotSpotter is a necessary policing tool for Chicago, even arrogantly declaring, “We deserve to be in Chicago.” There is a lot of time between now and November 22nd for the company’s public relations machine and the invasive-surveillance-tech-is-necessary-for-policing crowd to exert pressure or mount legal challenges to attempt to change the decision to not renew the ShotSpotter contract. Furthermore, Chicago has already started to test another SoundThinking product, CrimeTracer.
Hopefully, Mayor Johnson will not renege on his promise. In the meantime, I thank our friends in the Cook County Public Defender’s Office, the advocates and activists in the Stop ShotSpotter coalition, and others for their inspiring work in countering SoundThinking’s narratives and pressuring the City of Chicago to end its ShotSpotter contract.
Chat Bubble Skirmishes and Surprise Features That May Complicate Your Message Discovery
Allison Young, Digital Forensics Analyst
February was a big month for messaging news, from advances in encryption to a recent AT&T network outage (which may give you a hard time if you need records for an incident on February 22, 2024). Major apps, including Signal, iMessage, and WhatsApp, are bringing big changes that will impact the way that we chat – and how those chats end up in evidence.
Most people have moved from using standard SMS/MMS text messages, which are less secure from surveillance, to using messaging apps that send messages over data (as internet use). Messaging apps may include additional features like improved group chatting, video calling, encryption, or the ability to chat without a phone plan. Default text messaging apps are now adopting RCS (Rich Communication Services), and it’s typically “baked in” with new Android phones from leading carriers.
Data-based chat communications result in fewer messages logged by a carrier in call detail records (CDRs) and even less text message content available to law enforcement investigators (which was already only available for a few days depending on the carrier, but sometimes not retained at all). It may also frustrate the next generation of geofence warrants that are being sent to carriers now that Google has attempted to end (or at least greatly reduce) their participation in them.
iMessage, the default app for texting on iPhone, is a data-based chat app. Messages go through Apple’s servers and are not tracked by the phone carrier when two iPhone users message each other with this service (blue text). When an iPhone user and Android user text using iMessage, the chat switches to standard SMS/MMS (green text), which usually shows up on carrier records. Apple has historically pushed back on attempts to bypass the seemingly “VIP” access to iMessage afforded by owning an iPhone, blocking competitor workarounds to the point of attracting FCC attention. Apple has also claimed to be uninterested in supporting RCS messaging to support Android users… but had a sudden change of heart. This comes despite a win this month in which the EU found Apple immune to requirements to support “chat interoperability” under the Digital Markets Act (DMA) (it’s now theorized that the deciding legal pressure to add RCS support may have come from China).
WhatsApp, however, has become a recent victim of the EU’s DMA. They now face demands to implement the ability to send messages between other chat apps. While this could be compared to how email works, in that Gmail users can contact Outlook users and vice versa, privacy advocates have been concerned that it could leave chat users more vulnerable as messages are carried between various services and potentially de-anonymized. In the future, it may also complicate identifying what apps contain relevant messages – just because one person has evidence in their Facebook Messenger app, that may not rule out the other party having those same chats within another app, like Telegram.
Despite this, the Meta-owned company seems to have cheerily embraced the idea of adding chat app interoperability. Meta has also continued to support E2EE (end-to-end-encryption) in Facebook Messenger, despite attempts to make messages more susceptible to surveillance in the name of child protection. Turning off encryption in one app would no doubt complicate any plans to open lines of communication between the apps.
Finally, Signal is rolling out a new feature (originally teased by WhatsApp) to protect the contact information of its users by allowing them to chat with usernames instead of phone numbers. You may have a lower chance of identifying the source of a problematic chat if they’re using this app, as Signal does not provide subscriber information in response to a legal request.
2024 may be a tumultuous year in digital communications. With that in mind, please be patient when a digital forensics analyst answers your tech questions with “it depends.”
In the Courts
Massachusetts Court Precludes iPhone Location Data
Joel Schmidt, Digital Forensics Staff Attorney
There are many ways to potentially extract historical location information from a person’s cellphone usage. For example, a cellphone carrier may be able to reveal the location of cellphone towers that connected with a particular phone at a particular time. Other possibilities include Google location history and the location information that may be embedded in a video or photo at the time they are taken (aka “metadata”). These kinds of location information are occasionally admitted into evidence in court, whether used by the prosecution in an effort to place a defendant at the scene of a crime or by the defense to prove up an alibi showing the defendant was not at the scene.
A less talked about potential source of location information is the iPhone Frequent Location History (FLH), or Significant Locations as it is called on more recent versions of iOS (the operating system for iPhones). According to Apple “Your iPhone and iCloud-connected devices will keep track of places you have recently been, as well as how often and when you visited them, in order to learn places that are significant to you” so that they can “provide you with personalized services, such as predictive traffic routing, and to build better Memories in Photos.” That “learning” process occurs through the use of a proprietary algorithm that analyses where you’ve been in order to arrive at a list of locations it believes are significant to you.
Last month the Massachusetts high court – the Supreme Judicial Court – affirmed [PDF] Commonwealth v. Arrington, a trial court decision finding FLH to be too unreliable to be admissible at trial. The high court found that the lower court did not abuse its discretion in excluding the evidence because the government failed to meet its burden that FLH data is deemed to be reliable by the digital forensics community. It didn’t matter if reliable location data was fed into the algorithm since the government was trying to use the unverified data it spat out. “In other words, even if the inputs used by the FLH algorithm are generally deemed reliable, the FLH data outputs are not ipso facto reliable, especially where there is not scientific literature or adequate testing to support reliability.”
The government also failed to sufficiently test the reliability of FLH data. A government analyst conducted some experiments in which he repeatedly visited a number of locations to have them recognized as significant locations, but the sample size was small, unscientific, inadequate, and “[d]espite his testing, the analyst did not know the algorithm used in creating FLH data and did not know how various factors were weighed to create FLH data outputs.” Further, there are no peer reviewed or published studies on the reliability of FLH data and it has an unknown error rate.
In a friend of the court brief [PDF] three digital forensics examiners filed with the court, they emphasized that Apple only created FLH for advertising purposes which can tolerate an error rate that would be unacceptably high in other contexts. “If, for example, one does a search on their iPhone for ‘breakfast,’ an advertiser such as Waffle House would likely not be disappointed if their restaurant’s information was presented to a consumer in error because the location of the device was not exact.” However, the data was not intended for and should not be admissible in a criminal trial where human lives hang in the balance and the standard of proof is beyond a reasonable doubt.
The Massachusetts high court decision was unanimous, but two concurring judges wrote separately to caution that although the government was unable to meet its burden in this case that is subject to change in the future should our understanding of the technology evolve.
Expert Opinions
We’ve invited specialists in digital forensics, surveillance, and technology to share their thoughts on current trends and legal issues. Our guest columnist this month is Cynthia Conti-Cook.
An Emerging Technology Triad: Mobile Drivers Licenses, Digital Wallets, and Centralized City Databases
The Surveillance Resistance Lab is focused on an emerging “triad” of technology infrastructures and the interconnected co-dependence of their futures–one relies on the parallel development of the others. While individually they create surveillance and control concerns, together, digital identification, wallets, and centralized city databases present a significant threat of creating a core durable infrastructure for expanding surveillance, state control, social exclusion, and corporate influence on public policy and governance.
Such projects have been under way for more than a decade—and often represent the interests of local and federal police and corporations. The New York Times reported back in 2011 that “New York City has spent the past 18 months developing a database on four million residents, most of them the city’s neediest, which officials say will enhance social services but which advocates for the poor say could put their privacy at risk. Using data-sharing concepts developed by the Department of Homeland Security and other law enforcement agencies, the database links together vast amounts of information gathered by city agencies that previously maintained their files separately.”
Defenders and civil rights attorneys should track these technologies carefully and pay attention to how information-sharing infrastructure is impacting policing. Back in 2011, then head of the Legal Aid Society Steven Banks warned that “with all of the agencies now connected, an error made by one in recording information will cascade through every aspect of your life.”
The triad of mobile drivers licenses, digital wallets, and centralized database technologies will facilitate this. Mobile driver’s licenses (mDLs) are emerging across the country, including a contract between IDEMIA and the DMV in New York State. The Surveillance Resistance Lab, along with the National Immigration Law Center, produced a mDL guide explaining how this new technology works, why it raises concerns for privacy as well as for the influx of corporate power it invites into the public sector. Mobile driver’s licenses assign unique identifiers to people and can be tracked based on where you present it, both in person and online. As people are routinely crossing state lines and purchasing medications online, it is not hypothetical to imagine all of the ways law enforcement would rely on unique identifiers to link where people go online and in real life. In addition, holding digital identification information in smartphones, which people use to record police interactions, problematically introduces a power dynamic that invites police attempts to obstruct a recording by demanding digital identification.
Digital wallets, like credit cards, collect data on purchases. Matthew Fraser, Chief Technology Officer for NYC, former corporate consultant, and previous Deputy Commissioner and Chief Information Officer at the NYPD, has already suggested using digital wallet technology to surveil and incentivize people relying on city benefits to eat more vegetables. The City of Detroit suspended its Detroit ID program in 2022 because of residents’ concerns that MoCaFi was selling data to commercial databases accessed by federal immigration officials. The NYC Municipal ID Coalition thwarted a similar attempt to add a radio frequency ID chip (RFID chip) to New York City’s IDNYC in 2018 based on similar concerns about immigration surveillance.
While digital identification and wallets collect data, city databases that centralize social services data store it. For example, Mayor Adams has flagged that MyCity, “a one-stop shop for city services” will also “[combine] all agency metrics onto a single platform similar to CompStat and [use] analytics to track performance in real time, we can go from a reactive management approach to being proactive and, eventually, predictive.”
Indeed, a MyCity data sharing agreement signed by the Office of Technology and Innovation (OTI), Administration of Child Services (ACS), Department of Homeless Services (DHS), Department of Education (DOE), and Department of Human Resources Association (HRA) in 2023 may change how police and courts access New Yorkers’ data–and how New Yorkers find out about it. For example, this agreement directs OTI to respond to a legal demand for someone’s information from another agency instead of redirecting the requests to the agencies New Yorkers shared their information with. This is significant because some agencies may be required to notify people about a legal demand for their information so they may challenge the demand. The five days notice OTI is required to give agencies does not give someone enough time to bring a challenge.
In addition to the data sharing, this assembly of information lays foundations for predictive policing tools. New Yorkers must pay attention to similar digital experiments in other cities. MyCity could be turned into a tool for the executive administration to generate investigatory leads and introduce automatic denials of social services through corporate vendors.
Together, this tech infrastructure triad of digital identification, wallets, and centralized city databases increases state surveillance capacities across all agencies and problematically entwines corporate control over the public sector at the same time.
Be on the lookout for evidence of data sharing in your cases and let us know if we can support you. Save the date - on March 27 at 7pm we will be discussing the findings of our upcoming MyCity report at NYU law school. To learn more about the Surveillance Resistance Lab and our campaigns, please visit surveillanceresistancelab.org.
Cynthia Conti-Cook is the director of research and policy at the Surveillance Resistance Lab, a think and act tank focused on challenging technology deployed at the nexus of state and corporate power that erodes fundamental rights. We are committed to movement building to fight for accountability and government divestment from technologies that expand systems of control and punishment (as well as suppress dissent and difference) in public spaces, schools, workplaces, and at and across borders.
Upcoming Events
March 6, 2024
AI Risk Mitigation and Legal Strategies (NYCLA) (Virtual)
March 11, 2024
Digital Part III: Litigating Against Algorithms, Hidden Technology, and the Machine Witness (NYSDA) (Virtual)
April 11-12, 2024
2024 Forensic Science and Information Technology Institute (ABA) (San Diego, CA)
April 15-17, 2024
Magnet User Summit 2024 (Magnet Forensics) (Nashville, TN)
April 18-20, 2024
Making Sense of Science XVII: Forensic Science & the Law (NACDL) (Las Vegas, NV)
May 21, 2024
3rd Annual Decrypting a Defense Conference (Legal Aid Society’s Digital Forensics Unit) (Queens, NY) (Registration link coming soon!)
June 4-6, 2024
Techno Security East 2024 (Wilmington, NC)
July 12-14, 2024
HOPE XV (Queens, NY)
July 22-25, 2024
2024 HTCIA Global Training Event (Las Vegas, NV)
October 19, 2024
BSidesNYC (New York, NY)
Small Bytes
21 Bodycam Videos Caught the NYPD Wrongly Arresting Black Kid on Halloween. Why Can’t the Public See the Footage? (ProPublica)
AI-generated voices in robocalls can deceive voters. The FCC just made them illegal (AP)
The AI Lawyer is Here (The Marshall Project)
Backdoors that let cops decrypt messages violate human rights, EU court says (Ars Technica)
Artificial Intelligence Is Putting Innocent People at Risk of Being Incarcerated (Innocence Project)
New bill would let defendants inspect algorithms used against them in court (The Verge)
NYPD to Deploy Drones to Drop Flotation Devices on Drowning Swimmers at City Beaches (The City)
A Vending Machine Error Revealed Secret Face Recognition Tech (Wired)
This $4 Billion Car Surveillance Startup Says It Cuts Crime. But It Likely Broke State Laws. (Forbes)