Strava and Data Brokers, Tech Eulogies, Social Media and the Fourth Amendment, the Future of Legal AI-d & More
Vol. 5, Issue 12
December 2, 2024
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. This month, Joel Schmidt reviews how Strava and data brokers are revealing location data to others. Diane Akerman eulogizes tech we’ve loved and lost this year. Shane Ferro analyzes a recent court decision on social media and the Fourth Amendment. Finally, our guest columnists, Cynthia Conti-Cook and Megan Graham, discuss the threat of AI on the public sector.
The Digital Forensics Unit of The Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts, the Unit provides support and analysis to the Criminal Defense, Juvenile Rights, and Civil Practices of The Legal Aid Society.
In the News
How Strava and Data Brokers May be Revealing Your Location for all the World to See (No Matter Who You Are)
Joel Schmidt, Digital Forensics Staff Attorney
In 2021 French President Emmanell Macron took what was intended to be a private weekend trip not listed on the president’s official agenda. However, by reviewing publicly available Strava data from Macron’s bodyguards the French newspaper Le Monde was able to determine that Macron spent the weekend in Honfleur, a French seaside resort.
Similarly, in the United States, agents with the Secret Service were publicly posting their locations on Strava allowing Le Monde, and anyone else, to determine the highly confidential locations of Secret Service protectees, such as President Joe Biden, First Lady Jill Biden, Vice President Kamala Harris, and President-elect Donald Trump.
In all, Le Monde was able to identify the public Strava profiles of 26 US Secret Service agents. Also identified were a number of Strava profiles belonging to bodyguards affiliated with the French and Russian agencies in charge of presidential security, including Russian president Vladimir Putin. Reviewing these Strava profiles may not only disclose a protectee’s current location but may also predict future locations where a protectee may visit since bodyguards and Secret Service agents typically travel to and scout locations ahead of time to secure these locations prior to the protectee’s arrival.
According to the Secret Service there is no prohibition on its agents’ use of Strava while off duty, but the agency says it will review the Le Monde report “to determine if any additional training or guidance is required.” Members of the United States military have been restricted in how they can use Strava and other fitness tracking apps since 2018 after Strava released the satellite location data of its users revealing secret US Army bases around the world.
However, the military restrictions on the use of Strava have not made the Pentagon and its facilities immune from the harmful effects of another source of data; the cellphone location information collected and sold by commercial data brokers. Hundreds of millions of people have apps on their phone which, often unknowingly, track their location wherever they go. That information is sold to commercial data brokers which is then available for purchase by anyone for a fee. Government agencies have been known to purchase this location information on the open market to acquire location information that they otherwise only would have been able to obtain via a search warrant supported by probable cause. (In April the US House of Representatives passed the Fourth Amendment is Not for Sale Act which would prohibit the government from acquiring information that would normally require a warrant, but the Senate has not yet acted on it.)
Now the government itself is feeling the repercussions of allowing commercial data brokers to collect and sell cellphone location data as they please. Recent reporting has shown how anyone with a credit card can purchase phone coordinates from a US data broker and track the movement of American military and intelligence personnel at sensitive US military locations in Germany, including tracking them to nuclear storage facilities. They can also be tracked to prohibited locations like area brothels, making them vulnerable to blackmail and exploitation, jeopardizing US national security.
Our cellphones are essential to everyday life but all too often the data they generate falls into the wrong hands.
The Year in Review
In Memoriam: Our Dearly Departed Surveillance Tech
Diane Akerman, Digital Forensics Staff Attorney
With Thanksgiving falling so late in November this year, it felt fitting for once to perhaps not just focus on the continuing rise of the surveillance state encroaching on our every move and slowly depriving us of even the smallest modicum of privacy, but instead take a moment to find something to be thankful for. So this year, the Digital Forensics Unit (DFU) celebrates mourns the demise of bad surveillance tech.
Evolv Weapons Scanners (July 26, 2024 – October 2024)
Despite marketing that sounded about as believable as a QVC infomercial, Mayor Adams and the NYPD deployed Evolv weapons scanners in NYC subways. After a full month of terrorizing straphangers, the NYPD quietly released the results of the failed project: 118 false positives and not a single recovered firearm. Not only is the NYC pilot dead, but Evolv is facing an ever growing onslaught of troubles: an internal audit found evidence of “misconduct,” and Evolv’s contracts are also under scrutiny by the DOI, the feds, and the FTC. Add this to the pile of Mayor Adam’s legal woes.
Knightscope (September 22, 2023 – February 2, 2024)
The little 420 pound robot, flanked at all times by two NYPD bodyguards, was never long for this world. Though the Mayor continues to promise K5 will have a new assignment, the robot remains in storage. It seems unlikely the robot will roam the floors of the subway again, as even the folks who deployed it were admitting that it did little more than act “as a reminder that technology is watching folks.” Perhaps it would be better if the NYPD officers deployed in overwhelming numbers throughout the subway watched anything other than their phones.
Google Timeline Geofence Warrants (~ 2015 – 2024)
Geofence warrants [PDF] remain one of the most intrusive surveillance methods used by law enforcement across the country. By allowing law enforcement to request the data of any person whose phone may have been in an area, it’s not just likely that innocent civilians are being swept up in a surveillance dragnet, it’s guranteed. While the issue takes its times winding through the courts, who remain split about the warrants’ constitutionality, Google made the decision to simply stop keeping the data in the cloud. No data, nothing to provide in response to warrant. Is it tech altruism, a good business decision, or simply less work overall? Who knows, but it’s one rare example of how tech moving faster than the law can sometimes benefit the public. Now we just need all the other tech companies to follow suit.
Ring Neighbors Warrantless Access (2018 – 2024)
Ring cameras have been one of the most popular home security cameras for years, providing relatively simple and affordable monitoring for one’s home. In 2018, Ring introduced the Neighbors app, turning simple home security surveillance social networking. The app allowed users to post and share videos with, you guessed it, their neighbors, but unfortunately, those neighbors also included local law enforcement, who were given full warrantless access to videos without owners consent. Early this year, Ring ended the Request For Assistance program. It won’t end police access to Ring content entirely, but it will require (for the most part) a warrant. Now if only it wasn’t so easy for law enforcement to get one of those.
Happy Holidays and Happy New Year from DFU and good rubbish and good riddance to silly robots, unreliable scanners, unconstitutional dragnets, and misguided, nonconsensual partnerships with law enforcement. May the new year bring a ban on facial recognition, a prohibition on circumventing the Fourth Amendment, and transparency.
But since that’s unlikely, maybe you should consider spending a few hours doing this and this, or just throwing your phone directly into the sun.
In the Courts
Social Media is Not a Categorical Exception to the Fourth Amendment
Shane Ferro, Digital Forensics Staff Attorney
In early November, the Eastern District of Virginia took up one of my favorite, most frustrating legal topics, social media and the Fourth Amendment, in U.S. v. Chhipa, 2024 WL 4682296 (EDVA 2024).
This holding recognizes that some parts of a Facebook account (private messages) are covered by the Fourth Amendment. At this point, most people have a reasonable expectation that not every photo they put on the internet should exist forever for everyone. However, it still goes into the flaming trash pile of cases refusing to acknowledge how social media sites actually work written by judges who hold a clear disdain for any young person that dares post to the internet. Engaging with life in the 21st Century should not extinguish the right to challenge the government rifling through years of a person’s digital papers and effects.
Chhipa leans heavily on the test from Katz v. United States, 389 U.S. 347 (1967), which hinges on whether a person had a “reasonable expectation of privacy” in the thing the government went searching through. Chhipa, and the line of cases it cites, treats the issue as if there are two very obvious buckets of information in a Facebook page, one public and and one private. From there, it’s supposedly simple: there’s a reasonable expectation of privacy in private messages, but not in public messages. Yet, I’m sitting here thinking, “Okay, sure, but what is the expectation for the metadata for the photo that was previously a profile photo but then got moved to ‘friends only’ and was later deleted?” The public/private dichotomy in social media has been dead for over a decade.
I can see that maybe for the simpler-to-use Instagram, grid photos on a public profile are purely “public” information. What about Stories that auto delete after 24 hours? What about Stories posted only to Close Friends? Does it matter if a person has a 5-person Close Friends list or a 25,000 person list? On Facebook, where each individual post can have different privacy settings that change over time, good luck trying to distinguish the “public” character of one post from another.
The complex gray area of social media privacy exists precisely because of a huge demand by social media users to share information with just a certain subset of people, or just for a short amount of time. At this point, most people have a reasonable expectation that not every photo they put on the internet should exist forever for everyone. However, social media companies need scale, thus they want to push people to make content public despite a myriad of privacy offerings, and they make it intentionally confusing for a user to establish clear privacy settings. (If you haven’t posted something on main you didn’t mean to, you aren’t using the internet enough.)
As someone who has grown up as an early adopter of most of the major social networks still in use, I think it’s actually impossible to create a clear delineation of what is and is not reasonably expected to be “private” on the internet. The Katz test is simply a vehicle for out-of-touch judges to moralize about kids these days behaving in ways that the judges don’t like or understand.
Yet, somehow, forty years ago, Congress saw this freight train coming and tried to stop it. The Stored Communications Act, for all its outdated frameworks and other faults,* requires the government to get a warrant to obtain content from tech companies. The expectation of privacy is right there! Has been since 1986! It’s nonsense for Courts to recognize that the law sees enough of an expectation of privacy to require a warrant for the government, but not enough to allow a defendant to challenge that warrant.
If that wasn’t backwards enough, this line of cases that harps on public vs. private information using the Katz test (and ignoring the SCA’s intent) also requires the defense to first make a showing about privacy settings. Not today’s privacy settings, but the privacy settings on the date the warrant was signed. Unless the client specifically remembers, the only way to do that is to get that information from the social media company. But wait! We can’t! Because the SCA requires a warrant, and the defense is not the government.**
So, having ranted about the problem, what’s the solution? The first is obvious: the SCA exists. It’s a federal law that recognizes an expectation of privacy in internet content. If the government seeks a warrant for the defendant’s content pursuant to the SCA, the defense can contest the warrant pursuant to legal framework that created the warrant requirement.
There is also an alternative: dive deeper into Carpenter’s recognition of the property test.*** The Fourth Amendment applies not only because of a reasonable expectation of privacy, but because the content belongs to the owner of the account. In 2024, a social media account is a person’s digital “papers and effects,” and thus the Fourth Amendment allows a challenge to their search and seizure.
It’s right there in the Bill of Rights.
* See: the next paragraph.
** As I mentioned, there are faults. There are also ongoing attempts to challenge this limitation. See Privacy As Privilege: The Stored Communications Act and Internet Evidence, Privacy Asymmetries: Access to Data in Criminal Defense Investigations, and Equalizing Access to Evidence: Criminal Defendants and the Stored Communications Act.
*** For the uninitiated, see Carpenter v. United States, 585 U.S. 296, 387 (2018) (Gorsuch, J., dissenting).
Expert Opinions
We’ve invited specialists in digital forensics, surveillance, and technology to share their thoughts on current trends and legal issues. This month’s article is a collaboration between guest columnists Cynthia Conti-Cook and Megan Graham.
Working Together to Prevent a Future Legal AI-d
In the last couple of years we have read many headlines about automated and artificial intelligence systems. While entertainment, coding, and call center jobs have already been targeted for “efficiency” reforms by corporate management and the AI industry, legal professionals have generally felt unthreatened. Discussions about the impact of AI on lawyers have focused more on the advantages of GenAI–large language model technologies that may generate writing based on specific prompts–and related ethical landmines. Yet recent trends in tech procurement across public sector services and public education require legal workers to also pay attention.
Government officials have already reduced or replaced public sector workers with AI and other automated corporate services. A new report [PDF] estimates that “essentially all 92 million low-income people in the U.S. states—everyone whose income is less than 200 percent of the federal poverty line—have some basic aspect of their lives decided by AI.” These decisions are often based on historical data – therefore they can replicate and amplify existing layers of harmful human bias [PDF]. When governments gut organized public sector staff with decades of deep expertise and connections to the communities they’re serving, no one is left to flag how far off the rails new technology is veering, systemic rights violations, or government overreach. Low-income people are not only being deprived of benefits and sometimes even wrongly accused of fraud, they also lose even more power within local governance.
This is not just happening in social service agencies but now also to teachers who, like lawyers, had been considered irreplaceable because of the significant impact they have on our families’ and communities’ lives. An emerging EdTech industry promises to reshape the role of teachers “further away from being providers of knowledge and towards becoming learning facilitators.” Teachers, like defenders, know that the lives they impact and the work they do every day can never be replaced by a robot. Yet government austerity, efficiency, and streamlining of services are commonly invoked as justifications for drastic budget cuts and cruel compromises.
Defenders know this dance–they have long fought for caseload caps, salary parity and resources that level the playing field in the courtroom for the accused. Most clearly, the humanity of our clients and their ability to access and develop a relationship with legal counsel who can advocate on their behalf is at stake. But the ability of legal service workers to leverage their collective power and organize for working conditions and for the accused in order to continue doing the work of defending democracy on a daily basis from policing and state power is also at stake.
Workers across industries have begun to demand more agency over not just how technology is deployed on the job but whether it is deployed at all. Just as the dockworkers and actors have recently demanded, it’s important for defenders to demand binding protections over whether and how to incorporate GenAI and other AI systems into their practices.
We know teachers are so much more than knowledge providers and likewise, what defenders do every day requires human connection, fierce empathy, sustaining trusting relationships, understanding of factual context, creative human reasoning, and courage to fight against abusive state force. In this new era of AI everything, organize to center the humanity required to defend the accused but also to defend the public and democracy.
Cynthia Conti-Cook is the Director of Research and Policy at Surveillance Resistance Lab.
Megan Graham is an Associate Clinical Professor and the Director of the Technology Law Clinic at the University of Iowa College of Law.
Upcoming Events
December 4, 2024
Legal and Ethical Issues Around Vehicle Surveillance Systems (NYU School of Law Policing Project) (Virtual)
December 5, 2024
Policing Pregnancy: The Impact of Data and Surveillance on Reproductive Healthcare (NACDL) (Washington, DC)
December 5-6, 2024
Questioning Forensics: To Err is Human, To Do Science is… Still Human (Legal Aid Society) (New York, NY)
December 11, 2024
Cyber Security, Ethics and the Current State of Data Privacy (NYS Academy of Trial Lawyers) (Virtual)
December 17, 2024
Electronic Information and Crimes: Protecting Defendants and Protecting Rights (NYSBA) (Virtual)
December 18, 2024
AI and Legal Services: The Present and Future (PLI) (Virtual)
January 16, 2025
This is Not Your Father's AI: Newest Uses of AI in Law Practice (NYS Academy of Trial Lawyers) (Virtual)
January 30-31, 2025
NAPD Virtual Tech Expo (National Association for Public Defense) (Virtual)
February 17-22, 2025
AAFS Annual Conference - Technology: A Tool for Transformation or Tyranny? (American Academy of Forensic Sciences) (Baltimore, MD)
February 24-March 3, 2025
SANS OSINT Summit & Training 2025 (SANS) (Arlington, VA or Virtual)
March 17-19, 2025
Magnet User Summit (Magnet Forensics) (Nashville, TN)
March 24-27, 2025
Legalweek New York (ALM) (New York, NY)
March 31-April 3, 2025
Cellebrite Case-to-Closure (C2C) User Summit (Cellebrite) (Washington, D.C.)
April 24-26, 2025
2025 Forensic Science & Technology Seminar (NACDL) (Las Vegas, NV)
April 28-May 2, 2025
IACIS Collecting and Admitting Digital Evidence at Trial (IACIS) (Virtual)
Small Bytes
Meta AI is ready for war (The Verge)
Police Freak Out at iPhones Mysteriously Rebooting Themselves, Locking Cops Out (404 Media)
Police Use of Geofence Warrants Splits Courts Over 4th Amendment (Bloomberg Law)
Ben Horowitz’s cozy relationship with the Las Vegas Police Department aided a16z portfolio company Skydio (TechCrunch)
The WIRED Guide to Protecting Yourself From Government Surveillance (Wired)
More eyes in the sky: NYPD expanding use of 'drones as first-responders' (Gothamist)
Leaked Documents Show What Phones Secretive Tech ‘Graykey’ Can Unlock (404 Media)
Inside Clear’s ambitions to manage your identity beyond the airport (MIT Technology Review)
ICE Spent Millions on Phone Hacking Tech, Just In Time For Trump’s Mass Deportation Plans (Forbes)
Someone Made a Dataset of One Million Bluesky Posts for ‘Machine Learning Research’ (404 Media)