Consumer AI Spying, NYPD Transparency (Failures), Facial Recognition Bans, Jail Surveillance & More
Vol. 5, Issue 6
![Radio telescope pointed at the sky Radio telescope pointed at the sky](https://substackcdn.com/image/fetch/w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fb6e439b1-9e27-4bb0-be72-a996548713b3_4928x3280.jpeg)
June 3, 2024
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. Allison Young checks in on what fresh evidence may come from the rollout of AI to devices. Shane Ferro examines the NYPD’s transparency failures. Joel Schmidt explains how municipalities are leading the way on facial recognition bans. Finally, our guest expert, Elizabeth Daniel Vasquez spotlights the massive surveillance operation at New York City jails.
The Digital Forensics Unit of The Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts, the Unit provides support and analysis to the Criminal, Juvenile Rights, and Civil Practices of The Legal Aid Society.
In the News
AI Keeping Tabs on Everything You Do: Business as Usual.
Allison Young, Digital Forensics Analyst
This month, major tech players rushed to pull AI out of the cloud and into consumer devices, with Microsoft announcing a new tool that takes screenshots of what you do on the computer. And nearly everyone responded with “yikes.”
Microsoft Recall won’t be available (yet) on all Windows 11 computers. Where it is supported, it will save images of users’ screens, push that information into an index, and chat with you about it. The data is supposed to stay on the computer for about 3 months.
Surveillance-as-a-feature isn’t a new concept and it’s getting bigger (or “worse” depending on your perspective). Let’s recap a few of our favorites:
Alexa - do I need to bring her up? You probably know that your voice recordings can be used in machine-learning (aka AI, which she’s getting more of). This data is requested by law enforcement AND got Amazon into hot water for recording children. There’s a little something here for everyone to be afraid of.
Apple’s Siri, released in 2011, has long kept a watchful eye on your iPhone behavior, stating on its Legal page at least as far back as 2021: “Siri uses local, on-device processing to learn . . . your Safari browsing history, emails, messages, images, notifications, and contacts . . . and provide suggestions.” Phone extraction evidence contains plenty of signs that your phone knows you pretty intimately. If that doesn’t seem intrusive enough, the future may hold more data to pick through – Apple is rumored to be rebuilding Siri bigger, better, with fresh AI features.
Wait, Microsoft already did something similar? While the function is much simpler than the proposed Recall tool, Windows 10 Timeline stores 30 days of activity history, sometimes including screenshots of the activity. Windows 10 Timeline is already a feature that forensics analysts look at in their investigations.
While this is a more obvious example of “worse” surveillance tech, bossware is legal spyware that your employer can use to collect screen recordings and biometrics, all while hopefully not misusing or abusing this information.
I will absolutely be turning off these features as I catch them, but they can be a friend to digital forensics analysts when they’re enabled.
Devices store a large amount of sensitive data to provide you with even the simplest daily features, like autofill (how a form knows your name, birthdate, credit card info, address, and probably your SSN). We can often find screenshots of recently used apps on iPhones. Sometimes we can find the contents of deleted messages in logs copied from Android phones. Regardless of whether digital features like AI assistants share data with third parties (bad for privacy) or keep information encrypted on the device as Microsoft is proposing they will do with Recall (generally better for privacy), they generate digital evidence that can be used by the government, civilian investigators, and hackers for our respective purposes.
There have been many excellent articles addressing the legal challenges of dealing with AI-generated data, like deep-fake videos and half-baked legal filings. Don’t forget that real, useful, marketable, smart AI is going to want to know everything about you. If you end up in court, we may get a copy of that information, too.
The NYPD Continues to Hoover Up New Tech While Ducking Transparency Requirements
Shane Ferro, Digital Forensics Staff Attorney
Last week, the NYPD Office of the Inspector General (OIG-NYPD) released its annual(ish) review of the NYPD’s compliance with the POST Act—the city law that requires the Department to both announce and hold a public comment period before using any new surveillance technology (full PDF Report). The report focuses heavily on the five new policing technologies the NYPD acquired last year, sprung on the city by Mayor Adams last April. The OIG’s report responds directly to the letter [PDF] Legal Aid sent detailing how the new surveillance technologies announced by Mayor Adams violated the POST Act.
The report concludes that the NYPD is still improperly “grouping” its surveillance technologies instead of issuing specific impact and use policies (IUPs) for each new technology it acquires. The report states that the grouping allows the NYPD to avoid the kind of transparency about its technological capabilities that the POST Act sought to require, to the detriment of public accountability:
NYPD’s grouping approach creates a risk that individual technologies may be shielded from public scrutiny and oversight, limiting the transparency about these technologies that the POST Act sought to create. To the extent that grouped technologies are unique, this approach deprives members of the public of an opportunity for notice and comment with respect to the applicable IUP, and makes it more difficult for the public to discern the capabilities and use of the technologies and the policies applicable to them.
The report concludes that to be in compliance with the POST Act, the NYPD needs to issue a completely new IUP for the Digidog, and substantially update its existing IUPs to actually provide the required information for other technologies it has added, including the Times Square Robot (RIP, already), the StarChase GPS tracker guns, digital fingerprint scanners, and the Department’s internal augmented reality app.
Meanwhile, in more recent months, Mayor Adams has announced yet more cop technologies the city is adding to its arsenal—regardless of whether those technologies are actually helpful or made for the use case that the mayor wants to hold a press conference about.
The NYPD did actually release a draft IUP [PDF] for the subway “weapons scanners” (er, fancy metal detectors) that Mayor Adams announced in March. Never mind that the scanner company is publicly against this “use case”—possibly in connection with the lawsuits the company is currently facing about misleading investors about the tech’s capabilities—Adams has a vibe that this is what the subway needs. Beyond the ineffectiveness of the technology, LAS’s public comments [PDF] on the draft IUP shows that the NYPD’s hand wave at POST Act compliance fails to actually live up to the required transparency under the law.
Before the weapons scanners have even been rolled out, Mayor Adams is on to a new shiny cop tech toy: FUSUS video systems that allow the police to tap into private video feeds and access city-wide surveillance from almost anywhere. Is it effective? No one is quite sure. But it is shiny, new, and a reason for the Tech Bro Mayor to throw himself another pat-on-the-back press conference.
Policy Corner
Municipalities Lead the Way on Facial Recognition Bans
Joel Schmidt, Digital Forensics Staff Attorney
In 1964, Teaneck, New Jersey – located eight miles from midtown Manhattan – became the first majority white municipality in the nation to voluntarily desegregate its schools through busing. In 2021, Teaneck made history again when it became the first municipality in New Jersey to outright ban the use of facial recognition technology by passing an ordinance making it “unlawful for any municipal entity to obtain, retain, access or use facial recognition surveillance technologies.”
“Now is not the time to let this technology in our municipality,” said then Teaneck Councilman Keith Kaplan. “Some of the people being targeted have absolutely no connection to a crime and are incarcerated based on an algorithm. We need to protect the civil rights of our residents.”
Teaneck is in good company. Nationwide over twenty municipalities, including Boston, Pittsburgh, San Francisco, and King County, Washington (home to Seattle), have banned the use of facial recognition by law enforcement, and Vermont, Maine, and Montana all have significant statewide restrictions on law enforcement use of facial recognition. For good reason. The technology is too unreliable, especially for women and people of color, and the risk of a false match is unacceptably high.
The Washington Post, however, has recently reported on the disturbing practice of some police departments flouting facial recognition bans by asking other law enforcement entities not subject to the ban to run their facial recognition searches for them. For example, when Austin’s city council unanimously passed its ban on the law enforcement use of facial recognition technology it also prohibited the use of “information obtained” by such technology, thereby specifically prohibiting the use of facial recognition results performed by other law enforcement agencies not subject to the ban.
Yet, that did not stop the Austin Police Department from asking or making use of facial recognition matches performed for it by a police department in a neighboring community about thirty miles north of Austin. “Hello sir, I was referred to you by our Robbery Unit who advised me you are able to do facial recognition,” wrote one Austin detective to the neighboring police department. “I am working a case where I am trying to identify a suspect and was curious if you might be able to help me out with it.”
Determining the scope of these facial recognition ban violations can be difficult since on a national level the use of facial recognition is rarely mentioned in police reports and not introduced as evidence at trial. Sadly, the very agencies in charge of enforcing the law can break the law and frequently keep it secret.
Notwithstanding these police violations, having in place a facial recognition ban is an important first step in curtailing the societal harms caused by facial recognition. A bill currently before the New York State Legislature would ban the law enforcement use of facial recognition technology in New York State. The Legal Aid Society supports [PDF] this bill and encourages the New York State Legislature to pass it.
Expert Opinions
We’ve invited specialists in digital forensics, surveillance, and technology to share their thoughts on current trends and legal issues. Our guest columnist this month is Elizabeth Daniel Vasquez of Brooklyn Defender Services.
The Community Spying Network No One is Talking About
We have all experienced the heartbreak: a fully-formed case theory and hours of trial prep shattered against the ping of the DA email: I’ve uploaded your client’s jail call recordings.
While we are all familiar with the impact that our client’s calls have on our cases, most of us have not spent a lot of time thinking about the technical system that generates those calls. We should.
Here’s what you need to know:
Since 2014, the New York City Department of Correction (DOC) has contracted with Securus Technologies, Inc., to administer its communications’ surveillance system. Founded in 2004, Securus’s first major product offering a few years later—the Secure Call Platform (SCP)—transitioned call mediation from analog to digital. The switch to digital opened up extensive data possibilities. Securus began to not only record every call on its system, but also database those call recordings.
Digital databasing. Instead of a local database, digital transmission paved the way for Securus’s web- and cloud-based system. “NextGen SCP” houses multiple streams of data, including: (1) call recordings; (2) machine transcriptions of those recordings; (3) client data provided by DOC; and (4) billing name and address records for call recipients’ landline phones.
Voiceprinting. Securus’s other innovation was to incorporate continuous voice verification into its systems. A form of biometric identification, voice verification digitally captures and maps unique patterns in voice and speech, generating “voiceprints.” Those voiceprints can then be applied to a call recording to identify call participants. Securus has emphasized that it intended voiceprinting to collect data on not just those incarcerated, but also on every called party.1
AI Transcription. Finally, Securus also provides a product that transcribes call recordings to text. This feature makes the entire call recording database word or term searchable.
Consequently, the identity of all call participants, the audio recording of all calls, and the searchable-text content of those calls are aggregated and searchable for every call made by anyone who is unlucky enough to be unable to afford bail.
But it does not stop with communications. In the background of its databases, Securus also collects other data through its other data streams. For example, Securus not only provides call services to DOC, but it also owns JPay, a system used by NYC to allow family members to put money on their loved ones’ jail commissary accounts. That financial transaction data, along with video visitation data, educational tablet data, and more all get databased by Securus.
This surveillance reality has not always existed in New York City. In fact, universal jail call recording only began here in 2008. And Securus was only brought to New York City in 2014. For decades before that, law enforcement was able to record jail calls, but only with an eavesdropping warrant.
Today, all this surveillance activity occurs without any requirement of individual suspicion, no need for court oversight, and no need for a warrant. And it appears that NYPD has direct access.
THREADS. Securus has another product offering specific for law enforcement: THREADS. THREADS combines the data in Securus’s SCP with complex analytics tools necessary for mining that data. Securus sells subscription-based access to THREADS directly to law enforcement and intelligence agencies. THREADS gives law enforcement access to data aggregated from across Securus’s “nationwide community”.2
In 2014, when Securus was originally pitching DOC, the company touted its relationship to NYPD, flagging that the police department already had a THREADS subscription even prior to DOC’s adoption of Securus’s system.3
Fusion Center and Fusion Team. In fact, as early as 2007, the NYPD began trying to obtain intelligence from Rikers Island, executing a Memorandum of Understanding (“MOU") between DOC and the NYPD, as well as the Department of Probation. The 2007 MOU’s stated objective was to: “share certain information electronically . . . to aid in the detection, investigation, and prevention of criminal activity . . . and to support law enforcement activities.”4 This information specifically included DOC’s phone and visitor data.
By 2009, the NYPD had taken the intelligence relationship further, establishing a presence on Rikers Island at a new information-sharing hub:5 the Rikers Island Fusion Center.6
Based on references in DOC’s disclosures and in discovery received in criminal cases, the NYPD formed a dedicated team from its Real Time Crime Center as the “Rikers Fusion Team.” This Team appears to have direct access to Securus’s databases.
It is past time for this community spying program to be exposed and ended.
To that end, Brooklyn Defenders, Bronx Defenders, New York County Defenders, and Cleary Gottlieb filed a class action lawsuit against the Department of Corrections in April. If you want to read more about this issue, check out the petition here [PDF] or my affidavit explaining the system here [PDF].
If you want to talk about how this project impacts your cases directly, please reach out.
Elizabeth Daniel Vasquez is the Director of the Science & Surveillance Project and the Forensic Practice at Brooklyn Defender Services.
Upcoming Events
June 4-6, 2024
Techno Security East 2024 (Wilmington, NC)
June 24-25, 2024
Law Enforcement Use of Predictive Policing Approaches: A Workshop (National Academeies of Sciences, Engineering, and Medicine) (Washington, D.C. and Virtual)
June 25, 2024
Intelligent Application of AI: Risks and Rewards (New York State Academy of Trial Lawyers) (Virtual)
July 12-14, 2024
HOPE XV (Queens, NY)
July 22-25, 2024
2024 HTCIA Global Training Event (Las Vegas, NV)
August 8-11, 2024
DEF CON 32 (Las Vegas, NV)
September 16-18, 2024
Techno Security West 2024 (Pasadena, CA)
October 14-15, 2024
Artificial Intelligence & Robotics National Institute (ABA) (Santa Clara, CA)
October 19, 2024
BSidesNYC (New York, NY)
Small Bytes
A Face Recognition Firm That Scans Faces for Bars Got Hacked - and That’s Just the Start (Wired)
An AI tool used in thousands of criminal cases is facing legal challenges (NBC News)
America’s Biggest Mall Owner Is Sharing AI Surveillance Feeds Directly With Cops (Forbes)
New Police Tech Can Detect Phones, Pet Trackers And Library Books In A Moving Car (Forbes)
NYPD will deploy drones to respond to 911 calls in 5 NYC precincts, officials say (Gothamist)
Here Is What Axon’s Bodycam Report Writing AI Looks Like (404 Media)
Librarians Are Waging a Quiet War Against International “Data Cartels” (The Markup)
The Chicago City Counsel is pushing to keep ShotSpotter technology despite Mayor Brandon Johnson’s effort to get rid of it (WBEZ Chicago)
A Leak of Biometric Police Data Is a Sign of Things to Come (Wired)
Mayor Whitmire to scrap $3.5M ShotSpotter program, calling it a ‘gimmick’ conceived by contractors (Houston Chronicle)
Securus, Securus’ JLG Technologies Releases Investigator Pro 4.0 (Sept. 7, 2016), https://securustechnologies.tech/securus-jlg-technologies-releases-investigator-pro-4-0/ (““[T]he new software gives investigators the ability to select a voice sample from either the Incarcerated Individual or called party side of an Incarcerated Individual’s telephone call and then use that sample to search for all other calls where that voice occurs. . . . The searchable voice feature makes it possible to follow the individual voice, not just the PIN/ID or telephone numbers. An investigator can now answer questions like these: What other Incarcerated Individuals are talking to this particular called party? Was this called party ever an . . . Incarcerated Individual?”).
FCC Ex Parte Submission, WC Docket No. 17-126, ITC-T/C-20170511-00094, ITC-T/C-20170511-00095, at 21, 24, 30 (Aug. 2, 2017). To create this “community,” Securus formally asks its correction facility customers to opt in to sharing their facility’s data. This data includes such varied streams as the content of billions of personal calls, video visitation data, financial information, biometric data, and location data. More than half of the unique individuals whose information has been collected in this system were not incarcerated when the data was collected. Id. at 24.
DOC Response to FOIL Request submitted by Elizabeth Daniel Vasquez, dated October 10, 2023. (Attachment: Securus - BAFO, at 3).
DOC Response to FOIL Request submitted by Elizabeth Daniel Vasquez, dated October 10, 2023. (Attachment: Memorandum of Understanding Between New York City Police Department and New York City Department of Correction (Oct. 24, 2007).
Following September 11th, the United States Department of Homeland Security created Fusion Centers to serve as network access points for the different law enforcement agencies operating in a jurisdiction: local, state, and national. Various law enforcement entities sit in common space within a fusion center and share database access credentials with each other. For New York City jails that is—at least—DOC, the NYPD, and federal law enforcement agencies.
See @CorrectionsNYC, Twitter (Oct. 30, 2018, 3:25pm), https://twitter.com/CorrectionNYC/status/1057352758087598081; New York City Council, Hearing of the Committee on Criminal Justice, Testimony of Commission Cynthia Brann (March 14, 2019), https://www.nyc.gov/site/doc/media/march-14-testimony.page. The only other traces of the Rikers Island Fusion Center’s existence come from NYPD police reports received in criminal discovery, and periodic social media mentions by staff. Its exact origin, purpose, or founding date are not publicly disclosed.