October 2, 2023
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. In this issue, Benjamin Burger highlights a new book on facial recognition technology. Shane Ferro looks into the NYPD’s purchase of a “crime-fighting” robot. Diane Akerman discusses the newly enacted prohibition on the use of facial recognition in schools in New York State. Finally, Chris Pelletier explains how AirTags work and their forensic significance.
The Digital Forensics Unit of the Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts, the Unit provides support and analysis to the Criminal, Juvenile Rights, and Civil Practices of the Legal Aid Society.
In the News
Limitless Facial Recognition
Benjamin S. Burger, Digital Forensics Staff Attorney
In 2020, reporter Kashmir Hill (full disclosure: Hill has previously written about the Legal Aid Society’s Digital Forensics Unit) wrote a New York Times article revealing law enforcement’s extensive use of a private facial recognition program, Clearview AI. Clearview AI differed from “traditional” facial recognition systems, which usually search databases filled with mugshots, drivers license photos, and other government compiled images. Instead, Clearview AI’s database consisted of images scraped directly from the Internet, particularly social media sites like Facebook. Police agencies across the country - including the NYPD - purchased the service.
Hill has now authored a book about Clearview AI, its founders, and the implications of the unfettered use of facial recognition. In Your Face Belongs To Us: A Secretive Startup’s Quest To End Privacy As We Know It, Hill expands on her previous article, explaining how Clearview AI was created and marketed to law enforcement agencies. It also highlights the lack of regulation surrounding the use of facial recognition, and the accompanying loss of privacy in public spaces.
This is particularly true in New York City, where Mayor Adams has embraced surveillance technology and deployed it without any public discussion or regulation. We know that facial recognition technology is imperfect and has falsely identified people - primarily black men - as perpetrators of crime. Doxxing random people on social media is not only condoned by platforms like TikTok but celebrated by some users. Stepping outside your home and into a public area now subjects a person to being photographed, recorded, and identified without any consent. All without regulation or constraints on law enforcement and private actors. It is past time for our laws to catch up with the ubiquity of public surveillance. As Supreme Court Justice Louis Brandeis explained: “The makers of the Constitution conferred the most comprehensive of rights and the right most valued by all civilized men—the right to be let alone.”
NYPD Acquires Crime Roomba for Times Square Subway Station
Shane Ferro, Digital Forensics Staff Attorney
This month, for the second time this year, Mayor Adams held a press conference to announce that a new robot cop would be patrolling Times Square. This announcement (once again) violated the POST Act [PDF], which requires the NYPD to announce any new surveillance technologies 90 days before implementing them and to open a 45-day comment period for the public before any new surveillance technology is put into action.
The Mayor’s press conference was both baffling and cringe. He did heart hands with the robot (it has neither arms nor hands). He repeatedly stated that the robot costs the city $9 an hour and he hopes that it will replace police officer jobs in the future. (The robot operates “…below minimum wage – no bathroom breaks, no meal breaks,” Adams said according to the Knightscope press release.)
Adams described how the NYPD’s crime roomba would need to be followed around by a real, live, more-than-$9-an-hour Technical Assistance Response Unit (TARU) officer everywhere it goes for the moment. He failed to justify the value-add of a 420lb camera-covered robot with a top speed of 3mph in a subway station already blanketed with surveillance cameras. It also seems that the robot cannot navigate stairs—so to the extent it is patrolling the “station,” that means the main, well-lit level of the station and not the actual platforms.
At the press conference, Adams and the NYPD said that for the next two months the robot will patrol the Times Square “station” with a TARU officer from 12am to 6am every night.
TikTok user brennalip took a late-night trip to the station to see the cop R2D2 in action (hat tip to Jennvine Wong). She found that the robot was nearly an hour late to its shift, and neither the robot nor the human officers were able or willing to help a trans woman who said she was being followed and harassed. The officers did seem to have plenty of time to be on their phones (the robot doesn’t have a phone, but technically contains a phone).
A man-in-the-station interview revealed that the average New Yorker thinks that the robot “is not going to stop crime at all … it looks like an egg.”
Spokespeople for the company that makes the robot were unwilling to answer questions for the camera. The five-minute video captures only one instance of the police policing that night: near the end of the video, one officer writes a ticket for fare evasion.
Policy Corner
New York State Bans Facial Recognition in Schools
Diane Akerman, Digital Forensics Staff Attorney
In a long overdue decision, New York State Education Department Commissioner Betty A. Rosa issued a determination [PDF] on September 27 banning the use of facial recognition technology (FRT) in schools. The ban applies to all schools statewide, and leaves room only for counties to implement the use of certain other biometric identifying technologies.
In December of 2020, then Governor Cuomo signed S5140B/A06787 into law, banning the use of facial recognition and other biometric tracking until July 1, 2022. The law directed the commissioner to complete a study, and if the study determined the benefits of use outweighed the risks, to provide specific guidelines to mitigate the possible dangers.
The final report [PDF] was issued in August 2023 and included guidance and conclusions that would be unsurprising anyone who reads this newletter (or Kashmir Hill’s new book). The report found that there are “discernable risks to the use of FRT in a school setting” and that those risks outweigh any documented benefits. The report also noted that, on the whole, there was a significant difference in the use of biometrics for “one-to-one device management” and found this kind of use to be of less concern. The report also noted, sensibly, that technology is constantly changing and must be re-examined, and the conclusions in the report should be regularly reevaluated.
In the determination, the commissioner adopted the finding in the report. The Department noted there were “serious concerns regarding the use of FRT in schools, including…the higher rate of false positives for people of color, nonbinary and transgender people, women, the elderly ad children” that were not “outweighed by the claimed benefits” particularly as “little information is available about real life situations where such technology detected and helped prevent violent incidents.”
This commonsense decision is a long time coming and reiterates the now well-known flaws of facial recognition technology. Yet, in spite of the now copious literature and anecdotal evidence of the harms of FRT, the city and state have been slow to address its use at private sporting events or in criminal prosecutions. Biometric ban bills have been languishing at the state and city level for years, while the technology continues to result in real life harm to individuals. This ban is a small step in the right direction.
Ask an Analyst
Do you have a question about digital forensics or electronic surveillance? Please send it to AskDFU@legal-aid.org and we may feature it in an upcoming issue of our newsletter. No identifying information will be used without your permission.
Q. What is an Apple AirTag and how does it work?
A. An Apple AirTag is a round, battery-powered device used to track the location of any object it’s attached to. It is about the size of a quarter and can be attached to just about anything you would like to keep track of, such as key chains, briefcases, animal collars, luggage, bicycles, TV remotes, etc. It can also simply be placed in a pocket, in a wallet or purse, or in the center console or glove compartment of a vehicle, allowing that object to be tracked. AirTags are not the only tracker on the market. One such similar device includes Tile trackers, which can be used with both Apple and Android phones.
AirTags use low energy Bluetooth wireless signals (which have a range of roughly 30 feet) to communicate with other devices. This communication with other devices is how it tracks location. When it’s moved away from its owner’s phone, an AirTag anonymously communicates with any iPhone in range. Every iPhone the AirTag communicates with relays the tag’s location to Apple’s “Find My” network. The Find My network consists of all iPhones in use worldwide, except for those with this feature disabled. Considering all the iPhones in existence, it creates a vast network with very dense coverage.
Setting up an AirTag is easy. A plastic tab is removed to enable the battery. The AirTag is then activated by placing it next to the owner’s iPhone. A prompt on the phone will ask the user if they would like to pair with the AirTag. During this process, the AirTag is associated to the Apple ID of the setup device. A user can name the AirTag and associate an emoji with it. Once the AirTag is set up, its location may be monitored using the Find My app on the user’s iPhone, which will show its location on a map. The Find My app also allows a user to make the AirTag play an alert sound, so it can be located it if it’s nearby. If its further away, the app can give you directions to the tag. You can also set notifications to let you know when an AirTag is no longer with you, or you can enable “Lost Mode”, which will have the tag play an alert and send a message to any iPhone it connects with. Also, if an AirTag is out of range of any device in the “Find My” network for more than 8-24 hours, it will play an alert to let someone know that an AirTag might have been placed with them, since these devices have been used to stalk people. Android phones can also detect unwanted AirTags using an unknown tracker alert feature.
AirTag evidence may be relevant in cases where someone is accused of stealing property. In New York City, Mayor Eric Adams and the NYPD provided free AirTags to help generate leads on stolen vehicles. Under certain circumstances, depending on iPhone model and iOS version, it may be possible to obtain information about an AirTag’s past locations through forensic analysis of the “Find My” app on the AirTag owner’s iPhone. AirTags associated with a device may also provide insight into other sources of relevant data, such as iPads or computers where a tracker has been attached.
Chris Pelletier, Digital Forensics Analyst
Upcoming Events
October 10, 2023
AI Admissibility and Use at Court Hearings and/or Trials (NYSBA) (Virtual)
October 11, 2023
AI 101 For Lawyers: How To (And Not To) Use ChatGPT (NYS Academy of Trial Lawyers) (Virtual)
October 19, 2023
The Ethics of Social Media Use by Attorneys (NYSBA) (Virtual)
Precautions for Proper Protection: Client Communications, Data, and Cybersecurity (NYS Academy of Trial Lawyers) (Virtual)
Artificial Intelligence and the Law (Queens County Women’s Bar Association) (Virtual)
October 26, 2023
Facing A Discovery Dump: Organizing Your Criminal Defense from the Start (NYSDA) (Virtual)
February 8, 2024
Digital Part I: Geofence Warrants (NYSDA) (Virtual)
February 14-17, 2024
ABA TECHSHOW 2024 (ABA) (Chicago, IL)
February 29, 2024
Digital Part II: Cell Site Location Information (“CSLI”) and Call Detail Records (“CDR”) (NYSDA) (Virtual)
April 18-20, 2024
Making Sense of Science XVII: Forensic Science & the Law (NACDL) (Las Vegas, NV)
Small Bytes
Police real-time crime centers are becoming data powerhouses (StateScoop)
Streetlights as Spyware (Tech Policy Press)
As car theft spikes, NYPD deploys a vehicle with a license plate reader in every precinct (Gothamist)
NYPD spent millions to contract with firm banned by Meta for fake profiles (the Guardian)
Axon’s Ethics Board Resigned Over Taser-Armed Drones. Then the Company Bought a Military Drone Maker (The Markup)
New Orleans DA Fights ‘Terrorism’ on Streets with AI Spycraft (Wall Street Journal)
How a ‘Digital Peeping Tom’ Unmasked Porn Actors (Wired)
iOS 17: iPhone Users Report Worrying Privacy Settings Change After Update (Forbes)
The Maker of ShotSpotter Is Buying the World’s Most Infamous Predictive Policing Tech (Wired)