Phone Hacking, Facial Recognition Regulation, BWC Audit Trails Decision, Surveillance Video & More
Vol. 5, Issue 1
January 8, 2024
Welcome to Decrypting a Defense, the monthly newsletter of the Legal Aid Society’s Digital Forensics Unit. Allison Young discussed cell phone hacking disputes. Joel Schmidt explains how the Federal Trade Commission regulates facial recognition. Shane Ferro reviews a recent body-worn camera audit trails decision. Finally, Lisa Brown answers a question about video surveillance systems.
The Digital Forensics Unit of The Legal Aid Society was created in 2013 in recognition of the growing use of digital evidence in the criminal legal system. Consisting of attorneys and forensic analysts, the Unit provides support and analysis to the Criminal, Juvenile Rights, and Civil Practices of The Legal Aid Society.
In the News
Forensic Software Vendor in “Slozhnaya Situatsiya” for Code Allegedly Lifted from Competitor
Allison Young, Digital Forensics Analyst
Digital forensics tools are at the center of a dispute filed last month in Russia that involves Elcomsoft and MKO-Systems / Oxygen Forensics.
The forensic companies offer assorted cell phone forensic hacking products, similar to those provided by Grayshift / Magnet and Cellebrite, that allow law enforcement and investigators to access data on mobile devices. According to Forbes, Elcomsoft alleges that MKO-Systems stole proprietary code to hack iPhones with iOS 16 installed. They also allege that Oxygen Forensics, (associated with MKO by the two Russian entrepreneurs who helped set up both companies) uses this same code in their iPhone capabilities.
Cellphone hacking can be conducted for legitimate (and legal) reasons. One of these reasons is to preserve digital evidence for civil, criminal, and internal investigations. Many tool vendors only sell to customers they’ve vetted first because hacking tools often implement dangerous “zero-day” vulnerabilities – that is, weaknesses in technology that have been discovered and have not yet been reported. Cellphone hacking tool vendors may often test ethical boundaries despite this, providing hacking tools to countries known to violate human rights and potentially breaking US computer laws in their use while pitching their products to US agencies.
In this instance, there is a weakness in iOS 16 that allows the forensic software to copy data that is normally retained only on the phone, which can include encrypted chats and logs of phone use.
Elcomsoft was itself the target of a significant intellectual property dispute over 20 years ago. When one of its software developers was arrested at DEF CON in Las Vegas (we’ve been there! DEF CON, that is...) for bypassing copyright protections on Adobe files, it resulted in an early Digital Millennium Copyright Act (DMCA) case (DMCA is also known as that law that John Deere and Apple use to try to keep people from fixing their own property).
Will the outcome of this case affect defenders who have had Elcomsoft or Oxygen tools used in their cases? It’s my opinion that it will not, although it may have ramifications for how law enforcement extractions are performed.
Oxygen and Elcomsoft both provide more tools than just the iOS 16 exploit, respectively covering other areas of cloud collection and computer analysis. U.S. software customers have been increasingly weary of working with businesses headquartered under Russian jurisdiction. Oxygen’s software (like Elcomsoft) is historically Russian-made, although its offices are in Virginia. The lawsuit between these two companies may make investigators weary to purchase from either by resurfacing geopolitical concerns.
The situation also highlights a baked-in problem with the digital forensic field which likely will never be resolved: transparency into what makes these tools tick. While the general idea of phone extraction can be explained by an expert, rarely does one have access to the step-by-step method by which the extraction is executed. This was an issue with GrayKey’s literal “black box” technology and is the insurmountable compromise when seeking out digital evidence to support guilt or innocence.
Rite Aid Agrees to Settle Federal Trade Commission Complaint that it Engaged in Impermissible Facial Recognition Practices
Joel Schmidt, Digital Forensics Staff Attorney
Last May, the Federal Trade Commission issued a warning to companies engaged in the practice of capturing customers’ biometric information that doing so “raises significant consumer privacy and data security concerns and the potential for bias and discrimination.” Among the concerns highlighted in the warning was the concern that “some technologies using biometric information, such as facial recognition technology, may have higher rates of error for certain populations than for others.”
Companies would be well advised to heed that warning. Last month, the Federal Trade Commission announced that Rite Aid had agreed to settle FTC allegations it violated a 2010 FTC data privacy order. According to the FTC, from 2012 through 2020, Rite Aid “recklessly” employed a facial recognition system in its stores that routinely falsely accused its customers of shoplifting by erroneously matching them to individuals on a company-maintained list of people Rite Aid believed were likely to be shoplifters or commit other crimes.
According to the complaint [PDF] filed in federal court, “In whole or in part due to facial recognition match alerts, Rite Aid employees took action against the individuals who had triggered the supposed matches, including subjecting them to increased surveillance; banning them from entering or making purchases at the Rite Aid stores; publicly and audibly accusing them of past criminal activity in front of friends, family, acquaintances, and strangers; detaining them or subjecting them to searches; and calling the police to report that they had engaged in criminal activity.” Yet, according to the complaint in thousands of those cases the facial recognition matches were false positives.
The FTC claims Rite Aid’s facial recognition system was so bad that in over two thousand cases, matches to one person occurred so close in time to each other in stores so far away from each other that “it was impossible or implausible” that the same person could have been accurately matched to both stores.
These wrong matches were alleged [PDF] to disproportionately affect people of color and were so inaccurate that in once instance a black woman was falsely matched with a person in the Rite Aid database described as “a white lady with blonde hair.” The police were called and the woman was asked to leave before anyone realized it was a false match.
Children were also not immune. The complaint describes an 11-year-old girl who was so distraught after being falsely accused of being a match to someone in the Rite Aid database that the child’s mother told Rite Aid she had to miss work as a result.
Rite Aid has denied the allegations, but per the terms [PDF] of the proposed settlement Rite Aid will, inter alia, 1) be banned from using facial recognition technology for a period of five years, 2) delete all images and data associated with the facial recognition system in use from 2012 to 2020, 3) notify customers if their biometric information is ever entered into a database, 4) investigate and respond to biometric-related complaints, 5) provide clear and conspicuous notice to customers if in the future they ever choose to use facial recognition after the five year ban, 6) delete any biometric information it obtains pursuant to such a program within five years of its collection, and 7) implement a data security program to safeguard customers’ information.
Hopefully, other companies will take notice of this settlement and completely eliminate improper and unlawful biometric collection practices.
In the Courts
Yet Another Court Holds BWC Audit Trails Discoverable — With Extensive Factual Conclusions
Shane Ferro, Digital Forensics Staff Attorney
Last month, the Digital Forensics Unit scored a big win in our quest to get DA’s offices to turn over body-worn camera audit trails (also known as audit logs) in discovery. In People v. Ballard, Judge Gershuny of Queens Criminal Court ruled that BWC audit logs are discoverable under four separate subsections of CPL 245.20. 2023 NY Slip Op 23392 (Dec. 14, 2023).
The ruling came after an extensive evidentiary hearing in which the prosecution called Allison Arenson, executive agency counsel and Director of the NYPD’s BWC Unit Legal Bureau, to testify about what BWC audit trails are, how they are created, and all of the information that they may contain.
As a result of the hearing testimony, Judge Gershuny ruled that the audit trails not only relate to the subject matter of the case, but are discoverable as police writings [245.20(1)(e)], as potential impeachment material [245.20(1)(k)(iv)], as electronic information created by or on behalf of law enforcement [245.20(1)(u)(i)(B)], and under the more catchall “presumption of openness” when it comes to items that relate to the subject matter of the case [245.20(7)].
Important factual conclusions made by the court after the hearing include:
NYPD policy requires officers to add arrest numbers and appropriate categories to every BWC video they record, which show up in the audit trails. Officers can also independently write notes and add other information to audit trails.
There are three types of relevant NYPD audit trails: evidence, device, and user. The evidence audit trail includes everything that happened with a specific video (piece of evidence). The device audit trail includes everything that happened with a specific camera (device). The user audit trail includes every action taken by a specific user.
These audit trails contain more information, including dates and times that certain actions were taken and who took those actions, than the “metadata sheets” typically turned over in discovery by the various NYC DA’s offices.
Judge Gershuny concluded that audit trails include notes of officers, “[e]ven if these notations are short, selected from a pre-determined list, or in digital format, the subject matter communicated is directly related to the case.”
He also concluded that audit trails contain impeachment material, because they keep a record of whether or not an officer followed the proper police procedure during an investigation and/or arrest, which may eventually contradict the officer’s testimony in a case. The decision stated that, “[j]ust as NYPD uses audit trails to make a record of police conduct, a defendant has a right under the discovery statute to review audit trails for possible impeachment.”
Finally, in addressing one of the more convoluted arguments that ADAs often make with regard to audit trails—that they are created by a third party company and therefore not “created or stored” by the NYPD—the decision stated that “Ms. Arenson clearly testified that it is an NYPD officer, not a computer, creating the underlying information in an audit trail.” (Emphasis original.) That the NYPD contracts with a private company to store this data created by its officers is not relevant to the question of discoverability.
Finally, the decision stated clearly that Ms. Arenson’s testimony clearly contradicted the prosecution’s typical statement that the information in their “metadata sheets” is the same as the information in the audit trails and therefore duplicative. The decision notes that the Manhattan court in People v. Champion reached the same conclusion after ordering the ADA to turn over the audit trails in that case and comparing them to the metadata sheets. 81 Misc 3d 292, 297 (Crim Ct, NY County, Oct. 11, 2023).
After years of fighting with various DA’s offices and the NYPD over the vast amount of underlying information that we know they keep about their body-worn camera videos and devices, it seems as if the dam is finally beginning to break and courts are finally recognizing the validity of our argument: the BWC audit logs contain important information for the defense that we are entitled to in discovery.
Ask an Analyst
Do you have a question about digital forensics or electronic surveillance? Please send it to AskDFU@legal-aid.org and we may feature it in an upcoming issue of our newsletter. No identifying information will be used without your permission.
Q. I sent a subpoena to a bodega for surveillance video footage, and the shop mailed me an internal hard drive from the DVR. Can you recover video from this drive?
A. Most surveillance video systems use a proprietary file system. If we try to access the DVR hard drive directly, the contents will appear as gibberish. We have two options to recover the video:
Option 1- The hard drive can be returned to the bodega, reinstalled into the surveillance video system and the video can be exported from the DVR.
Once the hard drive is back in the DVR system, all the video files will be accessible through the on-screen interface. Each DVR has a different visual layout, but most will have an option to “backup” or “export” video files from the specified camera and time period and save them to an external USB drive.
Option 2- We can purchase special software which attempts to recover video files from a DVR hard drive with a proprietary file system.
Magnet WITNESS (formerly called DVR Examiner) is a tool for recovering video files from a surveillance system’s hard drive. The software can decode over fifty native DVR file formats. It scans the DVR hard drive and produces a list of recoverable video clips. This tool also gives us the potential to recover some deleted video files which are no longer accessible through the DVR’s on-screen interface. However, most surveillance systems are configured to overwrite old video once the hard drive is full, so typically very little “deleted” video is recoverable.
While the first choice for video recovery is to use the system’s on-screen backup or export feature, we also have the option of purchasing a single-case license of WITNESS, which may allow us to recover video directly from the hard drive.
Lisa Brown, Senior Digital Forensics Analyst
Upcoming Events
January 18, 2024
Open-Source Software Security: Areas of Long-Term Focus and Prioritization (ABA) (Virtual)
January 24, 2024
From Order to Action: The OMB's Guidance on AI Regulation (ABA) (Virtual)
February 8, 2024
Digital Part I: Geofence Warrants (NYSDA) (Virtual)
February 14-17, 2024
ABA TECHSHOW 2024 (ABA) (Chicago, IL)
February 27 - March 7, 204
Magnet Virtual Summit 2024 (Virtual)
February 29, 2024
Digital Part II: Cell Site Location Information (“CSLI”) and Call Detail Records (“CDR”) (NYSDA) (Virtual)
February 29 - March 7, 2024
SANS OSINT Summit & Training (SANS) (Arlington, VA & Virtual)
April 15-17, 2024
Magnet User Summit 2024 (Nashville, TN)
April 18-20, 2024
Making Sense of Science XVII: Forensic Science & the Law (NACDL) (Las Vegas, NV)
June 4-6, 2024
Techno Security East 2024 (Wilmington, NC)
Small Bytes
AI Is Detaining Sex Workers at the Border-and You’re Next (The Daily Beast)
Do Video Doorbells Really Help to Deter Crime? (Undark Magazine)
These Noise Cameras Put a Price on Peace: $2,500 for Loud Drivers (NY Times)
Governments spying on Apple, Google users through push notifications - US senator (Reuters)
Why It Took Meta 7 Years to Turn on End-to-End Encryption for All Chats (Wired)
Verizon Gave Phone Data to Armed Stalker Who Posed as Cop Over Email (404 Media)
How Police Have Undermined the Promise of Body Cameras (ProPublica)
Google Just Killed Warrants That Give Police Access To Location Data (Forbes)
Largest Dataset Powering AI Images Removed After Discovery of Child Sexual Abuse Material (404 Media)