CHICAGO — A lawsuit filed under Illinois’ unusual state law regulating biometric data collection has taken aim at an AI program used by X since 2015 to identify nudity and NSFW images.
The proposed class action suit, filed by lawyers on behalf of Chicago resident Mark Martell in 2023, alleged that X — the platform formerly known as Twitter — was in violation of the Illinois Biometric Information Privacy Act, commonly known as BIPA.
As XBIZ reported, plaintiff lawyers filing BIPA suits in Illinois have been known to target big tech companies such as Google, OnlyFans, Shutterfly and others. During one such action, Facebook agreed to pay a $650 million settlement over BIPA issues.
Martell’s complaint states that “since approximately 2015, Twitter has implemented software to police pornographic and other not-safe-for-work (‘NSFW’) images uploaded to the site. NSFW images are then ‘tagged’ by Twitter as such, preventing them from being viewed by people who do not wish to view them.”
The complaint then alleges that, in analyzing each uploaded image to determine whether “it contains nudity (or any other qualities that Twitter deems objectionable),” the platform “actively collects, captures and/or otherwise obtains; stores; and/or makes use of the biometric identifiers and biometric information of any individual included in each photo.”
On Thursday, U.S. District Court Judge Sunil R. Harjani threw out Martell’s lawsuit, but left the door open for the plaintiff to file an amended complaint by June 27.
At the core of Martell’s lawsuit is X’s usage of the third-party software PhotoDNA, developed by Microsoft. The lawsuit also mentions a 2015 article by Wired magazine that refers to an in-house AI technology developed by Twitter in 2014 after it acquired Madbits, a pioneering AI startup founded by NYU researcher Clément Farabet.
“When Farabet and his MadBits crew joined Twitter last summer, Alex Roetter — the company’s head of engineering — told them to build a system that could automatically identify NSFW images on its popular social network,” Wired reported at the time. “A year later, that AI is in place.”
Martell alleges that X’s AI scans images “without first making certain disclosures and securing written, informed permission from Illinois users posting photos” as required under Illinois’ BIPA, Law 360 reported.
Harjani ruled, however, that PhotoDNA’s creation of a “hash,” or unique digital signature, of one of Martell’s images does not amount to “a scan of his facial geometry in violation of BIPA,” the report added.
The judge defined “biometric identifier” for BIPA purposes as “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry,” thus excluding photos.
“While plaintiff alleged that PhotoDNA scanned the photo to create a unique hash, plaintiff did not allege facts indicating that the hash is a scan of face geometry, as opposed to merely a record of the photo,” Harjani wrote. “Plaintiff’s allegations leave open the question of whether the hash is a unique representation of the entire photo or specific to the faces of the people in the picture.”
Industry attorney Corey Silverstein of Silverstein Legal told XBIZ, “I absolutely love this decision. Judge Harjani did an incredible job analyzing a very tricky issue. BIPA has turned into a goldmine for class action plaintiffs and their attorneys, and in my opinion the abuse of these types of claims has gotten completely out of hand. I hope that this will lead prospective plaintiffs and their counsel to pause before filing baseless lawsuits.
“The plaintiff’s main contention is that making hash values of photos violates BIPA,” Silverstein explained. “The plaintiff’s argument seems to be that services using PhotoDNA must hash all photos, including images showing people’s faces in their database, to check if those photos match a hash value in the PhotoDNA database. The plaintiff seemingly argues that hashing the photo necessarily calculates the photo subjects’ face geometries. The court wasn’t buying what the plaintiff was selling, although the judge did grant leave to amend, meaning the plaintiff could amend its allegations and take another crack at its claim.”