This post is part of a series of articles we are doing on 2023 data protection litigation trends. To stay up to date with our writings, please subscribe to the WilmerHale Privacy and Cybersecurity Blog.
Since its enactment in 2008, Illinois’s Biometric Information Privacy Act (BIPA) has produced a wave of privacy-related litigation across the United States, and 2023 was no exception. Last year featured hundreds of new BIPA cases in both federal and state court. There were also four notable BIPA decisions by the Illinois Supreme Court, all of which will impact how the law will be interpreted in future cases.
As we explain below, many 2023 BIPA cases expand the scope of liability for entities that process biometric data pursuant to the law. These cases also continue to leave important questions unanswered, such as how the law applies to certain healthcare entities and third-party claims. It is likely that many of these issues, as well as a host of new ones, will come up in 2024 cases.
For companies that process biometric information, compliance obligations continue to grow with the introduction of new state privacy laws. For example, new state comprehensive privacy laws often regulate biometric data as “sensitive” data, and many require companies to obtain consent before processing biometric data (similar to BIPA). Additionally, Washington’s My Health My Data Act (MHMDA) regulates biometric data and—like BIPA—also includes a private right of action. These new compliance obligations are, of course, on top of companies’ obligations to process biometric information in Illinois in accordance with BIPA.
In this blog post, we highlight some of the key BIPA rulings and trends from 2023. We identify notable trends that companies should focus on from last year’s cases and provide an analysis of some cases that showcase these trends.
BIPA is a privacy law that regulates the use of biometric information like fingerprints, eye scans, voiceprints, and facial geometry scans. Section 15 of BIPA imposes various obligations on entities that interact with or process biometric data. For example, the law requires that entities obtain an individual’s consent before collecting, obtaining, or disclosing that individual’s biometric information. It also requires entities to develop, publicly disclose, and comply with a written data retention and deletion policy.
BIPA includes a private right of action for parties that have been aggrieved by a BIPA violation. It also includes a statutory-damages provision which states that a prevailing party may recover $1,000 or for each negligent BIPA violation and $5,000 for each intentional or reckless BIPA violation (or actual damages if they exceed these amounts). The combination of a private right of action with a statutory-damages provision has led to widespread class action litigation under the law.
A BIPA violation accrues with each unauthorized collection or disclosure of biometric information.
In arguably the most consequential BIPA decision of 2023, Cothron v. White Castle System, Inc., the Illinois Supreme Court held that a BIPA violation accrues each and every time an entity collects or discloses biometric information without consent, not just the first time it does so.
In Cothron, a class of employees sued their employer—White Castle—alleging violations of BIPA. The suit focused on White Castle’s fingerprint-based system that employees used to access their pay stubs and computers: The complaint alleged that White Castle collected and disclosed the employees’ fingerprints without consent each time they accessed the fingerprint-based system, a practice that continued over many years.
In a 4-3 decision, the Illinois Supreme Court held that “[a] party violates … [BIPA] when it collects, captures, or otherwise obtains a person’s biometric information without prior informed consent. This is true the first time an entity scans a fingerprint or otherwise collects biometric information, but it is no less true with each subsequent scan or collection.” The court applied the same logic to disclosure, concluding that each unauthorized disclosure of biometric information is a distinct BIPA violation.
The majority’s reasoning in Cothron suggests that an entity processing an individual’s biometric information repeatedly or on a regular basis could be liable for hundreds or thousands of discrete BIPA violations. This dramatically expands the scope of liability for companies that interact with biometric data. In Cothron itself, for example, White Castle estimated that the majority’s interpretation of the statute could subject it to devastating and astronomical damages awards exceeding $17 billion (the majority opinion drew a stinging dissent, joined by three justices of the court). This might explain why lawsuits asserting BIPA claims jumped by 65% in Illinois state courts following the Cothron decision. But as explained below, the decision’s brief discussion of damages under BIPA may provide a small silver lining for entities that process biometric information.
For more information about the Cothron decision, see our prior blog post.
BIPA’s statutory-damages provision may be discretionary, not mandatory.
Until this year, courts and commentators often assumed that BIPA’s statutory-damages provision functions like a liquidated-damages clause in a contract, automatically granting plaintiffs $1,000 per negligent violation of BIPA and $5,000 per intentional or reckless violation. But several 2023 cases cast serious doubt on that assumption.
At the end of its decision in Cothron, the Illinois Supreme Court said that it “appears that the General Assembly chose to make damages discretionary rather than mandatory under the Act.” The Court provided very little additional explanation for this conclusion and its comment was certainly dicta. But it could nevertheless profoundly change damages in BIPA cases. Lower courts have already begun to seize on this language. For example, a court recently vacated a $228 million jury award that was calculated by multiplying the number of violations by the per-violation figure provided in the statutory-damages provision.1 The court concluded that this calculation method was not appropriate because damages under BIPA are discretionary. So the court ordered a new trial on damages to allow a jury to determine the appropriate damages amount.
If damages under BIPA are discretionary, not mandatory, it remains unclear how a court or jury is expected to determine the proper damages award2. This area of BIPA case law is likely to develop quickly and could look radically different in the coming months and years.
A five-year statute of limitations applies to BIPA claims.
In February 2023, the Illinois Supreme Court clarified that individuals have five years after an alleged BIPA violation to bring their claims. In Tims v. Black Horse Carriers, Inc., the Illinois Supreme Court was asked to decide which of two statutes of limitations applies to BIPA claims: (1) Illinois’ one-year statute of limitations for privacy and defamation claims, or (2) Illinois’s “catchall” five-year statute of limitations. The Court unanimously agreed that the general five-year statute of limitations applies. This gives plaintiffs more time to bring claims, and—particularly when considered alongside the Illinois Supreme Court’s decision in Cothron—represents another case expanding the scope of liability for entities that handle biometric data.
For more information on the Tims decision, see our prior blog post.
Healthcare entities received a BIPA win in 2023.
While many of the year’s BIPA decisions ruled in favor of plaintiffs and against entities that handle biometric information, healthcare entities received a favorable ruling from the Illinois Supreme Court regarding the scope of BIPA’s “healthcare exemption.”
Mosby v. Ingalls Memorial Hospital arose from two class-action suits brought on behalf of registered nurses against the hospitals for which they worked. In both cases, the nurses alleged that the hospitals collected their fingerprints to verify their identities before they could access a medication-dispensing system, and that the hospitals did so without first obtaining consent.
The nurses argued that BIPA’s healthcare exemption excludes only patient biometric data. The court rejected that interpretation of the healthcare exemption. It unanimously concluded that the healthcare exemption excludes from BIPA’s reach any information “used for a particular purpose—health care treatment, payment, or operations—regardless of the information’s source.” Because the nurses’ biometric data was collected and used in the course of providing healthcare treatment, BIPA did not cover the hospitals’ collections of biometric data.
Continuing uncertainty about third-party liability in BIPA cases.
A recurring issue in BIPA cases concerns which entity or entities may be subject to BIPA liability. Often, more than one entity touches an individual’s biometric data: For example, an employer could request the biometric data of an employee but rely on a third-party vendor to collect and process biometric data. Questions then arise as to which entity or entities are properly subject to suit.
In several 2023 cases, courts held that third-party processors and vendors can be liable under BIPA, even when those entities do not directly interface with the individual who provided biometric data. For example, one court held that a third-party provider of biometric systems could be held liable, observing that “BIPA’s text does not suggest a carveout for third-party vendors,” and that an individual can suffer “many individual injuries at the hands of many individual defendants who violated BIPA.”3 In another case, the court held that Amazon could be held liable even though it was a back-end service provider. Because Amazon stored face images on its cloud-based storage and used algorithms to extract and analyze the facial geometry of the images, the Court concluded that Amazon collected and possessed biometric information as defined by BIPA.4
But other cases in 2023 suggest a limit to the scope of third-party liability. In one case, for example, the court granted Microsoft’s motion to dismiss because the company was merely “a vendor to the third-party that provided the biometric timekeeping technology and services to [the plaintiff’s] employer.” The court stressed that while “several courts have extended BIPA to apply to third-party providers that supply biometric collection technology and services, no case has extended BIPA to vendors for such third-party providers.”5 And in another case, the court dismissed a BIPA claim because the complaint failed to allege that Microsoft “actively obtained” the plaintiff’s biometric data when Microsoft merely provided back-end cloud services for the entity that collected the plaintiff’s biometric information.6
The extent to which different entities can be held liable often turns on fine-grained factual distinctions in the complaint. Courts carefully analyze the actions of defendants to assess whether a given entity collects, possesses, or disseminates data within the meaning of BIPA.
Courts continue to carefully parse the factual allegations made in a complaint.
In case after case in 2023, courts drew very fine distinctions between cases based on the allegations in the complaint. For standing purposes, for example, one court drew a distinction between a complaint that alleges only that the defendant profited from the use of the plaintiff’s biometric data (insufficient to confer standing) and a complaint that alleges that the defendant profited and deprived the plaintiff of the opportunity to profit from their biometric data (sufficient to confer standing).7 And in other cases, courts carefully scrutinized the allegations concerning a defendant entity’s actions to determine whether the entity “collect[ed]” data within the meaning of BIPA, drawing on small differences between the complaint and existing case law to determine whether a given defendant could be held liable.8
Parties in BIPA litigation should recognize that even small differences in phrasing and factual allegations within a complaint can affect whether a case survives a motion to dismiss.
1 Rogers v. BNSF Ry. Co., No. 19 C 3083, 2023 WL 4297654 (N.D. Ill. June 30, 2023).
2 See, e.g., Tapia-Rendon v. United Tape & Finishing Co., No. 21 C 3400, 2023 WL 5228178 (N.D. Ill. Aug. 15, 2023) (recognizing that damages may be below the statutory amount of $1,000 per violation but offering no explanation of how to calculate per-violation damages).
3 Johnson v. NCR Corp., 2023 WL 1779774 (N.D. Ill. Feb. 6, 2023).
4 Rivera v. Amazon Web Servs., Inc., No. 2:22-CV-00269, 2023 WL 4761481 (W.D. Wash. July 26, 2023); see also Kyles v. Hoosier Papa LLC, No. 1:20-CV-07146, 2023 WL 2711608 (N.D. Ill. Mar. 30, 2023) (holding that a franchisor could be held liable even though the franchisee was principally responsible for the collection and processing of biometric data).
5Jones v. Microsoft Corp., 649 F. Supp. 3d 679 (N.D. Ill. 2023).
6 Clark v. Microsoft Corp., No. 23 C 695, 2023 WL 5348760 (N.D. Ill. Aug. 21, 2023).
8 See, e.g., Clark v. Microsoft Corp., No. 23 C 695, 2023 WL 5348760 (N.D. Ill. Aug. 21, 2023); Rivera v. Amazon Web Services, No. 2:22-cv-00269, 2023 WL 4761481 (W.D. Wash. July 26, 2023).