Skip to content

Texas Biometrics Case Highlights Need for Consent: Meta Settles for $1.4 Billion

Chris James presenting at ACI’s 12th West Coast Forum on FCPA Enforcement and Compliance Background Image

On July 30, 2024, Meta Platforms, Inc. (formerly known as Facebook, Inc.) agreed to pay $1.4 billion to the State of Texas to settle a lawsuit alleging that Meta unlawfully captured and used biometric identifiers of millions of Texans without their consent. The settlement is one of the largest of its kind. Companies operating in Texas would do well to consider whether they capture or use biometric identifiers and if they comply with the law. Companies should also understand related data privacy concerns, especially if such biometric identifiers are used in connection with artificial intelligence.

The Lawsuit

The lawsuit, filed on February 14, 2022, claimed that Meta violated the Texas Capture or Use of Biometric Identifier Act (“CUBI”) and the Texas Deceptive Trade Practices-Consumer Protection Act (“DTPA”) by building an artificial intelligence empire using Texans’ biometric data without their knowledge or permission. CUBI, enacted in 2009, prohibits the capture, use, or disclosure of biometric identifiers, such as fingerprints, iris scans, or facial geometry, without informed consent and requires the destruction of such identifiers within a reasonable time. The DTPA, enacted in 1973, prohibits false, misleading, or deceptive acts or practices in trade or commerce.

According to the lawsuit, Meta captured biometric identifiers from photos and videos uploaded by users and non-users to its social media platforms, including Facebook and Instagram, and used them to train its facial-recognition program, DeepFace, without informing or obtaining consent from the individuals whose biometric data was collected. The lawsuit alleged that Meta’s practices put Texans’ well-being, safety, and security at risk, as biometric data is unique, permanent, and susceptible to misuse. The lawsuit sought civil penalties of up to $25,000 for each violation of CUBI and up to $10,000 for each violation of DTPA, as well as injunctive relief, disgorgement of Meta’s assets, and attorneys’ fees and costs.

The lawsuit followed a class action lawsuit in Illinois, where Meta paid $650 million to settle claims that it violated the Illinois Biometric Information Privacy Act (BIPA) by using facial-recognition technology without consent. The Illinois settlement, approved in February 2021, was the largest consumer privacy settlement in U.S. history at the time. Meta claimed to have ceased its facial-recognition practices in late 2021, but the Texas lawsuit challenged the adequacy of Meta’s actions.

Terms of the Settlement

After more than two years of litigation, Meta and the State of Texas reached a settlement agreement and filed a proposed agreed final judgment with the District Court of Harrison County, Texas, on July 30, 2024. The settlement agreement and the agreed final judgment resolved all claims related to the collection and use of biometric data by Meta up to the effective date of the agreement.

The settlement also established detailed procedures for Meta to notify the Office of Attorney General of Texas (OAG) of anticipated or ongoing conduct related to biometric data. As is typical, the settlement was not an admission of any violation or liability by Meta, but rather a compromise of disputed claims. The settlement was subject to the approval of the District Court, which issued the agreed final judgment on the same day as the filing of the settlement agreement. The agreed final judgment incorporated the terms of the settlement agreement and enjoined Meta from violating CUBI or DTPA in relation to biometric data. 

What This Means for You

The Meta settlement is a significant development in the field of biometric data privacy and security, as it demonstrates the high stakes and potential liabilities involved in the capture and use of biometric identifiers without consent—and the risk in allowing that data to be used in artificial intelligence applications. The settlement also reflects the growing enforcement and litigation activity in this area, as more states enact or amend laws to protect biometric data, such as the Texas Data Privacy and Security Act (TDPSA), which came into effect on July 1, 2024, and includes sensitive data, such as biometric data, in its scope.

Companies that collect, process, or sell biometric data—including, e.g., for purposes of security, theft prevention, authentication and system access—should be aware of the legal and regulatory requirements and risks associated with such data and take steps to ensure compliance and mitigate exposure.

Some practical action items for companies include:

  • Conduct a data inventory to understand what, if any, biometric data is collected, processed, and stored, and for what purposes.
  • Remain vigilant about biometric data being used to train or otherwise going into (and coming out of) artificial intelligence models and their later applications.
  • Obtain informed and explicit consent from individuals before capturing or using their biometric identifiers, and provide clear and accurate notices of how such data is used, shared, and protected.
  • Implement and update policies and procedures for the retention and destruction of biometric data within a reasonable time or as required by law.
  • Segregate and secure biometric data from other types of personal data and limit access to authorized personnel only.
  • Review and revise contracts with third-party processors or vendors that handle biometric data to include terms that ensure compliance and accountability.
  • Monitor and respond to changes in laws, regulations, and best practices regarding biometric data privacy and security.

V&E assists clients in identifying, managing, and mitigating cybersecurity and data privacy risks, from early planning and assessment to drafting and revising policies and consents to comply with applicable laws, including CUBI.

.

This information is provided by Vinson & Elkins LLP for educational and informational purposes only and is not intended, nor should it be construed, as legal advice.