2025 marked another year of significant legislative and regulatory advances at the federal and state levels for data privacy and security, in ways that were both expected and unexpected. As was anticipated, a new administration meant a shift in enforcement priorities for some federal agencies, including for the Federal Trade Commission (FTC). Compared to the last administration’s flurry of enforcement actions based on aggressive interpretations of “unfair” trade practices, the current FTC appears to be more concerned with a much narrower range of issues, focusing on the privacy and safety of children and teens online, as well as data security more generally. Other agencies that focused on privacy issues during the last administration (such as the Consumer Financial Protection Bureau) were also less active in privacy issues more generally in the past year. As has been the norm over the past few years, Congress kicked the tires on a few privacy proposals last year, but none gained meaningful traction.
While no federal privacy legislation emerged, there was a critical new federal development with the finalization of the Department of Justice’s Data Security Program (DSP). This regulatory framework, which targets cross-border transfers of bulk US sensitive personal data and US government-related data to certain “countries of concern,” interconnects data protection and national security concerns and has broad applicability based on how its relevant terms are defined (and does not have the same exemptions that are typically included in other data protection laws). Enforcement of the DSP will be an area of focus for companies in 2026 and beyond, especially given that the DSP comes with steep civil penalties and even potential criminal liability.
States did fill most of the enforcement gap left at the federal level, with California and Texas leading as the prominent regulators in the privacy and cybersecurity landscape. While it was somewhat surprising that no new state passed a comprehensive privacy law (marking the first time this has happened since 2020), states remained busy enacting AI and children’s privacy laws, proposing and amending laws to protect consumer health information, and engaging in privacy-related rulemaking. In addition to these developments at the federal and state levels, both pixel tracking litigation and security incidents continued to be an issue for companies in 2025.
We describe below our top ten US data privacy and cybersecurity developments (in no particular order) from the past year. Companies should understand the key shifts and trends from 2025 in relation to their existing compliance obligations and anticipate potential legislative and regulatory changes for the coming year. We will continue tracking all these developments in the new year and providing analysis on the compliance changes and policy updates in our Privacy and Cybersecurity Law blog, which you can subscribe to here.
1. DOJ Finalizes Rule Regarding Sensitive Data Transfers
On January 8, 2025, the Department of Justice issued its final Rule under Executive Order 14117, “Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern” (the “Rule”). The Rule, which took effect on April 8, 2025, targets cross-border transfers of bulk US sensitive personal data and US government-related data to certain “countries of concern,” specifically China (including Hong Kong and Macau), Russia, Iran, North Korea, Cuba, and Venezuela, as well as to “covered persons” (essentially, any person or entity with certain affiliations with a country of concern). Furthermore, the Rule applies certain requirements and restrictions to certain data transactions with any foreign persons, or any person that is not a US person.
The Rule prohibits US persons from engaging in any “data brokerage” transaction involving identified categories of sensitive US personal data with “covered persons” or “countries of concern.” Put another way, the Rule bans US data brokers from licensing or otherwise transferring a wide variety of sensitive US personal data to covered countries or covered persons. Similarly, the Rule prohibits US persons from engaging in any “data brokerage” transaction involving identified categories of sensitive US personal data with “foreign persons” absent the imposition of contractual safeguards to prevent the subsequent transfer of the data to a country of concern or covered person. Additionally, the Rule prohibits all US persons from knowingly engaging in any “covered data transaction” with “countries of concern” or “covered persons” involving access to bulk human genomic, epigenomic, proteomic, or transcriptomic data, or with human biospecimens from which such data can be derived.
Furthermore, the Rule establishes a class of “restricted” transactions, referring to transactions involving vendor agreements, employment agreements, or investment agreements that provide countries of concern or covered persons with access to bulk US sensitive personal data or US government-related data. US persons are permitted to engage in such restricted transactions but must comply with cybersecurity standards promulgated by the Cybersecurity and Infrastructure Security Agency (CISA) and satisfy additional requirements, such as implementing a data compliance program, complying with audit requirements, and maintaining records related to applicable transactions.
The Rule defines sensitive personal data broadly with relatively low thresholds, meaning a wide range of personal data collected in the course of fairly standard online transactions can trigger the Rule. Furthermore, compliance obligations under the Rule are significant, as the diligence, auditing, recordkeeping, and reporting requirements for restricted transactions may require that entities either build out or establish comprehensive compliance programs to comply with the Rule. Furthermore, companies should review cross-border data flows and update contracts to meet the new requirements. Penalties for violations are steep: civil fines can reach $368,136 or twice the transaction amount, and criminal penalties for willful violations include fines up to $1 million or imprisonment for up to 20 years. The Rule underscores the Department of Justice’s growing focus on national security and potentially signals that additional national security-motivated data regulations may follow.
2. Federal Privacy Proposals Focus on Health and Children’s Data, while Comprehensive Frameworks Stall
On February 12, 2025, the US House of Representatives Committee on Energy and Commerce announced the creation of a comprehensive data privacy working group, with Committee Chairman Brett Guthrie and Vice Chairman John Joyce stating that “a national data privacy standard is necessary.” On February 21, Chairman Guthrie and Vice Chairman Joyce issued a Request for Information (RFI), inviting stakeholders to share questions and suggestions with the newly formed working group. While a federal comprehensive privacy law remained elusive in 2025, Chairman Guthrie held a December 2 hearing to “examine ways to protect children and teens online.” The subcommittee hearing, titled “Legislative Solutions to Protect Children and Teens Online,” focused on 19 children’s online safety bills, 18 of which were advanced to a full committee vote following a December 11 subcommittee markup session (the remaining bill, “Reducing Exploitative Social Media Exposure for Teens” was not considered in the markup session). Notably, this focus on child safety was thought to be one of the reasons that the “AI moratorium,” aimed at reducing the enforcement of state AI regulations, was removed from the budget reconciliation bill.
In addition to the data privacy working group and legislation focused on children and teens, there were multiple federal consumer health privacy bills introduced in 2025. Proposed health bills included the Health Information Privacy Reform Act (“to provide additional protections with respect to health information”), the Reproductive Data Privacy and Protection Act (“to ensure requests for data on individuals do not pertain to reproductive services”), the American Genetic Privacy Act of 2025 (“to prohibit the disclosure of certain genetic information to the People’s Republic of China”), and the My Body, My Data Act of 2025 (“to protect the privacy of personal reproductive or sexual health information”). Notably, none of these bills are bipartisan and have yet to make it outside of any committee.
3. Shift in Enforcement Priorities for New Administration
2025 was a relatively quiet year for the FTC (particularly compared to the previous two years), as the change in administration meant a shift in enforcement priorities. While the FTC under the previous administration used a broad interpretation of its Section 5 authority under the FTC Act to crack down on unfair and deceptive practices, the FTC under the current administration has yet to try to expand its Section 5 authority through the use of novel unfairness theories. This is consistent with various public statements by the new FTC Chair. Instead, the FTC’s recent enforcement actions—the first privacy enforcement actions under the new administration—indicate that the Commission is focused on the privacy and safety of children and teens online.
Three of the four FTC enforcement actions brought in September alleged violations of the Children’s Online Privacy Protection Act (COPPA), while the fourth action alleged unfair and deceptive practices relating to the failure to remove child sexual abuse material (CSAM) and nonconsensual material (NCM) from an adult content website. Notably, the FTC Commission has three vacancies and is currently comprised of two Republican commissioners, including Chairman Andrew Ferguson. It remains to be seen how the FTC’s limited number of commissioners will impact its enforcement activities in 2026.
4. States Fill the Enforcement Gap, with California and Texas Leading the Charge
In the absence of a strong federal regulatory presence, states have continued to fill the enforcement gap. In April 2025, a group of bipartisan state regulators formed the Consortium of Privacy Regulators to “share expertise and resources” and to “coordinate efforts to investigate potential violations of applicable laws.” The current Consortium consists of the California Privacy Protection Agency (CPPA) and state Attorneys General (AGs) from California, Colorado, Connecticut, Delaware, Indiana, Minnesota, New Hampshire, New Jersey, and Oregon.
California remained a prominent regulator in 2025, with both the California AG and the CPPA announcing their largest privacy-related monetary penalties. On July 1, 2025, the California AG announced a $1.55 million settlement—the largest penalty issued under the California Consumer Privacy Act (CCPA) to date—with Healthline, an online health and wellness knowledge platform. According to the California AG, Healthline violated the CCPA through its use of online tracking tools for targeted advertising purposes and its disclosure of sensitive health-related information to advertisers without complying with the CCPA’s requirements. Furthermore, the California AG alleged that Healthline violated the CCPA’s purpose limitation principle by using consumers’ personal information in a manner that was inconsistent with the purposes for which it was collected and processed initially. On September 30, 2025, the CPPA announced a $1.35 million settlement—the largest in the agency’s brief history—with Tractor Supply Company (Tractor Supply), a rural lifestyle retailer. According to the CPPA, Tractor Supply failed to provide consumers with an effective mechanism to opt out of the selling or sharing of their personal information and failed to notify California consumers—including job applicants—of their privacy rights in its privacy policy and disclosures. Notably, this is the first enforcement action involving employment-related data, which is protected under the CCPA (unlike most other state privacy laws).
While California has historically been an active regulator in the privacy and cybersecurity landscape, Texas continued to establish itself as an active enforcer in 2025. In January 2025, Texas filed the first-ever privacy lawsuit under a state comprehensive privacy law. The lawsuit alleged that a car insurance company developed a software development kit (SDK) which could collect a user’s geolocation and movement data. Texas alleged that, by collecting consumers’ geolocation data without their knowledge or consent, the car insurance company violated the state’s Data Privacy and Security Act. Furthermore, Texas continued to aggressively enforce child privacy laws, with the state alleging in a January 2025 enforcement action against a social media company that it violated the state’s Deceptive Trade Practices Act by misrepresenting the quantity of explicit material depicting drugs, nudity, alcohol, and profanity exposed to children on the platform. Finally, Texas secured a $1.4 billion settlement—one of the largest data privacy related settlements reached by a single state—in connection with multiple data privacy claims, including a company’s alleged violations of the Capture or Use of Biometric Identifier (“CUBI”) Act.
5. No New State Comprehensive Privacy Laws
Despite the introduction of hundreds of consumer privacy bills, 2025 was the first year since 2020 in which no new state comprehensive privacy was enacted. Massachusetts’ and Pennsylvania’s state legislatures made strides towards comprehensive privacy acts by passing bills through one chamber. However, both states failed to pass the bills through the second chamber before the end of the year. This leaves the total number of states with comprehensive privacy laws at 19.
Still, nine states passed amendments to their existing comprehensive privacy laws in 2025: Connecticut, Montana, Oregon, Colorado, Kentucky, Texas, Utah, Virginia, and California. Among the most significant amendments was Connecticut’s SB 1295, which marked the second time Connecticut’s Data Privacy Act (CTDPA) has been amended since its passage in 2022. This latest amendment, which expands the CTDPA’s definition of sensitive data and modifies consumers’ right to access, will go into effect on July 1, 2026. One of the most significant changes SB 1295 makes to Connecticut state comprehensive privacy law is its expansion of CTDPA’s applicability. Currently, the CTDPA applies to entities that conduct business in Connecticut and, in the preceding year, (1) control or process personal information of at least 25,000 Connecticut residents and derive more than 25% of gross revenue from the sale of personal information; or (2) control or process personal information of at least 100,000 Connecticut residents. However, starting on July 1, 2026, the CTDPA will apply to entities that (1) control or process the personal data of at least 35,000 consumers; (2) control or process consumers’ sensitive data; or (3) offer consumers’ personal data for sale in trade or commerce. By expanding the applicability to entities that sell consumers’ personal information or process consumers’ sensitive information, SB 1295 will likely bring many more businesses within the CTDPA’s scope.
6. States Focus on AI Legislation
States remained focused on artificial intelligence (AI) legislation in 2025. On June 22, 2025, the Texas Governor signed the Texas Responsible Artificial Intelligence Governance Act (TRAIGA) into law, making Texas the second state to pass comprehensiveAI regulation (with Colorado being the first). The Act, which places categorical limitations on the deployment and development of AI systems, went into effect on January 1, 2026, exactly one month before the Colorado AI Act. On December 19, 2025, the New York Governor signed the Responsible AI Safety and Education Act (RAISE Act) into law, amending the version that was originally passed by the state legislature in June. The RAISE Act creates requirements for the training and use of AI frontier models, including creating a safety plan. Given the civil penalties available under both statutory schemes, companies should evaluate their uses of AI to ensure compliance.
Notably, the new administration issued an executive order attempting to curb the impact of state AI laws. The order, issued on December 11, 2025, targeted the “patchwork” of “50 discordant State” regulatory regimes “thwart[ing]” the innovation required for the US to “win[] the AI race.” The order cites to Colorado’s AI Act as an example of harms posed by state AI laws, arguing that its ban on “algorithmic discrimination” requires “entities to embed ideological bias within models”—potentially “forc[ing] AI models to produce false results in order to avoid a ‘differential treatment or impact’ on protected groups.” While the order could chill future AI legislation at the state level, it is worth noting that the New York Governor signed the RAISE Act into law after the administration issued the order.
7. Consumer Health Data Remains a Priority
Health data privacy remained a legislative priority for states in 2025. In March, Virginia enacted SB 754, amending its Consumer Protection Act to restrict the collection, disclosure, and sale of reproductive and sexual health information without opt-in consent. The law defines covered data broadly, including diagnoses, procedures, purchases, location data, and inferred information, while excluding HIPAA-regulated records. It also provides a private right of action directly to consumers, and gives the Virginia AG authority to bring actions as well. In June, the New York state legislature passed the Health Information Privacy Act (NY HIPA), which proposed strict limits on processing “regulated health information,” defined to include any data reasonably linkable to an individual and related to physical or mental health. Ultimately, however, the New York Governor vetoed the Act on December 19, 2025 (although a modified version is expected to be introduced in 2026).
2025 also marked the first time a class action complaint was filed under Washington’s My Health, My Data Act (MHMDA). The class action alleged that Amazon.com, Inc. and Amazon Advertising, LLC’s SDK embedded in third-party mobile applications violated federal wiretap laws and state privacy laws, including the MHMDA. This lawsuit represents a significant test case for Washington’s MHMDA, which has been in effect since March 2024. Given the continued focus on consumer health data, companies should evaluate any data collection and consent processes related to their business.
8. States Propose and Finalize Significant Rules under the Comprehensive Privacy Laws
Even though no new comprehensive laws passed last year, companies should still be aware of new requirements at the state level that have been implemented through the rulemaking process. On June 2, 2025, the New Jersey Division of Consumer Affairs (the “Division”), alongside the Office of the Attorney General, announced proposed rules (the “Proposed Rules”) to implement the New Jersey Data Privacy Act (NJDPA). The 60-day comment period, which closed on Friday, August 1, 2025, provided the public with an opportunity to weigh in on how the NJDPA is enforced. After the Division reviews and considers the submitted comments, it is expected to publish a Notice of Adoption this year.
Additionally, on September 23, 2025, California finalized its regulations on cybersecurity audits, risk assessments, and automated decisionmaking technology (ADMT). While the regulations went into effect on January 1, 2026, companies have additional time to comply with cybersecurity audits, risk assessments, and requirements for automated decisionmaking technologies.
9. Pixel Tracking Litigation Remains Active
Litigation over website tracking technologies continued into 2025 as plaintiffs challenged these practices under various legal theories. Plaintiffs relied on state privacy statutes such as the California Invasion of Privacy Act (CIPA), wiretapping laws, and even the Video Privacy Protection Act (VPPA), a federal statute originally intended to protect video rental histories. There have been significant circuit splits on the VPPA and whether pixel tracking litigation falls under it.
The VPPA prohibits a “video tape service provider” from knowingly disclosing personally identifiable information (PII) about a consumer, except in limited circumstances such as consent. Courts have split on what qualifies as PII and who counts as a consumer. On the PII question, the Second Circuit in Solomon v. Flipps Media held that complex identifiers embedded in pixels do not qualify as PII because an ordinary person cannot easily link them to video-viewing behavior. This approach aligns with the Third and Ninth Circuits’ “ordinary person” standard but conflicts with the First Circuit’s “reasonable foreseeability” test. On the consumer issue, the D.C. Circuit in Pileggi v. Washington Newspaper Publishing narrowed the interpretation of “consumer,” going a step beyond the Sixth Circuit to further limit VPPA protections to individuals who rent, purchase, or subscribe to specific audiovisual services. The D.C. Circuit ruled that merely consuming audiovisual materials is insufficient; the videos for which viewing history is disclosed must be the same materials or services the individual purchased, rented, or subscribed to. In contrast, on this issue, the Second and Seventh Circuits have held that the VPPA covers individuals who rent, purchase, or subscribe to any good or service provided by a videotape service provider, even if the good or service is not itself audiovisual.
These significant circuit splits on interpreting the VPPA and pixel litigation may eventually prompt Supreme Court review. For now, companies should monitor developments and reassess their use of tracking technologies to reduce litigation risk.
10. Data Security and Breach Enforcement Remains a Priority
Regulators continued to prioritize data security enforcement in 2025, imposing significant penalties for inadequate safeguards and poor incident response. At the federal level, the FTC relied on Section 5 of the FTC Act to frame security failures as unfair or deceptive practices, while signaling a shift toward more targeted enforcement focused on deception, fraud, and data security. On December 1, 2025, the FTC announced a settlement with Illuminate Education Inc. (Illuminate) following the education technology provider’s security incident. According to the FTC, Illuminate misrepresented “that it implemented reasonable measures to protect personal information against unauthorized access.” As part of the settlement, Illuminate is required to implement a data security program and delete unnecessary data.
State AGs also remained active, using Unfair or Deceptive Practices (UDAP) statutes and breach notification laws to pursue enforcement. These enforcement actions included Massachusetts’ August 2025 settlement against a property management company following multiple security incidents. According to the Massachusetts AG, Peabody Properties, Inc (Peabody) failed to adequately protect Massachusetts residents’ personal information and violated the state’s data breach notification statue by unlawfully delaying required notifications to the AG and consumers. At both the state and federal level, organizations should expect increased oversight and prioritize risk assessments, employee training, and breach readiness. Repeat offenders face escalating penalties, making proactive compliance essential.
***
2025 was a year of substantial change in the regulation of data privacy, in ways that were both expected and unexpected. While no federal privacy legislation or state comprehensive privacy laws emerged, highlights from the past year included critical federal regulations and significant state enforcement actions. Companies affected by these developments—likely most companies of any meaningful size in the United States—will need to understand their current compliance obligations as well as anticipate the potential legislative and regulatory changes ahead in 2026.