Skip to main content
February 8, 2023

What’s Coming in 2023? Top 2022 Privacy Events, Trends, and 2023 Forecast


We want to know if you are planning on attending this year’s IAPP Global Privacy Summit from April 4-5, just steps away from Arnold & Porter’s office in Washington, DC, as we would love to see you at our reception on Tuesday, April 5. Please let us know if you or your colleagues will be in town by adding yourself to our IAPP Global Privacy Summit mailing list.

Privacy arguably had its biggest year in 2022 in terms of the number of developments and the speed in which they cropped up. Many of the developments that occurred last year will guide the course of privacy law this year. Here’s a quick look of some of the top developments that we’ve monitored throughout 2022 and will continue to watch in 2023.

European Data Transfers: Better?

EU-US Data Transfer Landscape Shows Improvement . . .

Following the Schrems II decision in July 2020, companies have been scrambling to update standard contractual clauses with vendors and other third parties (often thousands of contracts) to meet the new requirements set forth by the European Commission (EC), in addition to completing complex transfer impact assessment requirements. The end of the year saw some improvement in the data transfer landscape when the European Commission released on December 13, 2022 a Draft Adequacy Decision for the EU-US Data Privacy Framework (Framework) that will allow companies certified under the program to once again transfer data from the EU to the United States without an additional transfer mechanism (such as standard contractual clauses). While this Framework will be subject to attack (and in fact, that attack has been foreshadowed), the adequacy decision seeks to address the concerns raised by Schrems II,1 including by relying on the safeguards implemented by the Biden Administration’s Executive Order 14086, which include, most importantly, a narrowing of personal data that will be sought as part of the United States’ surveillance and national security activities and the creation of redress mechanisms for EU data subjects.

The Framework is an important step towards establishing the former EU-US data transfer framework’s (i.e., the Privacy Shield’s) successor. Under the Framework, the European Commission concluded that the United States’ data protection system is essentially equivalent to the EU’s. US companies can become certified under the Framework by committing to the EU-US Data Privacy Framework Principles. Under the certification, companies will be subject to the investigatory and enforcement powers of the Federal Trade Commission (FTC) or the Department of Transportation. Once the adequacy decision becomes final—approvals from the European Data Protection Board, European Parliament, and a committee of EU Member States representatives must be obtained—data can freely flow from the EU to the United States if the US company is certified.

. . . But the UK’s Proposed Data Protection Law Reform May Impede Data Transfers Between the UK and the EU

In May 2022, the British government announced its intention to reform UK data protection laws. It looked to take advantage of Brexit and reconcile the conflicting aims of creating a more business-friendly data regime promoting growth and innovation while protecting individuals’ privacy rights. The House of Commons introduced the Data Protection and Digital Information Bill (Bill) in July 2022 to realize these post-Brexit intentions. Although the Bill amounted to little more than an evolution of the existing UK General Data Protection Regulation (GDPR) rather than a radical overhaul, the introduced changes on international data transfers potentially threaten the EC’s UK adequacy finding made in 2021. The UK adequacy decision enables the free flow of personal data between the EU and the UK following Brexit. The EC may withdraw the adequacy decision if the UK materially diverges from European data protection standards. Without the adequacy decision, EU businesses will be prevented from freely sharing personal data with the UK, requiring reliance on an alternate data transfer mechanism.

The Bill was postponed indefinitely in September 2022. One month later, the UK’s Secretary of State for Digital, Culture, Media and Sport (DCMS) restated the government’s plan to reform data protection law in the UK, criticizing the GDPR as red tape inherited from the EU, and the DCMS’s Deputy Director announced in November 2022 that the latest consultation on the Bill would commence shortly. Reforming UK data protection law seems a surprising priority at the current time, and the government’s claims that the UK GDPR is overly unfriendly to business goals for data use seem difficult to substantiate. If the government proceeds with its proposed reform and triggers the revocation of the UK adequacy finding, multinational companies will inevitably face increased costs in developing appropriate data transfer solutions.

The United States: Heading Towards a Centralized Approach?

A US Federal Privacy Law Is Proposed . . .

For decades, the US Congress has repeatedly tried to pass a law to protect citizens’ privacy while states have introduced and passed their own privacy laws. 2022 finally saw some progress on the federal front: in July, the House Energy and Commerce Committee advanced the American Data Privacy and Protection Act (ADPPA). The ADPPA is a comprehensive privacy bill that would establish national standards and safeguards for personal data collected by businesses. It also represents the government’s latest effort to address the current patchwork of industry-specific privacy laws.

Similar to already established state consumer privacy laws, ADPPA grants consumers certain rights, including the right to access, correct, delete, and transfer their personal data. It also contains provisions addressing and creating notice and purpose limitations related to the collection and use of covered data, opt-out, and consent requirements.

The ADPPA differs from previous attempts by Congress to pass a comprehensive privacy bill by including a preemption provision and creating a private right of action. ADPPA would preempt current state privacy laws that regulate the same issues and includes a list of state laws that must be preserved, including the California Consumer Privacy Act’s private right of action, the Illinois Biometric Information Privacy Act, and state data breach notification laws. Additionally, subject to certain limitations, the ADPPA includes a delayed private right of action: beginning two years after the enactment of the ADPPA, individuals may sue covered entities for damages, injunctions, litigation costs, and attorneys’ fees.

Although ADPPA received bipartisan and bicameral support, concerns have been raised over its preemption provisions, purported enforcement gaps, and failure to impose a “duty of loyalty” requirement on covered entities. ADPPA supporters hope that the public’s increasing concerns on how businesses collect, store, protect, and monetize personal data will push the bill to become a priority in 2023.

. . . But US States Continue To Introduce and Pass Privacy Laws

While the ADPPA sits on the House Union Calendar awaiting further action, states have continued to pass privacy legislation. In 2022, Connecticut and Utah joined California, Colorado, and Virginia in passing comprehensive privacy laws. Several other states, including Indiana (SB 358), Iowa (House File 2506), Kentucky (HB 586), Louisiana (House Bill 987), Tennessee (HB 1467 / SB 1554), and Wisconsin (Assembly Bill 957), have considered similar bills. All of the enacted (and not enacted) privacy laws follow a similar approach to those existing in the other states, but each contains some nuances that will create additional compliance obligations for companies. We suspect that we will continue to see activity in this space in 2023.

All of this activity occurred while the existing state laws have issued drafts of implementing regulations. For instance, the California Privacy Rights Act’s draft regulations impose additional notice requirements, including a requirement to have both a notice at collection and a privacy notice, and impose potentially onerous requirements related to recognizing opt-out preference signals.

Litigation Clamps Down on Use of Tracking Technologies, While Data Uses and Access Grows

Tracking Technologies Face Trouble . . .

2022 saw significant focus on website and application trackers such as cookies and pixels. A rash of litigations were filed against companies playing video on their websites and simultaneously using certain pixels containing that video data under the antiquated Video Privacy Protection Act (VPPA); wiretapping statutes were reinstated to attack the use of chatbots and session replay software; and disclosures made to third parties through cookies and pixels from various websites were the subject of data breach and similar class action litigation.

In addition, the Department of Health and Human Services (HHS) Office for Civil Rights (OCR) indicated that it plans to crack down on covered entities’—hospitals and other healthcare providers—use of tracking devices on their websites. Last December, OCR issued a Bulletin on online tracking technologies that stated that even data of some unauthenticated users, i.e., casual visitors to a covered entity’s website, may be considered protected health information under the Health Insurance Portability and Accountability Act (HIPAA) (and thus subject to HIPAA disclosure limitations and protections, among other things). And under US state privacy laws, behavioral or targeted advertising continues to require changes to ensure opt-out abilities for consumers in those states.

Many healthcare companies have been under attack as a result of using tracking tools, including the pixel. Indeed, one health system who notified individuals about the use of the tracking tool is now the subject of class action litigation.

In addition, recent rulings in the Third and Ninth Circuits have ushered in additional privacy class actions alleging that use of two specific online tracking technologies violate state wiretapping laws: (1) “session replay” tools that collect information about a user’s interactions with a webpage (e.g., cursor movements, clicks, pageviews, details entered into a webform) and (2) “chatbots” to streamline customer service inquiries by employing artificial intelligence and recording those conversations. Both the Third (in Popa v. Harriet Carter Gifts, Inc.2) and Ninth Circuits (in Javier v. Assurance IQ, LLC3) have held that the relevant wiretapping statutes at issue (Pennsylvania and California, respectively) apply to the use of chatbots and session replays. This is certain to continue to spur litigation for these types of tracking technologies.

Finally, a dramatic surge of class-action litigation in 2022 alleged violations of the VPPA,4 raising compliance concerns for virtually any company serving video content in the US. The VPPA was passed in 1988 after a journalist had published the video cassette rental history of US Supreme Court nominee Robert Bork that the journalist obtained from Mr. Bork’s video rental store. The VPPA essentially prohibits “video tape service providers”5 from knowingly disclosing a consumers’ personally identifiable information to a third party without the consumer’s consent. Although the VPPA contemplated only video tapes at the time it was passed, “video tape service providers” under the statute broadly encompasses any business engaged in the “rental, sale, or delivery of prerecorded video cassette tapes or similar audio visual materials.”6 For this reason, occasional VPPA claims against video streaming services have survived motions to dismiss in the recent past.

In 2022, however, more than 80 VPPA claims were brought at the state and federal levels. The lawsuits targeted “traditional” video streaming providers in addition to newspapers, radio networks, health-focused websites, and even manufacturers of retail products whose businesses are not generally associated with video content at all. The common denominators among the tsunami of litigation are: (1) the defendants are all alleged to have hosted video content on their websites or mobile apps and (2) the defendants are all alleged to have web beacons, pixels, and/or website cookies that disclose a consumer’s identity and some information about the video content watched to a third party, such as a social media site or online advertising network.

. . . But Health Data Use in the Post-Roe World Leaves Health App Users Vulnerable to Prosecution

In June 2022, the Supreme Court of the United States made waves when it released its decision in Dobbs v. Jackson Women's Health Organization7 and overturned the constitutional right to abortion. The decision reverberates far beyond individual fundamental rights. It impacts healthcare, research, and privacy and data protection.

The Court’s decision leaves individuals at increased risks of civil and criminal penalties for performing abortions after a certain gestational age, while there are limited legal protections, rights, and individual control over personal information that could be used in determining such penalties.

Multiple pieces of personal information may, when combined, lead to the inference that an individual sought abortion-related services. For example, someone may use an app or website to track hormonal trends and predict fertility, conduct web searches on pregnancy or pregnancy-related healthcare, communicate with healthcare providers or other individuals about the same, or visit certain pregnancy-related healthcare offices while using a GPS-enabled device. Law enforcement may have the ability to subpoena this information to support a suspicion that an abortion was performed. Such personal information has also become far more valuable in the past year, and data brokers have caught on. As a result, there is increased incentive for data brokers to find and compile all of this information and sell it.

HIPAA falls short of protecting protected health information (PHI) from such subpoenas and other mechanisms to obtain information: (1) while HIPAA prohibits the disclosure of PHI unless an individual has provided authorization, that protection is unavailable when law enforcement is looking for evidence of a crime and (2) HIPAA only applies to covered entities (and their business associates) which include healthcare providers, health plans, and healthcare clearinghouses. Data brokers and apps that store personal information input by the user or track geolocation do not have to answer to HHS, although they will be subject to FTC enforcement (and potentially enforcement by other regulators).

Individuals that reside in states without any comprehensive privacy laws are left with little to no power to limit these activities. Currently, there are not any states that prohibit abortion and have privacy laws in place. Dobbs has increased interest and motivation to pass comprehensive privacy laws, although we do not expect to see the passage of such federal laws any time soon.

Until a national privacy law that codifies principles of data minimization and purpose limitation, and requires express affirmative consent of an individual prior to the sharing of their sensitive personal information with third parties, is enacted, much of this information may be available to law enforcement or others without individuals’ knowledge or consent.

Enforcement Is on the Rise

Regulators’ Increasingly Aggressive Use of Enforcement Powers for Privacy and Data Security

The FTC delivered on chair Lina Khan’s publicly stated intention to use the FTC’s enforcement power more aggressively for data privacy and security purposes in 2022. Following discovery of a widespread vulnerability in the programming code known as Log4j, the FTC indicated that it would “use its full legal authority to pursue companies that fail to take reasonable steps to protect consumer data from exposure as a result of Log4j, or similar vulnerabilities in the future.” The FTC also brought a suit against Kochava, Inc. for its sale of precise geolocation data that revealed customers’ visits to “sensitive locations” such as abortion clinics, places of worship, and homeless shelters. The FTC’s complaint alleged Kochava violated Section 5 of the FTC Act by selling these customized data feeds in order to assist them with “advertising and analyzing foot traffic at stores or other locations.” The FTC also secured two record-breaking settlement agreements with Epic Games, Inc., the creator of the popular video game Fortnite: a $275 million monetary penalty for illegally collecting personal information from children under the age of 13 without parental consent and a $245 million to refund consumers for Epic Games’ alleged use of “dark patterns” to manipulate users into making unintentional purchases. The first penalty is the largest ever obtained for violating an FTC rule, while the second is the FTC’s largest refund amount in a gaming case and largest administrative order in history. The FTC also recently entered into a consent order with online alcohol marketplace Drizly, LLC and its CEO for inadequate cybersecurity practices, which the FTC alleged resulted in a data breach exposing the personal information of about 2.5 million customers. Among other things, the FTC’s order obligates Drizly to destroy any personal data it collected that is not necessary for it to provide products or services to consumers and must refrain from maintaining any personal data not necessary for the specific purposes set forth in a retention schedule.

The Irish Data Protection Commission fined Meta Ireland €390 million, which issued alongside the European Data Protection Board’s (EDPB’s) binding decisions announcing that Facebook’s and Instagram’s lawful bases for processing personal data for personalized advertising were not valid. The EDPB decisions rejected the contract basis, saying that contracts could not be used because behavioral advertising was not a core element of the services.

And More Is Yet to Come

On top of the enforcement, regulators have indicated broad enforcement interest. The FTC flexed its rulemaking muscles in the summer of 2022, publishing an Advance Notice of Proposed Rulemaking (ANPR) to address what the FTC termed “harmful commercial surveillance and lax data security.” According to Chair Khan, adopting regulations would not only enable the agency to fine first-time violators, which it generally lacks authority to do under the FTC Act, but also serve to prevent punishable actions and the injuries they cause, many of which are difficult to remediate through case-by-case enforcement. The FTC held a public forum and invited comments on the ANPR, which it is now reviewing and presumably will take into account in issuing proposed regulations addressing the topics discussed in the ANPR.

If the FTC does move forward with such regulations, it apparently will address a full range of “commercial surveillance” activities, which the ANPR broadly defines to include “the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information,” i.e., virtually any type of processing of consumer data and not just individually identifiable information. Among the key areas of focus for the rulemaking are:

  • data minimization, purpose-based limitations on data use, and other restrictions on the collection, use, retention, and transfer of consumer data;
  • limits on personalized or targeted advertising and the use of biometric technologies;
  • standards for the use of artificial intelligence and other automated decision-making systems, such as requirements for algorithmic accuracy, validity, and reliability;
  • transparent and easily executed methods for consumers to exercise control over their personal data; and
  • mandates for data security.

The FTC has embarked on what will likely be a lengthy rulemaking process. Throughout this process, businesses subject to the agency’s authority would do well to consider how they might engage with and educate the FTC on the benefits and drawbacks of particular regulatory approaches. Once it publishes proposed rules, the agency will be seeking feedback, and being prepared to take well-reasoned positions on the issues raised in the ANPR will give businesses an important leg up in having a meaningful influence on any final regulations.

And the French data protection authority has indicated in its action plan that it will focus on and prioritize understanding mobile apps data flows to move towards requiring increased transparency to better protect the privacy of its users.

Meanwhile, on October 12, 2022, the EU published the Digital Markets Act (DMA), which targets “gatekeepers” (companies providing core platform services such as search engines, social media, video sharing platforms, browsers, online advertisers, and the like) that meet certain volume and revenue requirements. Later that month, on October 27, 2022, the EU published the Digital Services Act (DSA), which imposes many obligations on similar providers of digital services (including online social media and similar platforms, search engines, hosting services, caching services, and others). While many of the obligations of the DMA and DSA relate to content moderation, antitrust matters, and other non-privacy issues, the DMA and DSA interact with other privacy considerations companies are facing, including requirements to refrain from using dark patterns, to only process personal data collected from third party services for online advertising with prior consent, to collect consent for certain secondary data uses, and to prohibit targeted advertising based on profiling of sensitive or children’s data.

At the same time, the European Commission has recognized the need for more data sharing. The European Data Governance Act (DGA) was published on June 3, 2022. The DGA allows for certain re-use of personal data and sets up a framework to be certified as a trustworthy data collector and sharer. The EC also proposed a regulation regarding a European Health Data Space, intending to ensure access, sharing, and use of health data across the EU (not dissimilar to some of the obligations contained in the Cures Act in the United States).

As 2023 plays out, Arnold & Porter’s Privacy, Cybersecurity & Data Strategy lawyers will continue to track and advise on these and other privacy and cybersecurity issues that affect our clients’ businesses. To keep up with us and the latest privacy and cybersecurity legal developments, sign up to receive our advisories and updates.

© Arnold & Porter Kaye Scholer LLP 2023 All Rights Reserved. This Advisory is intended to be a general summary of the law and does not constitute legal advice. You should consult with counsel to determine applicable legal requirements in a specific fact situation.

  1. The reasons for invalidating the previous adequacy decision were mainly focused on two points: (1) disproportionate and unnecessary access by US governmental bodies to the personal data of EU data subjects and (2) no effective redress mechanism available for EU data subjects.

  2. Popa v. Harriet Carter Gifts, Inc., 52 F.4th 121 (3d Cir. Oct. 18, 2022).

  3. Javier v. Assurance IQ, LLC, No. 21-16351, 2022 US App. LEXIS 14951 (9th Cir. May 31, 2022).

  4. Video Privacy Protection Act, 18 U.S.C. § 2710.

  5. 28 U.S.C. § 2710(a)(4).

  6. Id.

  7. 142 S. Ct. 2228 (2022).