Resource Center

Introduction to Resource Center
This page provides an overview of the IAPP's Resource Center offerings.

Contact Resource Center
For any Resource Center related inquiries, please reach out to resourcecenter@iapp.org.

FTC enforcement trends: From straightforward actions to technical allegations

Amy Olivero

Andrew Folks

Cases analyzed from 2018 to 2024.

Published: May 2024

Navigate by Topic

Until recently, the U.S. Federal Trade Commission limited its prescribed guidance on data privacy and cybersecurity compliance. Instead, its enforcement actions have offered a new common law of privacy, providing examples of actions and behaviors it considers unreasonable practices. In 2014, the IAPP Westin Research Center analyzed 47 FTC actions spanning 12 years in detail, working backward from the descriptions of privacy and security practices alleged to violate the FTC Act in order to develop best practices and guidelines. In 2018, the IAPP revisited the case study to trace the evolution of enforcement trends, looking at 50 cases from 2014-2018.

These studies illustrate how FTC enforcement has grown from straightforward actions over misrepresentations in privacy policies and data transfer agreements to more technical allegations over issues like facial recognition technology, software development kit usage and expanded definitions of "unfair practices" and "sale of data." Accordingly, privacy taxonomy continues to grow in complexity.

The IAPP has now analyzed 67 FTC enforcement actions between October 2018 and April 2024 in eight primary areas: children's privacy, health privacy, general privacy, data breaches and data security practices, vendor management and third-party access, employee data and management, and artificial intelligence governance. The analysis offers actions companies can take toward compliance, summarizes major trends in each area and dives deep into case studies of interest.

As noted, the IAPP constructed this case study series on the premise that the FTC has not historically provided much explicit compliance guidance to companies. However, this approach to understanding FTC enforcement actions may change in response to the agency's growing use of its business guidance and technology blogs and forthcoming rulemaking signaled by its 2022 Advanced Notice of Proposed Rulemaking on Commercial Surveillance and Data Security. Through this public consultation, the FTC is working to promulgate regulations to reduce ambiguity for organizations seeking to establish strong privacy and security protocols .

Children's privacy

As recent FTC leadership has prioritized protecting the data of more vulnerable populations, children's online privacy has come into focus through 14 enforcement actions since October 2018. With robust — and currently evolving — rules already in place under the Children's Online Privacy Protection Act, children's privacy has ripened within the enforcement agenda as regulators endeavor to protect children online.

Alleged COPPA violations range from simply failing to observe notice-and-consent requirements to knottier issues involving machine learning algorithms and dark patterns. The FTC's COPPA enforcement provides inferred guidance that businesses offering online services, apps or products directed to child users should:

On notice and choice

On data management and governance

expand_more Cases analyzed

Case study: US v. Epic Games

Subject: Children's privacy

In December 2022, the FTC filed two complaints against video game developer Epic Games for practices relating to popular online multiplayer game Fortnite. One federal complaint alleged Epic violated the COPPA Rule and FTC Act, while another administrative complaint alleged Epic's illegal use of dark patterns.

The federal complaint alleged Epic failed to notify and obtain consent from parents when collecting personal information from their children. It pointed to Epic's marketing and user surveys to show the company knew children comprised much of its user base. Epic also created a confusing and unreasonable process for parents to request deletion of their children's personal information — at times simply failing to honor requests. In Fortnite, it enabled live text and voice communications by default and allowed children and teens to play against and communicate with strangers, creating elevated risk of harassment and bullying.

Later, the administrative complaint spotlighted Epic's use of dark patterns, or deceptive or manipulative design features, to encourage in-game purchases. Epic allegedly maintained a confusing button configuration that led Fortnite players to incur unwanted charges. As a result, children frequently purchased in-game content without parental or card-holder consent. Epic further ignored more than one million user complaints and several employee concerns about these billing practices.

Businesses processing children's data and hosting accounts used by children should heed the FTC's Epic warnings: Children's privacy settings should not be public by default. Likewise, the FTC's increasing concern over dark patterns indicates businesses must avoid resorting to trickery to boost sales or accumulate personal information.

Health privacy

In the era of wearable devices and personal health mobile apps, the FTC has stepped up enforcement of health data privacy laws and regulations amid questions around the future of the Health Insurance Portability and Accountability Act and the lack of digital health data protections. The application of new enforcement tools, such as the Health Breach Notification Rule, has led businesses to reassess their digital health privacy practices.

The scope of what the FTC considers sensitive health information expanded under actions like GoodRx, where it clarified elevated privacy protections apply to any data that could lead to inferences about sensitive health information, such as life expectancy, sexual and reproductive health, and disability. Businesses collecting or processing personal health information must accordingly heed guidance inferred from relevant actions, including recommendations to:

On notice and choice

On data disclosures

On data management and governance

expand_more Cases analyzed

Subject: Health privacy

Easy Healthcare developed Premom, a mobile application that collected personal information on menstrual cycles, fertility and pregnancy and allowed users to import data from other apps. Easy Healthcare settled with the FTC over allegations that it deceptively shared users' sensitive personal information with third parties and shared health information for advertising purposes without obtaining consumers' affirmative express consent. The FTC further alleged that Easy Healthcare failed to notify consumers of these unauthorized disclosures, in violation of the HBNR, and failed to implement access controls to address privacy and security risks created by use of third-party automated tracking tools, like software development kits.

Premom illustrates several hallmarks of the modern FTC. It features a previously underutilized enforcement tool, the HBNR, to fill gaps in HIPAA privacy protections, establishing new protection for digital health data. Further, the Premom complaint exhibits technical sophistication that was absent in past actions, including an exposition of SDKs and those responsibilities inherent in using them. Regardless of whether a business falls within the scope of existing health privacy laws, if it traffics in health data, it should study Premom to discern responsibilities assigned by the FTC to companies handling sensitive and/or personal health data.

Individual privacy, financial privacy and consumer reporting

Finance, consumer reporting, advertising technology and other personal data-centric businesses have been frequent enforcement targets as well. Where mature legislation like the Gramm-Leach-Bliley Act and the Fair Credit Reporting Act detail requirements for financial institutions and consumer-reporting agencies, respectively, their lessons can often retain wider applicability.

Analysis of relevant FTC enforcement trends instruct businesses to:

On notice and choice

On data disclosures

On data management and governance

expand_more Cases analyzed

Subject: Individual privacy, financial privacy and consumer reporting

Everalbum enabled consumers to upload photos and videos to its cloud servers and used facial-recognition technology to group photos by the individuals appearing in each photo. Most users were unaware of and unable to disable the face-recognition feature. Everalbum allegedly trained its own face technology using photos from users who did not disable the face recognition feature and failed to delete photos and videos of users who requested account deactivation, contradicting its privacy policy. The FTC consent order required Everalbum to disclose its biometric information policy, obtain affirmative express consent from users who provided biometric information, and delete information and models trained on data from users who deactivated their accounts.

Everalbum marks the first FTC case focused on policing facial recognition, an increasingly popular technology that causes extensive concern for privacy advocates. It also spotlights what has quickly become the remedy du jour in the face of expanding algorithmic capacity: Algorithmic disgorgement. It has thus far been ordered in five cases: Cambridge Analytica in 2019, Everalbum in 2022, WeightWatchers in 2023, Ring in 2023, Rite Aid in 2023 and Avast in 2024.

Data security practices and breach incident response

FTC data security guidance has grown significantly more granular in recent years. In the drawn-out litigation of LabMD, the 11th U.S. Circuit Court of Appeals found the FTC's order unenforceable due to insufficient specificity, citing an "indeterminate standard of reasonableness" conveyed by the agency. Since that 2018 decision, consent orders have enumerated considerably more specific prescriptions for businesses, as first demonstrated by the Lightyear Dealer Technologies case.

FTC enforcement in the wake of data breaches sticks to a fairly consistent playbook that features explicit and, at this point, routine security mandates. The agency does not tolerate businesses cutting corners or dealing with security incidents in bad faith. Previous practices discussed in the two previous meta-analyses carry through to current guidance, with continued emphasis on strong passwords, strict access controls, encryption and other reasonable safeguards designed to mitigate identified risks.

With respect to data security, businesses should:

expand_more Cases analyzed

Subject: Data security practices and breach incident response

In May 2014, attackers gained unauthorized access to Uber drivers' personal information. In 2016, Uber suffered a second breach of drivers' and riders' data after attackers used a key posted by an Uber engineer on GitHub, a code-sharing website, to access consumer data stored on third-party cloud providers' servers. The breach exposed unencrypted files containing the names, email addresses, phone numbers and driver's license numbers of tens of millions of Uber riders and drivers.

After the attackers notified Uber that they compromised its databases and demanded ransom, Uber paid them USD100,000 through its "bug bounty" program, which is intended to financially reward individuals for responsible disclosure of security vulnerabilities.

Uber demonstrates federal commitment to hold those responsible for failure to protect privacy and data security accountable. The FTC complaint leaned on deception, alleging the company failed to live up to its claims that it closely monitored employee access to rider and driver data and that it deployed reasonable measures to secure personal information stored on third-party cloud provider's services. The agency admonished the company's evasiveness in failing to notify the FTC or consumers for a year following the second breach — a course of action that rarely ends up well for FTC defendants.

Editor's note: A previous version of this article contained a sentence that misstated the convictions against Uber's Chief Security Officer, which are undergoing appeal. That sentence has been removed.

Service providers and third-party access

Recent FTC actions also reflect an appreciation for the complexity of the data ecosystem. Often, companies share personal data from employees or customers with third parties to receive services from the third parties and/or generate revenue. Downstream management of personal information shared with service providers or third parties has also been a point of emphasis under the FTC. Accordingly, businesses are advised to:

expand_more Cases analyzed

Subject: Service providers and third-party access

Mobile phone maker BLU contracted with China-based third-party service provider ADUPS Technology to "issue security and operating system updates" to their devices. The FTC alleged BLU and its president "failed to implement security procedures and to oversee the practices" of ADUPS, resulting in the collection of sensitive personal information without consumer knowledge and the creation of security vulnerabilities on consumer devices.

The settlement with BLU includes a mandated data security program that provides guidance applicable beyond the mobile device manufacturing industry, listing practices against which businesses may compare their existing data security programs.

BLU also continues a trend of naming individual defendants, typically high-ranking executives, as parties to an action and saddling them with extensive and ongoing obligations. Significantly, the portions of the consent order mandating action by BLU's CEO follow him to any companies he owns or controls.

Employee data and management

Employees and contractors can create liabilities for businesses, both in terms of personal information about them as well as the information they can access. Companies targeted by FTC enforcement in this area have often failed to implement or follow a privacy or data security program.

Based on actions involving employee data and employee management, businesses should:

expand_more Cases analyzed

Case study: FTC v. Ring

Subject: Employee data and management

Ring, an Amazon subsidiary selling home security camera systems, allegedly permitted employees to access customers' videos, used customer videos to train algorithms without consent and maintained insufficient security safeguards. Every employee and hundreds of third-party contractors had access to every video from the cameras, regardless of whether their position required access. One employee viewed thousands of recordings of female users in intimate spaces, including bathrooms and bedrooms. In its response, the FTC emphasized businesses should not bury the purposes for data collection in privacy policies and mandated more controls and monitoring of employee access to consumer data.

The lessons of Ring instruct companies to not only protect personal information from outside actors, but also from those professionally responsible for internal data processing. The consent order provides insight into the FTC's requirements for employee access, largely mirroring the previously mentioned data security prescriptions. The FTC does not necessarily distinguish across worker classifications, either. Whether an employee or contractor has access to personal information is often immaterial — proper controls must be in place regardless.

AI governance

Of late, the FTC has brought AI under its spotlight to the extent that use of AI technology may result in harm to consumers. Early enforcement and guidance involving the emerging technology have applied existing frameworks for unfair and deceptive practices in other areas, including data privacy and security, to automated decision-making systems and machine-learning technologies. In turn, AI governance professionals must also heed many of the lessons discussed herein. Failing to adhere to tenets of privacy program management such as providing notice, obtaining consent, systematically assessing risk and honoring opt-outs have been the basis of AI-related actions and provide insight into the FTC's priorities for regulating novel technologies.

With only five total actions related to AI as of March 2024, the FTC has only dabbled in regulation of AI use. Nonetheless, its actions thus far offer valuable insight for businesses in the AI industry or for businesses looking to incorporate AI and other novel technologies into their products and services. As such, based on the FTC's nascent AI common law, businesses are recommended to:

expand_more Cases analyzed

Case study: Rite Aid

Subject: AI governance

In perhaps its deepest incursion into AI governance enforcement to date, the FTC alleged drug store chain Rite Aid used facial recognition technology based on AI and automated biometric detection systems to surveil and identify patrons it deemed likely to engage in shoplifting and other criminal behavior. Rite Aid implemented an app that alerted employees to take action against individuals who triggered matches with pictures of known criminals.

Despite countless false positives and other deficiencies, the business relied on the technology for eight years, resulting in emotional, reputational and other harms to its customers. As an example, a Black woman was asked to leave the store following an alert that indicated a match to "a white lady with blonde hair." Elsewhere, an 11-year-old girl was stopped and searched based on a false positive match, causing her mother to miss work due to the child's distress.

The Rite Aid case cautions businesses to develop and maintain AI-governance programs tailored to their respective industries with consumers in mind. The chain failed to take reasonable measures to anticipate and prevent foreseeable harm that might — and later did — befall its customers. Many of these harms could likely have been prevented, mitigated or remedied through assessment of risks associated with the technology's use, testing or assessing accuracy before its deployment or regularly thereafter, monitoring input and output quality, or training and overseeing the employees tasked with its operation. If Rite Aid engaged in appropriate deliberate and regular data and technology governance, it may have avoided many of the pitfalls that invited regulatory scrutiny.

Wrap-up

Entering its 110th year, the FTC has continued to evolve in how it approaches consumer privacy protection and sets its enforcement priorities. Its development of technical expertise has led the agency to a deeper understanding of emerging industries, demonstrated by the growing frequency of references to dark patterns, adtech tools and, of course, AI. It has recognized the vulnerable contours of privacy, concentrating on safeguarding more sensitive categories of data, namely, information related to children, health and personal finances. Its jurisdiction has permeated throughout business relationships and supply chains, extending its influence into third-party access and service-provider management. It has expanded the enforcement tools at its disposal by creatively using the HBNR and GLBA, naming individual defendants, and demanding algorithmic deletion or disgorgement.

These trends will doubtlessly continue as the FTC continues to usher businesses toward its conception of reasonable data privacy and security compliance through enforcement, regulation and guidance.

Additional resources

expand_more Previous editions of this report

What FTC Enforcement Actions Teach Us About the Features of Reasonable Privacy and Data Security Practices
In the first edition of this report, former IAPP Westin Fellow Patricia Ballin analyzed 47 FTC cases from 2002 to 2014.
View here What FTC Enforcement Actions Teach Us About the Makings of Reasonable Privacy and Data Security Practices: A Follow-Up Study
In the second edition of this report, former IAPP Westin Fellow and current IAPP Principal Researcher of Privacy Law and Policy, Müge Fazlioglu, examined 50 FTC cases from 2014 to 2018.
View here Guide to FTC Privacy Enforcement
This guide from 2017 describes the various paths the Federal Trade Commission may pursue when it brings privacy cases under its primary consumer protection authority, Section 5(a) of the FTC Act.
View here

expand_more FTC resources

The Rise of Prescriptive Technical Safeguards in FTC Settlements
This white paper reviews U.S. Federal Trade Commission settlements that have required increasingly specific remedies.
View here FTC Privacy Rulemaking – The Steps to Get There
This infographic outlines the key rulemaking steps from the FTC on prohibiting unfair and deceptive acts or practices.
View Here IAPP Guide to FTC Privacy Enforcement
This guide from the IAPP Westin Research Center describes the various paths the FTC may pursue when it brings privacy cases under its primary consumer protection authority, Section 5(a) of the FTC Act.
View here FTC Cases and proceedings
This webpage lists all U.S. Federal Trade Commission cases and proceedings and allows filtering by name, date, enforcement type, and more.
View here

expand_more Privacy enforcement resources

Enforcement topic page
On this topic page you can find the IAPP’s collection of coverage, analysis and resources related to privacy enforcement.
View here Global Privacy and Data Protection Enforcement Database
This tool contains a collection of enforcement actions from all over the world.
View here

expand_more US federal privacy resources

US Federal Privacy topic page
The IAPP has been keeping track of the goings-on regarding a federal privacy law, and we’ve collected our news and resources here.
View here US Federal Privacy Legislation Tracker
This tracker organizes privacy-related bills proposed in the U.S. Congress to keep our members informed of developments within the federal privacy landscape.
View here