On April 7th, The American Privacy Rights Act of 2024 (APRA) ‘discussion draft‘ was introduced. The APRA aims to establish the first ever federal standard for comprehensive data privacy and security regulation. Some of the obligations and requirements under this Act, whether it will end up passing or not, set the tone and show a trend of how the future of privacy will look like, globally. In this article we will review the proposed act and compare it with the General Data Protection Regulation (The GDPR).
The American Privacy Rights Act – Definitions
- Covered entity – any entity that determines the purpose and means of collecting, processing, retaining, or transferring covered data and is subject to the FTC Act, including common carriers and certain nonprofits. Small businesses, governments, entities working on behalf of governments, the National Center for Missing and Exploited Children (NCMEC), and, except for data security obligations, fraud-fighting nonprofits are excluded. The definition is somewhat similar to the definition of a Data Controller under the GDPR: “determines the purposes and means of the processing of personal data”. However, under the APRA it applies only to for-profit companies and even excludes small businesses.
- Small Businesses – Unlike the GDPR, the APRA excludes small businesses and defines them as businesses with average annual gross revenue for the period of the 3 preceding calendar years (or for the period during which the covered entity has existed if less than 2 years) of not more than $40m. And on avg. did not annually collect, process, retain or transfer the covered data of more than 200,000 individuals for any purpose other than for a requested service or product as long as all covered data for such purpose was deleted or de-identified within 90 days. And they did not transfer covered data to a third party in exchange for revenue or anything of value.
- Large data holder—The APRA provides additional requirements for covered entities that have $250,000,000 or more in annual revenue; collect, process, retain, or transfer the covered data of more than 5,000,000 individuals (or 15,000,000 portable devices or 35,000,000 connected devices that are linkable to an individual) or the sensitive data of more than 200,000 individuals (or 300,000 portable devices or 700,000 connected devices).
- Biometric information – any covered data that is specific to an individual and is generated from the measurement or processing of the individual’s unique biological, physical, or physiological characteristics that is linked or reasonably linkable to the individual. It does not include digital or physical photographs or audio or video recording.
- Collection – buying, renting, gathering, obtaining, receiving, accessing or otherwise acquiring covered data by any means. This definition is slightly narrower than the definition of “processing” under the GDPR: “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction;”
- Covered Data – information that identifies or is linked or reasonably linkable to an individual or device. It does not include de-identified data, employee data, publicly available information, inferences made from multiple sources of publicly available information that do not meet the definition of sensitive covered data and are not combined with covered data, and information in a library, archive, or museum collection subject to specific limitations.
- Excluding de-identified data (and not anonymized data), might cause troubles in the future, as most de-identified data (pseudonymized), could still be linked to an individual. Furthermore, excluding employee data and publicly available information (which US companies LOVE to use) might not serve the goal of this act. So let’s see how publicly available information is defined;
- Publicly available information—information that has lawfully been made available to the general public. It does not include derived data that reveals sensitive covered data, biometric or genetic information, covered data combined with publicly available information, or obscene or non consensual intimate images.
- Sensitive covered data—a subset of covered data that includes government identifiers; health information; biometric information; genetic information; financial account and payment data; precise geolocation information; log-in credentials; private communications; information revealing sexual behavior; calendar or address book data, phone logs, photos and recordings for private use; any medium showing a naked or private area of an individual; video programming viewing information; an individual’s race, ethnicity, national origin, religion, or sex, in a manner inconsistent with a reasonable expectation of disclosure; online activities over time and across third party websites, or over time on a high-impact social media site; information about a covered minor; and other data the FTC defines as sensitive covered data by rule.
- Targeted advertising—displaying an online advertisement based on known or predicted preferences or interests associated with an individual or device identified by a unique identifier. It does not include advertisements in response to an individual’s specific request for information; first-party advertising; contextual advertising; or processing data for measurement.
- Service Provider – an entity that collects, processes, retains, or transfers covered data for the purpose of performing 1 or more services or functions on behalf of, and at the direction of, a covered entity.
- Third Party – any entity that receives covered data from another entity; and is not a service provider with respect to such data; and does not include an entity that collects covered data from another entity if the 2 entities are related by common ownership or corporate control and share common branding.
The American Privacy Rights Act – General Principles
- Data Minimization: APRA takes the same general approach to data minimization as ADPPA – data is not to be processed beyond what is “necessary, proportionate, and limited” to provide or maintain a specific product or service requested by an individual, or in accordance with a list of “permissible purposes.” APRA has reworded the permissible purpose involving targeted advertising, but still appears to foreclose advertising that is based on sensitive data, which includes “information revealing an individual’s online activities over time and across websites or online services.”
- A covered entity cannot collect or transfer to a third party biometric or genetic information without the individual’s affirmative express consent, unless expressly allowed by a stated permitted purpose.
- A covered entity cannot transfer sensitive data to a third party without the individual’s affirmative express consent, unless expressly allowed by a stated permitted purpose
- Permitted purposes include protecting data security; complying with legal obligations; effectuating a product recall or fulfilling a warranty; conducting market research (which requires affirmative express consent for consumer participation); de-identifying data for use in product improvement and research; preventing fraud and harassment; responding to ongoing or imminent security incidents or public safety incidents; processing previously collected nonsensitive covered data for advertising.
- Transparency: Similarly to the GDPR, covered entities and service providers must have publicly available privacy policies detailing their data privacy and security practices.
- The privacy policies must identify the entity; disclose the categories of data collected, processed, or retained; the purposes for the data processing; the categories of service providers and third parties to which data is transferred; the name of any data brokers to which data is transferred; the length of time data is retained; data security practices; and the effective date of the privacy policy.
- Large data holders are subject to additional requirements pursuant to retaining and publishing their privacy policies from the past 10 years and also provide a short-form notice of their policies.
Security: Covered entities and service providers must establish data security practices that are appropriate to the entity’s size, the nature and scope of the data practices, the volume and sensitivity of the data, and the state of the art of safeguards.
📈 the Act’s rigorous consent requirements for handling sensitive data like biometric and genetic information underline an increasing prioritization of individual control over personal information in response to growing privacy concerns.
The American Privacy Rights Act – Individual Control Rights
- The bill also would require covered entities to give individuals the right to access, correct, delete, and export their data, as well as opt out of targeted advertising and data transfers.
- Covered entities must comply with individual control rights within specified timeframes, and large data holders must report metrics related to the requests they process.
- The Act would prohibit the use of covered data to discriminate against consumers and provide consumers with the right to opt out of the use of algorithms for consequential decisions.
- Covered entities are prohibited from using dark patterns to divert an individual’s attention from notice required by the Act, impair the exercise of any right under the Act, or to obtain consent under the Act.
📈 This focus on individual rights is coupled with specific prohibitions against discriminatory uses of data and the employment of dark patterns that manipulate consumer behavior. These measures collectively signal a legislative move towards ensuring more transparent, fair, and consumer-friendly data practices, emphasizing individual autonomy in managing personal data.
Privacy and Security Officers
- All covered entities must designate one or more covered employees to serve as privacy or data security officers.
- Large data holders are required to designate both a privacy and a data security officer.
Privacy Impact Assessment & Risk Assessment
- Covered entities and service providers must assess vulnerabilities and mitigate reasonably foreseeable risks to consumer data.
- Large data holders must conduct privacy impact assessments on a biennial basis.
Service Providers & Third-Parties
- Similarly to the GDPR requirement for processors to act under the instructions of the Processors, Service providers must adhere to the instructions of a covered entity and assist the entity in fulfilling its obligations under the Act.
- Service providers must cease data practices where they have actual knowledge that a covered entity is in violation of this Act.
- Service providers must maintain the security and confidentiality of covered data and allow for independent assessors to assess their security practices.
- Covered entities must exercise due diligence in the selection of service providers and in deciding to transfer covered data to a third party.
- Third parties may only process, retain, and transfer data received from another entity for a purpose consistent with what the covered entity disclosed in its privacy policy; or, for sensitive covered data, a purpose for which the consumer provided affirmative express consent.
📈 The focus on due diligence requirements for service providers and third parties in the American Privacy Rights Act (APRA) represents a significant trend toward strengthening the accountability chain in data management. This provision ensures that covered entities must not only be compliant themselves but must also ensure that their partners uphold the same standards of data protection.
Data Brokers
- The FTC is directed to establish a data broker registry, and data brokers affecting the data of 5,000 or more individuals must register each calendar year.
- The Federal Trade Commission (FTC), State attorneys general, and consumers could enforce against violations of the Act.
- States may seek injunctive relief; civil penalties, damages, restitution, or other consumer compensation; attorneys’ fees and other litigation costs; and other relief, as appropriate.
- Consumers may file private lawsuits against entities that violate their rights under this Act.
- A person bringing an action may recover actual damages, injunctive relief, declaratory relief, and reasonable attorney fees and costs. Any amount that a court orders an entity to pay may be offset by recovery for the same violation pursuant to an FTC or State action.
Privacy Enhancing Technologies
- APRA would direct the FTC to establish a pilot program to encourage private sector use of privacy-enhancing technology to protect personal data.
📈 This marks a key advancement in formally recognizing and regulating these entities within the privacy law framework. Furthermore, the encouragement for companies to adopt Privacy Enhancing Technologies (PETs) underscores a push towards innovation in data protection.
The American Privacy Rights Act – Enforcement
- State Enforcement: Section 18 of the APRA draft authorizes enforcement by state attorneys general, chief consumer protection officers and any other “officer” of “officer of a State”. These authorized state-level APRA enforcers may seek injunctive relief; civil penalties, damages, restitution and other consumer compensation; attorneys’ fees and other litigation costs; and other relief, as appropriate.
- FTC Enforcement: Section 17 also provides for FTC enforcement. The FTC is permitted in some areas and required in others to issue regulations under the APRA, including for privacy impact assessments, universal opt-out mechanisms, processed-based data security, exceptions “to protect the rights of individuals,” exceptions to “alleviate undue burdens on covered entities” and exceptions to “prevent unjust or unreasonable outcomes from the exercise of” the APRA draft’s access, correction, deletion and portability rights.
- Private Right of Action: Section 19 of the APRA draft allows for a limited private right of action (PRA) related to violations of specific APRA provisions, including transfers of sensitive covered data. This act allows for recovery of actual damages, injunctive relief, declaratory relief and reasonable attorneys’ fees and costs for violations of most of the APRA’s provisions.
- 30 days prior to initiating an action, unless a “substantial privacy harm” is alleged, the individual must provide the covered entity with a “written notice identifying the specific provisions of [APRA]” that the individual alleges the covered entity violated.
📈 This suggests an evolving legal landscape where both the state and private citizens can actively pursue remedies for privacy violations, potentially leading to stricter compliance and enforcement practices across various sectors.
What’s next?
In the event that this process continues as intended, additional committee hearings will be held, amendments will likely be made during mark-ups and the language will likely be debated on the floor in both chambers before voting. In the event that the House passed the bill and the Senate passed the bill differently, contrasts will need to be ironed out before identical, compromise legislation is sent to the president for signature. Privacy in the US? Let’s see!
Noa Kahalon
Noa is a certified CIPM, CIPP/E, and a Fellow of Information Privacy (FIP) from the IAPP. Her background consists of marketing, project management, operations, and law. She is the co-founder and COO of hoggo, an AI-driven Digital Governance platform that allows legal and compliance teams connect, monitor, and automate digital governance across all business workflows.