The year 2023 promises to be another big year for privacy. In 2022, regulators focused on AI, dark patterns and aggressive remedies for allegedly deceptive and unfair data practices, such as disgorgement of algorithms developed through ill-gotten data, and these trends are likely to continue. Privacy professionals continue to focus on the privacy laws in five states coming into effect this year (California, Virginia, Colorado, Utah, Connecticut), while federal regulators continue to flex their muscles on privacy. Here are our top five predictions for privacy regulation in 2023:
1. Employee privacy will take center stage. The California Privacy Rights Act (CPRA) comes into effect in January, along with four other state privacy laws. Notably, it is the first state privacy law that will apply to employee data. In general, under CPRA, employees may request access, correction, or deletion of their personal information, and employers must provide notice to the employees of these and other rights. Employers have a limited time to respond to employee requests. Join representatives from Wilson Sonsini’s privacy and cybersecurity and employment practices for a webinar on February 2, 2023, to discuss these issues.
2. The Federal Trade Commission (FTC) will issue a Notice of Proposed Rulemaking on privacy. In 2022, the draft American Data Protection and Privacy Act (ADPPA) came close to passage. It would have been the first comprehensive privacy law in the United States. In August 2022, with prospects for ADPPA passage waning, the FTC introduced its own set of proposed federal privacy rules, requesting public comments on potential rules in the areas of personalized advertising, data security, and algorithmic discrimination. In 2023, the FTC will likely take the next step in that rulemaking process, and the public will get a glimpse of what proposed federal privacy rules might look like. We predict that the FTC will borrow heavily from sources like ADPPA, which focused on issues like data minimization, security, purpose limitation, and other “substantive” rights, as opposed to consent.
3. A broader range of companies will have to consider new children and teen privacy regulations. Thus far, when regulating children’s privacy, the U.S. approach, embodied in the Children’s Online Privacy Protection Act, has been to focus notice, parental consent, deletion, and other requirements on services directed to children 13 and under or where the service has actual knowledge that a user is a child. California has passed legislation adopting the U.K. approach of requiring age appropriate design codes, where companies whose services are attractive to teens and children under 18 must conduct data protection assessments and put in place appropriate protections. Services that are attractive to people under 18 will include a range of entities in the gaming, sports and entertainment, and ecommerce spaces, who will now likely have to comply with the new California law. And although a challenge has been filed, there is no indication that the law, which goes into effect on July 1, 2024, will be stayed. In addition, the law has served as inspiration for bills introduced in New York and New Jersey. The FTC is also set to revise its COPPA Rule; given the criticisms of the COPPA approach, we can expect to see some major revisions, perhaps more along the lines of age-appropriate design codes.
4. Regulators will seek to hold more top officials liable for privacy and security violations. The FTC named the CEO of Drizly—an alcohol delivery company bought by Uber—for failure to implement reasonable security measures at the company. The only fact the complaint points to in support of naming the CEO is that he “hired senior executives dedicated to finance, legal, marketing, retail, human resources, product, and analytics, but failed to hire a senior executive responsible for the security of consumers’ personal information…” With this case, the FTC is sending a clear signal: Those at the top need to make sure that the resources they spend on privacy and security are adequate to protect against threats to personal data.
5. Fintech companies will be subject to greater scrutiny on privacy and cybersecurity issues. Certain provisions of the updated GLB Safeguards Rule, which contains security requirements for most fintech companies that maintain consumer data, will come into effect on June 9, 2023. The CFPB has begun the process of a rulemaking to implement Section 1033 of the Dodd-Frank Act, outlining proposals that would require banks and other financial institutions to allow consumers to easily port their personal information, either to themselves or to a third party at the consumer’s direction. The proposal includes options to preserve privacy, such as a requirement that any third party be limited in how it can use consumer-permissioned data. The CFPB also issued a report on “Buy Now, Pay Later” companies, which allow consumers to split a retail transaction into smaller installments and repay over time. The report notes “data harvesting” as an important concern. Given all of this movement, we can expect to see the CFPB playing a more active role in the privacy landscape.
Wilson Sonsini Goodrich & Rosati routinely advises clients on privacy and cybersecurity issues. See here for our companion post on cybersecurity issues. For more information about the developments mentioned in these posts, or any other information advice concerning U.S. privacy and cybersecurity regulation, please contact Maneesha Mithal or another member of the firm’s privacy and cybersecurity practice.
Rebecca Weitzel contributed to the preparation of this Wilson Sonsini Alert.