Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.
Close

U.S. Privacy Landscape: 2023 and Beyond

2023 marked another important year for the development of U.S. privacy policy. Transatlantic data flows received much-needed clarity after the European Commission announced its adequacy decision for the EU-U.S. Data Privacy Framework and policymakers, including some agencies, finally began examining the role of data brokers in the digital economy. Users also received a major privacy boost after Google announced several changes to how it collects and stores location data that included shifting data storage from the cloud to user’s devices, setting auto-delete to three months by default, and “automatically encrypting your backed-up data so no one can read it, including Google.”

However, the continued absence of a comprehensive federal privacy law saw state lawmakers resume their active role in shaping U.S. privacy law as seven states enacted comprehensive data privacy laws in 2023 – with even more sectoral privacy bills expected. There was some legislative overlap though, as state and federal lawmakers both explored proposals aimed at protecting young people online. Looking ahead, 2024 represents another important period for U.S. privacy, especially regarding the pending Supreme Court cases such as NetChoice & CCIA v. Paxton

This post highlights a few important privacy developments to follow for 2024. 

Privacy Enhancing Technologies 

The rapid growth of generative Artificial Intelligence (AI) models and large language models, seen with the release of ChatGPT and similar products, helped demonstrate the positive impact such technologies can have on users. However, the widespread development and use of such technologies and related systems also raised some concerns regarding user privacy and security, especially relating to training data. Addressing these concerns will be difficult but recent initiatives and advancements in privacy-enhancing technologies (PETs) may offer some help. PETs are tools that attempt to address the privacy trade-off – ensuring that the data retains its utility while offering sufficient privacy protections. Some emerging PETs include federated learning, multiparty computing, homomorphic encryption, and zero-knowledge proofs. (For further information on the state of PETs, the Centre for Information Policy Leadership recently published a whitepaper on these powerful tools.) 

Further, President Biden’s executive order on artificial intelligence directed the federal government, amongst other things, to prioritize the research and development of PETs so that users’ privacy and security remain protected as AI continues to advance. The EO calls upon the National Institute of Standards and Technology (NIST) to research and provide guidance on PETs including differential privacy – differential privacy is a privacy-preserving data method that adds random “noise” to the dataset to protect the privacy of individual data. Recently, NIST released draft guidance on differential privacy that would allow “data to be publicly released without revealing the individuals within the dataset” – a promising approach to allowing researchers access to useful data. Apple also has been using differential privacy (“secure aggregation”) to improve the Memories feature by learning about the “kinds of photos people take at frequently visited locations (iconic scenes) without personally identifiable data leaving their device.” 

Privacy and AI Overlap 

The end of the 2023 legislative cycle was marked by a growing interest from policymakers to engage the recent progress made with AI systems and related technologies.

While it is not clear whether lawmakers will pursue heavy-handed regulation or promote a risk-based approach, some policymakers have begun exploring ways to regulate AI technologies and systems through enforcement and privacy legislation. For instance, the California Privacy Protection Agency  (CPPA) proposed draft regulations regarding a business’ use of automated decisionmaking technology (ADT) and individuals’ access and opt-out rights. However, the proposed draft rules would go beyond the statutory text and would grant individuals the right to opt-out of various low-risk systems including those that detect spam. Notably, the CPPA has still yet to adopt the complete set of final regulations implementing the California Privacy Rights Act and the California Chamber of Commerce filed a lawsuit to adopt the final rules promptly.

Last year, the Federal Trade Commission cautioned about the privacy and data security risks concerns from a business’ use of consumers’ biometric information, including for training AI systems. The Commission reiterated the importance of establishing proper guardrails and security measures, along with providing several factors for determining whether a business’s use of biometric information or technology could be unfair and in violation of the Federal Trade Commission (FTC) Act. In December, the FTC built upon this warning in a proposed settlement with Rite Aid that would ban the company from using facial recognition technology for surveillance purposes for five years. The proposed order alleges the company failed to take reasonable measures to prevent harm to consumers, which include “generating thousands of false-positives.” The FTC will likely continue to pursue enforcement actions but may also unfortunately explore overly broad rulemaking

Looking Ahead 

Congress will have a busy first few months shaping the U.S. privacy landscape. First, the legislative efforts to protect young online users will resume this month as the Senate Judiciary Committee will have a hearing later the end of the month about child exploitation where various technology company executives are expected to testify. At the same time, recent measures including the Kids Online Safety Act and Children’s Online Privacy Protection Act (COPPA) 2.0 are also likely to resurface despite widespread opposition. States are also likely to continue their efforts but the suits challenging the constitutionality of such laws, most recently in Ohio, will hopefully reaffirm that the protections of the First Amendment still apply online.

Second, Congress will have until April 19 to reauthorize Section 702 of the Foreign Intelligence Surveillance Act after a four-month extension was added to the National Defense Authorization Act. At the end of 2023, there were two competing reauthorization bills – one reported by the House Intelligence Committee and the other by the House Judiciary Committee. Each bill would authorize this authority in very different manners as the Intel bill would broadly expand the scope of covered entities to nearly all businesses and the Judiciary bill would require agencies to get a court order before conducting U.S. person queries of 702 information.

Privacy

Trust in the integrity and security of the Internet and associated products and services is essential to its success as a platform for digital communication and commerce. For this reason we’re committed to upholding and advocating for policymaking that empowers consumers to make informed choices in the marketplace while not impeding new business models.