Contact Us


Disruptive Competition Project

655 15th St., NW

Suite 410


Washington, D.C. 20005

Phone: (202) 783-0070
Fax: (202) 783-0534

Contact Us

Please fill out this form and we will get in touch with you shortly.
Close

Why Implementing Education is a Logical Starting Point for Children’s Safety Online

person holding white ipad on white table

Bolstering digital services’ mechanisms to promote children’s safety online

Protecting children online is rightfully an increasingly discussed topic across state capitals, Congress, the UK, the European Union, and globally. Safety and privacy are critical for all users of digital services, which is why technology firms are investing in creating industry-leading tools, settings, and programs with these principles top of mind. 

Despite this, the best tools by which to promote the safe use of digital services by minors remains a subject of debate. Any approach to this objective should attempt to minimize unintended consequences, such as creating barriers to innovation or limiting user experience. This complex subject is further compounded by First Amendment concerns particular to the U.S. legal system. As such, many leading digital service providers are working jointly and independently to advance online safety. Though they are developing their own products and tools to advance the goal of child safety online, lawmakers are also considering legislation that provides a multidimensional approach by introducing proposals to require digital citizenship in classroom curricula. 

Recent initiatives in the UK and EU do not reflect constitutional considerations and cultural contexts and, as a result, do not take into account fundamental rights in US, including freedom of speech and access to information. Similar to the General Data Protection Regulation (GDPR), some US lawmakers are introducing proposals based on the UK’s “Age Appropriate Design Code”. As of today, California is the first and only state thus far to enact legislation shopped by the 5Rights Foundation with the enactment of AB 2273 last year. While this law seeks to require businesses to design their products to be tailored towards children who are likely to access their online service, this law does not provide a sufficient means to protect children and teen’s privacy, safety, and their right to access information. As written, the bill may ultimately work counter to the legislation’s goals, as further detailed below.

Part of the value of modern technology is that it allows people to connect across state and national borders. But as businesses work to comply with an increasingly diverse patchwork of laws, this could present unnecessary challenges to creating innovative and diverse products for consumers. As such, it is imperative that measures targeted at addressing children and teens online safety are not only effective but provide clear roadmaps for compliance and avoid unintended consequences.

Does the AADC ensure the effective achievement of its goals while minimizing adverse unintended consequences? 

Any policy decision has the potential to cause broad ripple effects, and the issue of children’s online safety presents its own particular nuances. While lawmakers were in the process of passing the now law, the provided bill analysis from August 2022 outlined several goals. However, given the law’s vague language and lack of definitions, significant questions remain as to how businesses can address those goals. Rather than establishing clear mechanisms and standards that would support achieving increased children’s online safety, the law creates a moving compliance target with increased liability risk and the threat of heavy fines for violations.

For example, consider the goal to “further protect minors online”. The law provides no clear mechanisms or standards by which to implement such protections and creates confusion with several poorly defined terms and concepts. First, the law provides an overly broad definition of a child as anyone under 18. AB 2273 explicitly acknowledges that different age groups warrant different treatment. However, there is no additional language to detail what those different approaches should be or specify to which groups those apply, causing difficulties with operationalizing. Second, the law also does not provide for how an online service may estimate a user’s age or how penalties for those who do not abide by the law will be enforced. Given the potential for hefty fines for misjudging a user’s age, businesses would be forced to increase age verification measures on all websites and collect additional sensitive information, which flies in the face of other data privacy considerations, namely, data minimization principles. 

The law also requires a business that provides an online service, product, or feature “likely to be accessed by children” to provide any privacy information, terms of service, policies, and community standards concisely, prominently, and using “clear language suited to the age of children likely to access that online service, product, or feature”. Online services do not have a clear metric to reliably determine what could be “likely to be accessed by children.” It is conceivable that children could access just about any product, whether intentionally or not. The definition of “clear language suited to the age of children likely to access online services” is not defined and leaves room for significant subjective interpretation. If a child is defined as anyone under 18, one could expect a wide variation of reading comprehension skills across such a wide age group — a 17-year-old would presumably have better reading comprehension skills than a 5-year-old. Without definitions for such terms, the law is difficult to comply with.

AB 2273’s bill analysis also highlighted two other areas where lawmakers were seeing existing online protections falling short: 1) laws already on the books were only focused on regulating the collection and sale of children’s personal information rather than protecting children from manipulative design (dark patterns), adult content, or other potentially harmful design features; and 2) protections for minors were only triggered when an online platform had actual knowledge that children were accessing their website. However, AB 2273 does not provide or require any possible examples or explicit mechanisms digital service providers could use to protect children from manipulative design, adult content, or other potentially harmful design features. It is quite difficult for businesses to comply with such vague requirements coupled with hefty consequences looming over their heads if they do not meet an ambiguous standard. If the goal is to empower users while also avoiding negatively impacting innovation, it is imperative that more guidance on best practices be provided along with specific and narrow standards.

In order to achieve meaningful children’s safety protections, it is vital for businesses to have a roadmap of how to properly comply and avoid unintentional violations. This law provides broad strokes of what is expected of businesses but does not portend how businesses may achieve those objectives. Instead, businesses are expected to verify ages to a “reasonable level of certainty”, without “reasonable level of certainty” even being defined. Further, to ensure compliance and minimize legal risk, businesses may be forced to collect additional information about their consumers to verify who, in fact, is accessing their platforms. 

No policy decision occurs in a vacuum. As such, effective solutions are often best tackled and driven from multiple fronts, including from industry itself.

During the drafting process, AB 2273 sought to “elevate child-centered design in online products and services that are likely to be accessed by children.” This model and willingness to force society and all of its structures to be designed to be child-proof should be subject to cost-benefit analysis, particularly when it comes to expressive products. Rather than designing the world wide web into a child-proof space, there should be a willingness to instead implement a risk-informed design.

Many online service providers have real-time mechanisms already in place that are specifically tailored to focus on children’s needs while simultaneously prioritizing safety and privacy. Online businesses are already taking steps to ensure a safer and more trustworthy internet — for example, in 2021, leading online businesses announced their participation in the Digital Trust & Safety Partnership (DTSP) to develop and implement best practices and recently reported on the efforts to implement these commitments. In addition, several participating digital services that minors may use have produced several mechanisms to further promote children’s protection and privacy. 

Lawmakers across the country are seizing upon this multidimensional approach by introducing proposals to require digital citizenship in classroom curricula. 

Structures that can be used to enhance the safety of children online should include the industry’s existing efforts to support child safety and privacy across all platforms and could be bolstered by public school curricula focused on how to be a proper online steward. This curriculum would provide children with the necessary tools to operate safely and responsibly online to better protect themselves and others. Legislation has already been passed in New Jersey and introduced in states like Florida, Texas, Missouri, Indiana, Minnesota, and New York, along with a myriad of others, that would require public school curricula to focus on efforts to create best practices for appropriate, responsible, and healthy online behavior, including cyberbullying and identity fraud prevention. While younger generations are increasingly savvy about operating in a digital world, it is unreasonable to assume that every child (or parent) has an inherent and robust knowledge of proper and responsible use. Leaders, like North Carolina Attorney General Josh Stein, acknowledged this by providing helpful internet principles for parents to spark conversations with their kids about online safety and good decision-making.

Measures to improve children’s safety online should acknowledge the nuances involved — they should reflect a comprehensive approach that draws on industry’s knowledge and experiences, meets children and parents where they are, and provides workable protections while enabling the development of products and services that benefit children, teens, and families. Any solution to this important issue should be narrowly tailored to solving the intended goal and avoid creating adverse consequences such as barriers to innovation and market entry or creating new problems, like violating data minimization and privacy principles or harming the user experience.

Privacy

Trust in the integrity and security of the Internet and associated products and services is essential to its success as a platform for digital communication and commerce. For this reason we’re committed to upholding and advocating for policymaking that empowers consumers to make informed choices in the marketplace while not impeding new business models.