×

Our award-winning reporting has moved

Context provides news and analysis on three of the world’s most critical issues:

climate change, the impact of technology on society, and inclusive economies.

OPINION: Designing a safer online world for children is possible

Tuesday, 22 February 2022 14:30 GMT

Sandy Saputra, 19, who is one of TikTok's biggest Indonesian stars, uses his smartphone to record a video using the app in Jakarta, Indonesia, July 24, 2020. Picture taken July 24, 2020. REUTERS/Willy Kurniawan

Image Caption and Rights Information

* Any views expressed in this opinion piece are those of the author and not of Thomson Reuters Foundation.

What can the US learn from the UK in terms of children’s online safety?

Assemblymember Buffy Wicks represents the 15th district in the California State Assembly. Elected in 2018 and re-elected in 2020, her district spans the communities of Oakland to Richmond, and includes the City of Berkeley. She is a Democrat.

Baroness Beeban Kidron OBE is the Founder and Chair of 5Rights Foundation. She is a Crossbench Peer in the UK’s House of Lords, where she introduced the Age Appropriate Design Code, which became UK law on 2nd September 2021.

Children, like all of us, now fully rely on technology. To get an education, to access health care, to navigate social relationships and understand the broader world is to be online. But for children this dependence on technology is happening when their minds are in formation. And while technology is essential, there are no guardrails. Our kids are riding high-speed cars without a speed limit, seatbelts, or car seats.

Friend suggestions recommending children’s profiles to strangers, targeted advertising that promotes age-restricted products to children, misinformation and racist content prioritized in teens’ feeds, in-game features incentivizing children to spend huge sums of money, and recommended posts featuring certain body types that contributes to negative body image among teens. To get the most out of their online experiences, kids are forced to endure inappropriate and sometimes harmful content, by design.

But another world is possible: children can navigate technology that is designed for their developmental abilities and that is safe by design and default, and tech companies can build those very experiences. How do we know? Because that world is already taking shape thanks to a law in the United Kingdom, the Age Appropriate Design Code.

The law, in effect since September and authored by Baroness Beeban Kidron, establishes rules of the road when it comes to the design of products and the collection of minors’ data. It has already triggered tangible reform within companies and the way they cater for the children who rely on their products and services daily.

Indeed, the UK Code’s introduction has led to the biggest raft of child safety innovations the sector has seen. Instagram has announced it will not allow unknown adults to direct message under 18-year-olds. Teens under 16 on TikTok will have the accounts set to private by default. The Google Play Store now prevents those under 18 from viewing and downloading apps rated as adult-only. The list does go on. Silicon Valley is innovating in the name of product safety, and our children will benefit.

But there’s a catch: currently, that law is only enforceable in the UK. Why shouldn’t children in the US have such rights to hold tech accountable?

Today that starts to change. Alongside my Republican colleague Assemblymember Jordan Cunningham, we are introducing (in the same rotunda that launched the country the nation’s first consumer privacy bill four years ago) a ground-breaking safety by design framework: the California Age Appropriate Design Code.

Introducing this bill in California is the natural evolution of the state’s priorities to protect data; 8 in 10 state voters wanted to better protect children in advance of the voter initiative on data privacy passed in 2020. And while California’s consumer privacy protections may still be nation-leading, they only prevent companies from selling data, they don’t incentivize new design or eliminate the incentive for those companies to collect data in the first place. This bill will ensure that digital products and services are safe for young people by design and by default.

There is already bipartisan and nearly universal support for protections for children online. And now we have a model that allows tech companies in the heart of Silicon Valley to do what they do best: innovate, and do so with children’s best interests in mind. No digital product or service should be designed with commercial interests as a priority, and children’s best interests as an afterthought. We can do better, and for our children we must.

Our Standards: The Thomson Reuters Trust Principles.

-->