Utah’s governor has signed into law two pieces of legislation that require social media companies to obtain parental consent for minors to use their apps.
HB 311 and SB 152 were signed by Gov. Spencer Cox on Mar. 23, making Utah the first state in the U.S. to codify such protections for children.
Under the new laws, social media companies are also prohibited from showing the account of anyone under age 18 in search results, targeting minors with ads, collecting, or sharing or using personal information from the account of a minor, with some exceptions.
Additionally, the legislation bars companies from using designs or features that could cause a minor to become addicted to the app.
Social media companies must also prohibit direct messaging with certain accounts, limit the hours of access to the platform, and allow a parent or guardian the ability to access their child’s account and direct messages.
Social media companies that fail to adhere to these laws are subject to fines of up to $250,000 per incident, depending on the violation.
Tech industry and advocacy organizations have pushed back against the legislation, arguing that “the majority of young Utahns will find themselves effectively locked out of much of the web,” according to the New York Post.
The bills come as concerns mount over the potential harm that social media may be having on adolescents.
On the day after the bills were signed, Cox shared a Twitter thread from social psychologist Jonathan Haidt, who said, “In the debate over whether social media caused the teen mental illness epidemic, the loudest voice is the complete absence of Gen Zers saying ‘no.’ I have spoken at many high schools. Not once did a student say that social media was on the whole good for them.”
Haidt added that though some teens point to some benefit of social media, “all see the massive waste of time and the devastation it causes to many of their friends and to their generation.”
The signing came as social media platform TikTok’s CEO gave testimony to address concerns over how the company handles user data, security, and other issues.
TikTok has recently come under fire for recommending suicide content to users as young as 13, with such content appearing on the feeds of minors in less than three minutes after sign-up. Self-harm and suicide, or eating disorder recommendations, are displayed for teen accounts every 66 seconds, according to a study by the Center for Countering Digital Hate.
“The results are every parent’s nightmare,” Imran Ahmed, CCDH’s chief executive, said upon publication of the study’s findings. “Young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health.”