Bipartisan Bill Proposes Tech Companies Implement New Protections For Underage Users

Social media platforms used by those under the age of 16 would be required to make strict safeguards their default setting


A new bill introduced in the U.S. Senate would expand the steps tech companies and social media platforms would be required to take to mitigate harm to minors.

Republican Senator Marsha Blackburn of Tennessee and Democrat Senator Richard Blumenthal of Connecticut introduced the Kids Online Safety Act on Feb. 16. Both senators serve on the consumer protection subcommittee of the Senate Commerce Committee as chair and ranking members, respectively.

“This issue of what is happening online to children is something that comes up repeatedly, with them saying, there has to be something done about this,” Blackburn told reporters.

If passed, the bill would require online platforms that are “reasonably likely to be used by” kids under the age of 16 to adopt safeguards that would allow underage users or their parents to “control their experience and personal data.”

The goal of the policy is to reduce the potential harm caused by the established link between minors’ online activities and content exposure to suicide, eating disorders and substance abuse.

Companies expected to be impacted by the legislation include Snap, Google, TikTok, as well as Facebook and parent company Meta.

Safeguards proposed by the senators include limiting the ability of users to find minors online and lessening the amount of data the companies collect from minors. Tech companies would also be required to permit underage users to opt out of any recommendation systems generated by algorithms and limit the amount of time minors can spend on a website or app.

Social media platforms and tech companies would be required by the law to make the most stringent version of the proposed protection their service’s default settings. The bill prohibits the platforms from encouraging minors to deactivate the safeguards.

Additionally, the policy calls for the formation of a council comprised of “parents, experts, tech representatives, enforcers and youth voices” which would be “convened by the Commerce secretary to give advice on how to implement the law,” per CNBC.

The Federal Trade Commission and state attorneys would also be responsible for creating guidelines that regulated how the platforms can conduct “market- and product-focused research on minors.”

The policy reflects widespread concern about the impact of internet exposure on the nation’s children. In 2021, Facebook paused the production of Instagram for Kids after the project was denounced by lawmakers and parents for the possible harm it could bring to minors.

Speaking at a press conference, Blumenthal spoke optimistically about the policy’s potential impact.

“I think we are on the cusp of a new era for Big Tech imposing a sense of responsibility that has been completely lacking so far,” Blumenthal said. “And we know that it is not only feasible and possible, but that it works.”

“If the tech companies want to come to the table, we’re always ready to hear them,” the senator added. “But all too often in the past, they have unleashed their armies of lawyers and lobbyists to oppose legislation.”

Human Events Content recommendations!
Human Events recommendations!