Big Tech /

Irish Data Protection Commission Fines Meta $400 Million For Violating Children's Privacy


The Irish Data Protection Commission has fined Facebook parent company Meta $400 million for violating the privacy of children.

Instagram, according to the regulator, violated Europe’s General Data Protection Regulation (G.D.P.R.) by having public-by-default accounts for minors.

The G.D.P.R. was enacted in 2018 to limit how Big Tech companies handle user data.

Two years ago, the commission launched an investigation into how Instagram handles the data of users between the ages of 13 and 17. They found that accounts belonging to teenage users from 13 to 17 were set to public by default and those with business accounts were allowed to publicize their email addresses and phone numbers.

Meta claims that they have since changed the settings so that the accounts belonging to minors are private by default. Under the new feature, adults cannot message minors who do not follow them on the platform.

“This inquiry focused on old settings that we updated over a year ago, and we’ve since released many new features to help keep teens safe and their information private,” a Meta spokesperson told Politico. “Anyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post, and adults can’t message teens who don’t follow them. We engaged fully with the DPC throughout their inquiry, and we’re carefully reviewing their final decision.”

In a statement to the New York Times, Meta confirmed that they plan to appeal the decision.

Ireland has had a number of inquiries into Meta’s compliance with the regulation due to it being the home of the company’s European headquarters.

In 2021, Meta was fined €225 million for violations on WhatsApp. This year, a data breach led to the company being fined €17 million.

According to the Politico report, the Irish DPC has at least six additional ongoing investigations into Meta-owned companies.

In September 2021, the Wall Street Journal published internal information proving that Meta knew Instagram’s algorithm and model were harming teenagers.

“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” the researchers said in a March 2020 slide presentation posted to Facebook’s internal message board. “Comparisons on Instagram can change how young women view and describe themselves.”

“We make body image issues worse for one in three teen girls,” said a 2019 slide.

Facebook’s own research found that “among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram,” according to the report.

*For corrections please email [email protected]*

Popular