Parents in the United Kingdom will now have the ability to more directly moderate their teenagers’ activity on Instagram.
Meta, the social media platform’s parent company, unveiled a collection of new controls for parents and guardians of users under the age of 18.
The new feature, called the Family Center, will allow parents to see who their teens follow, who follows their teens, and what accounts they have reported. Parents can also set a daily time limit from 15 minutes to two hours, after which the app’s screen will turn black, and schedule breaks in usage to prevent their teen from extended continuous use.
“The features will also include ‘nudges’ to encourage teenagers to switch to a different topic after repeatedly searching for the same thing and are aimed at encouraging them to discover something new,” reports Sky News.
Instagram also restrict the direct message feature for its underage users so they cannot be contacted by adults they do not follow.
The newest version of the feature, which went into effect in the U.K. on June 14, allows parents to send their children an invitation to initiate the tools. Previously, the supervisory programs were reliant on an invitation from teenage users.
“Supporting teens and their parents is one of our most important priorities,” the platform says on its website. “We’ve taken steps over the past several years to increase safety and protection for young people on Instagram.”
To create its enhanced safety features for children, Instagram worked with a number of organizations, including Internet Matters, the Child Mind Institute, the Cyber Peace Foundation, SaferNet, Young Leaders for Active Citizenship, ParentKind, the Suicide Prevention India Foundation, and Stop Hate Speech.
In September of 2021, The Wall Street Journal published a lengthy report on Meta executive’s awareness on the negative impact Instagram has on teenagers’ body image.
The impact of Instagram and other forms of social media on teenagers’ mental health has become an increasing concern for health experts and parents alike.
The technology company’s researchers found that “thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.”
The internal research reported by The Post found that Instagram’s algorithm would curate content that glorified or promoted dieting, thinness, and disordered eating to teen girls with anorexia and other eating disorders.
“We know there are effects on young people’s feelings about their appearance, their body satisfaction, and social media platforms can increase risk for eating disorders and other mental health concerns such as depression and low self-esteem,” said Rachel Rodgers, an associate professor of applied psychology at Northeastern University, in an interview with [email protected]. “There’s no doubt that there’s substantial research showing that these platforms can have a negative effect on young people.”
Meta is currently being sued by Kathleen and Jeff Spence, the parents of a 19-year-old girl from New York who allegedly became addicted to Instagram and developed an eating disorder, anxiety, and depression as a result.
The lawsuit was filed in U.S. District Court for the Northern District of California by the Social Media victims Law Center. The Spence family says they were “emotionally and financially harmed by Meta’s addictive design and continued and harmful distribution and/or provision of multiple Instagram accounts to their minor child,” per 6ABC.
The new Instagram controls were made available to American parents in March. Meta plans to roll out the supervisory tools in Germany, France, Canada, Ireland, Australia, and Japan this month and globally by the end of 2022.