Washington: Facing outrage over its handling of internal research on harm to teens from Instagram, a Facebook executive is telling Congress that the company is working to protect young people on its platforms. And she disputes the way a recent newspaper story describes what the research shows.
“We have put in place multiple protections to create safe and age-appropriate experiences for people between the ages of 13 and 17,” Antigone Davis, Facebook’s head of global safety, said in written testimony Thursday for a Senate Commerce subcommittee.
Facebook has removed more than 600,000 accounts on Instagram from June to August this year that didn’t meet the minimum age requirement of 13, Davis said.
Davis was summoned by the panel as scrutiny over how Facebook handles information that could indicate potential harm for some of its users, especially girls, while publicly downplaying the negative impacts.
The revelations in a report by The Wall Street Journal, based on internal research leaked by a whistleblower at Facebook, have set off a wave of anger from lawmakers, critics of Big Tech, child-development experts and parents. The outcry prompted Facebook to put on hold its work on a kids’ version of Instagram, which the company says is meant mainly for tweens aged 10 to 12. But it’s just a pause.
Also read:Foes united vs Facebook over Instagram's effect on teens
For some of the Instagram-devoted teens, the peer pressure generated by the visually focused app led to mental-health and body-image problems, and in some cases, eating disorders and suicidal thoughts. It was Facebook’s own researchers who alerted the social network giant’s executives to Instagram’s destructive potential.
Davis says in her testimony that Facebook has a history of using its internal research as well as outside experts and groups to inform changes to its apps, with the goal of keeping young people safe on the platforms and ensuring that those who aren’t old enough to use them do not.
“This hearing will examine the toxic effects of Facebook and Instagram on young people and others, and is one of several that will ask tough questions about whether Big Tech companies are knowingly harming people and concealing that knowledge,” Sen. Richard Blumenthal, D-Conn., chairman of the consumer protection subcommittee, said in a statement. “Revelations about Facebook and others have raised profound questions about what can and should be done to protect people.”