A former Meta employee testified before a Senate subcommittee on Tuesday, alleging that the parent company of Facebook and Instagram was aware of the harassment and other harms teens face on its platforms, but failed to address them.
The employee, Arturo Béjar, worked in wellness for Instagram from 2019 to 2021 and was previously an engineering director for Facebook’s Protect and Care team from 2009 to 2015, he said.
Bejar testified before the Senate Judiciary Subcommittee on Privacy, Technology and the Law in a hearing on social media and its impact on adolescent mental health.
“It is time for the public and parents to understand the true level of harm these ‘products’ represent and it is time for young users to have the tools to report and suppress online abuse,” he said in written comments available before the audience.
Bejar’s testimony comes amid a bipartisan push in Congress to pass legislation that would require social media platforms to provide parents with tools to protect children online.
Arturo Béjar, a former Meta employee, testified before the Senate Judiciary Subcommittee on Privacy, Technology and Law in a hearing on social media and its impact on the mental health of adolescents.
The goal of his work at Meta was to influence the design of Facebook and Instagram in ways that nudge users toward more positive behaviors and provide tools for young people to manage unpleasant experiences, Bejar said at the hearing.
Meta said in a statement that it is committed to protecting young people online, noting its support of the same user surveys that Bejar cited in his testimony and its creation of tools such as anonymous notifications of potentially harmful content.
“Every day, countless people inside and outside Meta work on how to help keep young people safe online,” Meta’s statement said. “All this work continues.”
Executives had decided “time and time again not to address this issue,” Béjar testified. Above, CEO Mark Zuckerberg.REUTERS Bejar’s testimony comes amid a bipartisan push in Congress to pass legislation that would require social media platforms to give parents tools to protect children online.AP
Bejar told senators that he met regularly with senior executives at the company, including CEO Mark Zuckerberg, and considered them to be supportive of the work at the time. However, he later concluded that management had decided “over and over again not to address this issue,” he testified.
In a 2021 email, Bejar pointed Zuckerberg and other senior executives to internal data revealing that 51% of Instagram users had reported having a bad or harmful experience on the platform in the past seven days and that on the 24th, 4% of children ages 13 to 15 had reported receiving unwanted sexual advances.
She also told them that her own 16-year-old daughter had been sent misogynistic comments and obscene photographs, without adequate tools to report those experiences to the company. The existence of the email was first reported by the Wall Street Journal.
In a 2021 email, Bejar informed Zuckerberg and other senior internal data executives revealing that 51% of Instagram users had reported having a bad or harmful experience on the platform in the past seven days.
In his testimony, Bejar recounted that in a meeting, Meta’s product manager, Chris Cox, was able to cite accurate statistics about teen harm that he could think of.
“I found it heartbreaking because it meant they knew and they weren’t acting on it,” Béjar said.
Categories: Trending
Source: vtt.edu.vn