Technology

Meta failed to act to protect teens, second whistleblower testifies

Meta failed to act to protect teens, second whistleblower testifies

Arturo Bejar, former Facebook employee and consultant for Instagram, testifies before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law during a hearing to examine social media and the teen mental health crisis, Tuesday, Nov. 7, 2023, on Capitol Hill in Washington.

Stephanie Scarbrough | AP

A second Meta whistleblower testified before a Senate subcommittee on Tuesday, this time describing his fruitless efforts to flag the extent of harmful effects its platforms could have on teens to top leadership at the company.

Arturo Bejar, a former Facebook engineering director from 2009 to 2015, who later worked as a consultant at Instagram from 2019 to 2021, testified before the Senate Judiciary Subcommittee on Privacy, Technology and Law that top Meta officials did not do enough to stem harm to its youngest users experienced on the platforms.

Lawmakers on both sides of the aisle blamed tech lobbying for Congress’ failure to pass laws protecting kids online. Despite broad support within Senate committees of bills that aim to protect kids on the internet, they have ultimately sat dormant, waiting for a vote on the Senate floor or for action in the House.

Bejar’s appearance shows the frustration among lawmakers who believe large tech companies operate with largely unchecked power.

Bejar’s allegations

Bejar recently came forward with allegations against the company in a Wall Street Journal interview. He follows in the footsteps of Frances Haugen, another former Meta employee who leaked internal documents and research to news organizations and the Senate to shed light on the company’s safety issues.

Meta leadership was aware of prevalent harms to its youngest users but declined to take adequate action to address it, Bejar told lawmakers on Tuesday.

Blumenthal said that, prior to the hearing, Bejar had recounted to him a conversation with Chief Product Officer Chris Cox. In that meeting, Bejar said he brought up the research into platform harms to teens and he recalled Cox acknowledging he was already aware of the statistics.

“When I returned in 2019, I thought they didn’t know,” Bejar testified. But after that meeting with Cox, he no longer believed it.

“I found it heartbreaking because it meant that they knew and they were not acting on it,” Bejar said.

Part of the issue, according to Bejar, is that Meta directs resources toward tackling a “very narrow definition of harm.” He said that it’s important to break down the prevalence of different harms on the platform to different demographics of users in order to understand the true extent of harm to certain groups.

On the day that Haugen, the first Facebook whistleblower, testified in the Senate on October 5, 2021, Bejar emailed top Meta executives including Meta CEO Mark Zuckerberg, then-COO Sheryl Sandberg and Instagram CEO Adam Mosseri.

Bejar, who shared the email as part of a trove of documents with the committee, addressed the message to Zuckerberg, saying he’d already raised the issues to Sandberg, Mosseri and Cox.

In an email to Mosseri on Oct. 14, 2021, where Bejar provided an outline of his points for a meeting scheduled for the next day, Bejar highlighted a survey of 13-15-year-olds on Instagram.

According to the survey, 13% of respondents had received unwanted sexual advances on Instagram in the last seven days alone, 26% had seen discrimination against people on Instagram based on various identities and 21% felt worse about themselves because of others’ posts on the platform.

Bejar wrote in the email to Zuckerberg that his teenage daughter has received unsolicited genitalia pictures from male users since she was 14. His daughter said she would block users who sent the photos.

“I asked her why boys keep doing that?” Bejar wrote in the email. “She said if the only thing that happens is they get blocked, why wouldn’t they?”

He advocated for funding and prioritizing efforts to understand what content is fueling bad experiences for users, what percentage of that content violates policy and what product changes they could make to improve the experience on the platform.

Bejar said he never received a response from or met with Zuckerberg or Sandberg about the email.

“Every day countless people inside and outside of Meta are working on how to help keep young people safe online,” Meta spokesperson Andy Stone said in a statement. “The issues raised here regarding user perception surveys highlight one part of this effort, and surveys like these have led us to create features like anonymous notifications of potentially hurtful content and comment warnings. Working with parents and experts, we have also introduced over 30 tools to support teens and their families in having safe, positive experiences online. All of this work continues.”

Stone pointed to a tool called “Restrict,” developed based on teen feedback. If one user restricts a second user, only the second user will be able to see their own comments on user one’s posts. He also pointed to Meta’s 2021 content distribution guidelines, made to address what the company calls borderline content that toes the lines of its policies.

Blaming tech money for lack of new laws

Subcommittee Chair Richard Blumenthal, D-Conn., and Sen. Marsha Blackburn, R-Tenn., positioned their bill, the Kids Online Safety Act (KOSA) as a key solution to the harms Bejar described. KOSA aims to put more responsibility on tech companies to safely design their products for kids.

“The time has come for the Congress to provide protection tools that parents and kids can use to disconnect from those algorithms, those black boxes that drive the toxic content,” Blumenthal told reporters before the hearing began.

He addressed concerns from some progressive groups that the bill could negatively impact vulnerable children, including LGBTQ youth, saying they’d made changes to reflect their concerns.

“This measure is not about content or censorship. It is about the product design that drives that toxic content at kids,” Blumenthal said. “We’re not trying to come between kids and what they want to see, but simply enable them to disconnect from algorithms when it drives content that they don’t want.”

While some fear that advancing narrow legislation will further delay broad privacy protections in Congress, Blumenthal said, “We’ve reached a consensus now that we need to do the possible rather than aim for the ideal. I’m all in favor of a broader privacy bill, but let’s take it one step at a time, and the more bipartisan consensus we have on protecting children, the better positioned we’ll be to do a broader privacy bill.”

“It is an indictment of this body, to be honest with you, that we have not acted,” said Subcommittee Ranking Member Josh Hawley, R-Mo. “And we all know the reason why. Big Tech is the biggest, most powerful lobby in the United States Congress … They successfully shut down every meaningful piece of legislation.”

Judiciary Committee Chair Dick Durbin, D-Ill. slammed the failure of the chamber to take up bills seeking to protect child safety online after they passed out of the committee level with overwhelming support.

Sen. Lindsey Graham, R-S.C., blamed Section 230, tech’s legal liability shield, for enabling tech’s lobbying practices. “The other bills are going nowhere until they believe they can be sued in court,” he said.

Subscribe to CNBC on YouTube.

WATCH: Attorney generals around the country file lawsuit against Meta alleging addictive features