(WASHINGTON) — A Senate panel on Tuesday grilled executives from YouTube, TikTok and Snapchat on what the social media companies are doing to ensure young users’ safety in the wake of revelations about Facebook’s practices and allegations the platforms need to do more to prevent potentially harmful effects on kids.
“They have deepened America’s concern and outrage and have led to increasing calls for accountability, and there will be accountability,” Senate Commerce subcommittee Chair Richard Blumenthal, D-Conn., said in his opening remarks regarding the newly exposed details on the inner workings of social media giants.
“We’re hearing the same stories of harm” caused by YouTube, TikTok and Snapchat, Blumenthal said, calling this, “for Big Tech, a Big Tobacco moment.”
“This time is different,” he said.
The subcommittee is seeking information from executives at TikTok, Snap Inc. and YouTube on how critics say algorithms can magnify harm to children, with the goal of passing legislation aimed to protect kids.
“You’re parents,” said Ranking Member Sen. Marsha Blackburn, R-Tenn., to the witnesses in her opening statement. “What would you do to protect your child?”
Tuesday’s hearing comes as the subcommittee expands its scope after hearing from Facebook whistleblower Frances Haugen earlier this month. She alleged that executives blatant disregarded concerns when they learned their platforms could have harmful effects on foreign democracies and the mental health of children.
The hearing also marked the first time TikTok and Snapchat have testified before lawmakers, while Facebook has been called to more than 30 congressional hearings through the years and YouTube executives have already appeared in front this Congress earlier in the year.
The social media executives on Tuesday vigorously defended how their platforms protect children from inappropriate content.
Here are some key takeaways:
Tech companies blasted for alleged lack of transparency
Different from a normally polarized Washington, senators on both sides of the aisle came together to drill the social media executives on transparency and focused on whether they’d allow access to independent researchers to study their algorithms, which some allege have exposed kids to harmful behavior and fueled eating disorders in young girls.
All three platforms said they have studied the potential negative impacts on children’s mental health.
Blumenthal asked, “If an academic researcher comes to you on child psychology and wants to determine whether one of your products causes teen mental health issues or addiction, they get access to raw data from you without interference?”
Jennifer Stout, vice president for global public policy of Snapchat parent Snap Inc., said her company’s algorithms “operate very differently” from those of the other platforms under scrutiny, but ultimately signaled a willingness to support outside researchers, as did TikTok’s executive.
“Yes, senator, we believe transparency for the average is incredibly important. We’re one of the first companies to publish publicly, a deep dive in how our algorithm works,” said Michael Beckerman, a TikTok vice president and head of public policy for the Americas.
Leslie Miller, vice president for government affairs and public policy of YouTube’s owner Google, skirted the question and said that outside research “would depend on the details” — an answer that frustrated Blumenthal.
“I’m going to cite the difference between your response between Mister Beckerman’s and Ms. Stout’s, which indicates certainly a strong hesitancy if not resistance,” Blumenthal said to Miller.
Overall, the executives defended what senators deemed was a lack of transparency.
Stout said in her closing statement that the protection of children is the “highest priority,” and Miller also said at YouTube there “no more important thing than the safety of kids online.”
But Tiktok appeared to be most willing for congressional oversight with Beckerman saying squarely in his closing statement, “We support stronger privacy rules to be put in place.”
Push for privacy legislation
While millions of young users log into the platforms every day, the bipartisan panel of senators appeared to agree that not enough is being done to protect them from harmful content.
Sen. Ed Markey, D-Mass., used the moment to push the companies to say whether they support his proposed privacy laws banning the use of targeted ads on kids and other potentially harmful features.
One piece of legislation he’s introduced, the Children’s Online Privacy Protection Act, or COPPA, would prohibit internet companies from collecting personal information from anyone under the age of 13 without parental consent.
“Do you support it or not?” he asked the Snap executive.
“I think, senator, we’d love to talk to you a bit more about this,” Stout said.
“This is just what drives us crazy,” a heated Markey responded. “We want to talk, we want to talk, we want to talk. This bill’s been out there for years, and you still don’t have a view on it?”
“We like your approach,” Beckerman, from TikTok, said. “However, I think a piece that should be included is a better way to verify age across the Internet across apps rather than the system that is in place now. And I think with that improvement, it would be something that we’d be happy to support.”
Miller said wouldn’t commit on the record but said executives at YouTube have had “constructive” conversations internally.
He also pressed them on the Kids Internet Design and Safety Act, or KIDS Act, another piece of legislation he’s introduced to stop online practices such as manipulative marketing, noting the impact of social media influencers on children.
“They’re inherently manipulative to young kids who often cannot tell that they’re really paid advertisements that their heroes pushing that the hero is getting a monetary kickback,” Markey said. “Should we make it illegal?”
Miller said they would “need to stare at the details of such a bill” to which Markey, again, noted,” It’s been around for a while.”
The TikTok executive said they agree that there should be additional transparency and additional privacy laws, which Snap mirrored, but added the caveat, “We would be happy to look at them.”
After Miller said YouTube executives “support the goals of comprehensive privacy legislation,” when Blumenthal raised the Eliminating Abusive and Rampant Neglect of Interactive Technologies, or EARN IT, Act, which has bipartisan Senate support, he said back, “This is the topic that we’ve seen again and again and again, and again. ‘We support the goals, but that’s meaningless unless you support the legislation.”
Focus on potential real-world harm on kids
With the momentum of the findings from the Facebook hearing, the panel argued that social media platforms have been allowed to promote and glorify dangerous content, and it especially harms the nation’s most vulnerable: children.
While executives defended their platforms and listed actions that they’ve taken internally, senators on the committee highlighted several examples of inappropriate content slipping past those safeguards and getting in front of children.
Sen. Mike Lee, R-Utah, said his staff opened an account saying it was for a teenage girl, and when they opened the “Discover” page with its default settings, found concerning videos.
“They were immediately bombarded with content that I can most politely describe as wildly inappropriate for a child, including recommendations for among other things an invite to play an online sexualized video game, tips on why you shouldn’t go to bars alone,” he said, waving his hands with concern.
The Snap executive said guidelines prevent sexual content to 18 and above, “so I’m unclear as to why that content would’ve shown up for an account that was for a 14-year-old.”
Senators reminded the witnesses that Snapchat’s speed filter allowed users to add their speeds and it took eight years for the company to remove the filter following catastrophic car crashes associated with the app.
Sen. Amy Klobuchar, D-Minn., pressed Snapchat over the use of illegal drugs being used on its platform in an argument for greater liability on tech companies, citing the case of Devin Norring, who authorities said died in Minnesota after taking Percocet laced with Fentanyl from a drug dealer on Snapchat.
“They can get on your platform and just find a way to buy it, and that is the problem,” she said. “Are you going to get drugs off Snapchat?”
Stout said it was a “top priority” and that it’s happening on other platforms, too.
“I think there are other ways to do this too as creating liability when this happened, so maybe that’ll make you work even faster, so we don’t lose another kid,” Klobuchar replied.
Citing a recent investigation by the Wall Street Journal which found that Tik Tok algorithm can put young users into content glorifying eating disorders, drug violence, Klobuchar asked blankly, “Have you stopped that?”
Beckerman said it’s something they’ve taken action on are “constantly working on” and repeated their support for the Children and Media Research Advancement Act or CAMRA Act.
Blumenthal pressed TikTok on its effects on teens, saying his staff created TikTok accounts intended for dance videos and within a week those accounts were flooded with content of suicidal ideation, self-injury, sex and eating disorders.
Beckerman suggested some of those challenged are overblown by the press and said that’s “not the typical TikTok experience.”
“We found pass-out videos,” Blumenthal said, pausing for dramatic effect. “We found them, so I have a lot of trouble crediting your response on that score.”
“This is stuff occurring in the real world,” he added later.
Copyright © 2021, ABC Audio. All rights reserved.