Lindsey Graham tells tech CEOs: ‘You have blood on your hands’ – live | Technology

Lindsey Graham to tech CEOs: ‘You have blood on your hands’

In opening statements at Wednesday’s hearing, Sen. Lindsey Graham (R-SC) referenced parents and children in attendance, many of whom brought photos of their children they say died or suffered emotional damage as a result of social media.

“To all the victims who came and showed us photos of your loved ones don’t quit, you’re making a difference,” he said. “Hopefully we can take your pain and turn it into something positive.”

Like Senator Dick Durbin, Graham took aim at the legal immunity that Section 230 allows social media firms, stating it is “now time to repeal” the measure.

“They’re destroying lives, and threatening democracy itself. These companies must be reined in, or the worst is yet to come,” he said.

Referencing the chief executives in attendance, he said: “I know you don’t mean it to be so, but you have blood on your hands.”

His statements drew applause from attendees, a rare riotous moment for a Congressional hearing from a passionate crowd. Before CEOs began their testimony, Durbin requested that attendees do not stand, shout, or applaud witnesses.

“We have a large audience, the largest I’ve seen in this room,” he said. “I know there is high emotion in this room, for justifiable reasons. But I asked you to please follow the traditions of the committee.”

Lindsey Graham at the hearing on Wednesday.
Lindsey Graham at the hearing on Wednesday. Photograph: Bonnie Cash/UPI/Rex/Shutterstock

Updated at 

Key events

Graham demands to know if executives support legislation, pushes Zuckerberg on lawsuit questions

Graham targeted Meta CEO Mark Zuckerberg with questions about a teenage Instagram user who died by suicide after being targeted by a sextortion ring via the platform, asking the executive if he believes the family of the victim should be able to sue the company.

“I think that they can,” Zuckerberg responded. Graham retorted that such lawsuits are often thrown out due to Section 230, and promoted bipartisan legislation that would more easily allow victims to take companies to court.

In a tense line of questioning, Senator Lindsey Graham asked top tech executives if they supported such legislation, and noted that they refused to answer directly with a “yes” or “no”.

“The bottom line, I’ve come to conclude, is that you aren’t going to support any of this,” he said. “If you’re waiting on these guys to solve the problem, we’re going to die waiting.”

X becomes first online platform to endorse the Stop CSAM Act

In her opening statements before Congress, X chief executive officer Linda Yaccarino said the company endorses the Stop CSAM Act, a bill introduced by Sen. Dick Durbin (D-IL) that would remove legal immunity for civil claims against internet companies over child sex abuse material.

“You have my personal commitment that X will be active and a part of this solution,” Yaccarino said. “X believes that the freedom of speech and platform safety can and must coexist. We agree that now is the time to act with urgency.”

Durbin thanked her for being the “first social media company” to publicly endorse the act. “It is our honor, chairman,” she replied. Yaccarino said in her opening remarks that X had bolstered its resources for dealing with child sexual abuse material, though how much is unclear: Elon Musk reduced the size of the company by more than half, including its trust and safety teams.

Yaccarino stopped short of endorsing the Kids Online Safety Act (Kosa), another bill meant to target section 230 immunity for social media firms that was recently endorsed by the Snapchat parent company. But she said X “supports the progress” of the bill. Snap’s Evan Spiegel has endorsed the measure.

Kosa and the Stop CSAM act have been flagged by civil and digital rights groups for potential privacy and freedom-of-speech violations, due in part to their targeting of encryption tools as well as potential use of such bills to target LGBTQ content misidentified as dangerous.

Updated at 

Instagram whistleblower says emails show wellbeing was never priority at Meta

Dan Milmo

Dan Milmo

Arturo Béjar, a former senior engineer and consultant at Meta who testified in Washington in November about what he described as child safety failings at the company. He said emails released in advance of the hearing showed executives and Zuckerberg failing to act on issues like suicide ideation.

“If this work was a priority, there would be a thousand engineers and product managers on well-being. This is a company with over 30,000 engineers,” he told the Guardian.

Updated at 

Meta rejected greater investment in child safety, internal documents show

Dan Milmo

Dan Milmo

Mark Zuckerberg was asked for more resources to increase teen wellbeing on his platforms by senior lieutenant Sir Nick Clegg in 2021, according to emails released by Sen. Richard Blumenthal (D-CT) ahead of Wednesday’s hearing.

Clegg, then vice‑president for global affairs and communications at the Facebook and Instagram owner, asked for more investment in August 2021 to “strengthen our position on well-being across the company.” he said the matter was becoming urgent as “politicians in the US, UK, EU and Australia are publicly and privately expressing concerns about the impact of our products on young people’s mental health.” He said the company’s efforts on addressing wellbeing concerns among users was “being held back by a lack of investment”.

In one reply to Clegg, Sheryl Sandberg, then chief operating officer at Meta, said she was “supportive” of the investment request but “we have overall budgeting issues across the board so no promises on what will happen.”

“The hypocrisy is mind-boggling,” Blumenthal told The New York Times. “We’ve heard time and time again how much they care and are working on this, but the documents show a very different picture.”

In response to criticism of its efforts to safeguard children’s well-being, Meta has said it employs 40,000 people to improve trust and safety and has invested $20bn in such efforts since 2016.

Updated at 

Social media executives share prepared statements; Zuckerberg calls for Apple and Google to protect children via app stores

The congressional hearing investigating online sexual and other exploitation of children is under way, with executives from Meta, X, TikTok, Snap, and Discord sharing prepared statements.

Ahead of the statements, Sen. Dick Durbin (D-IL) noted that Mark Zuckerberg of Meta and Shou Zi Chew of TikTok were appearing voluntarily, whereas Linda Yaccarino of X (formerly Twitter), Evan Spiegel of Snap, and Jason Citron of Discord had to be subpoenaed and forced to appear.

“I hope this is not a sign of your commitment or lack of commitment to addressing serious issue before us,” he said.

After being sworn in, the executives began their testimonies, focusing largely on what tools their platforms have released to protect kids online. Many of them mentioned their own children and expressed their commitment to protecting kids online.

“All of us here on this panel today throughout the tech industry have a solemn and urgent responsibility to ensure that everyone who uses our platform is protected from these criminals, both online and off,” Citron said in his opening statements.

Zuckerberg stated that Meta has introduced more than 30 such tools over the last eight years, including controls that let parents set time limits for app usage and see who their children are following and engaging with online. He added that Meta has spent $20bn on safety and security since 2016 and employs about 40,000 people to address such concerns.

“We build technologies to tackle the worst online risks and share it to help our whole industry get better,” he said.

Zuckerberg sought to shift more responsibility for children’s safety online to Apple and Google, which operate the world’s biggest app stores. His company made a similar point last year when it called for legislation that would require parental approval for app purchases and downloads by teens.

Shou Zi Chew of TikTok cited the app’s “robust community guidelines”, including family pairing tools like setting screen time limits and filtering out certain content. He stated TikTok has more than 40,000 trust and safety professionals and expect to invest more than $2bn in trust and safety efforts in 2024 alone.

“Keeping kids safe online requires a collaborative effort as well as collective action,” he said. “We’ve shared the community’s concern and commitment to protect young people online. We welcome the opportunity to work with you on legislation to achieve this goal.”

Evan Spiegel of Snap, like other executives, acknowledged the victims of online harms in the room and parents of children who have been impacted by online harms.

“Words cannot begin to express the profound sorrow. I feel that a service you designed to bring people happiness and joy has been abused to cause harm,” he said. “I want to be clear that we understand our responsibility to keep our community safe.”

Updated at 

Family members of children who killed themselves after online sexual exploitation are attending the congressional hearing with portraits in hand.

Families inside the hearing room on Wednesday. Photograph: Katie McQue/The Guardian

The mood in the room is tense and somber. Dozens of family members of victims are in the audience, holding up their loved one’s photos, hoping to catch the attention of the CEOs are they filed into the room flanked by staff. One woman quietly wept into her handkerchief. They have applauded senators’ jibes at the tech CEOs multiple times.

One opening remark by Durbin elicited laughter: “Coincidentally, some of these platforms have implemented common-sense child safety protections within the last week.”

Updated at 

Lindsey Graham to tech CEOs: ‘You have blood on your hands’

In opening statements at Wednesday’s hearing, Sen. Lindsey Graham (R-SC) referenced parents and children in attendance, many of whom brought photos of their children they say died or suffered emotional damage as a result of social media.

“To all the victims who came and showed us photos of your loved ones don’t quit, you’re making a difference,” he said. “Hopefully we can take your pain and turn it into something positive.”

Like Senator Dick Durbin, Graham took aim at the legal immunity that Section 230 allows social media firms, stating it is “now time to repeal” the measure.

“They’re destroying lives, and threatening democracy itself. These companies must be reined in, or the worst is yet to come,” he said.

Referencing the chief executives in attendance, he said: “I know you don’t mean it to be so, but you have blood on your hands.”

His statements drew applause from attendees, a rare riotous moment for a Congressional hearing from a passionate crowd. Before CEOs began their testimony, Durbin requested that attendees do not stand, shout, or applaud witnesses.

“We have a large audience, the largest I’ve seen in this room,” he said. “I know there is high emotion in this room, for justifiable reasons. But I asked you to please follow the traditions of the committee.”

Lindsey Graham at the hearing on Wednesday. Photograph: Bonnie Cash/UPI/Rex/Shutterstock

Updated at 

Meta’s new parental tools will not protect vulnerable children, experts say

Tech firm gives parents greater control over their children’s online activities, but not all kids have consistent supervision

  • Meta introduced new parental supervision tools in July, but child protection and anti-sex trafficking organizations say the new measures offer little protection to the children most vulnerable to exploitation, and divert the responsibility from the company to keep its users safe.

Meta estimates about 100,000 children using Facebook and Instagram receive online sexual harassment each day, including “pictures of adult genitalia”, according to internal company documents made public as part of a lawsuit by the New Mexico attorney general.

Updated at 

Congressional hearing opens with stories of exploited children

A Senate judiciary committee hearing on Wednesday exploring how social media firms allegedly fail to protect their youngest users opened with a video featuring voices of children who described being were sexually exploited online. They said they were sexually exploited on Facebook, Instagram, X, and other platforms. The video also featured parents who say their children killed themselves following sexual exploitation online.

“Big tech failed to protect me from online sexual exploitation,” one child said in the video.

“We need Congress to do something for our children and protect them,” a parent said.

In opening statements, Senator Dick Durbin said online child sexual exploitation is “a crisis in America”. He said executives in attendance on Wednesday represent tech companies that are “responsible for many of the dangers our children face online.” In attendance at the hearing are Mark Zuckerberg of Meta, Linda Yaccarino of X (formerly Twitter), Shou Zi Chew of TikTok, Evan Spiegel of Snap, and Jason Citron of Discord.

“Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety of all put our kids and grandkids at risk,” Durbin said.

The Senator called for legislation to address the harms, targeting section 230 of the Communications Decency Act, a law that exempts social media firms from legal liability for content and activity on their platforms.

“Let this hearing be a call to action we need to get kids online safety legislation to the president’s desk,” Durbin said.

Updated at 

US surgeon general issues advisory on ‘profound’ risks of child social media use

Dr Vivek Murthy called on tech companies and policymakers to take ‘immediate action’ to protect children’s mental health in May

  • Social media use by children and teenagers can pose a “profound risk of harm” to their mental health and wellbeing, the US surgeon general is warning. Murthy said that in the absence of robust independent research it is impossible to know whether social media is safe for children and adolescents.

  • “The bottom line is we do not have enough evidence to conclude that social media is, in fact, sufficiently safe for our kids. And that’s really important for parents to know,” he said.

How Facebook and Instagram became marketplaces for child sex trafficking

Our two-year investigation suggests that the tech giant Meta is struggling to prevent criminals from using its platforms to buy and sell children for sex. Read more here:

Updated at 

‘The tide has turned’: why parents are suing US social media firms after their children’s deaths

Social media firms have faced scrutiny from Congress over their impact on young users, but parents who have lost kids to online harm are now leading the charge

‘Fundamentally against their safety’: the social media insiders fearing for their kids

Parents working for tech companies have a first-hand look at how the industry works – and the threats it poses to child safety

  • “I really can’t imagine a world where, as things stand today, these things are safe for a 13-year-old to use,” Arturo Bejar told the Guardian. Bejar left Facebook in 2015, where he spent six years making it easier for users to report when they had problems on the platform. But it wasn’t until his departure that he witnessed what he described in recent congressional testimony as the “true level of harm” the products his former employer built are inflicting on children and teens – his own included. He discovered his then 14-year-old daughter and her friends were routinely subjected to unwanted sexual advances, harassment and misogyny on Instagram, according to his testimony.

CEOs of Meta, X, TikTok, Snap and Discord arrive in Congress

Kari Paul

Chief executives from five major social media firms arrived in Congress on Wednesday morning to face questioning about alleged harms to young users caused by their platforms.

The hearing, titled “Big Tech and the Online Child Sexual Exploitation Crisis”, promises to “examine and investigate the plague of online child sexual exploitation”, according to a statement from the US Senate Judiciary Committee. In attendance are chief executive officers including Mark Zuckerberg of Meta, Linda Yaccarino of X (formerly Twitter), Shou Zi Chew of TikTok, Evan Spiegel of Snap, and Jason Citron of Discord.

In a speech on Tuesday preceding the hearing, Senator Dick Durbin said combating dangers faced by children online has been one of his “top priorities” as chair of the committee and said he plans to ask executives “what they’re doing to make their platforms inaccessible to child sex offenders”.

“As recently as last week, some have launched new child safety measures that are long overdue, but it should not take a hearing before the Senate Judiciary Committee to finally get these companies to prioritize child safety,” he said. “Because these changes are half measures at best, I welcome the opportunity to question them about what more needs to be done.”

Executives appearing in Congress are expected to highlight controls and tools introduced to manage children’s online experiences and mitigate harm. In prepared remarks, Zuckerberg stated that Meta has introduced more than 30 such tools over the last eight years, including controls that let parents set time limits for app usage and see who their children are following and engaging with online. He added that Meta has spent $20bn on safety and security since 2016 and employs around 40,000 people to address such concerns.

“We’re committed to protecting young people from abuse on our services, but this is an ongoing challenge,” he said. “As we improve defenses in one area, criminals shift their tactics, and we have to come up with new responses.”

Updated at