Whistleblower: Facebook's response to child abuse 'inadequate'

3 years ago 164
ARTICLE AD BOX

By Angus Crawford
BBC News

Image source, Getty Images

Image caption, The whistleblower said Facebook Groups is where a lot of "abhorrent behaviours occur"

A former Facebook employee has told US authorities the company's efforts to remove child abuse material from the platform were "inadequate" and "under-resourced".

The allegations are contained in documents seen by BBC News and submitted to the US Securities and Exchange Commission (SEC) two weeks ago.

The anonymous whistleblower says moderators are "not sufficiently trained and are ill prepared".

Facebook said in a statement: "We have no tolerance for this abhorrent abuse of children and use sophisticated technologies to combat it.

"We've funded and helped build the tools used by industry to investigate this terrible crime, rescue children and bring justice to victims."

It added that it has shared its anti-abuse technologies with other companies.

The revelations come after former insider Frances Haugen told the US congress earlier this month that Facebook's platforms "harm children, stoke division and harm our democracy".

This week she also gave evidence to the UK parliamentary committee examining the proposed Online Safety Bill.

Senior executives from Facebook, Twitter, Google, YouTube and Tiktok are also due to give evidence.

The latest revelations come from an unnamed whistleblower, with inside knowledge of the teams within Facebook set up to intercept harmful material.

In a sworn statement to the SEC, which regulates securities markets and protects investors, the individual said there was no solution to illegal material at Facebook because there had not been "adequate assets devoted to the problem".

They claim that a small team set up to develop software which could detect indecent videos of children was broken up and redeployed, because it was seen as "too complex".

Facebook says it uses technology known as PhotoDNA and VideoDNA, which automatically scan for known child abuse images - each image recovered by law enforcement worldwide and referred to the American National Centre for Missing and Exploited Children, is given a unique identifying code.

Other accusations from the whistleblower include:

  • Facebook doesn't know the full scale of the problem of child abuse material because it "doesn't track it"
  • A constant question allegedly asked by senior managers was "what's the return on investment?"

The whistleblower told the SEC that this was a legitimate business question, "but not when it comes to public safety issues as critical as child sex abuse".

In the five-page legal document there was also a warning about Facebook "Groups", which were described as "facilitating harm".

The groups, many of which are only visible to members, is where "a lot of terrifying and abhorrent behaviours occur".

Paedophiles "use code words to describe the type of child, the type of sexual activity...they use Facebook's encrypted Messenger service or Whatsapp to share these codes, which change routinely.

"Facebook's system depends on a self-policing model that can't rationally or reasonably be enforced".

Facebook told the BBC that it does scan private groups for content that violates its polices and has 40,000 people working on safety and security, with an investment of more than $13bn (£9.4bn) since 2016.

It said it had actioned 25.7 million pieces of content for child sexual exploitation in Q2 of 2021.

Sir Peter Wanless, chief executive of the NSPCC, said: "These revelations raise profound and disturbing questions about Facebook's commitment to combat illegal child abuse on its services.

"For the first time, evidence from inside Facebook suggests they have abdicated their responsibility to comprehensively tackle child sexual abuse material".

The former employee concluded their statement by writing: "Unless there is...the credible threat of legislative and/or legal action. Facebook won't change".

More on this story

Read Entire Article