Rio Ferdinand says football is 'sliding backwards' because of racist abuse online

3 years ago 64
ARTICLE AD BOX
Rio Ferdinand says he "expected" racist abuse towards black England stars after their Euros penalty miss

Rio Ferdinand says football is "sliding backwards" and racism is being normalised because of the prevalence of racist abuse online.

The former England captain was speaking to a joint parliamentary committee seeking possible improvements to the government's draft Online Safety Bill.

Ferdinand, 42, said he had seen family members "disintegrate" after he was abused on social media.

"Self-esteem, your mental health is at risk," Ferdinand said.

The former Manchester United defender said it was "disheartening" to see levels of racism in football rising to the levels of the 1970s and 80s.

He cited the abuse received by England players Marcus Rashford, Bukayo Saka and Jadon Sancho after July's penalty shootout defeat by Italy in the Euro 2020 final.

"When those three players missed those penalties, the first thing I thought was 'let's see what happens on social media'," Ferdinand said.

"I expected [the abuse] to happen."

On Wednesday Ferdinand's brother Anton spoke to the Home Affairs select committee, asking whether it would take a tragedy for social media companies to act on online racist abuse.

At that hearing, representatives of Twitter and Instagram argued they were attempting to address the issue.

Katy Minshall, head of UK public policy and philanthropy at Twitter, said the company was starting to focus its work on the ease with which footballers could be contacted on social media, while Tara Hopkins, the director of public policy at Instagram, said 95% of hateful content was proactively removed from the platform.

'I have seen members of my family disintegrate'

Rio Ferdinand smiling with a television earpiece inRio Ferdinand now works as a pundit after earning 83 England caps in a 14-year international career

Rio Ferdinand told the joint committee of MPs and peers that it was "baffling" that social media companies had tools that track copyright breaches on, for example, his YouTube channel but could not use that same technology to pick up certain emojis or words used in racially abusive posts.

He stressed that harmful content affected more than just the person who received it, saying: "I have seen members of my family disintegrate at times when it happens."

"I have to sit there with my kids and explain what the monkey emoji means in that context," Ferdinand added.

The former Leeds and West Ham player condemned the fact that perpetrators were allowed to remain anonymous online, saying it was "normalising racist behaviour".

"If you put it in the context of a young person who supports a certain player at whatever level, he is looking through that feed and seeing racist language," he continued.

"That young person then goes into his network of friends and 'it's fine, it's normal so I'll say that at school so it's OK'.

"When there are no repercussions, there is nothing done to expose that person for their ignorant language, then people are going to think it's normal."

Ferdinand agreed that social media companies profited from prejudice and said that placing the onus on victims to report abuse or block abusers was "an easy cop-out" for platforms.

'Layered' approach needed to verify users

Ferdinand spoke to the committee alongside Edleen John, the Football Association's director of international relations, corporate affairs and equality, diversity and inclusion, and Kick It Out chair Sanjay Bhandari.

John commented on how the draft bill could tackle the issue of identity verification for social media users, suggesting a "layered" approach.

"Social media companies seem to believe that it's a binary option where people have to provide all information or no information," she said.

John instead suggested "multiple mechanisms" could be used to tackle the issue, using identity verification alongside "default settings" and limiting the reach of an account.

She believed this approach would reduce the use of 'burner' accounts, which are set up to send abusive messages before being quickly deleted, with users able to set up a new account shortly after.

John said she "consistently receives platitudes" from social media organisations in response to concerns and cited anecdotal evidence of a player who was blocked by a platform for reporting abuse too many times.

Bhandari said the current system was "frictionless" and called for an amendment to the draft bill that would give communications regulator Ofcom "power to introduce codes of practice".

"You need to give Ofcom the power to regulate harmful but legal content," he said.

Bhandari added that the public response to the abuse of the England trio after the Euro 2020 final should offer a template for future regulations.

"The unified public condemnation of that says to us that what the public are demanding is that every piece of hate that was spat out that night needed to be taken off the platforms," he continued.

"The way in which to do that is to give a regulator the power to reflect contemporary social practices.

"We would have to put checks and balances in. We also have to balance against dealing with an evolving problem. We can't legislate through the rear-view mirror, we have to legislate through the windscreen, looking ahead."

Imran Ahmed, chief executive and founder of the Center for Countering Digital Hate (CCDH), also spoke at the hearing after a report from his organisation in July found Instagram failed to remove 94% of accounts that had sent racist messages to Rashford, Saka and Sancho after the Euros final defeat.

"When it comes to racism against footballers, the reason why the abuse matters isn't because they're a wealthy footballer," Ahmed said.

"Imagine what they would call me, my mum or anyone else from a minority. It is a sense of, 'these places aren't for you. These are our places'."

What is the draft Online Safety Bill?

The committee hearing is one of several steps in the process of the draft bill - which aims "to establish a new regulatory framework to tackle harmful content online" - being passed into law.

The draft bill places new duties on social media firms to remove harmful content quickly or potentially face multi-billion-pound fines.

Some campaigners say the plans will lead to censorship, while others warn fines do not go far enough.

The draft legislation has been two years in the making and is especially geared towards keeping children safe.

As well as racist abuse, it covers terrorism, disinformation, pornography, grooming, revenge porn, hate speech, images of child abuse and posts relating to suicide and eating disorders.

Late additions to the bill include provisions to tackle online scams, such as romance fraud and fake investment opportunities.

Read Entire Article