Findings from study commissioned by PFA and carried out by data science company Signify include: targeted racist abuse peaked in May 2021, homophobia is the most common form of online discriminatory abuse targeting UK professional footballers
Thursday 5 August 2021 07:40, UK
Two in five Premier League players received abusive messages on Twitter last season, a new Professional Footballers' Association-funded study has found.
The players' union worked with ethical data science company Signify to monitor levels of abuse on the social media platform during the 2020-21 season.
The research found 176 of the 400 players (44 per cent) who were found to hold a Twitter account received abuse.
Signify used machine learning systems and the data revealed 20 per cent of all abusive messages were sent to just four unnamed players.
The study analysed over six million posts and ran a deeper analysis of over 20,000 flagged posts identifying 1,781 explicitly abusive messages from 1,674 accounts for reporting to Twitter.
The data found unmoderated racist online abuse increased by 48 per cent in the second half of the 2020/21 football season.
The survey found 50 per cent of abusive accounts came from the UK, while a third of the abusive accounts were affiliated to a UK club.
It also found players across the leagues faced homophobic, ableist and sexist abuse, with the former representing nearly a third of all detected abuse.
Targeted abuse peaked in May 2021, shortly after a four-day football-wide social media boycott was held to draw further attention to online hate.
The report attributed the spike in May to an incident in the FA Cup final between Chelsea and Leicester, compounded by the clubs meeting again in the Premier League three days later.
It also found more than three-quarters of the 359 accounts sending explicitly racist abuse to Premier League, Women's Super League and English Football League players were still on the platform.
Only 56 per cent of racially abusive posts identified throughout the season had been removed, with some posts have remained live for months, and in some cases, the full duration of the season.
PFA chief executive Maheta Molango said: "The time has come to move from analysis to action. The PFA's work with Signify clearly shows that the technology exists to identify abuse at scale and the people behind offensive accounts.
"Having access to this data means that real-world consequences can be pursued for online abuse. If the players' union can do this, so can the tech giants."
The report also found that Twitter applied a hierarchical order to its moderation.
Twenty-seven per cent of abusive posts directed at Premier League players are no longer visible, dropping to 17 per cent for abuse targeted at EFL players and 12 per cent for WSL players.
Watford captain and PFA Players' Board representative Troy Deeney said: "Social media companies are huge businesses with the best tech people. If they wanted to find solutions to online abuse, they could. This report shows they are choosing not to. When is enough, enough?"
Twitter does not believe the report fully or fairly reflects the steps it has taken to proactively enforce its rules.
The platform recently added new conversation settings that allow people on the platform, particularly those who have experienced abuse, to choose who can reply to the conversations they start.
A Twitter spokesperson said: "It is our top priority to keep everyone who uses Twitter safe and free from abuse. While we have made recent strides in giving people greater control to manage their safety, we know there is still work to be done.
"We continue to take action when we identify any tweets or accounts that violate the Twitter rules. We welcome people to freely express themselves on our service. However we have clear rules in place to address threats of violence, abuse and harassment and hateful conduct.
"For example, in the hours after the Euro 2020 final, using a combination of machine-learning-based automation and human review, we swiftly removed over 1,000 tweets and permanently suspended a number of accounts for violating our rules - the vast majority of which we detected ourselves proactively using technology.
"We welcome peer-reviewed research and reports from partners and third-party organisations that help to identify tweets or accounts that violate our policies, which contributes to our work to improve the health of the public conversation.
"What is important to note is that these types of reports are made possible because our public API (application programming interface) is open and accessible to all, which is not true of most other technology companies.
"We have engaged and continue to collaborate with our valued partners in football, to identify ways to tackle this issue collectively.
"We want to reiterate that abusive and hateful conduct has no place on our service and we will continue to take swift action on the minority that try to undermine the conversation for the majority. We will also continue to play our part in curbing this unacceptable behaviour - both online and offline."
Sky Sports is committed to making our channels a safe place for debate, free of abuse, hate and profanity. If you see a reply to Sky Sports posts and/or content with an expression of hate on the basis of race, colour, gender, nationality, ethnicity, disability, religion, sexuality, age or class please copy the URL to the hateful post or screengrab it and email us here.
For more information please visit: www.skysports.com/hatewontstopus
Online Reporting Form | Kick It Out
Kick It Out is football's equality and inclusion organisation - working throughout the football, educational and community sectors to challenge discrimination, encourage inclusive practices and campaign for positive change.