Latest news with #KickItOut


New York Times
3 days ago
- Politics
- New York Times
West Ham fined £120,000 after homophobic chanting against Chelsea
West Ham United have been fined £120,000 ($161,830) after supporters engaged in homophobic chanting during February's Premier League fixture against Chelsea. An independent regulatory commission imposed the fine, an action plan and formal warning on the Premier League club for misconduct after it was alleged they 'failed to ensure its spectators did not behave in an improper, offensive, abusive, indecent or insulting way with either an express or implied reference to sexual orientation'. Advertisement The Football Association (FA) received complaints from anti-discrimination charity Kick It Out following the fixture at Stamford Bridge on February 3 and charged West Ham on March 27 after it was alleged supporters engaged in the 'Chelsea rent boy' chant in the 62nd minute of the top-flight match. West Ham admitted to the charge on March 28, a written reasons document for the case states. An investigation into the incident suggested a 'very significant number' of supporters in the West Ham section had been involved in the 'plainly discriminatory and highly derogatory' chanting, therefore categorising it as 'mass chanting'. It continued for around 40 seconds. The commission stated West Ham had not identified any of the perpetrators inside the stadium and added there was no evidence to suggest the club had made efforts during or after the game to do so. A West Ham statement read: 'Homophobic chanting, which in this case amounted to the commission of a criminal offence, is not consistent with the values and beliefs of West Ham United and the vast majority of the club's supporters. The club has a zero-tolerance policy towards discriminatory, abusive and insulting behaviour, and those identified will, in addition to any criminal charges they face, be issued with club bans.' West Ham were accused of having a 'very significant lack of adequate specific pre-match planning' for the Chelsea fixture, which the commission stated contributed to the club being unable to identify and impose sanctions on those responsible. Club personnel deployed in the away end were also accused of failing to detect the homophobic chanting. West Ham said staff did not receive complaints regarding the chanting during or after the game. The West Ham statement continued: 'The club has already set in motion tangible actions to review and strengthen existing initiatives to continue to be strategic and proactive in the prevention and detection of any potential discriminatory or inappropriate words or behaviour in the future, at both home and away fixtures.' Advertisement In January 2023, the FA added the 'Chelsea rent boy' chant to its list of rule breaches and says it can pursue action against clubs whose supporters use it at matches. In 2022, the Crown Prosecution Service (CPS) confirmed that it considered the term a homophobic slur and therefore a hate crime. Luton Town were also fined £120,000 after directing the chant towards former Chelsea midfielder Billy Gilmour during their Premier League fixture against Brighton & Hove Albion in August 2023.


The Star
18-05-2025
- The Star
Online hate, culture of abuse is becoming normalised, study warns
LONDON: Sportspeople and pundits believe online hate is becoming normalised and say it is significantly impacting how they do their jobs, live their lives and express themselves, according to a new report. Contributors to a new report by UK watchdog Ofcom say online abuse has had profound offline consequences on them – prompting one individual to barricade themselves indoors, while others reported suffering from disordered eating and feelings of helplessness. Others said they self-censored online or while broadcasting for fear of being targeted, while some shied away from moving into on-screen roles at all because they feared doing so would increase their risk of being targeted. Researchers for Ofcom spoke to seven individuals and conducted nine discussion groups with support from anti-discrimination charity Kick It Out. Participants included sportspeople, on-screen commentators, and professionals working in sport and broadcasting. The respondents felt online abuse was becoming more common, sophisticated and normalised. They also highlighted how they felt the problem was rapidly evolving, with abusers able to evade filters with different phrases, terms and emojis. One contributor to the report said: 'I didn't leave my house for a week because of the impact of online abuse, the sort of wave (of intensity) and the amount of people that are abusing you. 'And then the media writes about it and then it becomes this sort of overwhelming feeling of just dread that so many people are saying such horrible things about you, without you actually having done anything.' Respondents felt abusers were becoming bolder because of a perceived lack of consequences for accounts that post it, and were being incentivised to post hateful and abusive content by the business models of online services that monetise engagement. Among the named contributors to the report were former cricketer Azeem Rafiq, former footballer Eni Aluko and former rugby union referee Wayne Barnes. Rafiq said nothing could have prepared him for the volume of abuse he received when he spoke out about the racism he suffered while playing at Yorkshire. Rafiq, who moved from the UK to Dubai because of the abuse, said in the report: 'The impact of this experience on me as a human being and on my mental health has damaged my life to such an extent, I'm not sure I'll ever be able to quantify it.' Ofcom said the report was part of a broader programme of work to better understand the lived experience of groups and individuals who have been particularly impacted by online harm. In March, duties came into force under the Online Safety Act that mean platforms must assess the risk of UK users encountering illegal material and use appropriate measures to protect them from it. Ofcom is currently assessing platforms' compliance with these new duties, and will take action if they fail to comply with them. Some platforms will also be subject to additional duties under the Act, such as providing adult users with features that enable them to reduce the likelihood of encountering certain types of legal but harmful content. Participants in this report said they wanted platforms to enforce their terms of service and reduce online hate and abuse for all users, not just for those who choose to use specific tools. They said existing tools, such as blocking or muting, do not go far enough to help protect them and their families and friends against online hate and abuse. Kick It Out chair Sanjay Bhandari said: 'The impact of online abuse is undeniable, and the rise in discriminatory social media reports to Kick It Out last season shows it's getting worse. 'Time and again, players and others across the game tell us about the mental toll this abuse takes, and we welcome this new report, which highlights just how deep that impact runs. 'This isn't about a few hateful comments. It's about a culture of abuse that has become normalised. It's about a social media ecosystem that too often enables and amplifies abuse. 'And it's about victims who feel imprisoned by that culture of abuse.' Jessica Zucker, Ofcom's online safety director, said: 'The UK's new online safety laws mean tech firms now have to start protecting people on their sites and apps from illegal forms of abuse. And when all the rules are fully in force, some of the largest social media platforms will have to give users more control over what they see online. 'People with lived experience of harm online are at the heart of the rules we make and the action we take. We'll be pushing companies hard to make their services safer by design, and holding them to account if they don't.' – PA Media/dpa
Yahoo
16-05-2025
- Yahoo
Online hate, culture of abuse is becoming normalized, study warns
Sportspeople and pundits believe online hate is becoming normalised and say it is significantly impacting how they do their jobs, live their lives and express themselves, according to a new report. Contributors to a new report by UK watchdog Ofcom say online abuse has had profound offline consequences on them – prompting one individual to barricade themselves indoors, while others reported suffering from disordered eating and feelings of helplessness. Others said they self-censored online or while broadcasting for fear of being targeted, while some shied away from moving into on-screen roles at all because they feared doing so would increase their risk of being targeted. Researchers for Ofcom spoke to seven individuals and conducted nine discussion groups with support from anti-discrimination charity Kick It Out. Participants included sportspeople, on-screen commentators, and professionals working in sport and broadcasting. The respondents felt online abuse was becoming more common, sophisticated and normalised. They also highlighted how they felt the problem was rapidly evolving, with abusers able to evade filters with different phrases, terms and emojis. One contributor to the report said: 'I didn't leave my house for a week because of the impact of online abuse, the sort of wave (of intensity) and the amount of people that are abusing you. 'And then the media writes about it and then it becomes this sort of overwhelming feeling of just dread that so many people are saying such horrible things about you, without you actually having done anything.' Respondents felt abusers were becoming bolder because of a perceived lack of consequences for accounts that post it, and were being incentivised to post hateful and abusive content by the business models of online services that monetise engagement. Among the named contributors to the report were former cricketer Azeem Rafiq, former footballer Eni Aluko and former rugby union referee Wayne Barnes. Rafiq said nothing could have prepared him for the volume of abuse he received when he spoke out about the racism he suffered while playing at Yorkshire. Rafiq, who moved from the UK to Dubai because of the abuse, said in the report: 'The impact of this experience on me as a human being and on my mental health has damaged my life to such an extent, I'm not sure I'll ever be able to quantify it.' Ofcom said the report was part of a broader programme of work to better understand the lived experience of groups and individuals who have been particularly impacted by online harm. In March, duties came into force under the Online Safety Act that mean platforms must assess the risk of UK users encountering illegal material and use appropriate measures to protect them from it. Ofcom is currently assessing platforms' compliance with these new duties, and will take action if they fail to comply with them. Some platforms will also be subject to additional duties under the Act, such as providing adult users with features that enable them to reduce the likelihood of encountering certain types of legal but harmful content. Participants in this report said they wanted platforms to enforce their terms of service and reduce online hate and abuse for all users, not just for those who choose to use specific tools. They said existing tools, such as blocking or muting, do not go far enough to help protect them and their families and friends against online hate and abuse. Kick It Out chair Sanjay Bhandari said: 'The impact of online abuse is undeniable, and the rise in discriminatory social media reports to Kick It Out last season shows it's getting worse. 'Time and again, players and others across the game tell us about the mental toll this abuse takes, and we welcome this new report, which highlights just how deep that impact runs. 'This isn't about a few hateful comments. It's about a culture of abuse that has become normalised. It's about a social media ecosystem that too often enables and amplifies abuse. 'And it's about victims who feel imprisoned by that culture of abuse.' Jessica Zucker, Ofcom's online safety director, said: 'The UK's new online safety laws mean tech firms now have to start protecting people on their sites and apps from illegal forms of abuse. And when all the rules are fully in force, some of the largest social media platforms will have to give users more control over what they see online. 'People with lived experience of harm online are at the heart of the rules we make and the action we take. We'll be pushing companies hard to make their services safer by design, and holding them to account if they don't.'


The Independent
16-05-2025
- The Independent
Azeem Rafiq and Eni Aluko highlight impact of online abuse in new report
Sportspeople and pundits believe online hate is becoming normalised and say it is significantly impacting how they do their jobs, live their lives and express themselves, according to a new report. Contributors to a new Ofcom report say online abuse has had profound offline consequences on them – prompting one individual to barricade themselves indoors, while others reported suffering from disordered eating and feelings of helplessness. Others said they self-censored online or while broadcasting for fear of being targeted, while some shied away from moving into on-screen roles at all because they feared doing so would increase their risk of being targeted. Researchers for Ofcom spoke to seven individuals and conducted nine discussion groups with support from anti-discrimination charity Kick It Out. Participants included sportspeople, on-screen commentators, and professionals working in sport and broadcasting. The respondents felt online abuse was becoming more common, sophisticated and normalised. They also highlighted how they felt the problem was rapidly evolving, with abusers able to evade filters with different phrases, terms and emojis. One contributor to the report said: 'I didn't leave my house for a week because of the impact of online abuse, the sort of wave (of intensity) and the amount of people that are abusing you. 'And then the media writes about it and then it becomes this sort of overwhelming feeling of just dread that so many people are saying such horrible things about you, without you actually having done anything.' Respondents felt abusers were becoming bolder because of a perceived lack of consequences for accounts that post it, and were being incentivised to post hateful and abusive content by the business models of online services that monetise engagement. Among the named contributors to the report were former cricketer Azeem Rafiq, former footballer Eni Aluko and former rugby union referee Wayne Barnes. Rafiq said nothing could have prepared him for the volume of abuse he received when he spoke out about the racism he suffered while playing at Yorkshire. Rafiq, who moved from the UK to Dubai because of the abuse, said in the report: 'The impact of this experience on me as a human being and on my mental health has damaged my life to such an extent, I'm not sure I'll ever be able to quantify it.' Ofcom said the report was part of a broader programme of work to better understand the lived experience of groups and individuals who have been particularly impacted by online harm. In March, duties came into force under the Online Safety Act that mean platforms must assess the risk of UK users encountering illegal material and use appropriate measures to protect them from it. Ofcom is currently assessing platforms' compliance with these new duties, and will take action if they fail to comply with them. Some platforms will also be subject to additional duties under the Act, such as providing adult users with features that enable them to reduce the likelihood of encountering certain types of legal but harmful content. Participants in this report said they wanted platforms to enforce their terms of service and reduce online hate and abuse for all users, not just for those who choose to use specific tools. They said existing tools, such as blocking or muting, do not go far enough to help protect them and their families and friends against online hate and abuse. Kick It Out chair Sanjay Bhandari said: 'The impact of online abuse is undeniable, and the rise in discriminatory social media reports to Kick It Out last season shows it's getting worse. 'Time and again, players and others across the game tell us about the mental toll this abuse takes, and we welcome this new report, which highlights just how deep that impact runs. 'This isn't about a few hateful comments. It's about a culture of abuse that has become normalised. It's about a social media ecosystem that too often enables and amplifies abuse. 'And it's about victims who feel imprisoned by that culture of abuse.' Jessica Zucker, Ofcom's online safety director, said: 'The UK's new online safety laws mean tech firms now have to start protecting people on their sites and apps from illegal forms of abuse. And when all the rules are fully in force, some of the largest social media platforms will have to give users more control over what they see online. ' People with lived experience of harm online are at the heart of the rules we make and the action we take. We'll be pushing companies hard to make their services safer by design, and holding them to account if they don't.'


Powys County Times
16-05-2025
- Powys County Times
Azeem Rafiq among sportspeople highlighting impact of online abuse in new report
Sportspeople and pundits believe online hate is becoming normalised and say it is significantly impacting how they do their jobs, live their lives and express themselves, according to a new report. Contributors to a new Ofcom report say online abuse has had profound offline consequences on them – prompting one individual to barricade themselves indoors, while others reported suffering from disordered eating and feelings of helplessness. Others said they self-censored online or while broadcasting for fear of being targeted, while some shied away from moving into on-screen roles at all because they feared doing so would increase their risk of being targeted. Researchers for Ofcom spoke to seven individuals and conducted nine discussion groups with support from anti-discrimination charity Kick It Out. Participants included sportspeople, on-screen commentators, and professionals working in sport and broadcasting. The respondents felt online abuse was becoming more common, sophisticated and normalised. They also highlighted how they felt the problem was rapidly evolving, with abusers able to evade filters with different phrases, terms and emojis. One contributor to the report said: 'I didn't leave my house for a week because of the impact of online abuse, the sort of wave (of intensity) and the amount of people that are abusing you. 'And then the media writes about it and then it becomes this sort of overwhelming feeling of just dread that so many people are saying such horrible things about you, without you actually having done anything.' Respondents felt abusers were becoming bolder because of a perceived lack of consequences for accounts that post it, and were being incentivised to post hateful and abusive content by the business models of online services that monetise engagement. Among the named contributors to the report were former cricketer Azeem Rafiq, former footballer Eni Aluko and former rugby union referee Wayne Barnes. Rafiq said nothing could have prepared him for the volume of abuse he received when he spoke out about the racism he suffered while playing at Yorkshire. Rafiq, who moved from the UK to Dubai because of the abuse, said in the report: 'The impact of this experience on me as a human being and on my mental health has damaged my life to such an extent, I'm not sure I'll ever be able to quantify it.' Ofcom said the report was part of a broader programme of work to better understand the lived experience of groups and individuals who have been particularly impacted by online harm. In March, duties came into force under the Online Safety Act that mean platforms must assess the risk of UK users encountering illegal material and use appropriate measures to protect them from it. Ofcom is currently assessing platforms' compliance with these new duties, and will take action if they fail to comply with them. Some platforms will also be subject to additional duties under the Act, such as providing adult users with features that enable them to reduce the likelihood of encountering certain types of legal but harmful content. Participants in this report said they wanted platforms to enforce their terms of service and reduce online hate and abuse for all users, not just for those who choose to use specific tools. They said existing tools, such as blocking or muting, do not go far enough to help protect them and their families and friends against online hate and abuse. Kick It Out chair Sanjay Bhandari said: 'The impact of online abuse is undeniable, and the rise in discriminatory social media reports to Kick It Out last season shows it's getting worse. 'Time and again, players and others across the game tell us about the mental toll this abuse takes, and we welcome this new report, which highlights just how deep that impact runs. 'This isn't about a few hateful comments. It's about a culture of abuse that has become normalised. It's about a social media ecosystem that too often enables and amplifies abuse. 'And it's about victims who feel imprisoned by that culture of abuse.' Jessica Zucker, Ofcom's online safety director, said: 'The UK's new online safety laws mean tech firms now have to start protecting people on their sites and apps from illegal forms of abuse. And when all the rules are fully in force, some of the largest social media platforms will have to give users more control over what they see online. 'People with lived experience of harm online are at the heart of the rules we make and the action we take. We'll be pushing companies hard to make their services safer by design, and holding them to account if they don't.'