X continues to suck at moderating hate speech, according to a new report::A new report from the Center for Countering Digital Hate (CCDH) suggests X is failing to remove posts that violate its own community rules regarding misinformation and hate speech.

  • TheFeatureCreature@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    6
    ·
    8 months ago

    Of course it does. It’s literally owned and run by a fascist. I really wish people would stop expecting Twitter to run and behave like it used too. It will not and it never will again.

  • Obinice@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 months ago

    Isn’t hate speech what they’re going for?

    They’re run by literal fascists now, presumably they want it to be a platform for their kind?

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    This is the best summary I could come up with:


    According to the CCDH, the reported posts, which largely promoted bigotry and incited violence against Muslims, Palestinians, and Jewish people, were collected from 101 separate X accounts.

    Just one account was suspended over their actions, and the posts that remained live accrued a combined 24,043,693 views at the time the report was published.

    X filed a lawsuit against the CCDH in July earlier this year over claims the organization “unlawfully” scraped X data to create “flawed” studies about the platform.

    In a statement to The Verge, X’s head of business operations, Joe Benarroch, said that the company was made aware of the CCDH’s report yesterday and directed X users to read a new blog post that details the “proactive measures” it has taken to maintain the safety of the platform during the ongoing Israel-Hamas war, including the removal of 3,000 accounts tied to violent entities in the region and taking action against over 325,000 pieces of content that violate its terms of service.

    X claims that by choosing to only measure account suspensions, the CCDH has not accurately represented its moderation efforts and urged the organization to “engage with X first” to ensure the safety of the X community.

    After publication, Benarroch questioned the methodology of the CCDH’s study and claimed the organization only considers a post “actioned” after the account has been suspended.


    The original article contains 476 words, the summary contains 224 words. Saved 53%. I’m a bot and I’m open source!