Sports Illustrated was publishing articles under seemingly fake bylines. We asked their owner about it — and they deleted everything.

  • 7112@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    3
    ·
    7 months ago

    AI can be such a great tool, but they just use it for this crap… We need regulation

    • Fredselfish@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      7 months ago

      Unfortunately our government is full of 80 year olds who we don’t want regulating it because it will be what the corporate lobbyist want and us average Americans don’t want that.

      • 7112@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        7 months ago

        Sadly you’re right. We at least need the basic rule of labeling content that used AI

    • r3df0x ✡️✝☪️@7.62x54r.ru
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      7 months ago

      I don’t understand what the purpose of regulation would be.

      If a company writes bad articles using AI, they will lose business and the problem will be self correcting.

      • Very_Bad_Janet@kbin.socialOP
        link
        fedilink
        arrow-up
        8
        arrow-down
        1
        ·
        7 months ago

        In this case, the magazine created fake journalists with fake bios and used them.in the bylines. I would think that’s a kind of fraud and definitely hurts the magazine’s trustworthiness. The article goes into other publications that have also used fake journalists with their bylines in AI produced articles. I think if they were more transparent- say, labeling an article “AI created content” - it wouldn’t be as much of an issue.

        This also doesn’t address errors in the AI produced articles (which could be solved by having a real life fair checker or editor).

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    7 months ago

    This is the best summary I could come up with:


    And even more strangely, his profile photo on Sports Illustrated is for sale on a website that sells AI-generated headshots, where he’s described as “neutral white young-adult male with short brown hair and blue eyes.”

    Ortiz isn’t the only AI-generated author published by Sports Illustrated, according to a person involved with the creation of the content who asked to be kept anonymous to protect them from professional repercussions.

    On its review section’s title page, for instance, the site still proudly flaunts the expertise of AI-generated contributors who have since been deleted, linking to writer profiles it describes as ranging “from stay-at-home dads to computer and information analysts.”

    Or look at “Denise McNamara,” the “information analyst” that TheStreet boasted about — “her extensive personal experience with electronics allows her to share her findings with others online” — whose profile picture is once again listed on the same AI headshot marketplace.

    Back in February, when the company first started publishing AI-generated health advice at its magazine Men’s Journal, we found that its first story was riddled with errors, prompting it to issue a massive correction.

    Needless to say, neither fake authors who are suddenly replaced with different names nor deplorable-quality AI-generated content with no disclosure amount to anything resembling good journalism, and to see it published by a once-iconic magazine like Sports Illustrated is disheartening.


    The original article contains 1,620 words, the summary contains 217 words. Saved 87%. I’m a bot and I’m open source!