• SomeGuy69@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    4
    ·
    edit-2
    4 months ago

    “Yet another stepping stone on the road to objectifying women with AI,” Hugging Face AI researcher Dr. Sasha Luccioni told Ars Technica. “As a woman working in this field, I’m unsurprised but disappointed.”

    Can we all take a step back and look at the whole AI picture. There are thousands of people following a non existing “women”, an AI contest where no women is involved, so no women is exploited by an agency, their husband or peer pressure and everyone involved knows it’s not real (instead of say an anorexia suffering real human being for instance). No pressure on a real human to take unnecessary surgery etc.

    How exactly do you make the mental gymnastic split to twist this into women objectification? Maybe people enjoy pretty things? Seams to be SFW to, so not even nude content. Not a single women will feel worse because of AI unachievable beauty standards. Current non AI standards are already unachievable for 99,999%, and more, of the people. You can’t objectively measure if there’s an increase in the unwell feeling of a person, if this person is comparing themselves to photoshoped supermodels or AI, both is equally impossible.

    But a lot of women will make less money by no longer being able to objectifying themselves and selling content, as it’s now just a click away.

    Of course this doesn’t solve the issue with AI and people creating unsolicite nudes, but that’s another issue. However you can now post your real nudes online and pretend it’s AI. A lot of women should feel less pressured now, as real nudes lose a lot of blackmail leverage.

    If you only want to see the negatives, then that’s all you’ll see.