‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

  • ByteJunk@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    2
    ·
    7 months ago

    I don’t think they’re really comparable?

    These AI pictures are “make believe”. They’re just a guess at what someone might look like nude, based on what human bodies look like. While apparently they look realistic, it’s still a “generic” nude, kind of how someone would fantasize about someone they’re attracted to.

    Of course it’s creepy, and sharing them is clearly unacceptable as it’s certainly bullying and harassment. These AI nudes say more about those who share them than they do about who’s portrayed in them.

    However, sharing intimate videos without consent and especially as revenge? That’s a whole other level of fucked up. The AI nudes are ultimately “lies” about someone, they’re fakes. Sharing an intimate video, that is betraying someone’s trust, it’s exposing something that is private but very real.