A.I. Is Making the Sexual Exploitation of Girls Even Worse::Parents, schools and our laws need to catch up to technology, fast.

    • VaultBoyNewVegas@lemmy.world
      link
      fedilink
      English
      arrow-up
      44
      arrow-down
      20
      ·
      10 months ago

      Ai is definitely making things worse. When I was at school there was no tool for creating a deep fake of girls, now boys sneek a pic and use an app to undress them. That then gets shared, girls find out and obviously become distressed. Without ai boys would have to either sneak into toilets/changing rooms or psychically remove girls clothes. Neither of which was happening at such a large degree in a school before as it would create a shit show.

      Also most jurisdictions don’t actually have strict AI laws yet which is making it harder for authorities to deal with. If you genuinely believe that AI isn’t at fault here then you’re ignorant of what’s happening around the world.

      https://www.theguardian.com/technology/2024/feb/29/clothoff-deepfake-ai-pornography-app-names-linked-revealed That’s an article about one company that provides an app for deep fakes. It’s a shell corp so not easy to shut down through the law and arrest people, also hundreds of teenage girls have been affected by others creating non consensual nudes of them.

      • tsonfeir@lemm.ee
        link
        fedilink
        English
        arrow-up
        42
        arrow-down
        6
        ·
        10 months ago

        When I was a kid I used to draw dirty pictures and beat off to them. AI image creation is a paint brush.

        I very much disagree with using it to make convincing deepfakes of real people, but I struggle with laws restricting its use otherwise. Are images of ALL crimes illegal, or just the ones people dislike? Murder? I’d call that the worst crime, but we sure do love murder images.

      • DarkThoughts@fedia.io
        link
        fedilink
        arrow-up
        32
        arrow-down
        12
        ·
        10 months ago

        Ai is definitely making things worse. When I was at school there was no tool for creating a deep fake of girls, now boys sneek a pic and use an app to undress them. That then gets shared, girls find out and obviously become distressed. Without ai boys would have to either sneak into toilets/changing rooms or psychically remove girls clothes.

        I’m sorry but this is bullshit. You could “photoshop” someone’s face / head onto someone else’s body already before “AI” was a thing. Here’s a tutorial that allows you to do this within minutes, seconds if you know what you’re doing: https://www.photopea.com/tuts/swap-faces-online/

        That’s an article about one company that provides an app for deep fakes. It’s a shell corp so not easy to shut down through the law and arrest people, also hundreds of teenage girls have been affected by others creating non consensual nudes of them.

        Also very ignorant take. You can download Stable Diffusion for free and add a face swapper to that too. Generating decent looking bodies actually might take you longer than just taking a nude photo of someone and using my previous editing method though.

        • ominouslemon@lemm.ee
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          5
          ·
          10 months ago

          You could do everything before, that’s true, but you needed knowledge/time/effort, so the phenomenon was very limited. Now that it’s easy, the number of victims (if we can call them that) is huge. And that changes things. It’s always been wrong. Now it’s also a problem

          • BringMeTheDiscoKing@lemmy.ca
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            3
            ·
            10 months ago

            This is right. To do it before you had to be a bit smart and motivated. That’s a smaller cross section of people. Now any nasty fuck with an app on their phone can bully and harass their classmates.

          • DarkThoughts@fedia.io
            link
            fedilink
            arrow-up
            2
            ·
            10 months ago

            I’m not sure you listened to what I said or even attempted it yourself. The time / effort here is very similar, both methods have their own quirks that make them better or worse than the other, both methods however are very fast and very easy to do. In both cases the result should just be ignored as far as personal feelings go, and reported as far as legal matters go, or report things to your teachers. You don’t need special laws to file for harassment or even possible blackmail. This whole thing is just overblown fake hysteria and media panic because “AI” is such a hot topic at the moment. In a few years this will all go away again because no one really cares that much and real leaked nudes will possibly even declared a deepfake to confuse people.

            • ominouslemon@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              10 months ago

              The time / effort here is very similar, both methods have their own quirks that make them better or worse than the other, both methods however are very fast and very easy to do.

              You’re lying to yourself and you must know that, or you’re just making false assumptions. But let’s go through this step by step.

              Now with a “nudify” app:

              • install a free app
              • snap a picture
              • click a button
              • you have a fake nude

              Before:

              • snap a picture
              • go to a PC
              • buy Photoshop for $ 30.- / month (sure) or search for a pirated version, dowload a crack, install it and pray that it works
              • find a picture that fits with the person you’ve photographed
              • read a guide online
              • try to do it
              • you have (maybe) a bad fake nude

              That’s my fist point. Second:

              the result should just be ignored as far as personal feelings go

              Tell it to the girl who killed herself because everyone thought that her leaked “nudes” were actual nudes. People do not work how you think they do.

              You don’t need special laws to file for harassment or even possible blackmail. This whole thing is just overblown fake hysteria and media panic because “AI” is such a hot topic at the moment

              True, you probably don’t need new laws. But the emergence of generative AI warrants a public discussion about its consequences. There IS a lot of hype around AI, but generative AI is here and is having/will have a tangible impact. You can be an AI skeptic but also recognise that some things are actually happening.

              In a few years this will all go away again because no one really cares that much and real leaked nudes will possibly even declared a deepfake to confuse people.

              For this to happen, things will have to get WAY worse before they get better. And that means people will suffer and possibly kill themselves, like it’s already happened. Are we ready to let that happen?

              Also we’re talking only about fake nudes, but if you think about the fact that GenAI is going to spread throughout every aspect of our world, your point becomes even more absurd

              • DarkThoughts@fedia.io
                link
                fedilink
                arrow-up
                2
                ·
                10 months ago

                buy Photoshop for $ 30.- / month (sure) or search for a pirated version, dowload a crack, install it and pray that it works

                I literally gave you a tutorial to a free web app and you come here with a massive bad faith text wall that starts with “you must buy / pirate literal photoshop to do image editing”. Sorry but the difficulty that you are seeing here is not the method but the thing in front of your monitor.

                find a picture that fits with the person you’ve photographed read a guide online try to do it you have (maybe) a bad fake nude

                Speaking of lying to oneself… Where’s your several attempt of nudifying just to figure out that you actually need a good photo too, with good lighting, clothes having a good contrast with the background, a pose that is probably frontal and without arms obstructing, and then pray the model can manage to draw a half decently realistic looking body over it without any artifacts or mutations. Generative models like these aren’t magic and have many faults and you clearly show that you have absolutely no clue about that topic if you think you can just snap a picture and press a button.

                Tell it to the girl who killed herself because everyone thought that her leaked “nudes” were actual nudes. People do not work how you think they do.

                As tragic as that is, a photoshopped picture would’ve resulted in the same outcome and you can probably even find such cases if you dig through old news articles too.

                But the emergence of generative AI warrants a public discussion about its consequences.

                No, it only shows that we’ve slept on law enforcement related to digital topics for decades, thanks to all the boomers in politics who have even less of a clue about that topic than people like you, and all those who ridiculed everyone using computers before they eventually reached mainstream audiences. The problem here is also that fear mongering and dishonest “discussions” like this only lead to draconian overbearing laws that will end up really hurting the wrong end of it, while not doing much if anything for the actual issue behind it, which isn’t generative “AI” or manual photo editing. It’s akin to using the topics of terrorism or child pornography to ban things like encryption or implement web filters, or other highly invasive mass surveillance methods such as data logging or things like facial recognition.

                Are we ready to let that happen?

                I mean, I’m not, as I wasn’t even before harassment of this type was a thing just without the “AI” aspects of it. But you all were ready to let it happen, again, until the “AI” aspects of it started to cause the media hysteria. If you dig through articles of cases like this or of similar nature, like cyber-bullying was a popular umbrella term before, then you find that people simply did not take this kinda stuff seriously, causing those acts to go by unpunished. And that was, and still is, the main issue of it. Many, if not most countries don’t even give you as a person the rights to an image of yourself. How can you then expect that someone editing photos of someone and publishing that without the photographed person’s consent being legally liable? After all, what crime would they have committed if the exposed person on the picture isn’t even real except for maybe the head, which was publicly visible at the time of the photo too?

                Also we’re talking only about fake nudes, but if you think about the fact that GenAI is going to spread throughout every aspect of our world, your point becomes even more absurd

                Here’s the thing: This is going to happen either way. That’s why we need to understand that we rather have to find ways to live with it. You could ban generative models, but that would mostly just stop the legal usage, while others would just continue to use it illegally. Maybe your average inept zoomer kid would have trouble finding an app for his smartphone to do it, but it would still happen.

      • dgmib@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        10 months ago

        AI is a genie that can’t be put back into its bottle.

        Now that it exists you can make it go away with laws. If you tried, at best all you’d do is push it to sketchy servers hosted outside of the jurisdiction of whatever laws you passed.

        AI is making it easier than it was before to (ab)use someone by creating a nude (or worse) with their face. That is a genuine problem, it existed before AI, and it’s getting worse.

        I’m not saying you have like it, but if you think laws will make that unavailable you’re dreaming.