• RainfallSonata@lemmy.world
    link
    fedilink
    arrow-up
    212
    arrow-down
    13
    ·
    11 months ago

    I never understood how they were useful in the first place. But that’s kind of beside the point. I assume this is referencing AI, but due to the fact that you’ve only posted one photo out of apparently four, I don’t really have any idea what you’re posting about.

    • Hildegarde@lemmy.world
      link
      fedilink
      arrow-up
      249
      arrow-down
      1
      ·
      11 months ago

      The point of verification photos is to ensure that nsfw subreddits are only posting with consent. Many posts were just random nudes someone found, in which the subject was not ok with having them posted.

      The verification photos show an intention to upload to the sub. A former partner wanting to upload revenge porn would not have access to a verification photo. They often require the paper be crumpled to make it infeasible to photoshop.

      If an AI can generate a photorealistic verification picture, it cannot be used to verify anything.

      • RainfallSonata@lemmy.world
        link
        fedilink
        arrow-up
        71
        arrow-down
        3
        ·
        edit-2
        11 months ago

        I didn’t realize they originated with verifying nsfw content. I’d only ever seen them in otherwise text-based contexts. It seemed to me the person in the photo didn’t necessarily represent the account owner just because they were holding up a piece of paper showing the username. But if you’re matching the verification against other photos, that makes more sense.

        • RedditWanderer@lemmy.world
          link
          fedilink
          arrow-up
          70
          arrow-down
          1
          ·
          11 months ago

          It’s been used way before the nsfw stuff and the advent of AI.

          Back in the days if you were doing an AMA with a celeb, the picture proof is the celeb telling us this is the account they are using. Doesn’t need to be their account and was only useful for people with an identifiable face. If you were doing an AMA because you were some specialist or professional, giving your face and username doesn’t do anything, you need to provide paperwork to the mods.

          This is a poor way to police fake nudes though, I wouldn’t have trusted it even before AI.

      • oce 🐆@jlai.lu
        link
        fedilink
        arrow-up
        30
        arrow-down
        2
        ·
        11 months ago

        Was it really that hard to Photoshop enough to bypass mods that are not experts at photo forensic?

      • DominusOfMegadeus@sh.itjust.works
        link
        fedilink
        arrow-up
        13
        ·
        11 months ago

        On a side note, they are also used all the time for online selling and trading, as a means to verify that the seller is a real person who is in fact in possession of the object they wish to sell.

      • trolololol@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        11 months ago

        How does traditional - as in before AI - photo verification knows the image was not manipulated? In this post the paper is super flat, and I’ve seen many others.

        • Hildegarde@lemmy.world
          link
          fedilink
          arrow-up
          10
          ·
          11 months ago

          From reading the verification rules from /r/gonewild they require the same paper card to be photographed from different angles while being bent slightly.

          Photoshopping a card convincingly may be easy. Photoshopping a bent card held at different angles that reads as the same in every image is much more difficult.

          • stebo@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            8
            arrow-down
            1
            ·
            11 months ago

            That last thing will still be difficult with AI. You can generate one image that looks convincing, but generating multiple images that are consistent? I doubt it.

              • EldritchFeminity@lemmy.blahaj.zone
                link
                fedilink
                arrow-up
                2
                ·
                10 months ago

                I feel like you could do this right now by hand (if you have experience with 3d modelling) once you’ve generated an image. 3d modelling often includes creating a model from references, be they drawn or photographs.

                Plus, I just remembered that creating 3d models of everyday objects/people via photos from multiple angles has been a thing for a long time. You can make a setup that uses just your phone and some software to make 3d printable models of real objects. No reason preventing someone from using a series of AI generated images instead of photos they took, so long as you can generate a consistent enough series to get a base model you can do some touch-up by hand to fix anything that the software might’ve messed up. I remember a famous lady in the 3d printing space who I think used this sort of process to make a complete 3d model of her (naked) body, and then sold copies of it on her Patreon or something.

        • KneeTitts@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Jut ask for multiple photos of the person in the same place, AI has a hard time with temporal coherence so in each picture the room items will change, the face will change a bit (maybe a lot), hair styles will change… etc

    • EldritchFeminity@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      10 months ago

      I had some trouble figuring out what exactly was going on as well, but the Stable Diffusion subreddit gave away that it was at least AI related, as that’s one of the popular AI programs. It wasn’t until I saw the tag though, that I really understood - Workflow Included. Meaning that the person included the steps they used to create the photo in question. Which means that the person in the photo was created using the AI program and is fake.

      The implications of this sort of stuff are massive too. How long until people are using AI to generate incriminating evidence to get people arrested on false charges, or the opposite - creating false evidence to get away with murder.

    • ysjet@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      22
      ·
      11 months ago

      Pretty sure it started because nsfw subreddit mods realized they demand naked pictures of women that nobody else had access to and it made their little mod become a big mod.

      • chaogomu@kbin.social
        link
        fedilink
        arrow-up
        35
        arrow-down
        1
        ·
        11 months ago

        Verification posts go back further than Reddit.

        They were used extensively on 4chan, because they were the only way to prove that a person posting was in fact that person.and yes, it was mostly people posting nudes, but it was more that they wanted credit.

        The reason it carried on to Reddit was because people were using the accounts to advertise patreon and onlyfans, and mods mostly wanted the people making money off the pictures to be the people who took those pictures.

        Also it was useful for AMA posts and other such where a celebrity was involved.

        • ysjet@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          4chan was a bit different in that it was anonymous to begin with- and more to the point, it was self-volunteered verification, not a mod-driven requirement.

          As for reddit, mods were requiring private verification photos LONG before patreon and onlyfans even existed in the first place.

          AMAs, agreed.

      • hansl@lemmy.world
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        11 months ago

        “No no it’s not about consent it’s about someone being horny” is such a bad take… and bad taste.

        • ysjet@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          10 months ago

          I hate to break this to you, but there was in fact subreddits that publically stated that they required you to privately DM mods a full-body full-face nudes in poses of the mod’s choice for verification.

          That ain’t me being in bad taste, it’s just me doing basic observation. Some subreddits it was about verification, yes. Some it was about consent. Some of them it was about the mods being horny. And most of them, it was some combination of the three.

          To pretend that it didn’t happen is… well, casual erasure of sexual misconduct of the mods, frankly.

  • Margot Robbie@lemmy.world
    link
    fedilink
    arrow-up
    128
    arrow-down
    6
    ·
    10 months ago

    Due to having so many people trying to impersonate me on the internet, I’ve become somewhat of a expert on verification pictures.

    You can still easily tell that this is fake because if you look closely, the details, especially the background clutter, is utterly nonsensical.

    1. The object over her right shoulder (your left), for example, looks like if someone blended a webcam with a TV with a nightstand.
    2. Over her left shoulder (your right), her chair is only on that one side and it blends into the counter in the background.
    3. Is it a table lamp or a wall mounted light?
    4. The doorframe in background behind her head is not even aligned.
    5. Her clavicles are asymmetrical, never seen that on a real person.
    6. Her wispy hairstrands. Real hair don’t appear out of thin air in loops.
    • Honytawk@lemmy.zip
      link
      fedilink
      arrow-up
      45
      arrow-down
      1
      ·
      10 months ago

      The point isn’t that you can spot it.

      The point is that the automated system can’t spot it.

      Or are you telling me there is a person looking at every verification photo, and if they did they would thoroughly scan the photo for imperfections?

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        25
        arrow-down
        5
        ·
        10 months ago

        The idea of using a picture upload for automated verification is completely unviable. A much more commonly used system would be something like telling you to perform a random gesture on camera on the spot, like “turn your head slowly” or “open your mouth slowly” which would be trivial for a human to perform but near impossible for AI generators.

        • curiousPJ@lemmy.world
          link
          fedilink
          English
          arrow-up
          23
          ·
          10 months ago

          but near impossible for AI generators.

          …I feel like this isn’t the first time I heard that statement before.

          • Margot Robbie@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            10 months ago

            It’s not that difficult to identify if you have a good understanding of photography principles. The lighting on this image is the biggest tell for me personally, since I can’t visualize any lighting setup that can cast shadows in the directions that’s shown on this picture, it just instinctually looks wrong to me on first sight because of the impossible light sources.

            That’s the reason the picture looks WRONG, even if you can’t identify the reason why it looks wrong.

            I only focused on the nonsense background clutters because I think it’s easier for people who don’t work around cameras all day.

            • ObsidianBlk@lemmy.world
              link
              fedilink
              arrow-up
              10
              ·
              10 months ago

              This is what makes this technology anxiety inducing at best…

              So, for yourself, you have no issues seeing the artificiality of the image due to your extensive exposure to and knowledge of photographic principles. This is fair… that said, I have read your earlier comment about the various issues with the photo as well as this one about light sources, and I keep going back to scrutinize those elements, and… for the life of me… I cannot pick out anything in the image that, to me, absolutely screams artificial.

              I’m fairly sure most people who look at these verification photos would be in a similar boat to me. Unless there’s something glaringly obvious (malformed hands, eyes in the wrong place, a sudden cthulhu-esk eldritch thing unnaturally prowling the background holding a stuffed teddy bear) I feel most people would accept an image like this at face value. Alternatively, you’ll get those same people so paranoid about AI generated fakes they’ll falsely flag a real image as fake because of one or two elements they can’t see clearly or have never seen before.

              And this is only the infancy of AI generated art. Every year it gets better. In a decade, unless there are some heavy limitations on how the AI is trained (of which, only public models would ever really have these limitations as private models would train be trained on whatever their developers saw fit… to shreds with what artists and copyright said), there would probably be no real way to tell a real image from a fake out apart at all… photographic principals and all.

              Interesting times :D

        • iegod@lemm.ee
          link
          fedilink
          arrow-up
          12
          ·
          edit-2
          10 months ago

          near impossible for AI generators

          That’s not really the case but moreoever the gap is closing at a blistering pace. Approximately two years ago this stuff was in the distant future. One year ago the lid was blown open. Today we’re seeing real-time frame generation. This rallying against the tech is misguided. It needs to be embraced and understood. Trying to do otherwise is great folly as everything will fall even further behind and lead to even larger misunderstandings. This isn’t theoretical. It’s already here. We can’t bury our heads in the sand.

        • EmergMemeHologram@startrek.website
          link
          fedilink
          arrow-up
          6
          ·
          10 months ago

          If you look at gaussian splatting and diffusion morphs/videos, this is merely in the space of “not broadly on hugging face yet” and not impossible, or even difficult depending on the gesture.

          We’re months away from fully posable and animatable 3d models of these AI images. It already exists in demos and on arxiv, it runs on consumer hardware but not in realtime, so a video upload would work but a live stream would require renting a cloud GPU ($$$).

        • smooth_tea@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          10 months ago

          Having an AI act out random gestures is really not that different from generating an image based on a prompt if you think about it. The temporal element has already been done, the biggest factor right now is probably that it’s too computationally heavy to do in real time, but I can’t see that being a problem for more than a year.

      • nova_ad_vitum@lemmy.ca
        link
        fedilink
        arrow-up
        6
        ·
        10 months ago

        More than that - these systems will eventually figure out how to not bitch the background so obviously. Then what? As others have said, we could switch to verification videos. That will be an extra year or two.

      • oce 🐆@jlai.lu
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        10 months ago

        I think so. I don’t think there would be more than a few dozens of verification to do every day, with a dozen of mods, it seems doable in this context. It’s not like millions of users are asking for verification every day.

    • Smoogs@lemmy.world
      link
      fedilink
      arrow-up
      39
      ·
      10 months ago

      Margot Robbie

      Due to having so many people trying to impersonate me on the internet

      Uh huh.

    • Aganim@lemmy.world
      link
      fedilink
      arrow-up
      25
      arrow-down
      2
      ·
      edit-2
      10 months ago

      Her clavicles are asymmetrical, never seen that on a real person.

      Shit, are you telling me that every time I see myself in the mirror I’m actually looking at a string of AI generated images, generated in real-time? The matrix is real. 😱

      It’s either that, or my clavicles are actually very asymmetric. ☹️

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        9
        arrow-down
        1
        ·
        10 months ago

        What I meant is that her right clavicle (your left) is about an inch higher than her left.

        I could be wrong, of course, but I imagine if that condition actually exists, then it would be extremely painful.

        • Gutless2615@ttrpg.network
          link
          fedilink
          English
          arrow-up
          19
          arrow-down
          1
          ·
          10 months ago

          You’re reaching. I don’t think this is “easy” to tell as you’re making it at all. You’re benefiting enormously from knowing the results before you begin extrapolating.

        • UrPartnerInCrime@sh.itjust.works
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          10 months ago

          I was agreeing with everything you’ve said, but I was in a pretty nasty bike accident years back which dislocated my clavical. Which now makes it sit about half an inch higher; mainly on the neck side. I was freaked out at first but the doctor said to just live with it so it can happen.

        • Aganim@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          10 months ago

          Yeah, I see what you mean, but my shoulders look almost exactly like that. Doesn’t hurt at all, just very annoying when carrying a backpack as the straps will always tend to slide off from my ‘drooping’ shoulder.

          But I agree with your comments about the background, that looks like a fever dream. And of course my situation isn’t the norm, so the shoulders/clavicles can be treated as a red flag, it’s just not definite proof and care should be taken to realise some people might actually just be built weird.

    • Phoenixz@lemmy.ca
      link
      fedilink
      arrow-up
      19
      arrow-down
      1
      ·
      10 months ago

      Due to so many people trying to impersonate me on the Internet

      Yeah see, now I am not really sure if you’re the real Margot Robbie.

      Could you send me a verification picture?

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        10 months ago

        But then how will I astroturf (I mean, organically market) my current and future movies, like Golden Globe winning summer blockbuster, Barbie, now available on Blu-Ray and select streaming services, here if I get verified?

      • MystikIncarnate@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        Me neither. There’s clearly more pictures that aren’t included here, so maybe on one of those?

        The odd thing about the hair in that picture to me is that on the left side of the photo, there’s one piece that seems to go on a nearly 90 degree bend for seemingly no reason, mid air. I don’t generally see hair get… Kinked like that. I suppose it’s not outside the realm of possibility, but it’s odd at least.

        The rest of the hair seems fine to me, but I’m no expert.

        I will note however that the object(s) in the background on the left side of the photo look like a gigantic (novelty sized) point and shoot camera from the 90’s. The box on top is the viewfinder and there’s the impression of a circle below that which would be the lens.

        Just makes me giggle at the thought of such a large disposable camera.

        • TheHarpyEagle@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          10 months ago

          Curly hair can look like that when it’s curling tightly towards/away from you. It looks fairly natural to me.

    • problematicPanther@lemmy.world
      link
      fedilink
      arrow-up
      18
      arrow-down
      1
      ·
      10 months ago

      every time I’d seen this photo, I only focused on the subject in the foreground, if I were the one verifying that the person in the photo is real, I’d have fallen for it. To me, the subject is entirely convincing. the issues you mentioned about the clavicles and hair, i think kind of make it a bit more convincing. Nobody is completely symmetrical for one, so seeing something like that, while not common, wouldn’t be necessarily uncommon. The hair, to me, just looks like normal person hair. sometimes hair do be like that.

      • Riven@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        5
        ·
        10 months ago

        Dude same, before I even read anything I was thinking ‘that’s a cute girl I didn’t know they started doing verifications on lemmy’ then I read and saw the whole hullabaloo.

    • lad@programming.dev
      link
      fedilink
      arrow-up
      11
      ·
      10 months ago

      Didn’t get the 5^th point, there’s only one clavicle visible, am I missing something?

      • Bohurt@lemm.ee
        link
        fedilink
        arrow-up
        11
        ·
        10 months ago

        Even so clavicles can be asymmetrical due to previous injury. We are pretty asymmetrical overall if you look closely enough.

        • TheFinn@discuss.tchncs.de
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          Well she’s Margot Robbie, so her clavicles are symmetrical af. She probably just assumes the rest of us are like that too 😓

    • Riven@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      6
      ·
      10 months ago

      An easy ‘solution’ to fix the background is to just use a mild blurring tool. They’re verifying you not your house, it wouldn’t be sus to just have a mild messy blur around you.

      • Margot Robbie@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        10 months ago

        The bokeh effect is surprisingly hard to fake, actually, because it has to do with the physical properties of the camera lens. I think with a light Gaussian blur it would be even less convincing.

    • ManOMorphos@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      10 months ago

      The “holes” on her cheeks are easy to miss but seriously unsettling close up. They’re not like freckles or blackheads but more like what termite tunnels look like in wood.

      • cheesepotatoes@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        10 months ago

        Nah. They just look like big pores. There are a few giveaways here that it’s AI generated, but the pores aren’t one of them.

        Source: have big pores. Also, google images.

    • /home/pineapplelover@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      I think ai can already do videos with people in them. Not without it looking completely natural though so there will be some discrepancies.

      • dustyData@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        10 months ago

        AI video still looks like fever dreams. The AI can’t keep consistent details, specially in the background, from frame to frame. There’s always parts that morph and look like conjured up by Van Gogh during a maniacal delirium. Maybe in a couple of years and with some human grooming in the middle.

      • Coasting0942@reddthat.com
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        It’s purpose is also to reduce the chances.

        If somebody is going to go to all the trouble of fooling a human, they probably aren’t going to just start spamming random pictures on the community for an instant moderator ban.

  • HiddenLayer5@lemmy.ml
    link
    fedilink
    English
    arrow-up
    92
    ·
    edit-2
    11 months ago

    At some point the only way to verify someone will be to do what the Klingons did to rule out changelings: Cut them and see if they bleed.

  • yamanii@lemmy.world
    link
    fedilink
    arrow-up
    88
    arrow-down
    1
    ·
    11 months ago

    Can confirm, I made some random korean dude on dall-e to send to Instagram after it threatened to close my fake account, and it passed.

        • EldritchFeminity@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          7
          ·
          10 months ago

          In the dark future, an underground market has formed to preserve the anonymity and privacy of the average person using holographic disguises of anthropomorphic figures that were in the distant past sometimes known as “furries.”

          • AceCephalon@pawb.social
            link
            fedilink
            arrow-up
            7
            ·
            10 months ago

            Ah yes, even in the dark future, furries are making super advanced and useful technologies to be more furry.

      • ThePinkUnicorn@lemdro.id
        link
        fedilink
        English
        arrow-up
        19
        ·
        11 months ago

        There are projects that already exist with this sort of purpose, one I came across a while ago was Deep Privacy which uses deepfakes to replace your face and body in an image with one that is AI generated.

    • pythonoob@programming.dev
      link
      fedilink
      arrow-up
      12
      ·
      11 months ago

      I’ve had an AI generated mix between my face and an actors as my Facebook profile pic for a little over a year now I think, or close to it, and only my sister has called me out on it.

    • Ook the Librarian@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      11 months ago

      I’m in the same boat. I basically want to wear an ai mask. I don’t like cartoon face trackers or similar. I don’t have the hardware to render a video though, and I’m not going to buy server time.

    • Turun@feddit.de
      link
      fedilink
      arrow-up
      2
      ·
      11 months ago

      Google automatic1111, it’s the program to run if you want to generate AI images. You can put in the original photo, use the built in editor and request the face of a pretty man/woman/elephant (for all I care) and it’ll generate a face and merge it with the surrounding image perfectly.

      Requires a graphics card with a few gigabytes of vram though, so there is a certain hardware requirement if you want to do this locally.

    • AA5B@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      I really like “Bitmoji” on my iPhone as an interesting start in that direction. I can create my avatar, whether as similar to me or not, and use it as a filter on FaceTime where it follows a lot of my actual movement and expressions

        • BehindTheBarrier@programming.dev
          link
          fedilink
          arrow-up
          11
          ·
          10 months ago

          I remember it being used at the “roast me” sub so the person verified they were actually the person in the image asking to be roasted.

        • Riskable@programming.dev
          link
          fedilink
          English
          arrow-up
          8
          ·
          10 months ago

          Example: When famous people do an AMA. E.g. Obama did an AMA on Reddit and he was verified with a photo that would be very easy to fake today using AI.

        • EnderMB@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          10 months ago

          There are plenty of people making Instagram and OF accounts of fully AI girls. They are hilariously fake, but looking at some of the thirsty comments their posts get, I’m inclined to say that subreddits like /r/AmIUgly and /r/rateme are likely to end up with lots of verification posts that result in lots of scams.

          Although, as already pointed out, verification posts have always been easy for people to scam with Photoshop.

  • wick@lemm.ee
    link
    fedilink
    arrow-up
    39
    ·
    10 months ago

    I can finally realise my dream of commenting on r/blackpeopletwitter

  • psmgx@lemmy.world
    link
    fedilink
    arrow-up
    38
    arrow-down
    1
    ·
    11 months ago

    Very rapidly the basis of truth in any discussion is going to get eroded.

    • Nora@lemmy.ml
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      11 months ago

      Micro communities based on pre (post-truth) connections. Only allowing people into the community that can be confirmed be others?

      I’ve been thinking of starting a matrix community to get away from discord and it’s inevitable Botting.

  • Striker@lemmy.world
    link
    fedilink
    arrow-up
    37
    arrow-down
    4
    ·
    11 months ago

    Isn’t there a trick where you can ask someone to do a specific hand gesture to get photos verified. That’ll still work especially because AI makes fingers look wonky

    • fidodo@lemmy.world
      link
      fedilink
      arrow-up
      70
      arrow-down
      5
      ·
      11 months ago

      AI has been able to do fingers for months now. It’s moving very rapidly so it’s hard to keep up. It doesn’t do them perfectly 100% of the time, but that doesn’t matter since you can just regenerate it until it gets it right.

      • YoorWeb@lemmy.world
        link
        fedilink
        arrow-up
        15
        ·
        edit-2
        10 months ago

        “For your verification please close left eye and run two fingers through your hair while eating a cauliflower with whipped cream. Attach a paperclip to your left ear and write your username on your forehead using an orange marker.”

      • Paradachshund@lemmy.today
        link
        fedilink
        arrow-up
        13
        ·
        11 months ago

        You could probably just set up a time for the person to send a photo, and then give them a keyword to write on the paper, and they must send it within a very short time. Combine that with a weird gesture and it’s going to be hard to get a convincing AI replica. Add another layer of difficulty and require photos from multiple angles doing the same things.

        • Vampiric_Luma@lemmy.ca
          link
          fedilink
          arrow-up
          10
          arrow-down
          1
          ·
          10 months ago

          Lornas can be supplied to the AI. These are data sets of specific ideas like certain hand gestures, lighting levels, whatever style you need you can fine-tune the general data set with lornas.

          I have the minimum requirements to produce art and HQ output takes 2 minutes. Low-quality only takes seconds. I can fine-tune my art on a LQ level, then use the AI to upscale it back to HQ. This is me being desperate, too, using only local software and my own hardware.

          Do this through a service or a gpu farm and you can spit it out much quicker. The services I’ve used are easy to figure out and do great work for free* in a lot of cases.

          I think these suggestions will certainly be barriers and I can think of some more stop-gaps, but they won’t stop everyone from slipping through the cracks especially as passionate individuals hyper-focus on technology we think in passing continue working on it.

        • fidodo@lemmy.world
          link
          fedilink
          arrow-up
          6
          ·
          10 months ago

          Simpler thing is to just have the user take a video. I’ve already seen that in practice.

        • ExperimentalGuy@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          10 months ago

          I feel like there’s a way to get around that… Like if you really wanted, some sort of system to Photoshop the keyword onto the piece of paper. This would allow you to generate the image but also not have to worry ab the AI generating that.

          Edit: also does anyone remember that one paper that had to do with a new AI architecture where you could put in some sort of negative image to additionally prompt an AI for a specific shape, output, or position.

          • Unkn8wn69@monero.town
            link
            fedilink
            arrow-up
            2
            ·
            10 months ago

            Just write on paper and overlay via Photoshop. Photopea has a literal one button click function for that very easy to do. Just blank paper and picture with enough light. Very easy

      • psud@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        10 months ago

        And it’ll get better if loads of verification posts are doing hand signs

    • Pasta Dental@sh.itjust.works
      link
      fedilink
      arrow-up
      8
      arrow-down
      30
      ·
      11 months ago

      Some AI models have already nailed the fingers, this won’t do anything. We need something that we can verify without having to trust the other person. I hate to say it but the block chain might be one of the best ways to authenticate users to avoid bots

  • qaz@lemmy.world
    link
    fedilink
    arrow-up
    32
    arrow-down
    6
    ·
    11 months ago

    That’s why you need a video with movement. AI still can’t do video right.

  • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    edit-2
    10 months ago

    My discord friends had some easy ways to defeat this.

    You could require multiple photos; it’s pretty hard to get AI to consistently generate photos that are 100% perfect. There would bound to be things wrong with trying to get AI to generate multiple photos of the same (non-celeb) person that would make it obvious it’s fake.

    Another idea was to make it a short video instead of a still photo. For now, at least, AI absolutely sucks balls at making video.