Doesn’t look fake to me actually.
I mean, it looks fine to me too, though if you know that it’s generated and can look at it, I guess it’s possible to meet the lower bar of finding things that are off.
If someone is taking a picture with a sunset in the background, then their face should be in shadow.
https://live.staticflickr.com/7008/13532530895_a52ee219eb.jpg
Also the straps on the shirt are all sorts of fucky.
That’s true, but it could just as well be an instagram filter? So many photos nowadays are heavily processed.
Besides the lighting, the only thing I could find are the straps on her top that seem to get a bit mixed up.True, didn’t think about that.
It does if you have seen a lot of AI images, especially SD.
It’s cool how quickly the brain can pick out the subtle issues, despite how near perfect the image is
True, but only when you expect it. I’ve seen real pictures with weird lighting before and if I didn’t know they were real, I would’ve thought of Photoshop. With some experience you know what to look for, but there have already been plenty of studies showing that AI persons can not properly be distinguished from real people in pictures.
Definitely true
I wonder if we’re going to end up with a new field of forensic medicine determining if media is real based on subtle anatomy/biomechanics details. Even if the person is real, a particular photo or video might not be
Also at what point is an image altered? Every camera has an ISP that improves image quality, so that shouldn’t be counted as an alteration, even if it can impact the image quite significantly. Then there’s manual tools like photoshop, which can do things like the ISP, but also a lot more and then you got some “AI” tools that blur the lines even more. At least CNNs are just filter banks with learned kernels (to be very reductive). The cut is a bit clearer with diffusion image generation and similar tech, as that stuff is just clearly fake, but what about img-to-img diffusion?
It was there already.