A South Korean man has been sentenced to jail for using artificial intelligence to generate exploitative images of children, the first case of its kind in the country as courts around the world encounter the use of new technologies in creating abusive sexual content.
I think you responded to the wrong comment, because while it is true that there are 2.4B people under the age of 18 alive today, it doesn’t appear to have any relevance to what you were replying to.
Normalising CSAM does harm. Crap argument.
Is it Child Sexual Abuse Material if there are no children involved?
“Anime should also be banned”, - - “All anime characters in anime should show passport with their date of birth”.
There are billions of children. HTH
I think you responded to the wrong comment, because while it is true that there are 2.4B people under the age of 18 alive today, it doesn’t appear to have any relevance to what you were replying to.
Only if you assume that the only children harmed by CSAM are those used to produce CSAM.
Consumers of CSAM are (actual or potential) perpetrators of abuse. Normalising it is not an option.