• catloaf@lemm.ee
    link
    fedilink
    English
    arrow-up
    259
    arrow-down
    2
    ·
    4 months ago

    OnlyFans told Reuters that police do not have to subscribe to investigate a creator’s posts, but the platform only grants free access to accounts when there’s an active investigation. That means once police suspect that CSAM is being exchanged on an account, they get “full access” to review “account details, content, and direct messages,” Reuters reported.

    God forbid police have to do their fucking job.

  • Erasmus@lemmy.world
    link
    fedilink
    English
    arrow-up
    113
    arrow-down
    6
    ·
    4 months ago

    Headline should read:

    ‘Cops upset they have to pay to jerk it like everyone else.’

  • sunzu@kbin.run
    link
    fedilink
    arrow-up
    67
    arrow-down
    5
    ·
    4 months ago

    These “cops” could check their local Catholic church or another place of worship, if they were looking to “detect” for child abusers.

  • TheGrandNagus@lemmy.world
    link
    fedilink
    English
    arrow-up
    46
    ·
    4 months ago

    “Jeff, why the living fuck have you tried to claim 5 femboy OnlyFans premium-tier subscriptions on your work expenses account?!”

    sweating “uhhh… to… ummm… make sure it wasn’t CSAM…?”

    Oh! Nice work Jeff. Very proactive. For a second I thought something dodgy might have been going on.”

  • M500@lemmy.ml
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    4 months ago

    So what will happen? Cops will get a see all account and then their will be record numbers of applicants to be a cop 😂

    • ruse8145@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 months ago

      Cops love the back door, it’s easier to flashbang the whole family they found at the wrong address

  • rob200@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    10
    ·
    4 months ago

    If only fans is not encrypted, wouldn’t these cops have other ways to detect and tackle csam material?

    Assuming it is not encrypted, what’s stopping them from doing their job? Only thing I can think of they could use as an excuse would be, accessing copyrighted materials, especially since it’s under a pay wall.

  • Gladaed@feddit.org
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    9
    ·
    edit-2
    4 months ago

    To be fair: they probably should have access to all imagery on the clock and on surveiled computers.

    Only Fans images are semi-public So I don’t see a privacy issue. If someone uses the access to masturbate or pirates content they should be prosecuted, of course.

    Cops being pigs does not mean they should not have basic tools. This also does not mean that every police station needs such a workstation.

    • ruse8145@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      4 months ago

      It’s also probably pretty easy for only fans to make software to detect and present a tiny subset of candidate streams for review. This seems like not a hard problem, Google does this already for the entire crawler accessible Internet.

  • Corvidae@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    4 months ago

    I’ve never had an OnlyFans account, but I do have social media accounts. I have had the distinct impression in recent years of being stalked, followed online.

    However, that intensified monitoring seems to have only just begun. NCMEC just got access to OnlyFans in late 2023, the child safety group told Reuters. And NCMEC seemingly can’t scan the entire platform at once, telling Reuters that its access was “limited” exclusively “to OnlyFans accounts reported to its CyberTipline or connected to a missing child case.”

  • Alpha71@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    4
    ·
    4 months ago

    Bit of a side note, but why did they go with CSAM, when CP is quicker and easier to say/type?

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      2
      ·
      4 months ago

      “Pornography” implies legitimacy and consent, from what I’ve read.

      • ZILtoid1991@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 months ago

        Also “pornography” is a harder to define term, and can be “circumvented” by calling it “child erotica” or “child artistic nudity”, and CSAM producers used them to get around anti-CSAM laws while they could. Today some of the distributors try to claim they’re “just recordings of a crime”.

    • CaptainBasculin@lemmy.ml
      link
      fedilink
      English
      arrow-up
      12
      ·
      4 months ago

      Unironically I think that wording is being pushed by the porn giants to not have it associated with porn.

      Though since the media, news and all that stuff don’t like to use the word porn too, it made it easier to be caught on.

    • dev_null@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      4 months ago

      CP is a name from the perspective of the consumer, CSAM is a name from the perspective of the victim. Since we want to take the side of the victim, we use the term relevant to the victim.