The report is here. There are some good points, and it’s definitely not an obvious smear-job, but it’s definitely not perfect. And it is likely to be misused by less-responsible authors who are interested in smearing Mastodon or other services not operated by major tech companies.
In the course of the investigation, researchers found that despite the availability of image hashes to identify and remove known CSAM, Twitter experienced an apparent regression in its mitigation of the problem. Using PhotoDNA, a common detection system for identified instances of known CSAM, matches were identified on public profiles, bypassing safeguards that should have been in place to prevent the spread of such content. This gap was disclosed to Twitter’s Trust & Safety team which responded to address the issue. However, the failure highlights the need for platforms to prioritize user safety and the importance of collaborative research efforts to mitigate and proactively counter online child abuse and exploitation.
That being said, people who code for the Fediverse should see this report and pay particular attention to things like
Current tools for addressing child sexual exploitation and abuse online—such as PhotoDNA and mechanisms for detecting abusive accounts or recidivism—were developed for centrally managed services and must be adapted for the unique architecture of the Fediverse and similar decentralized social media projects.
I honestly don’t know crap about coding, but this seems like a very solvable problem and something I’d very much like for the people who do to engage with. I would absolutely donate some money to support a project like this.
I actually just saw that Dansup is working on adding optional opt-in support for PhotoDNA in pixelfed if an instance admin adds a PhotoDNA API key, I wonder if that was spurred on by this report. Hopefully Mastodon also looks into adding support.
Nice, yeah hopefully this feature or something that accomplishes the same spreads* throughout the Fediverse quickly
*Like, it would be really cool if there was a way to fight child porn that didn’t involve relying on a for profit company, but chipping away at our screwed up economic system is a lower priority than stopping child abuse
I was informed that, in the last few years, NCMEC has added additional solutions beyond PhotoDNA. This includes Google’s CSAI and Facebook’s open-source video/image matching tools.
That’s good, but it is still just mind blowing to me that we let a bunch of private for profit companies take the lead on this. This is the sort of thing the FBI ought to be all over developing and maintaining and handing out to everyone if they weren’t a bunch of stupid assholes busy harassing environmentalists and police brutality protesters.
After a bit of reading, another option may simply be to include a “report” button that generates a hash of the image and federates the list. That being said, their may be a similarity algorithm under the hood of PhotoDNA that works better. Hard to say since it’s all proprietary and pay-for-membership. Prices aren’t even listed publicly unless you use a cloud API.
https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf
The report is here. There are some good points, and it’s definitely not an obvious smear-job, but it’s definitely not perfect. And it is likely to be misused by less-responsible authors who are interested in smearing Mastodon or other services not operated by major tech companies.
Those less responsible authors should be shown this study from the same organization last month showing similar problems on Twitter
That being said, people who code for the Fediverse should see this report and pay particular attention to things like
I honestly don’t know crap about coding, but this seems like a very solvable problem and something I’d very much like for the people who do to engage with. I would absolutely donate some money to support a project like this.
e; I guess what I meant to say is I would absolutely donate some money to purchase API keys from Microsoft
I actually just saw that Dansup is working on adding optional opt-in support for PhotoDNA in pixelfed if an instance admin adds a PhotoDNA API key, I wonder if that was spurred on by this report. Hopefully Mastodon also looks into adding support.
Nice, yeah hopefully this feature or something that accomplishes the same spreads* throughout the Fediverse quickly
*Like, it would be really cool if there was a way to fight child porn that didn’t involve relying on a for profit company, but chipping away at our screwed up economic system is a lower priority than stopping child abuse
Or… license PhotoDNA of course!
Oh, surely they don’t charge for a tool to stop child abus-
Wow, say what you will about capitalism, but it really is an engine for innovation and coming up with new ways to make me lose faith in humanity
There might be other options:
https://www.hackerfactor.com/blog/index.php?archives/931-PhotoDNA-and-Limitations.html
That’s good, but it is still just mind blowing to me that we let a bunch of private for profit companies take the lead on this. This is the sort of thing the FBI ought to be all over developing and maintaining and handing out to everyone if they weren’t a bunch of stupid assholes busy harassing environmentalists and police brutality protesters.
After a bit of reading, another option may simply be to include a “report” button that generates a hash of the image and federates the list. That being said, their may be a similarity algorithm under the hood of PhotoDNA that works better. Hard to say since it’s all proprietary and pay-for-membership. Prices aren’t even listed publicly unless you use a cloud API.
Yeah, I’m just discovering that it’s proprietary
Good to know my tax dollars went to helping Microsoft develop another product! /s
I appreciate you doing a level headed job of explaining this and dropping a link. Cheers!