Since Meta announced they would stop moderating posts much of the mainstream discussion surrounding social media has been centered on whether a platform has a responsibility or not for the content being posted on their service. Which I think is a fair discussion though I favor the side of less moderation in almost every instance.

But as I think about it the problem is not moderation at all: we had very little moderation in the early days of the internet and social media and yet people didn’t believe the nonsense they saw online, unlike nowadays were even official news platforms have reported on outright bullshit being made up on social media. To me the problem is the godamn algorithm that pushes people into bubbles that reinforce their correct or incorrect views; and I think anyone with two brain cells and an iota of understanding of how engagement algorithms works can see this. So why is the discussion about moderation and not about banning algorithms?

  • will_a113@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 day ago

    Algorithm is just a fancy word for rules to sort by. “New” is an algorithm that says “sort by the timestamp of the submissions”. That one is pretty innocuous, I think. Likewise “Active” which just says “sort by the last time someone commented” (or whatever). “Hot” and “Scaled”, though, involve business logic – rules that don’t have one technically correct solution, but involve decisions and preferences made by people to accomplish a certain aim. Again in Lemmy’s case I don’t think either the “Hot” or “Scaled” algorithms should be too controversial – and if they are, you can review the source code, make comments or a PR for changes, or stand up your own Lemmy instance that does it the way you want to. For walled-garden SM sites like TikTok, Facebook and Twitter/X, though, we don’t know what the logic behind the algorithm says. We can speculate that it’s optimized to keep people using the service for longer, or encouraging them to come back more frequently, but for all intents and purposes those algorithms are black boxes and we have to assume that they’re working only for the benefits of the companies, and not the users.