Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

Last week’s thread

(Semi-obligatory thanks to @dgerard for starting this)

  • self@awful.systems
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 months ago

    we really shouldn’t have let Microsoft both fork an editor and buy GitHub, of course they were gonna turn one into a really shitty version of the other

    anyway check this extremely valuable suggestion from Copilot in one of their screenshots:

    The error message ‘userld and score are required’ is unclear. It should be more specific, such as ‘Missing userld or score in the request body’.

    aren’t you salivating for a Copilot subscription? it turns a lazy error message into… no that’s still lazy as shit actually, who is this for?

    • a human reading this still needs to consult external documentation to know what userId and score are
    • a machine can’t read this
    • if you’re going for consistent error messages or you’re looking to match the docs (extremely likely in a project that’s in production), arbitrarily changing that error so it doesn’t match anything else in the project probably isn’t a great idea, and we know LLMs don’t do consistency
    • V0ldek@awful.systems
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 months ago

      I want someone to fork the Linux kernel and then unleash like 10 Copilots to make PRs and review each other. No human intervention. Then plot the number of critical security vulnerabilities introduced over time, assuming they can even keep it compilable for long enough.

          • Sailor Sega Saturn@awful.systems
            link
            fedilink
            English
            arrow-up
            8
            ·
            2 months ago

            Sshh don’t tell the investors, I’ve managed to be paid for a decade by updating my code to work with other people updating their code to work with other people updating their code, all without actually doing anything new.

            We as a profession have developed a careful balancing act where we’re always busy doing nothing. If the balance was off just a little someone might actually have to think about new features instead of, say, migrating from CGI to PHP to JavaScript to jQuery to AngularJS to Angular to React to ???, rejecting LLM generated changes, “fixing” the same bug year after year, or reverting reverts of reverts of reverts of reverts of changes.

            And thinking is hard.

      • froztbyte@awful.systems
        link
        fedilink
        English
        arrow-up
        9
        ·
        2 months ago

        that’d be an interesting experiment but also that’s $2400 you could spend on more useful things, like bootstrapping your whiskey collection

        • V0ldek@awful.systems
          link
          fedilink
          English
          arrow-up
          7
          ·
          2 months ago

          $2400 is hardly a number compared to whatever we’re already spending on genAI so fuck it