During covid times I heard many interesting conspiracy predictions such as the value is money will fall to zero, the whole society will collapse, the vaccine will kill 99% of the population etc. None of those things have happened yet, but can you add some other predicitons to the list?

Actually, long before covid hit, there were all sorts of predictions floating around. You know, things like the 2008 recession will cause the whole economy to collapse and then we’ll go straight to Mad Max style post-apocalyptic nightmare or 9/11 was supposed to start WW3. I can’t even remember all the predictions I’ve heard over the years, but I’m sure you can help me out. Oh, just remembered that someone said that paper and metal money will disappear completely by year xyz. At the time that date was like only a few years away, but now it’s more like 10 years ago or something. Still waiting for that one to come true…

  • Moneo@lemmy.world
    link
    fedilink
    arrow-up
    21
    arrow-down
    2
    ·
    9 months ago

    Fully self driving cars. Turns out it’s a lot harder than we thought to build a system that doesn’t get confused by edge cases.

    By the time they are widely legal most people will probably (hopefully) have realized how stupid car dependency is.

    • Son_of_dad@lemmy.world
      link
      fedilink
      arrow-up
      20
      arrow-down
      2
      ·
      9 months ago

      I blame Musk for getting people excited about Mars and self driving cars, in the days before we realized he’s nothing but a lying, piece of shit nepo deuche.

      • yiliu@informis.land
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        He’s far from the only one claiming self-driving cars are “just a year or two away”. There was a lot of that for a while.

      • tagliatelle@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        9 months ago

        who’s liable when it crashes? And it’s “better” than human drivers in very limited situations with a human driver behind the wheel to take control.

        • shrugal@lemm.ee
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          9 months ago

          I’d say if the human is supposed to observe and take control then the human is liable unless something about the autopilot made it impossible to intervene (e.g. no time to react). If it’s a completely autonomous autopilot then ofc the manufacturer is liable, who else could it be?! But autopilots would probably have to pass some safety tests before being allowed on the road, and you’d have to prove negligence or malicious intent by the manufacturer (e.g. faking test results). This would be similar to things like medicine, where the manufacturer just can’t guarantee 100% safety.

          Regarding “better”, afaik it’s on average. So if you let 1000 humans and 1000 autopilots drive 1000 miles each the autopilots will produce less accidents overall. Idk if autopilots get better or worse by allowing human intervention, a human could also take control at the wrong moment after all.

          • sanguine_artichoke@midwest.social
            link
            fedilink
            arrow-up
            3
            ·
            9 months ago

            Tesla has allegedly played with that by disengaging autopilot something like half a second before a crash, so it doesn’t add to statistics of crashes that occurred while it was on.

        • yiliu@informis.land
          link
          fedilink
          arrow-up
          2
          ·
          9 months ago

          who’s liable when it crashes?

          This is potentially the killer app of self-driving. If it gets safe enough, the company offering self-driving cars can take responsibility for insurance (so long as you use the self-driving feature).