• ComradeSalad@lemmygrad.ml
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    4 months ago

    That makes a lot of sense, thanks for the clarification!

    I did want to say that when I meant “secure”, I didn’t mean it was impregnable or somehow superior to modern protections, but secure in the sense that on a closed system with minimal points of entry and no internet access; a system like Windows 95 at a nuclear plant has its obsolescence work to its advantage. Similar to how Russia, the US, and China all still operate their nuclear triad on analog technology and haven’t updated their technology since the late 50s.

    There also is the problem of programs being specially designed and tailor made to an OS with virtually no way to update them unless the entire program is designed again from the ground up. Something that can take tens of millions of dollars and months of time, not to mention the difficulty of the switch over process.

    So it essentially becomes an “If it’s not broken, why fix it situation”.

    The 737 MAX is an abysmal failure in that regard though. It should have never been allowed to fly in the condition that it was launched.

    • USSR Enjoyer@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      but secure in the sense that on a closed system with minimal points of entry and no internet access; a system like Windows 95 at a nuclear plant has its obsolescence work to its advantage.

      Any system can be airgapped. Windows 9x’s are some of the worst systems on earth in terms of security; it does not exist because it wasn’t a design consideration. Keep in mind that 9x’s run on top of MSDOS, which has no concept of access control whatsoever. Even in the case of systems running NT4+, those systems have mountains of extremely well known vulnerabilities, which makes it trivial to exploit by any user with any form of access. The solution is to move up to something with a hardened security model and gets updates to fix CVEs. Operating systems have no advantage whatsoever by virtue of age, in fact they are very known quantities and it’s an atrocity that they still exist running outside a VM, let alone military and infrastructure.

      There is such a thing as security through diversity, but this mostly applies the the case when a widespread attack cannot affect all exposed systems due to differences and incompatibilities. But when you know what you’re targeting, you tailor your attack to that particular system. Outdated operating systems are the easiest to tailor attacks for, because the existing methods are virtually guaranteed to work and there’s no need to develop anything novel.

      still operate their nuclear triad on analog technology and haven’t updated their technology since the late 50s.

      Purely electrical/analogue/solid-state systems don’t need updating because they are very different principals. Circuit opens/closes, impedance increases/decreases, frequency goes up and down. Where there are no complex attack surfaces like network stacks and filesystems, the only threat model is physical access to controls and wiring, which can be tightly controlled with heavy door technology. It also speaks to the value of security by reducing complexity.

      There also is the problem of programs being specially designed and tailor made to an OS with virtually no way to update them unless the entire program is designed again from the ground up

      I have a good amount of experience with that kind of issue. So I can tell you that many industrial systems are vulnerable because they bought some kind of very expensive scientific or manufacturing hardware with a proprietary interface that can only be driven by a proprietary software bridge. A hydraulic press will outlast the software (and frequently the company who wrote it), leaving you stuck running win3.1 until you get the budget to replace it or someone reverse-engineers the protocol and writes new control software. One of many reason you should never trust or run non-open-source software.

      So it essentially becomes an “If it’s not broken, why fix it situation”.

      That’s not a bad philosophy when it’s actually true, which it never is if the underlying issue is cost. But, yeah, that’s the general hubris. It saves money until it doesn’t.