cross-posted from: https://lemmy.world/post/7123708

In this article, you will discover the ISO images that Debian offers and learn where and how to download them. I’ll also provide some useful tips on how to use Jigdo to archive the complete Debian repository into ISO images.

    • TCB13@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      6
      ·
      edit-2
      8 months ago

      Because you’re missing the point: if you want to build offline archives you can’t use something that is online by definition.

      It really baffles me how careless people are with depending on internet 24h7 these days and how it completely goes through your head that maybe, if you’ve to deploy dozen of machines a day (automated or manually) you might not want to be redownloading the same stuff every single time and maybe a few packages aren’t even available on the provided ISOs at debian.org. Additionally some people are required to work on airgapped networks and systems that can’t have Internet access to download packages…

      Since you didn’t care to actually read most of the article I’ll just say this:

      Larger images, such as Blu-ray, aren’t available for HTTP download however the Debian project provides Jigdo template files that can be used to build those images from the repositories. Jigdo makes it possible and easy to create offline images containing with all packages the repository has to offer.

      This is just perfect, a community about self-hosting that can’t even self host the basic software they require lol

      • computergeek125@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        8 months ago

        I’ve got nothing against downloading things only once - I have a few dozens of VM at home. But once you reach a certain point maintaining offline ISOs for updating can become a chore, and larger ISOs take longer to write to flash install media by nature. Once you get a big enough network, homogenizing to a single distro can become problematic: some software just works better on certain distros.

        I’ll admit that I did miss the point of this post initially wondering why there was a post about downloading Debian when their website was pretty straightforward - the title caught me off guard and doesn’t quite match what it really is on the inside. Inside is much much more involved than a simple download.

        Therein lies the wrinkle: there’s a wide spectrum of selfhosters on this community, everyone from people getting their first VM server online with a bit of scripted container magic, all the way to senior+ IT and software engineers who can write GUI front ends to make Linux a router. (source: skimming the community first page). For a lot of folks, re-downloading every time is an ok middle ground because it just works, and they’re counting on the internet existing in general to remotely access their gear once it’s deployed.

        Not everyone is going to always pick the ““best”” or ““most efficient”” route every time because in my experience as a professional IT engineer, people tend towards the easy solution because it’s straightforward. And from a security perspective, I’m just happy if people choose to update their servers regularly. I’d rather see them inefficient but secure than efficient and out of date every cycle.

        At home, I use a personal package mirror for that. It has the benefit of also running periodic replications on schedule* to be available as a target that auto updates work from. Bit harder to set up than a single offline ISO, but once it’s up it’s fairly low maintenance. Off-hand, I think I keep around a few versions each of Ubuntu, Debian, Rocky, Alma, EPEL, Cygwin, Xen, and Proxmox. A representative set of most of my network where I have either three or more nodes of a given OS, or that OS is on a network where Internet access is blocked (such as my management network). vCenter serves as its own mirror for my ESXi hosts, and I use Gitea as a docker repo and CI/CD.

        I also have a library of ISOs on an SMB share sorted by distro and architecture. These are generally the net install versions or the DVD versions that get the OS installed enough to use a package repo.

        I’ve worked on full air gap systems before, and those can be just a chore in general. ISO update sometimes can be the best way, because everything else is blocked on the firewall.

        *Before anyone corrects me, yes I am aware you can set up something similar to generate ISOs

        • TCB13@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          At home, I use a personal package mirror for that. It has the benefit of also running periodic replications on schedule* to be available as a target that auto updates work from

          Yes that’s a great option if it the machine hosting that can access the internet… but I’ve to keep ISOs for the other situation and older versions that might get audited at any time. Either way, Jigdo can be used to update ISO images as well.