i got sick again so the financial update and also this thread are late. i’ll get the financial update up at a later point, or i might just combine it with january since there’s not that much to report as far as i can tell

  • silentdanni@beehaw.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Ah man, dealing with depression can be really hard. I hope things are getting better for you and you’re happily trying out your new clothes.

    What are your plans for your new homelab, if I may ask?

    Hang in there, sir.

    • MangoKangaroo@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Thank you for your kind words. Every day gets brighter.

      For homelab, I’m not 100% sure yet. I’m at least going to be getting a Synology NAS to replace my ancient Lenovo EMC2. I really wanted to get some hardware for running LLama 2 and KobaldCPP, but I’m struggling to find something that’s equal parts not noisy (I live in a studio), affordable(ish), and that has the minimum specs I’d need. I was unironically considering a Mac Mini with a rack converter because of the energy efficiency and powerful iGPU, but sadly they only ship up to 32GB of RAM. Since my reading suggests I’d want at least 64GB of RAM for LLama 2’s 70B version, I’m having to try some other way of doing things. I just wish I didn’t live in a studio so I could grab a secondhand rackmount server without worrying about noise levels. 😭

      • silentdanni@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I have the same problem; my flat is only about 50sqm. Judging by the way things are going, I think there’s a chance Nvidia will release some consumer-grade hardware meant for LLMs in the near-ish future. Until they reveal their next lineup, although it may seem like a poor financial decision, I’m just sticking to using the cloud for running llms.

        I’m also hoping to get my hands on some raspberry pis too. I would like to build a toy k3s cluster at some point and maybe run my own mastodon instance. :)

        • MangoKangaroo@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Well at least I’m not the only one whose homelab ambitions are being crushed by their apartment layout. I think that I’m going to end up with a 2U compute rack, which means I’ll probably limp along on one or two consumer low-profile GPUs. Now if only I could work out the details of the actual rack server hardware…

          A Raspberry Pi cluster is interesting! My only real exposure to using Pis in a homelab was an old 1B I was using for PiHole. It was great right up until it stopped working.