I’ve had a pretty poor experience with it myself, so I wanna see what the Linux community thinks about this.

    • SLGC@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      I’ll emphasize the point that this goes for any kind of machine learning model that can benefit from CUDA, which means a large amount of gaming computers already meet the prerequisites for this. Installation is trivial (but requires some knowledge), and I hope to see more ML applications for hobbyists in the near future. Image generation and locally hosted GPT models come to mind.