The most visible winner of the artificial intelligence boom achieved its dominance by becoming a one-stop shop for A.I. development, from chips to software to other services.
It seems self hosted ai always needs at least 13gb of vram. Any gpu that has more than 12gb of vram is conveniently like $1k per gb for every gb of vram, beyond 12gb (sort of like how any boat longer than 18 feet usually costs $10k per foot for any every foot of length beyond 18ft). There are projects that do it all on cpu but still, ai gpu stuff is bullshit.
It seems self hosted ai always needs at least 13gb of vram. Any gpu that has more than 12gb of vram is conveniently like $1k per gb for every gb of vram, beyond 12gb (sort of like how any boat longer than 18 feet usually costs $10k per foot for any every foot of length beyond 18ft). There are projects that do it all on cpu but still, ai gpu stuff is bullshit.