Afordable hardware for run your local AI
I been considering to acquire a PC, or build one for running locally your own LLMs without needing to subscribe and share your data with external vendors.
I was wondering what would be the affordable for small home lab server/pc to make it "good enough", whatever that could be.
I tried quantised models and they work quite well, but usually quite slow on CPU, however GPUs have crazy prices and little RAM… Perhaps off loading part of models to GPU might be an option, but this so as many other factors need to be adjusted to the final PC build. Thx!
I been considering to acquire a PC, or build one for running locally your own LLMs without needing to subscribe and share your data with external vendors.
I was wondering what would be the affordable for small home lab server/pc to make it "good enough", whatever that could be.
I tried quantised models and they work quite well, but usually quite slow on CPU, however GPUs have crazy prices and little RAM… Perhaps off loading part of models to GPU might be an option, but this so as many other factors need to be adjusted to the final PC build. Thx!
4 users upvote it!
1 answer