SocialistVibes01@lemmy.ml to Linux@lemmy.mlEnglish · edit-215 hours agoWhich specs are as low as reasonable possible for local LLM models? Do you recommend some distro in particular?message-squaremessage-square14linkfedilinkarrow-up125arrow-down111file-text
arrow-up114arrow-down1message-squareWhich specs are as low as reasonable possible for local LLM models? Do you recommend some distro in particular?SocialistVibes01@lemmy.ml to Linux@lemmy.mlEnglish · edit-215 hours agomessage-square14linkfedilinkfile-text
minus-squareEager Eagle@lemmy.worldlinkfedilinkEnglisharrow-up10·edit-214 hours agoheavily depends on the model and quantization level choose the model you want on this website and it’ll give you some specs likely to run it https://runthisllm.com/ any/most distros will do, especially if you run it on Docker if you’re going with intel cards (best $ per GB VRAM right now), you could get a decent machine under $3k
heavily depends on the model and quantization level
choose the model you want on this website and it’ll give you some specs likely to run it
https://runthisllm.com/
any/most distros will do, especially if you run it on Docker
if you’re going with intel cards (best $ per GB VRAM right now), you could get a decent machine under $3k