minus-squareStubbornCassette8@feddit.nltoLocalLLaMA@sh.itjust.works•Mistral AI just dropped their new model, Mistral Large 2linkfedilinkEnglisharrow-up5·4 months agoWhat are the hardware requirements on these larger LLMs? Is it worth quantizing them for lower-end hardware for self hosting? Not sure how doing so would impact their usefulness. linkfedilink
What are the hardware requirements on these larger LLMs? Is it worth quantizing them for lower-end hardware for self hosting? Not sure how doing so would impact their usefulness.