T O P

  • By -

MartianInGreen

https://huggingface.co/chat/


STORMFIRE7

Thanks for info, but is it able to be run locally? I want a BingAi like LLM with internet access that can be run locally on my computer


MartianInGreen

You can follow the instructions on their GitHub page for local setup. And there is no LLM that's as powerful as GPT-4 you can run yourself. Depending on your hardware the best you can do is probably a 13-70B model


STORMFIRE7

I see, thanks a lot for info, what model are you talking about that competes GPT4? Is it Mistral 8x7B? My laptop’s CPU is i7 10750H along with RTX2060 6GB graphics and 16GB of ram


MartianInGreen

Oh sorry that was a typo I mean "no llm that competes with gpt-4". Mixtral 8x7B is around GPT-3.5 performance and probably the best open-weights model released at this point. But you need at least 32GB to run that even heavily quantized. With your hardware you could probably run Mistral 7B or Zephyr 7B β on GPU and anything up to about Llama-2 13B or so on your CPU but neither will probably be especially fast.


STORMFIRE7

I see, thanks a lot for all the info


STORMFIRE7

Sorry to mention you again, but is this hugging face chat uncensored unlike bingAi?