Our valued sponsor

ChatGPT asked me to be polite while I asked to draw me a image!

what kind of computer do you need to run it? I assume the processing power is mostly heavy for the CPU. How heavy is the LLM? Can it run smoothly on a HDD? Also what UI would you suggest for general, not specific use?
Mainly GPU and memory (vram). For example, LLMs run smoothly on my MacBook with a Max chip and 64GB of RAM, but any PC with a recent high-end GPU will suffice. As for storage, SSDs are way better for performance compared to HDDs, however HDD should work.

For a user interface, you might want to try LM Studio paired with Mistral-8x7B model. Check out LM Studio - Discover and run local LLMs for more details.
 
  • Love
Reactions: jafo

This thread gives you a pretty good idea how it works with Python, keep in mind this is only with chatgpt 3.5 so 4.0 which includes the code interpreter likely will yield even better results.

Which local model is as good as this? Have not seen any on LM studio either. Will check out mistral again. Using image generation models from civitai or something of the sorts works perfectly fine locally but text generation has always been a little lacking in my opinion.
 
  • Wow
  • Love
Reactions: clemens and jafo

This thread gives you a pretty good idea how it works with Python, keep in mind this is only with chatgpt 3.5 so 4.0 which includes the code interpreter likely will yield even better results.

Which local model is as good as this? Have not seen any on LM studio either. Will check out mistral again. Using image generation models from civitai or something of the sorts works perfectly fine locally but text generation has always been a little lacking in my opinion.
nice share... at least it can do something
 
  • Like
Reactions: jafo