Jump to content

Ollama - run the quantized version of most models! (Tutorials included)


parthp

Recommended Posts

For those who are short on GPU resources, https://ollama.com/ offers a vast library of quantized versions of top LLMs. 

 Multiple quantization levels of each model are available. 

Tutorials to use Ollama with google colab: 

  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...