parthp Posted August 29 Share Posted August 29 For those who are short on GPU resources, https://ollama.com/ offers a vast library of quantized versions of top LLMs.  Multiple quantization levels of each model are available. Tutorials to use Ollama with google colab: https://pub.towardsai.net/running-ollama-on-google-colab-free-tier-a-step-by-step-guide-9ef74b1f8f7a https://github.com/SkkJodhpur/Gen-ai/blob/main/Models/Ollama_3_A_Step_by_Step_Guide/Ollama_3_A_Step_by_Step_Guide.ipynb https://digitalitskills.com/running-ollama-on-google-colab-and-using-llm-models-for-free/ 1 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.