You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

ollama.md 1.4KB

12345678910111213141516171819202122232425262728293031323334353637383940
  1. # Ollama
  2. <div align="center" style="margin-top:20px;margin-bottom:20px;">
  3. <img src="https://github.com/infiniflow/ragflow/assets/12318111/2019e7ee-1e8a-412e-9349-11bbf702e549" width="130"/>
  4. </div>
  5. One-click deployment of local LLMs, that is [Ollama](https://github.com/ollama/ollama).
  6. ## Install
  7. - [Ollama on Linux](https://github.com/ollama/ollama/blob/main/docs/linux.md)
  8. - [Ollama Windows Preview](https://github.com/ollama/ollama/blob/main/docs/windows.md)
  9. - [Docker](https://hub.docker.com/r/ollama/ollama)
  10. ## Launch Ollama
  11. Decide which LLM you want to deploy ([here's a list for supported LLM](https://ollama.com/library)), say, **mistral**:
  12. ```bash
  13. $ ollama run mistral
  14. ```
  15. Or,
  16. ```bash
  17. $ docker exec -it ollama ollama run mistral
  18. ```
  19. ## Use Ollama in RAGFlow
  20. - Go to 'Settings > Model Providers > Models to be added > Ollama'.
  21. <div align="center" style="margin-top:20px;margin-bottom:20px;">
  22. <img src="https://github.com/infiniflow/ragflow/assets/12318111/a9df198a-226d-4f30-b8d7-829f00256d46" width="1300"/>
  23. </div>
  24. > Base URL: Enter the base URL where the Ollama service is accessible, like, `http://<your-ollama-endpoint-domain>:11434`.
  25. - Use Ollama Models.
  26. <div align="center" style="margin-top:20px;margin-bottom:20px;">
  27. <img src="https://github.com/infiniflow/ragflow/assets/12318111/60ff384e-5013-41ff-a573-9a543d237fd3" width="530"/>
  28. </div>