You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

ollama.md 1.4KB

Ollama

One-click deployment of local LLMs, that is Ollama.

Install

Launch Ollama

Decide which LLM you want to deploy (here’s a list for supported LLM), say, mistral:

$ ollama run mistral

Or,

$ docker exec -it ollama ollama run mistral

Use Ollama in RAGFlow

  • Go to ‘Settings > Model Providers > Models to be added > Ollama’.

Base URL: Enter the base URL where the Ollama service is accessible, like, http://<your-ollama-endpoint-domain>:11434.

  • Use Ollama Models.