瀏覽代碼

Fix issue with `keep_alive=-1` for ollama chat model by allowing a user to set an additional configuration option (#9017)

### What problem does this PR solve?

fix issue with `keep_alive=-1` for ollama chat model by allowing a user
to set an additional configuration option. It is no-breaking change
because it still uses a previous default value such as: `keep_alive=-1`

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
- [X] Performance Improvement
- [X] Other (please describe):
- Additional configuration option has been added to control behavior of
RAGFlow while working with ollama LLM
tags/v0.20.0
Viktor Dmitriyev 3 月之前
父節點
當前提交
b47dcc9108
沒有連結到貢獻者的電子郵件帳戶。
共有 1 個檔案被更改,包括 3 行新增2 行删除
  1. 3
    2
      rag/llm/chat_model.py

+ 3
- 2
rag/llm/chat_model.py 查看文件

@@ -663,6 +663,7 @@ class OllamaChat(Base):

self.client = Client(host=base_url) if not key or key == "x" else Client(host=base_url, headers={"Authorization": f"Bearer {key}"})
self.model_name = model_name
self.keep_alive = kwargs.get("ollama_keep_alive", int(os.environ.get("OLLAMA_KEEP_ALIVE", -1)))

def _clean_conf(self, gen_conf):
options = {}
@@ -679,7 +680,7 @@ class OllamaChat(Base):
ctx_size = self._calculate_dynamic_ctx(history)

gen_conf["num_ctx"] = ctx_size
response = self.client.chat(model=self.model_name, messages=history, options=gen_conf, keep_alive=-1)
response = self.client.chat(model=self.model_name, messages=history, options=gen_conf, keep_alive=self.keep_alive)
ans = response["message"]["content"].strip()
token_count = response.get("eval_count", 0) + response.get("prompt_eval_count", 0)
return ans, token_count
@@ -706,7 +707,7 @@ class OllamaChat(Base):

ans = ""
try:
response = self.client.chat(model=self.model_name, messages=history, stream=True, options=options, keep_alive=-1)
response = self.client.chat(model=self.model_name, messages=history, stream=True, options=options, keep_alive=self.keep_alive)
for resp in response:
if resp["done"]:
token_count = resp.get("prompt_eval_count", 0) + resp.get("eval_count", 0)

Loading…
取消
儲存