瀏覽代碼

Fix: claude max tokens. (#6484)

### What problem does this PR solve?

#6458

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
tags/v0.18.0
Kevin Hu 7 月之前
父節點
當前提交
095fc84cf2
沒有連結到貢獻者的電子郵件帳戶。
共有 1 個檔案被更改,包括 2 行新增2 行删除
  1. 2
    2
      rag/llm/chat_model.py

+ 2
- 2
rag/llm/chat_model.py 查看文件

@@ -1443,7 +1443,7 @@ class AnthropicChat(Base):
del gen_conf["presence_penalty"]
if "frequency_penalty" in gen_conf:
del gen_conf["frequency_penalty"]
gen_conf["max_tokens"] = 8196
gen_conf["max_tokens"] = 8192
if "haiku" in self.model_name or "opus" in self.model_name:
gen_conf["max_tokens"] = 4096

@@ -1477,7 +1477,7 @@ class AnthropicChat(Base):
del gen_conf["presence_penalty"]
if "frequency_penalty" in gen_conf:
del gen_conf["frequency_penalty"]
gen_conf["max_tokens"] = 8196
gen_conf["max_tokens"] = 8192
if "haiku" in self.model_name or "opus" in self.model_name:
gen_conf["max_tokens"] = 4096


Loading…
取消
儲存