소스 검색

Fix: claude max tokens. (#6484)

### What problem does this PR solve?

#6458

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
tags/v0.18.0
Kevin Hu 7 달 전
부모
커밋
095fc84cf2
No account linked to committer's email address
1개의 변경된 파일2개의 추가작업 그리고 2개의 파일을 삭제
  1. 2
    2
      rag/llm/chat_model.py

+ 2
- 2
rag/llm/chat_model.py 파일 보기

del gen_conf["presence_penalty"] del gen_conf["presence_penalty"]
if "frequency_penalty" in gen_conf: if "frequency_penalty" in gen_conf:
del gen_conf["frequency_penalty"] del gen_conf["frequency_penalty"]
gen_conf["max_tokens"] = 8196
gen_conf["max_tokens"] = 8192
if "haiku" in self.model_name or "opus" in self.model_name: if "haiku" in self.model_name or "opus" in self.model_name:
gen_conf["max_tokens"] = 4096 gen_conf["max_tokens"] = 4096


del gen_conf["presence_penalty"] del gen_conf["presence_penalty"]
if "frequency_penalty" in gen_conf: if "frequency_penalty" in gen_conf:
del gen_conf["frequency_penalty"] del gen_conf["frequency_penalty"]
gen_conf["max_tokens"] = 8196
gen_conf["max_tokens"] = 8192
if "haiku" in self.model_name or "opus" in self.model_name: if "haiku" in self.model_name or "opus" in self.model_name:
gen_conf["max_tokens"] = 4096 gen_conf["max_tokens"] = 4096



Loading…
취소
저장