Pārlūkot izejas kodu

fix: o1 model error, use max_completion_tokens instead of max_tokens. (#12037)

Co-authored-by: 刘江波 <jiangbo721@163.com>
tags/0.15.0
jiangbo721 pirms 10 mēnešiem
vecāks
revīzija
c98d91e44d
Revīzijas autora e-pasta adrese nav piesaistīta nevienam kontam

+ 6
- 3
api/core/model_runtime/model_providers/azure_openai/llm/llm.py Parādīt failu

@@ -113,7 +113,7 @@ class AzureOpenAILargeLanguageModel(_CommonAzureOpenAI, LargeLanguageModel):
try:
client = AzureOpenAI(**self._to_credential_kwargs(credentials))

if "o1" in model:
if model.startswith("o1"):
client.chat.completions.create(
messages=[{"role": "user", "content": "ping"}],
model=model,
@@ -311,7 +311,10 @@ class AzureOpenAILargeLanguageModel(_CommonAzureOpenAI, LargeLanguageModel):
prompt_messages = self._clear_illegal_prompt_messages(model, prompt_messages)

block_as_stream = False
if "o1" in model:
if model.startswith("o1"):
if "max_tokens" in model_parameters:
model_parameters["max_completion_tokens"] = model_parameters["max_tokens"]
del model_parameters["max_tokens"]
if stream:
block_as_stream = True
stream = False
@@ -404,7 +407,7 @@ class AzureOpenAILargeLanguageModel(_CommonAzureOpenAI, LargeLanguageModel):
]
)

if "o1" in model:
if model.startswith("o1"):
system_message_count = len([m for m in prompt_messages if isinstance(m, SystemPromptMessage)])
if system_message_count > 0:
new_prompt_messages = []

Notiek ielāde…
Atcelt
Saglabāt