Kaynağa Gözat

Fix: Tried to fix the fid mis match under some cases (#7426)

### What problem does this PR solve?

https://github.com/infiniflow/ragflow/issues/7407

Based on this context, I think there should be some reasons that let
some LLMs have a mismatch (add the wrong "@xxx"),
So I think when use fid can not fetch llm then tried to just use name
should can fetch it.

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
tags/v0.19.0
Stephen Hu 6 ay önce
ebeveyn
işleme
2dbcc0a1bf
No account linked to committer's email address
1 değiştirilmiş dosya ile 4 ekleme ve 0 silme
  1. 4
    0
      api/db/services/llm_service.py

+ 4
- 0
api/db/services/llm_service.py Dosyayı Görüntüle



model_config = cls.get_api_key(tenant_id, mdlnm) model_config = cls.get_api_key(tenant_id, mdlnm)
mdlnm, fid = TenantLLMService.split_model_name_and_factory(mdlnm) mdlnm, fid = TenantLLMService.split_model_name_and_factory(mdlnm)
if not model_config: # for some cases seems fid mismatch
model_config = cls.get_api_key(tenant_id, mdlnm)
if model_config: if model_config:
model_config = model_config.to_dict() model_config = model_config.to_dict()
llm = LLMService.query(llm_name=mdlnm) if not fid else LLMService.query(llm_name=mdlnm, fid=fid) llm = LLMService.query(llm_name=mdlnm) if not fid else LLMService.query(llm_name=mdlnm, fid=fid)
if not llm and fid: # for some cases seems fid mismatch
llm = LLMService.query(llm_name=mdlnm)
if llm: if llm:
model_config["is_tools"] = llm[0].is_tools model_config["is_tools"] = llm[0].is_tools
if not model_config: if not model_config:

Loading…
İptal
Kaydet