Ver código fonte

Refa: Update LLM stream response type to Generator (#9420)

### What problem does this PR solve?

Change return type of _generate_streamly from str to Generator[str,
None, None] to properly type hint streaming responses.

### Type of change

- [x] Refactoring
tags/v0.20.2
Liu An 2 meses atrás
pai
commit
d7b4e84cda
Nenhuma conta vinculada ao e-mail do autor do commit
1 arquivos alterados com 2 adições e 2 exclusões
  1. 2
    2
      agent/component/llm.py

+ 2
- 2
agent/component/llm.py Ver arquivo

import logging import logging
import os import os
import re import re
from typing import Any
from typing import Any, Generator


import json_repair import json_repair
from copy import deepcopy from copy import deepcopy
return self.chat_mdl.chat(msg[0]["content"], msg[1:], self._param.gen_conf(), **kwargs) return self.chat_mdl.chat(msg[0]["content"], msg[1:], self._param.gen_conf(), **kwargs)
return self.chat_mdl.chat(msg[0]["content"], msg[1:], self._param.gen_conf(), images=self.imgs, **kwargs) return self.chat_mdl.chat(msg[0]["content"], msg[1:], self._param.gen_conf(), images=self.imgs, **kwargs)


def _generate_streamly(self, msg:list[dict], **kwargs) -> str:
def _generate_streamly(self, msg:list[dict], **kwargs) -> Generator[str, None, None]:
ans = "" ans = ""
last_idx = 0 last_idx = 0
endswith_think = False endswith_think = False

Carregando…
Cancelar
Salvar