瀏覽代碼

fix(web): optimize prompt change logic for LLM nodes (#20841) (#20865)

tags/1.4.2
HyaCinth 4 月之前
父節點
當前提交
fc6e2d14a5
沒有連結到貢獻者的電子郵件帳戶。
共有 2 個檔案被更改,包括 3 行新增3 行删除
  1. 2
    2
      web/app/components/workflow/nodes/llm/use-config.ts
  2. 1
    1
      web/app/components/workflow/types.ts

+ 2
- 2
web/app/components/workflow/nodes/llm/use-config.ts 查看文件

@@ -247,11 +247,11 @@ const useConfig = (id: string, payload: LLMNodeType) => {
}, [inputs, setInputs])

const handlePromptChange = useCallback((newPrompt: PromptItem[] | PromptItem) => {
const newInputs = produce(inputRef.current, (draft) => {
const newInputs = produce(inputs, (draft) => {
draft.prompt_template = newPrompt
})
setInputs(newInputs)
}, [setInputs])
}, [inputs, setInputs])

const handleMemoryChange = useCallback((newMemory?: Memory) => {
const newInputs = produce(inputs, (draft) => {

+ 1
- 1
web/app/components/workflow/types.ts 查看文件

@@ -198,7 +198,7 @@ export type InputVar = {
hint?: string
options?: string[]
value_selector?: ValueSelector
hide: boolean
hide?: boolean
} & Partial<UploadFileSetting>

export type ModelConfig = {

Loading…
取消
儲存