Преглед изворни кода

fix(web): optimize prompt change logic for LLM nodes (#20841) (#20865)

tags/1.4.2
HyaCinth пре 4 месеци
родитељ
комит
fc6e2d14a5
No account linked to committer's email address

+ 2
- 2
web/app/components/workflow/nodes/llm/use-config.ts Прегледај датотеку

@@ -247,11 +247,11 @@ const useConfig = (id: string, payload: LLMNodeType) => {
}, [inputs, setInputs])

const handlePromptChange = useCallback((newPrompt: PromptItem[] | PromptItem) => {
const newInputs = produce(inputRef.current, (draft) => {
const newInputs = produce(inputs, (draft) => {
draft.prompt_template = newPrompt
})
setInputs(newInputs)
}, [setInputs])
}, [inputs, setInputs])

const handleMemoryChange = useCallback((newMemory?: Memory) => {
const newInputs = produce(inputs, (draft) => {

+ 1
- 1
web/app/components/workflow/types.ts Прегледај датотеку

@@ -198,7 +198,7 @@ export type InputVar = {
hint?: string
options?: string[]
value_selector?: ValueSelector
hide: boolean
hide?: boolean
} & Partial<UploadFileSetting>

export type ModelConfig = {

Loading…
Откажи
Сачувај