Nevar pievienot vairāk kā 25 tēmas Tēmai ir jāsākas ar burtu vai ciparu, tā var saturēt domu zīmes ('-') un var būt līdz 35 simboliem gara.
orangeclk a24717765e
feat: forward zhipu finish_reason (#7560)
pirms 1 gada
..
.idea update nltk version to 3.8.1 (#7544) pirms 1 gada
.vscode chore: remove .idea and .vscode from root path (#7437) pirms 1 gada
configs chore: support CODE_MAX_PRECISION (#7484) pirms 1 gada
constants chore(api): Introduce Ruff Formatter. (#7291) pirms 1 gada
contexts chore(api): Introduce Ruff Formatter. (#7291) pirms 1 gada
controllers Feat/7134 use dataset api create a dataset with permission (#7508) pirms 1 gada
core feat: forward zhipu finish_reason (#7560) pirms 1 gada
docker fix: ensure db migration in docker entry script running with `upgrade-db` command for proper locking (#6946) pirms 1 gada
events feat: custom app icon (#7196) pirms 1 gada
extensions fix(storage): 🐛 Create S3 bucket if it doesn't exist (#7514) pirms 1 gada
fields feat: Sort conversations by updated_at desc (#7348) pirms 1 gada
libs feat: custom app icon (#7196) pirms 1 gada
migrations chore(database): Rename table name from `workflow__conversation_variables` to `workflow_conversation_variables`. (#7432) pirms 1 gada
models Feat/7134 use dataset api create a dataset with permission (#7508) pirms 1 gada
schedule chore(api): Introduce Ruff Formatter. (#7291) pirms 1 gada
services Feat/7134 use dataset api create a dataset with permission (#7508) pirms 1 gada
tasks chore: update docstrings (#7343) pirms 1 gada
templates feat: implement forgot password feature (#5534) pirms 1 gada
tests Chore/remove python dependencies selector (#7494) pirms 1 gada
.dockerignore build: support Poetry for depencencies tool in api's Dockerfile (#5105) pirms 1 gada
.env.example Feat/7134 use dataset api create a dataset with permission (#7508) pirms 1 gada
Dockerfile add nltk punkt resource (#7063) pirms 1 gada
README.md Chores: add missing profile for middleware docker compose cmd and fix ssrf-proxy doc link (#6372) pirms 1 gada
app.py chore(api): Introduce Ruff Formatter. (#7291) pirms 1 gada
commands.py chore(api): Introduce Ruff Formatter. (#7291) pirms 1 gada
poetry.lock update nltk version to 3.8.1 (#7544) pirms 1 gada
poetry.toml build: initial support for poetry build tool (#4513) pirms 1 gada
pyproject.toml update nltk version to 3.8.1 (#7544) pirms 1 gada

README.md

Dify Backend API

Usage

[!IMPORTANT] In the v0.6.12 release, we deprecated pip as the package management tool for Dify API Backend service and replaced it with poetry.

  1. Start the docker-compose stack

The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

   cd ../docker
   cp middleware.env.example middleware.env
   # change the profile to other vector database if you are not using weaviate
   docker compose -f docker-compose.middleware.yaml --profile weaviate -p dify up -d
   cd ../api
  1. Copy .env.example to .env
  2. Generate a SECRET_KEY in the .env file.
   sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
   secret_key=$(openssl rand -base64 42)
   sed -i '' "/^SECRET_KEY=/c\\
   SECRET_KEY=${secret_key}" .env
  1. Create environment.

Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

  1. Install dependencies
   poetry env use 3.10
   poetry install

In case of contributors missing to update dependencies for pyproject.toml, you can perform the following shell instead.

   poetry shell                                               # activate current environment
   poetry add $(cat requirements.txt)           # install dependencies of production and update pyproject.toml
   poetry add $(cat requirements-dev.txt) --group dev    # install dependencies of development and update pyproject.toml
  1. Run migrate

Before the first launch, migrate the database to the latest version.

   poetry run python -m flask db upgrade
  1. Start backend
   poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
  1. Start Dify web service.
  2. Setup your application by visiting http://localhost:3000
  3. If you need to debug local async processing, please start the worker service.
   poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

Testing

  1. Install dependencies for both the backend and the test environment
   poetry install --with dev
  1. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml
   cd ../
   poetry run -C api bash dev/pytest/pytest_all_tests.sh