Você não pode selecionar mais de 25 tópicos Os tópicos devem começar com uma letra ou um número, podem incluir traços ('-') e podem ter até 35 caracteres.
takatost ec1d3ddee2
feat: support importing and overwriting workflow DSL (#5511)
1 ano atrás
..
configs chore: use singular style in middleware config class name (#5502) 1 ano atrás
constants feat: make Citations and Attributions display enable default (#5508) 1 ano atrás
controllers feat: support importing and overwriting workflow DSL (#5511) 1 ano atrás
core The firecrawl tool now supports self-hosting (#5528) 1 ano atrás
docker feat: add `flask upgrade-db` command for running db upgrade with redis lock (#5333) 1 ano atrás
events feat: support opensearch approximate k-NN (#5322) 1 ano atrás
extensions feat: introduce pydantic-settings for config definition and validation (#5202) 1 ano atrás
fields feat: option to hide workflow steps (#5436) 1 ano atrás
libs feat(api/auth): switch-to-stateful-authentication (#5438) 1 ano atrás
migrations refactor: extract db configs and celery configs into dify config (#5491) 1 ano atrás
models feat: make Citations and Attributions display enable default (#5508) 1 ano atrás
schedule Feat/dify rag (#2528) 1 ano atrás
services feat: support importing and overwriting workflow DSL (#5511) 1 ano atrás
tasks Feat/firecrawl data source (#5232) 1 ano atrás
templates fix: email template style (#1914) 1 ano atrás
tests chore: refactor the http executor node (#5212) 1 ano atrás
.dockerignore build: support Poetry for depencencies tool in api's Dockerfile (#5105) 1 ano atrás
.env.example add opensearch default value (#5536) 1 ano atrás
Dockerfile fix: apply best practices for the latest buildkit (#5527) 1 ano atrás
README.md chore: remove pip support for api service (#5453) 1 ano atrás
app.py chore: use singular style in config class name (#5489) 1 ano atrás
commands.py feat: support opensearch approximate k-NN (#5322) 1 ano atrás
config.py feat: Add program_name attribute to TiDB connection (#5499) 1 ano atrás
poetry.lock Add Oracle23ai as a vector datasource (#5342) 1 ano atrás
poetry.toml build: initial support for poetry build tool (#4513) 1 ano atrás
pyproject.toml Add Oracle23ai as a vector datasource (#5342) 1 ano atrás
requirements.txt chore: remove pip support for api service (#5453) 1 ano atrás

README.md

Dify Backend API

Usage

[!IMPORTANT] In the v0.6.12 release, we deprecated pip as the package management tool for Dify API Backend service and replaced it with poetry.

  1. Start the docker-compose stack

The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

   cd ../docker
   docker-compose -f docker-compose.middleware.yaml -p dify up -d
   cd ../api
  1. Copy .env.example to .env
  2. Generate a SECRET_KEY in the .env file.
   sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
   secret_key=$(openssl rand -base64 42)
   sed -i '' "/^SECRET_KEY=/c\\
   SECRET_KEY=${secret_key}" .env
  1. Create environment.

Dify API service uses Poetry to manage dependencies. You can execute poetry shell to activate the environment.

  1. Install dependencies
   poetry env use 3.10
   poetry install

In case of contributors missing to update dependencies for pyproject.toml, you can perform the following shell instead.

   poetry shell                                               # activate current environment
   poetry add $(cat requirements.txt)           # install dependencies of production and update pyproject.toml
   poetry add $(cat requirements-dev.txt) --group dev    # install dependencies of development and update pyproject.toml
  1. Run migrate

Before the first launch, migrate the database to the latest version.

   poetry run python -m flask db upgrade
  1. Start backend
   poetry run python -m flask run --host 0.0.0.0 --port=5001 --debug
  1. Start Dify web service.
  2. Setup your application by visiting http://localhost:3000
  3. If you need to debug local async processing, please start the worker service.
   poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

Testing

  1. Install dependencies for both the backend and the test environment
   poetry install --with dev
  1. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml
   cd ../
   poetry run -C api bash dev/pytest/pytest_all_tests.sh