dify/api
minodisk 84ff43d9b7 fix: use query params instead of request body for decode_plugin_from_identifier
The decode_plugin_from_identifier endpoint was sending plugin_unique_identifier
in the request body with a GET request. This causes issues with HTTP intermediaries
like Google Cloud Run's frontend, which rejects GET requests with a body as
malformed (returning 400 Bad Request).

Changed from `data=` (request body) to `params=` (query parameters), which is:
- Consistent with similar GET endpoints (fetch_plugin_manifest, fetch_plugin_by_identifier)
- Compliant with HTTP standards (GET requests should not have semantic body content)
- Compatible with Cloud Run and other HTTP proxies/load balancers

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-07 21:23:07 +09:00
..
.idea
.vscode
agent_skills
configs fix: fix WorkflowExecution.outputs containing non-JSON-serializable o… (#30464) 2026-01-05 10:57:23 +08:00
constants feat(i18n): add Tunisian Arabic (ar-TN) translation (#29306) 2025-12-13 10:55:04 +08:00
contexts
controllers chore: use from __future__ import annotations (#30254) 2026-01-06 23:57:20 +09:00
core fix: use query params instead of request body for decode_plugin_from_identifier 2026-01-07 21:23:07 +09:00
docker feat: use more universal C.UTF-8 instead of en_US.UTF-8 (#30621) 2026-01-06 16:39:04 +08:00
enums
events fix: delete knowledge pipeline but pipeline and workflow don't delete (#29591) 2025-12-15 12:00:03 +08:00
extensions chore: use from __future__ import annotations (#30254) 2026-01-06 23:57:20 +09:00
factories fix: webhook node output file as file variable (#29621) 2025-12-15 19:55:59 +08:00
fields refactor: remove unnecessary type: ignore from rag_pipeline_fields.py (#30666) 2026-01-07 14:40:35 +08:00
libs chore: use from __future__ import annotations (#30254) 2026-01-06 23:57:20 +09:00
migrations feat: optimize for migration versions (#28787) 2026-01-03 21:33:20 +09:00
models chore: use from __future__ import annotations (#30254) 2026-01-06 23:57:20 +09:00
repositories
schedule fix: Incorrect REDIS ssl variable used for Celery causing Celery unable to start (#29605) 2025-12-31 10:26:28 +08:00
services chore: use from __future__ import annotations (#30254) 2026-01-06 23:57:20 +09:00
tasks refactor(api): clarify published RAG pipeline invoke naming (#30644) 2026-01-06 23:48:06 +09:00
templates
tests fix: fix assign value stand as default (#30651) 2026-01-07 14:54:11 +08:00
.dockerignore
.env.example feat(logstore): make `graph` field optional via env variable LOGSTORE… (#30554) 2026-01-05 16:12:41 +08:00
.importlinter feat: Add conversation variable persistence layer (#30531) 2026-01-06 14:05:33 +08:00
.ruff.toml refactor: split changes for api/services/tools/api_tools_manage_servi… (#29899) 2025-12-31 10:24:35 +08:00
AGENTS.md
Dockerfile chore: Harden API image Node.js runtime install (#30497) 2026-01-05 21:19:26 +09:00
README.md feat: sandbox retention basic settings (#29842) 2025-12-18 14:16:23 +08:00
app.py
app_factory.py feat(refactoring): Support Structured Logging (JSON) (#30170) 2026-01-04 11:46:46 +08:00
celery_entrypoint.py
cnt_base.sh
commands.py refactor(models): Add mapped type hints to MessageAnnotation (#27751) 2026-01-05 15:50:03 +08:00
dify_app.py
gunicorn.conf.py
pyproject.toml chore: bump version to 1.11.2 (#30088) 2025-12-25 16:16:24 +08:00
pyrefly.toml refactor: port reqparse to Pydantic model (#28949) 2025-12-05 13:05:53 +09:00
pyrightconfig.json
pytest.ini test: Consolidate API CI test runner (#29440) 2025-12-15 13:20:31 +08:00
ty.toml
uv.lock chore(deps-dev): bump intersystems-irispython from 5.3.0 to 5.3.1 in /api (#30540) 2026-01-05 10:47:39 +08:00

README.md

Dify Backend API

Usage

[!IMPORTANT]

In the v1.3.0 release, poetry has been replaced with uv as the package manager for Dify API backend service.

  1. Start the docker-compose stack

    The backend require some middleware, including PostgreSQL, Redis, and Weaviate, which can be started together using docker-compose.

    cd ../docker
    cp middleware.env.example middleware.env
    # change the profile to mysql if you are not using postgres,change the profile to other vector database if you are not using weaviate
    docker compose -f docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d
    cd ../api
    
  2. Copy .env.example to .env

    cp .env.example .env
    

[!IMPORTANT]

When the frontend and backend run on different subdomains, set COOKIE_DOMAIN to the sites top-level domain (e.g., example.com). The frontend and backend must be under the same top-level domain in order to share authentication cookies.

  1. Generate a SECRET_KEY in the .env file.

    bash for Linux

    sed -i "/^SECRET_KEY=/c\SECRET_KEY=$(openssl rand -base64 42)" .env
    

    bash for Mac

    secret_key=$(openssl rand -base64 42)
    sed -i '' "/^SECRET_KEY=/c\\
    SECRET_KEY=${secret_key}" .env
    
  2. Create environment.

    Dify API service uses UV to manage dependencies. First, you need to add the uv package manager, if you don't have it already.

    pip install uv
    # Or on macOS
    brew install uv
    
  3. Install dependencies

    uv sync --dev
    
  4. Run migrate

    Before the first launch, migrate the database to the latest version.

    uv run flask db upgrade
    
  5. Start backend

    uv run flask run --host 0.0.0.0 --port=5001 --debug
    
  6. Start Dify web service.

  7. Setup your application by visiting http://localhost:3000.

  8. If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service.

uv run celery -A app.celery worker -P threads -c 2 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor,retention

Additionally, if you want to debug the celery scheduled tasks, you can run the following command in another terminal to start the beat service:

uv run celery -A app.celery beat

Testing

  1. Install dependencies for both the backend and the test environment

    uv sync --dev
    
  2. Run the tests locally with mocked system environment variables in tool.pytest_env section in pyproject.toml, more can check Claude.md

    uv run pytest                           # Run all tests
    uv run pytest tests/unit_tests/         # Unit tests only
    uv run pytest tests/integration_tests/  # Integration tests
    
    # Code quality
    ../dev/reformat               # Run all formatters and linters
    uv run ruff check --fix ./    # Fix linting issues
    uv run ruff format ./         # Format code
    uv run basedpyright .         # Type checking