Skip to content

Commit

Permalink
Merge branch 'main' into feat/new-login
Browse files Browse the repository at this point in the history
* main: (64 commits)
  chore: Enable Japanese descriptions for Tools (#8646)
  Make WORKFLOW_* configurable as environment variables. (#8644)
  feat: add deepseek-v2.5 for model provider siliconflow (#8639)
  docs: fix predefined_model_scale_out.md redirect error (#8633)
  feat: add qwen2.5 for model provider siliconflow (#8630)
  fix: send message error when chatting with opening statement (#8627)
  fix: llm_generator.py JSONDecodeError (#8504)
  fix: commands.py (#8483)
  fix: redundant check for available_document_count (#8491)
  chore: enhance configuration descriptions (#8624)
  chore: add Gemini newest experimental models (close #7121) (#8621)
  feat: support o1 series models for openrouter (#8358)
  fix: form input add tabIndex (#8478)
  Add model parameter translation (#8509)
  feat(tools/cogview):  Updated cogview tool to support cogview-3 and the latest cogview-3-plus (#8382)
  Add Fireworks AI as new model provider (#8428)
  feat:use xinference tts stream mode (#8616)
  docs: Add Japanese documentation for tools (#8469)
  feat: regenerate in `Chat`, `agent` and `Chatflow` app (#7661)
  feat: update pyproject.toml (#8368)
  ...
  • Loading branch information
ZhouhaoJiang committed Sep 23, 2024
2 parents 799ff30 + 03fdf5e commit f6f6bb1
Show file tree
Hide file tree
Showing 450 changed files with 16,703 additions and 1,542 deletions.
9 changes: 9 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -153,6 +153,9 @@ docker-legacy/volumes/etcd/*
docker-legacy/volumes/minio/*
docker-legacy/volumes/milvus/*
docker-legacy/volumes/chroma/*
docker-legacy/volumes/opensearch/data/*
docker-legacy/volumes/pgvectors/data/*
docker-legacy/volumes/pgvector/data/*

docker/volumes/app/storage/*
docker/volumes/certbot/*
Expand All @@ -164,6 +167,12 @@ docker/volumes/etcd/*
docker/volumes/minio/*
docker/volumes/milvus/*
docker/volumes/chroma/*
docker/volumes/opensearch/data/*
docker/volumes/myscale/data/*
docker/volumes/myscale/log/*
docker/volumes/unstructured/*
docker/volumes/pgvector/data/*
docker/volumes/pgvecto_rs/data/*

docker/nginx/conf.d/default.conf
docker/middleware.env
Expand Down
4 changes: 2 additions & 2 deletions CONTRIBUTING_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
| 被团队成员标记为高优先级的功能 | 高优先级 |
|[community feedback board](https://github.com/langgenius/dify/discussions/categories/feedbacks) 内反馈的常见功能请求 | 中等优先级 |
| 非核心功能和小幅改进 | 低优先级 |
| 有价值当不紧急 | 未来功能 |
| 有价值但不紧急 | 未来功能 |

### 其他任何事情(例如 bug 报告、性能优化、拼写错误更正):
* 立即开始编码。
Expand Down Expand Up @@ -138,7 +138,7 @@ Dify 的后端使用 Python 编写,使用 [Flask](https://flask.palletsproject
├── models // 描述数据模型和 API 响应的形状
├── public // 如 favicon 等元资源
├── service // 定义 API 操作的形状
├── test
├── test
├── types // 函数参数和返回值的描述
└── utils // 共享的实用函数
```
Expand Down
4 changes: 1 addition & 3 deletions api/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,14 +65,12 @@

8. Start Dify [web](../web) service.
9. Setup your application by visiting `http://localhost:3000`...
10. If you need to debug local async processing, please start the worker service.
10. If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service.

```bash
poetry run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion
```

The started celery app handles the async tasks, e.g. dataset importing and documents indexing.

## Testing

1. Install dependencies for both the backend and the test environment
Expand Down
130 changes: 62 additions & 68 deletions api/commands.py

Large diffs are not rendered by default.

12 changes: 6 additions & 6 deletions api/configs/deploy/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,30 +4,30 @@

class DeploymentConfig(BaseSettings):
"""
Deployment configs
Configuration settings for application deployment
"""

APPLICATION_NAME: str = Field(
description="application name",
description="Name of the application, used for identification and logging purposes",
default="langgenius/dify",
)

DEBUG: bool = Field(
description="whether to enable debug mode.",
description="Enable debug mode for additional logging and development features",
default=False,
)

TESTING: bool = Field(
description="",
description="Enable testing mode for running automated tests",
default=False,
)

EDITION: str = Field(
description="deployment edition",
description="Deployment edition of the application (e.g., 'SELF_HOSTED', 'CLOUD')",
default="SELF_HOSTED",
)

DEPLOY_ENV: str = Field(
description="deployment environment, default to PRODUCTION.",
description="Deployment environment (e.g., 'PRODUCTION', 'DEVELOPMENT'), default to PRODUCTION",
default="PRODUCTION",
)
6 changes: 3 additions & 3 deletions api/configs/enterprise/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,17 +4,17 @@

class EnterpriseFeatureConfig(BaseSettings):
"""
Enterprise feature configs.
Configuration for enterprise-level features.
**Before using, please contact [email protected] by email to inquire about licensing matters.**
"""

ENTERPRISE_ENABLED: bool = Field(
description="whether to enable enterprise features."
description="Enable or disable enterprise-level features."
"Before using, please contact [email protected] by email to inquire about licensing matters.",
default=False,
)

CAN_REPLACE_LOGO: bool = Field(
description="whether to allow replacing enterprise logo.",
description="Allow customization of the enterprise logo.",
default=False,
)
13 changes: 7 additions & 6 deletions api/configs/extra/notion_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,30 +6,31 @@

class NotionConfig(BaseSettings):
"""
Notion integration configs
Configuration settings for Notion integration
"""

NOTION_CLIENT_ID: Optional[str] = Field(
description="Notion client ID",
description="Client ID for Notion API authentication. Required for OAuth 2.0 flow.",
default=None,
)

NOTION_CLIENT_SECRET: Optional[str] = Field(
description="Notion client secret key",
description="Client secret for Notion API authentication. Required for OAuth 2.0 flow.",
default=None,
)

NOTION_INTEGRATION_TYPE: Optional[str] = Field(
description="Notion integration type, default to None, available values: internal.",
description="Type of Notion integration."
" Set to 'internal' for internal integrations, or None for public integrations.",
default=None,
)

NOTION_INTERNAL_SECRET: Optional[str] = Field(
description="Notion internal secret key",
description="Secret key for internal Notion integrations. Required when NOTION_INTEGRATION_TYPE is 'internal'.",
default=None,
)

NOTION_INTEGRATION_TOKEN: Optional[str] = Field(
description="Notion integration token",
description="Integration token for Notion API access. Used for direct API calls without OAuth flow.",
default=None,
)
11 changes: 7 additions & 4 deletions api/configs/extra/sentry_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,20 +6,23 @@

class SentryConfig(BaseSettings):
"""
Sentry configs
Configuration settings for Sentry error tracking and performance monitoring
"""

SENTRY_DSN: Optional[str] = Field(
description="Sentry DSN",
description="Sentry Data Source Name (DSN)."
" This is the unique identifier of your Sentry project, used to send events to the correct project.",
default=None,
)

SENTRY_TRACES_SAMPLE_RATE: NonNegativeFloat = Field(
description="Sentry trace sample rate",
description="Sample rate for Sentry performance monitoring traces."
" Value between 0.0 and 1.0, where 1.0 means 100% of traces are sent to Sentry.",
default=1.0,
)

SENTRY_PROFILES_SAMPLE_RATE: NonNegativeFloat = Field(
description="Sentry profiles sample rate",
description="Sample rate for Sentry profiling."
" Value between 0.0 and 1.0, where 1.0 means 100% of profiles are sent to Sentry.",
default=1.0,
)
Loading

0 comments on commit f6f6bb1

Please sign in to comment.