Compare commits

...

1516 Commits

Author SHA1 Message Date
zhsama
b8236b29f3 Merge branch 'feat/end-user-oauth' into deploy/end-user-oauth 2025-12-04 17:50:16 +08:00
zhsama
eb0e63c336 Merge remote-tracking branch 'origin/main' into feat/end-user-oauth
# Conflicts:
#	web/app/components/app/configuration/config/agent/agent-tools/index.tsx
2025-12-04 16:16:04 +08:00
kenwoodjw
f62926f0ca fix: bump pyarrow to 17.0.0, werkzeug to 3.1.4, urllib3 to 2.5.0 (#29089)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-12-04 15:39:31 +08:00
NFish
b033bb02fc chore: upgrade React to 19.2.1,fix cve-2025-55182 (#29121)
Co-authored-by: zhsama <torvalds@linux.do>
2025-12-04 14:44:52 +08:00
zyssyz123
031cba81b4 Fix/app list compatible (#29123) 2025-12-04 14:44:24 +08:00
yangzheli
693ab6ad82 fix(web): disable tooltip delay to avoid tooltip flickering (#29104) 2025-12-04 14:16:56 +08:00
NFish
541fd7daa2 chore: update Next.js dev dependencies to 15.5.7 (#29120) 2025-12-04 14:16:45 +08:00
Boris Polonsky
61d79a1502 feat: Unify environment variables for database connection and authentication (#29092) 2025-12-04 14:16:11 +08:00
Yunlu Wen
03357ff1ec fix: catch error in response converter (#29056)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-04 11:25:16 +08:00
dependabot[bot]
b4bed94cc5 chore(deps): bump next from 15.5.6 to 15.5.7 in /web (#29105)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-04 10:14:50 +08:00
wangxiaolei
e924dc7b30 chore: ignore redis lock not owned error (#29064)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-04 10:14:28 +08:00
longbingljw
4b969bdce3 fix:mysql does not support 'returning' (#29069) 2025-12-04 10:14:19 +08:00
非法操作
d07afb38a0 fix: trigger call workflow_as_tool error (#29058) 2025-12-04 10:13:18 +08:00
hj24
5bb715ee2f fix: remove chat conversation api dead arg message_count_gte (#29097) 2025-12-04 10:12:47 +08:00
非法操作
3e5f683e90 feat: dark theme icon support (#28858) 2025-12-04 09:29:00 +08:00
zhsama
31481581e8 refactor: simplify marketplace component structure by removing unused… (#29095) 2025-12-03 21:30:24 +08:00
zhsama
16dc744908 Merge remote-tracking branch 'origin/main' into feat/end-user-oauth 2025-12-03 18:44:20 +08:00
yyh
2e0c2e8482 refactor/marketplace react query (#29028)
Co-authored-by: zhsama <torvalds@linux.do>
2025-12-03 18:30:20 +08:00
zhsama
0343374d52 feat: add ReactScan component for enhanced development scanning (#29086) 2025-12-03 18:19:12 +08:00
Joel
c1fe394c0e fix: check education verify api slow may cause page redirects when modal closes (#29078) 2025-12-03 17:11:57 +08:00
zhsama
f3258bab9e Merge remote-tracking branch 'origin/main' into feat/end-user-oauth 2025-12-03 16:36:24 +08:00
Joel
876f48df76 chore: remove useless mock files (#29068) 2025-12-03 15:34:11 +08:00
Coding On Star
fbb2d076f4 integrate Amplitude analytics into the application (#29049)
Co-authored-by: CodingOnStar <hanxujiang@dify.ai>
Co-authored-by: Joel <iamjoel007@gmail.com>
2025-12-03 14:22:12 +08:00
非法操作
c7d2a13524 fix: improve chat message log feedback (#29045)
Co-authored-by: yyh <yuanyouhuilyz@gmail.com>
2025-12-03 13:42:40 +08:00
kenwoodjw
9b9588f20d fix: CVE-2025-64718 (#29027)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-12-02 21:49:57 +08:00
wangxiaolei
d6bbf0f975 chore: enhance test (#29002)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-02 21:49:08 +08:00
wangxiaolei
f48522e923 feat: add x-trace-id to http responses and logs (#29015)
Introduce trace id to http responses and logs to facilitate debugging process.
2025-12-02 17:22:34 +08:00
yyh
f8b10c2272 Refactor apps service toward TanStack Query (#29004) 2025-12-02 15:18:33 +08:00
carribean
369892634d [Bugfix] Fixed an issue with UUID type queries in MySQL databases (#28941) 2025-12-02 14:37:23 +08:00
yyh
8e5cb86409 Stop showing slash commands in general Go to Anything search (#29012) 2025-12-02 14:24:21 +08:00
zhsama
472f8a8a2b Merge branch 'origin-main' into feat/end-user-oauth 2025-12-02 13:30:53 +08:00
Gritty_dev
a85afe4d07 feat: complete test script of plugin manager (#28967)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-12-02 11:25:08 +08:00
wangxiaolei
e8f93380d1 Fix validation (#28985)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-02 10:25:52 +08:00
yyh
0a22bc5d05 fix(web): use atomic selectors in AccessControlItem (#28983) 2025-12-01 19:23:42 +08:00
yyh
626d4f3e35 fix(web): use atomic selectors to fix Zustand v5 infinite loop (#28977) 2025-12-01 15:45:50 +08:00
dependabot[bot]
f4db5f9973 chore(deps): bump faker from 32.1.0 to 38.2.0 in /api (#28964)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-01 15:45:39 +08:00
Gritty_dev
70dabe318c feat: complete test script of mail send task (#28963) 2025-12-01 15:45:22 +08:00
dependabot[bot]
f94972f662 chore(deps): bump @lexical/list from 0.36.2 to 0.38.2 in /web (#28961)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-01 15:44:52 +08:00
zhsama
c8807d3f89 feat: add service connection panel and translations for service connection messages 2025-12-01 15:23:46 +08:00
wangxiaolei
d162f7e5ef feat(api): automatically NODE_TYPE_CLASSES_MAPPING generation from node class definitions (#28525) 2025-12-01 14:14:19 +08:00
zhsama
17e1de18d5 Merge branch 'feat/end-user-oauth' into deploy/end-user-oauth 2025-12-01 12:26:29 +08:00
zhsama
5e6053b367 Merge branch 'origin-main' into feat/end-user-oauth 2025-12-01 12:26:01 +08:00
dependabot[bot]
2f8cb2a1af chore(deps): bump @lexical/text from 0.36.2 to 0.38.2 in /web (#28960)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-01 09:56:58 +08:00
Stephen Zhou
b91d22375f fix: moving focus after navigations (#28937) 2025-12-01 09:55:04 +08:00
yyh
a087ace697 chore(web): upgrade zustand from v4.5.7 to v5.0.9 (#28943) 2025-12-01 09:53:19 +08:00
Conner Mo
0af8a7b958 feat: enhance OceanBase vector database with SQL injection fixes, unified processing, and improved error handling (#28951) 2025-12-01 09:51:47 +08:00
Gritty_dev
861098714b feat: complete test script of plugin runtime (#28955)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-12-01 09:51:31 +08:00
dependabot[bot]
63b345110e chore(deps): bump echarts-for-react from 3.0.2 to 3.0.5 in /web (#28958)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-01 09:51:22 +08:00
zhsama
f5e36a8a2b Merge branch 'origin-main' into feat/end-user-oauth 2025-12-01 02:25:44 +08:00
Asuka Minato
247069c7e9 refactor: port reqparse to Pydantic model (#28913)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-30 16:09:42 +09:00
Gritty_dev
bb096f4ae3 Feat/ implement test script of content moderation (#28923) 2025-11-30 12:43:58 +08:00
Lê Quốc Bình
a37497ffb5 fix(web): prevent navbar clearing app state on cmd+click (#28935) 2025-11-30 12:43:47 +08:00
github-actions[bot]
02adf4ff06 chore(i18n): translate i18n files and update type definitions (#28933)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-30 12:43:02 +08:00
Conner Mo
acbc886ecd fix: implement score_threshold filtering for OceanBase vector search (#28536)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-29 18:50:21 +08:00
CrabSAMA
0a2d478749 Feat: Add "Open Workflow" link in workflow side panel (#28898) 2025-11-29 18:47:12 +08:00
莫小帅
95528ad8e5 fix: ensure "No apps found" text is visible on small screens (#28929)
Co-authored-by: yyh <92089059+lyzno1@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-29 17:21:39 +08:00
Gritty_dev
ddad2460f3 feat: complete test script of dataset indexing task (#28897)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 21:31:03 +08:00
zhsama
660efda7f5 feat: enhance AgentTools component with expandable provider sections and improved UI interactions 2025-11-28 18:30:55 +08:00
zhsama
5e96e4dda6 feat: introduce CredentialConfigHeader and EndUserCredentialSection components for enhanced plugin authentication UI 2025-11-28 18:12:11 +08:00
Charles Yao
a8491c26ea fix: add explicit default to httpx.timeout (#28836) 2025-11-28 04:02:07 -06:00
aka James4u
0aed7afdc0 feat: Add comprehensive unit tests for TagService with extensive docu… (#28885)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-28 18:01:01 +08:00
Gritty_dev
18b800a33b feat: complete test script of sensitive word filter (#28879)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 18:00:54 +08:00
hsparks-codes
c64fe595d3 test: add comprehensive unit tests for ExternalDatasetService (#28872) 2025-11-28 17:59:02 +08:00
zhsama
a31eea8389 feat: refactor AgentTools component to group tools by provider and enhance UI interactions 2025-11-28 16:42:52 +08:00
-LAN-
dd3b1ccd45 refactor(workflow): remove redundant get_base_node_data() method (#28803) 2025-11-28 15:38:46 +08:00
hsparks-codes
6f927b4a62 test: add comprehensive unit tests for RecommendedAppService (#28869) 2025-11-28 15:10:24 +08:00
Gritty_dev
c76bb8ffa0 feat: complete test script of file upload (#28843)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 15:10:12 +08:00
hsparks-codes
4dcd871cef test: add comprehensive unit tests for AudioService (#28860) 2025-11-28 14:43:35 +08:00
hsparks-codes
abe1d31ae0 test: add comprehensive unit tests for SavedMessageService (#28845) 2025-11-28 14:42:54 +08:00
hsparks-codes
2d71fff2b2 test: add comprehensive unit tests for TagService (#28854) 2025-11-28 14:41:57 +08:00
-LAN-
c4f61b8ae7 Fix CODEOWNERS workflow owner handle (#28866) 2025-11-28 14:41:20 +08:00
非法操作
c51ab6ec37 fix: the consistency of the go-to-anything interaction (#28857) 2025-11-28 14:29:15 +08:00
hsparks-codes
1fc2255219 test: add comprehensive unit tests for EndUserService (#28840) 2025-11-28 14:22:19 +08:00
zhsama
6aa0c9e5cc Merge branch 'refs/heads/origin-main' into feat/end-user-oauth 2025-11-28 14:19:42 +08:00
Gritty_dev
037389137d feat: complete test script of indexing runner (#28828)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-28 14:18:59 +08:00
非法操作
8cd3e84c06 chore: bump dify plugin version in docker.middleware (#28847) 2025-11-28 13:55:13 +08:00
-LAN-
b3c6ac1430 chore: assign code owners to frontend and backend modules in CODEOWNERS (#28713) 2025-11-28 12:42:58 +08:00
hsparks-codes
68bb97919a feat: add comprehensive unit tests for MessageService (#28837)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-28 12:36:15 +08:00
Gritty_dev
f268d7c7be feat: complete test script of website crawl (#28826) 2025-11-28 12:34:27 +08:00
aka James4u
d695a79ba1 test: add comprehensive unit tests for DocumentIndexingTaskProxy (#28830)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-28 12:30:54 +08:00
Gritty_dev
cd5a745bd2 feat: complete test script of notion provider (#28833) 2025-11-28 12:30:45 +08:00
aka James4u
51e5f422c4 test: add comprehensive unit tests for VectorService and Vector classes (#28834)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 12:30:02 +08:00
hsparks-codes
ec3b2b40c2 test: add comprehensive unit tests for FeedbackService (#28771)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 11:33:56 +08:00
Bowen Liang
67ae3e9253 docker: use COPY --chown in api Dockerfile to avoid adding layers by explicit chown calls (#28756) 2025-11-28 11:33:06 +08:00
aka James4u
d38e3b7792 test: add unit tests for document service status management (#28804)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 11:25:36 +08:00
Gritty_dev
43d27edef2 feat: complete test script of embedding service (#28817)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 11:24:30 +08:00
Satoshi Dev
94b87eac72 feat: add comprehensive unit tests for provider models (#28702)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 11:24:20 +08:00
yyh
fd31af6012 fix(ci): use dynamic branch name for i18n workflow to prevent race condition (#28823)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-28 11:23:28 +08:00
yyh
228deccec2 chore: update packageManager version in package.json to pnpm@10.24.0 (#28820) 2025-11-28 11:23:20 +08:00
Gritty_dev
639f1d31f7 feat: complete test script of text splitter (#28813)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 11:22:52 +08:00
aka James4u
ec786fe236 test: add unit tests for document service validation and configuration (#28810)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 11:21:45 +08:00
Gritty_dev
fe3a6ef049 feat: complete test script of reranker (#28806)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-28 11:21:35 +08:00
-LAN-
8b761319f6 Refactor workflow nodes to use generic node_data (#28782) 2025-11-27 20:46:56 +08:00
github-actions[bot]
002d8769b0 chore: translate i18n files and update type definitions (#28784)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-27 20:28:17 +08:00
GuanMu
5aba111297 Feat zen mode (#28794) 2025-11-27 20:10:50 +08:00
-LAN-
dc9b3a7e03 refactor: rename VariableAssignerNodeData to VariableAggregatorNodeData (#28780) 2025-11-27 17:45:48 +08:00
zhsama
8fe47a3c04 Merge remote-tracking branch 'origin/main' into feat/end-user-oauth 2025-11-27 17:45:28 +08:00
zhsama
cb4670cd68 feat: enhance agent tools with end user credential support 2025-11-27 17:45:05 +08:00
zhsama
1400b9c6e2 refactor(plugin-auth): enhance end-user type selection UI and remove unused state 2025-11-27 17:45:05 +08:00
zhsama
b7d9483bc2 feat(end-user): implement end-user credential management in plugin-auth component
- Added support for end-user credentials with options for OAuth and API Key.
- Introduced new props in PluginAuth for managing end-user credential types and their states.
- Updated workflow types to include end-user credential fields.
- Enhanced UI to allow users to select and manage end-user credentials.
- Added translations for new UI elements related to end-user credentials.
2025-11-27 17:45:05 +08:00
Joel
5f2e0d6347 pref: reduce next step components reRender (#28783) 2025-11-27 17:12:00 +08:00
Coding On Star
1f72571c06 edit analyze-component (#28781)
Co-authored-by: CodingOnStar <hanxujiang@dify.ai>
Co-authored-by: 姜涵煦 <hanxujiang@jianghanxudeMacBook-Pro.local>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 16:54:44 +08:00
CrabSAMA
820925a866 feat(workflow): workflow as tool output schema (#26241)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Novice <novice12185727@gmail.com>
2025-11-27 16:50:48 +08:00
Joel
299bd351fd perf: reduce reRender in candidate node (#28776) 2025-11-27 15:57:36 +08:00
-LAN-
13bf6547ee Refactor: centralize node data hydration (#27771) 2025-11-27 15:41:56 +08:00
wangxiaolei
1b733abe82 feat: creates logs immediately when workflows start (not at completion) (#28701) 2025-11-27 15:22:33 +08:00
aka James4u
5782e26ab2 test: add unit tests for dataset service update/delete operations (#28757)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 15:01:43 +08:00
aka James4u
38d329e75a test: add unit tests for dataset permission service (#28760)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 15:00:55 +08:00
非法操作
58f448a926 chore: remove outdated model config doc (#28765) 2025-11-27 14:40:06 +08:00
Gritty_dev
7a7fea40d9 feat: complete test script of dataset retrieval (#28762)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 14:39:33 +08:00
Gritty_dev
0309545ff1 Feat/test script of workflow service (#28726)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-27 11:23:55 +08:00
-LAN-
6deabfdad3 Use naive_utc_now in graph engine tests (#28735)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 11:23:20 +08:00
非法操作
f9b4c31344 fix: MCP tool time configuration not work (#28740) 2025-11-27 11:22:49 +08:00
majinghe
8d8800e632 upgrade docker compose milvus version to 2.6.0 to fix installation error (#26618)
Co-authored-by: crazywoola <427733928@qq.com>
2025-11-27 11:01:14 +08:00
aka James4u
4ca4493084 Add comprehensive unit tests for MetadataService (dataset metadata CRUD operations and filtering) (#28748)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 11:00:10 +08:00
aka James4u
7efa0df1fd Add comprehensive API/controller tests for dataset endpoints (list, create, update, delete, documents, segments, hit testing, external datasets) (#28750)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 10:59:17 +08:00
Will
b786e101e5 fix: querying and setting the system default model (#28743) 2025-11-27 11:58:35 +09:00
Will
09a8046b10 fix: querying webhook trigger issue (#28753) 2025-11-27 10:56:21 +08:00
NeatGuyCoding
2f6b3f1c5f hotfix: fix _extract_filename for rfc 5987 (#26230)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
2025-11-27 10:54:00 +08:00
jiangbo721
2551f6f279 feat: add APP_DEFAULT_ACTIVE_REQUESTS as the default value for APP_AC… (#26930) 2025-11-27 10:51:48 +08:00
Gritty_dev
01afa56166 chore: enhance the test script of current billing service (#28747)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 10:37:24 +08:00
Satoshi Dev
5815950092 add unit tests for iteration node (#28719)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 10:36:47 +08:00
Satoshi Dev
766e16b26f add unit tests for code node (#28717) 2025-11-27 10:36:37 +08:00
Gritty_dev
0fdb4e7c12 chore: enhance the test script of conversation service (#28739)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 09:57:52 +08:00
aka James4u
64babb35e2 feat: Add comprehensive unit tests for DatasetCollectionBindingService (dataset collection binding methods) (#28724)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-27 09:55:42 +08:00
-LAN-
38522e5dfa fix: use default_factory for callable defaults in ORM dataclasses (#28730) 2025-11-27 09:39:49 +09:00
Charles Yao
9df9db3a8f feat: use enums instead of literals (#28733) 2025-11-26 13:31:12 -06:00
Charles Yao
bf4f9b04bf Merge branch 'feat/end-user-oauth' into feat/end-user-add-dsl-field 2025-11-26 13:29:54 -06:00
Charles Yao
a659cbf71d ECO-184: use enums instead of literals 2025-11-26 13:28:23 -06:00
Charles Yao
72514904ea Revert "feat: add dsl field for end user oauth" (#28732) 2025-11-26 13:24:07 -06:00
Charles Yao
f3fbd4f90e Revert "feat: add dsl field for end user oauth" 2025-11-26 13:23:58 -06:00
Charles Yao
5947cc2bab feat: add dsl field for end user oauth (#28687) 2025-11-26 13:23:27 -06:00
aka James4u
4ccc150fd1 test: add comprehensive unit tests for ExternalDatasetService (external knowledge API integration) (#28716)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-26 23:33:46 +08:00
crazywoola
a4c57017d5 add: badges (#28722) 2025-11-26 23:30:41 +08:00
Satoshi Dev
b2a7cec644 add unit tests for template transform node (#28595)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-26 22:50:20 +08:00
Gritty_dev
ddc5cbe865 feat: complete test script of dataset service (#28710)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-26 22:48:08 +08:00
XlKsyt
1e23957657 fix(ops): add streaming metrics and LLM span for agent-chat traces (#28320)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-26 22:45:20 +08:00
Asuka Minato
2731b04ff9 Pydantic models (#28697)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-26 22:44:14 +08:00
Satoshi Dev
e8ca80a61a add unit tests for list operator node (#28597)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-26 22:43:30 +08:00
aka James4u
e76129b5a4 test: add comprehensive unit tests for HitTestingService Fix: #28667 (#28668)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-26 22:42:58 +08:00
非法操作
6635ea62c2 fix: change existing node to a webhook node raise 404 (#28686) 2025-11-26 22:41:52 +08:00
Yuichiro Utsumi
6b8c649876 fix: prevent auto-scrolling from stopping in chat (#28690)
Signed-off-by: Yuichiro Utsumi <utsumi.yuichiro@fujitsu.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-26 22:39:29 +08:00
GuanMu
af587f3869 chore: update packageManager version to pnpm@10.23.0 (#28708) 2025-11-26 22:37:05 +08:00
QuantumGhost
1c1f124891 Enhanced GraphEngine Pause Handling (#28196)
This commit: 

1. Convert `pause_reason` to `pause_reasons` in `GraphExecution` and relevant classes. Change the field from a scalar value to a list that can contain multiple `PauseReason` objects, ensuring all pause events are properly captured.
2. Introduce a new `WorkflowPauseReason` model to record reasons associated with a specific `WorkflowPause`.

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-26 19:59:34 +08:00
-LAN-
b353a126d8 chore: bump version to 1.10.1 (#28696) 2025-11-26 18:32:10 +08:00
Joel
ef0e1031b0 pref: reduce the times of useNodes reRender (#28682)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-26 16:52:47 +08:00
Eric Guo
d7010f582f Fix 500 error in knowledge base, select weightedScore and click retrieve. (#28586)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-26 16:44:00 +08:00
-LAN-
d696b9f35e Use pnpm dev in dev/start-web (#28684) 2025-11-26 16:24:01 +08:00
Ethan Lee
665d49d375 Fixes session scope bug in FileService.delete_file (#27911)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-11-26 16:21:33 +08:00
Charles Yao
31109735a3 fix: add dependency 2025-11-26 01:31:10 -06:00
Charles Yao
df025ac400 Merge branch 'feat/end-user-oauth' into feat/end-user-add-dsl-field 2025-11-26 01:30:18 -06:00
Charles Yao
f4a3e290cb fix: change end user table to uuidv7 (#28689) 2025-11-26 01:29:15 -06:00
Charles Yao
66e85ca16c fix: change end user table to user uuid v7 2025-11-26 01:27:07 -06:00
-LAN-
26a1c84881 chore: upgrade system libraries and Python dependencies (#28624)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Xiyuan Chen <52963600+GareArc@users.noreply.github.com>
2025-11-26 15:25:28 +08:00
Coding On Star
dbecba710b frontend auto testing rules (#28679)
Co-authored-by: CodingOnStar <hanxujiang@dify.ai>
Co-authored-by: 姜涵煦 <hanxujiang@jianghanxudeMacBook-Pro.local>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-26 15:18:07 +08:00
Charles Yao
ac8136cca4 ECO-184: add new dsl field to tool node, ready for enduser auth 2025-11-26 01:13:57 -06:00
Charles Yao
8c111de6a9 ECO-184: add new dsl field to tool node, ready for enduser auth 2025-11-26 01:13:35 -06:00
CrabSAMA
591414307a fix: fixed workflow as tool files field return empty problem (#27925)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
2025-11-26 14:00:36 +08:00
非法操作
1241cab113 chore: enhance the hint when the user triggers an invalid webhook request (#28671) 2025-11-26 14:00:16 +08:00
wangxiaolei
490b7ac43c fix: fix feedback like or dislike not display in logs (#28652) 2025-11-26 13:59:47 +08:00
Charles Yao
d784a0432c fix: migration file conflits (#28665) 2025-11-25 21:27:26 -06:00
Gritty_dev
0f521b26ae Feat/add test script for tool models (#28653) 2025-11-26 09:43:39 +08:00
aka James4u
e4ec4e1470 test: add comprehensive unit tests for SegmentService - Fix: #28656 (#28568)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-26 09:43:00 +08:00
yangzheli
203c2f0456 feat(web): update marketplace description & icon (#28662) 2025-11-26 09:42:09 +08:00
Charles Yao
1636b228db ECO-183: fix migration file conflits 2025-11-25 17:24:49 -06:00
yangzheli
b502d30e77 fix(web): resolve readme-panel display and styling issues (#28658) 2025-11-26 02:21:50 +09:00
Kevin9703
a486c47b1e fix: ensure advanced-chat workflows stop correctly (#27803)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-25 20:09:03 +08:00
墨绿色
f76a3f545c Feat/add weaviate tokenization configurable (#28159)
Co-authored-by: lijiezhao <lijiezhao@perfect99.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-25 20:07:45 +08:00
Asuka Minato
b5650b579d fix [Chore/Refactor] Generate complete API documentation using Flask-RESTX #24421 (#28649)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-25 20:04:27 +08:00
Byron.wang
83702762c8 use no-root user in docker image by default (#26419) 2025-11-25 19:59:45 +08:00
Xiu-Lan
abc13ef762 Feat/web workflow improvements (#27981)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: johnny0120 <johnny0120@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Wood <tuiskuwood@outlook.com>
2025-11-25 19:54:40 +08:00
Yeuoly
ce00388278 fix(TriggerProviderIdentity): avoid nullable tags (#28646) 2025-11-25 19:37:06 +08:00
非法操作
4a76318877 fix: draft run any nodes raise 500 (#28636) 2025-11-25 18:09:02 +08:00
yyh
e073e755f9 Fix start tab marketplace trigger search and plugin list scroll (#28645) 2025-11-25 18:08:46 +08:00
Novice
57b405c4c2 fix(style): update ExternalDataToolModal to support dark mode using semantic tokens (#28630) 2025-11-25 15:58:43 +08:00
非法操作
2181ffdc89 fix: chatflow log details always navigate to page first (#28626) 2025-11-25 15:54:15 +08:00
zhsama
d5beccf0da feat: add deployment workflow for end-user development environment 2025-11-25 15:05:55 +08:00
yyh
82dac2eba0 chore: add missing translations (#28631) 2025-11-25 14:52:17 +08:00
zhsama
e36d460d67 Merge branch 'refs/heads/origin-main' into feat/end-user-oauth 2025-11-25 14:38:40 +08:00
yyh
58be008676 chore: refactor i18n scripts and remove extra keys (#28618) 2025-11-25 13:23:19 +08:00
Charles Yao
68f195c5e9 Merge branch 'main' into feat/end-user-oauth 2025-11-24 21:29:42 -06:00
Jax
eed38c8b2a Fix(workflow): Prevent token overcount caused by loop/iteration (#28406) 2025-11-25 09:56:59 +08:00
NeatGuyCoding
6bd114285c fix: i18n: exexutions translation (#28610)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
2025-11-25 09:39:17 +08:00
Gritty_dev
25698ccd54 Feat/test workflow models (#28604) 2025-11-25 09:38:27 +08:00
Charles Yao
200e801182 ECO-171: remove unused index statement from migration script 2025-11-24 11:12:46 -06:00
Charles Yao
5bcf5be874 ECO-171: remove unused index statement from migration script 2025-11-24 11:11:15 -06:00
Charles Yao
b60ba0b192 Merge branch 'main' into feat/end-user-oauth 2025-11-24 11:04:39 -06:00
Maries
bb3aa0178d fix: update plugin verification logic to use unique identifier instea… (#28608) 2025-11-25 00:40:25 +08:00
Asuka Minato
751ce4ec41 more typed orm (#28577)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-24 21:01:46 +08:00
NeatGuyCoding
da98a38b14 fix: i18n: standardize trigger events terminology in billing translations (#28543)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-24 21:01:32 +08:00
yyh
034e3e85e9 Fix Node.js SDK routes and multipart handling (#28573) 2025-11-24 21:00:40 +08:00
changkeke
aab95d0626 fix: Failed to load API definition (#28509)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
2025-11-24 21:44:09 +09:00
Joel
15ea27868e pref: workflow (#28591) 2025-11-24 17:02:18 +08:00
Charles Yao
26dc4d43bf feat: add new table of end user oauth (#28351) 2025-11-24 01:28:40 -06:00
Charles Yao
0e355079fa ECO-171: removing duplicated index 2025-11-24 01:26:54 -06:00
Charles Yao
b51bf33b1e ECO-171: adding comments for expire_at as clarification 2025-11-24 00:02:37 -06:00
Charles Yao
30594978f9 ECO-171: update to competible with mysql 2025-11-23 23:43:59 -06:00
-LAN-
2e2d7a5345 Apply suggestion from @gemini-code-assist[bot]
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-24 13:23:31 +08:00
Charles Yao
cf2457a03c ECO-171: update credential cast 2025-11-23 23:17:24 -06:00
Charles Yao
4590dab046 ECO-171: update migration script with sql alchemy auto naming 2025-11-23 23:10:24 -06:00
Charles Yao
f2df8af4c8 ECO-171: update to use server default time stamp 2025-11-23 23:00:10 -06:00
NeatGuyCoding
bcbd3de336 fix: i18n: stop running translation (#28571)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
2025-11-24 12:45:06 +08:00
ice
a0daab2711 feat(seo): add meaningful <h1> headings across all public pages (#28569)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-24 12:42:04 +08:00
Charles Yao
5b93ed3fcd ECO-171: update unique constraints 2025-11-23 21:29:23 -06:00
非法操作
e1d11681c0 fix: plugin auto update display issues (#28564) 2025-11-24 11:08:40 +08:00
wangxiaolei
8a995d0c21 chore: not using db.session.get (#28555)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-24 11:06:06 +08:00
Asuka Minato
6241b87f90 more typed orm (#28519)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-24 10:50:20 +08:00
Gritty_dev
2c9e435558 feat: complete app modesls test script (#28549)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-24 10:50:09 +08:00
诗浓
b12057b7e5 fix: add COMPOSE_PROFILES param to middleware.env.example file (#28541)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-24 10:49:33 +08:00
Charles Yao
5fd83e9425 ECO-171: remove postgresql specific features 2025-11-23 20:41:13 -06:00
yyh
2445d04d19 chore: fix de-DE translations (#28552) 2025-11-24 10:11:19 +08:00
dependabot[bot]
a58986eb06 chore(deps): bump clickhouse-connect from 0.7.19 to 0.10.0 in /api (#28559)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-24 10:11:00 +08:00
aka James4u
a39b151d5f feat: add comprehensive unit tests for dataset service retrieval/list… (#28539)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-24 10:08:43 +08:00
Chen Jiaju
3841e8578f fix: use default values for optional workflow input variables (#28546) (#28527)
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-24 10:08:26 +08:00
Asuka Minato
e0824c2d93 api -> console_ns (#28246)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-24 10:04:11 +08:00
github-actions[bot]
c75a4e6309 chore: translate i18n files and update type definitions (#28528)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: yyh <92089059+lyzno1@users.noreply.github.com>
2025-11-23 15:47:57 +08:00
github-actions[bot]
1dfde240cb chore: translate i18n files and update type definitions (#28518)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-22 13:54:08 +08:00
Yuki Watanabe
c6e6f3b7cb feat: MLflow tracing (#26093)
Signed-off-by: B-Step62 <yuki.watanabe@databricks.com>
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-22 13:53:58 +08:00
aka James4u
ea320ce055 feat: add comprehensive unit tests for dataset service creation methods (#28522)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-22 13:38:35 +08:00
Charles Yao
8f6937eea6 ECO-171: resolving comments 2025-11-21 17:05:16 -06:00
Charles Yao
74fc026fe0 merge main 2025-11-21 09:12:46 -06:00
55Kamiryo
6d3ed468d8 feat: add a stop run button to the published app UI (#27509)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-21 22:26:30 +08:00
Asuka Minato
a6c6bcf95c more typed orm (#28507) 2025-11-21 21:45:51 +08:00
Gritty_dev
63b8bbbab3 feat: complete test script for dataset models (#28512)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-21 21:37:25 +08:00
goofy
33ff01d04c Support node reference multiple structured_output variables in single-step run (#26661)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-21 17:54:57 +08:00
Charles Liu
ae126fd56f Fix/24655 (#26527)
Co-authored-by: charles liu <dearcharles.liu@gmail.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-21 17:49:12 +08:00
非法操作
9fed2dc065 fix: Code editor throws dozen of errors (#28500) 2025-11-21 16:48:35 +08:00
wangxiaolei
2e0964e0b0 fix(api): SegmentType.is_valid() raises AssertionError for SegmentType.GROUP (#28249)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-21 16:43:09 +09:00
Asuka Minato
237bb4595b more typed orm (#28494)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-21 16:42:27 +09:00
耐小心
4486b54680 Clean up legacy conditions data in if-else nodes to prevent misjudgments (#28148)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-21 14:26:57 +08:00
Asuka Minato
1a2f8dfcb4 use deco (#28153)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-21 14:25:53 +08:00
Asuka Minato
3c30d0f41b more typed orm (#28331)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-21 14:23:32 +09:00
Charles Yao
bbd466eaba ECO-171: fix comments 2025-11-20 23:08:00 -06:00
GuanMu
5f61ca5e6f feat: Implement partial update for document metadata, allowing merging of new values with existing ones. (#28390)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-21 12:58:20 +08:00
耐小心
06466cb73a fix: fix numeric type conversion issue in if-else condition comparison (#28155)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-21 12:58:08 +08:00
Charles Yao
153609b968 ECO-171: remove server defaults 2025-11-20 22:57:37 -06:00
Gritty_dev
c5b6219006 Feat/add test script for account models (#28479) 2025-11-21 12:54:50 +08:00
znn
ae5b5a6aa9 disable sticky scroll (#28248) 2025-11-21 11:24:26 +08:00
yangzheli
a4c4d18f42 fix(api): add session_id validation for webapp JWT authentication (#28297)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-21 11:23:52 +08:00
Charles Yao
6cd7ab4719 Update api/models/tools.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-20 21:18:25 -06:00
Charles Yao
39de9e7248 Update api/models/tools.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-20 21:18:14 -06:00
Charles Yao
5e93a61865 ECO-171: edited migration script 2025-11-20 21:11:12 -06:00
Charles Yao
76069b5d6d ECO-171: add migration script 2025-11-20 21:11:12 -06:00
Asuka Minato
3cf19dc07f add two test examples (#28236)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-21 10:36:41 +08:00
github-actions[bot]
73c58e4cbb chore: translate i18n files and update type definitions (#28478)
Co-authored-by: asukaminato0721 <30024051+asukaminato0721@users.noreply.github.com>
2025-11-21 10:35:04 +08:00
张哲芳
c2043d0f6d fix: allow API to access conversations created before upgrade to 1.10.0 (#28462)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-21 10:34:55 +08:00
wangxiaolei
cad2991946 feat: support redis 7.0 shared pub and sub (#28333) 2025-11-21 10:33:52 +08:00
GuanMu
e260815c5e fix: adjust overflow styles in EditMetadataBatchModal for better layout (#28445) 2025-11-21 10:30:52 +08:00
lyzno1
b4e7239ac7 fix: correct trigger events limit modal wording (#28463) 2025-11-21 03:23:08 +09:00
lyzno1
4b6f4081d6 fix: treat -1 as unlimited for API rate limit and trigger events (#28460) 2025-11-21 03:22:00 +09:00
Maries
d1c9183d3b fix: correct monitor and fix trigger billing rate limit (#28465) 2025-11-20 20:37:10 +08:00
Yeuoly
2f9705eb6f refactor: remove TimeSliceLayer before the release of HITL (#28441)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-20 18:20:20 +08:00
lyzno1
0e3fab1f9f fix: add missing particle in Japanese trigger events translation (#28452) 2025-11-20 16:59:30 +08:00
hj24
2431ddfde6 Feat integrate partner stack (#28353)
Co-authored-by: Joel <iamjoel007@gmail.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-20 15:58:05 +08:00
Junyan Qin (Chin)
1e4e963d8c chore: update celery command for debugging trigger (#28443) 2025-11-20 15:43:22 +08:00
17hz
522508df28 fix: add app_id to Redis cache keys for trigger nodes to ensure uniqueness (#28243)
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-20 13:34:05 +08:00
17hz
859f73c19d fix: add .ts and .mjs to EditorConfig indent rules (#28397)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-20 13:27:00 +08:00
17hz
82c11e36ea fix: remove deprecated UnsafeUnwrappedHeaders usage (#28219)
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-20 13:20:41 +08:00
yangzheli
a6cd2ad880 fix(web): remove StatusPanel's internal useStore to fix context issues (#28348) 2025-11-20 12:50:46 +08:00
Gritty_dev
b2a604b801 Add Comprehensive Unit Tests for Console Auth Controllers (#28349)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-20 12:50:16 +08:00
CrabSAMA
7c060fc35c fix: lazy init audioplayer to fix no tts message also switch audio source bug (#28433) 2025-11-20 12:48:11 +08:00
GuanMu
48e39b60a8 fix: update table alias in document service display status test asser… (#28436) 2025-11-20 12:47:45 +08:00
Chen Jiaju
f038aa4746 fix: resolve CSRF token cookie name mismatch in browser (#28228) (#28378)
Co-authored-by: Claude <noreply@anthropic.com>
2025-11-20 11:40:35 +08:00
yangzheli
4833d39ab3 fix(workflow): validate node compatibility when importing dsl between chatflows and workflows (#28012) 2025-11-20 11:40:24 +08:00
Anubhav Singh
fa910be0f6 Fix duration displayed for workflow steps on Weave dashboard (#28289) 2025-11-20 11:37:01 +08:00
yangzheli
bc274e7300 refactor(web): remove redundant dataset card-item components and related code (#28199) 2025-11-20 11:36:41 +08:00
yihong
7b1fc4d2e6 fix: add make test for short cut backend unittest (#28380)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2025-11-20 11:33:42 +08:00
github-actions[bot]
204d5f1bb9 chore: translate i18n files and update type definitions (#28429)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-20 11:32:55 +08:00
Will
8fc1c7d994 chore: remove redundant reimports (#28415)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
2025-11-20 11:28:29 +08:00
yangzheli
879869d3e3 fix(web): fix checkbox unselectable bug & optimize document-list/app-annotation styles (#28244) 2025-11-20 11:28:20 +08:00
GuanMu
1d2cdf3489 feat: add display status filtering to document list and API (#28342)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-20 11:27:44 +08:00
yangzheli
a5d0e68675 feat(workflow): optimize workflow canvas pan and scroll behavior (#28250)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-20 11:27:30 +08:00
github-actions[bot]
605e543372 chore: translate i18n files and update type definitions (#28425)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-20 11:26:49 +08:00
-LAN-
c432f601ab fix: change TenantApi endpoint from GET to POST (#27858)
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-20 11:22:37 +08:00
lyzno1
e8d03a422d fix: improve email code sign-in experience (#28307) 2025-11-20 11:19:15 +08:00
Novice
6be013e072 feat: implement RFC-compliant OAuth discovery with dynamic scope selection for MCP providers (#28294)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-20 11:18:16 +08:00
znn
014cbaf387 make expand/collapse in question classifier node (#26772)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: lyzno1 <yuanyouhuilyz@gmail.com>
Co-authored-by: lyzno1 <92089059+lyzno1@users.noreply.github.com>
2025-11-20 11:17:34 +08:00
XlKsyt
1be38183e5 fix(frontend): add missing vertical type to divider in provider config modal (#28387)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-20 11:17:04 +08:00
ice
8bab42e224 style(web): fix vertical alignment of search button on apps page (#28398) 2025-11-20 11:14:09 +08:00
wangxiaolei
99e9fc751b refactor: refactor python sdk (#28118)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-20 11:10:53 +08:00
Maries
a1b735a4c0 feat: trigger billing (#28335)
Signed-off-by: lyzno1 <yuanyouhuilyz@gmail.com>
Co-authored-by: lyzno1 <yuanyouhuilyz@gmail.com>
Co-authored-by: lyzno1 <92089059+lyzno1@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-20 10:15:23 +08:00
longbingljw
c0b7ffd5d0 feat:mysql adaptation for metadb (#28188) 2025-11-20 09:44:39 +08:00
Maries
012877d8d4 fix: address user input preparation in workflow app generator (#28410)
Co-authored-by: lyzno1 <92089059+lyzno1@users.noreply.github.com>
2025-11-20 02:09:40 +08:00
Jyong
41bb6f3109 Revert "add vdb-test workflow run filter" (#28382)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-19 20:02:24 +08:00
Asuka Minato
adf673d031 Apply suggestion from @gemini-code-assist[bot]
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-19 17:18:50 +09:00
Lloyd-Pottiger
88c9b18cb6 fix(docker): start-up TiFlash (#28376) 2025-11-19 13:59:56 +08:00
-LAN-
6efdc94661 refactor: consume events after pause/abort and improve API clarity (#28328)
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
2025-11-18 19:04:11 +08:00
github-actions[bot]
68526c09fc chore: translate i18n files and update type definitions (#28284)
Co-authored-by: zhsama <33454514+zhsama@users.noreply.github.com>
Co-authored-by: lyzno1 <92089059+lyzno1@users.noreply.github.com>
2025-11-18 18:52:36 +08:00
kenwoodjw
a78bc507c0 fix: dataset metadata counts when documents are deleted (#28305)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-11-18 17:36:07 +08:00
Joel
e83c7438cb doc: add doc for env config when site and backend are in different domains (#28318)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-18 17:29:54 +08:00
Jyong
82068a6918 add vdb-test workflow run filter (#28336) 2025-11-18 17:22:15 +08:00
Asuka Minato
108bcbeb7c add cnt script and one more example (#28272)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-18 16:44:14 +09:00
非法操作
c4b02be6d3 fix: published webhook can't receive inputs (#28205) 2025-11-18 11:14:26 +08:00
lyzno1
30eebf804f chore: remove unused style.module.css from app-icon component (#28302) 2025-11-18 10:36:39 +08:00
Yessenia-d
ad7fdd18d0 fix: update currentTriggerPlugin check in BasePanel component (#28287) 2025-11-17 17:19:35 +08:00
zhsama
5d2fbf5215 Perf/mutual node UI (#28282) 2025-11-17 16:23:04 +08:00
非法操作
4a89403566 fix: click log panel of log page cause whole page crash (#28218) 2025-11-14 16:38:43 +09:00
crazywoola
e0c05b2123 add icon for forum (#28164) 2025-11-14 16:38:19 +09:00
lyzno1
85b99580ea fix: card view render (#28189) 2025-11-14 14:16:11 +08:00
lyzno1
15fbedfcad feat: add icon gallery stories (#28214)
Signed-off-by: lyzno1 <yuanyouhuilyz@gmail.com>
2025-11-14 13:34:23 +08:00
非法操作
1e6d0de48b fix: knowledge pipeline can not published (#28203) 2025-11-14 09:47:37 +08:00
Anubhav Singh
cad751c00c Upgrade weave version to fix weave configuration failure (#28197) 2025-11-14 09:47:21 +08:00
Maries
a47276ac24 chore: bump to 1.10.0 (#28186)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-13 22:36:04 +08:00
yangzheli
20403c69b2 refactor(web): remove redundant add-tool-modal components and related code (#27996) 2025-11-13 20:21:04 +08:00
hoffer
ffc04f2a9b fix: StreamableHTTPTransport got invalid json exception when receive a ping event from mcp server #28111 (#28116) 2025-11-13 20:19:48 +08:00
Asuka Minato
d1580791e4 TypedBase + TypedDict (#28137)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-13 20:18:51 +08:00
NeatGuyCoding
c74eb4fcf3 minor fix(rag): return early when pushing empty tasks to avoid Redis DataError (#28027)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
2025-11-13 20:18:11 +08:00
NeatGuyCoding
a798534337 fix(web): fix unit promotion in formatNumberAbbreviated (#27918)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
2025-11-13 20:17:26 +08:00
GuanMu
470883858e fix: adjust padding in AgentNode and NodeComponent for consistent layout (#28175) 2025-11-13 20:16:56 +08:00
GuanMu
4f4911686d fix: update start-worker alias to include additional queues for bette… (#28179) 2025-11-13 20:16:44 +08:00
GuanMu
6d479dcdbb fix: update package manager version to 10.22.0 (#28181) 2025-11-13 20:16:00 +08:00
zhsama
24348c40a6 feat: enhance start node metadata to be undeletable in chat mode (#28173) 2025-11-13 18:11:15 +08:00
yihong
a39b50adbb fix: skip tests if no database run (#28102)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-13 15:57:13 +08:00
李龙飞
81832c14ee Fix: Correctly handle merged cells in DOCX tables to prevent content duplication and loss (#27871)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-13 15:56:24 +08:00
zhsama
b86022c64a feat: add draft trigger detection to app model and UI (#28163)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-13 15:43:58 +08:00
breath57
45e816a9f6 fix(knowledge-base): regenerate child chunks not working completely (#27934) 2025-11-13 15:36:27 +08:00
Joel
667b1c37a3 fix: can still invite when api is pending (#28161) 2025-11-13 15:28:32 +08:00
Chen Yu
b75d533f9b fix(moderation): change OpenAI moderation model to omni-moderation-la… (#28119)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-13 15:21:44 +08:00
CrabSAMA
aece55d82f fix: fixed error when clear value of INTEGER and FLOAT type (#27954)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-13 15:21:34 +08:00
kenwoodjw
c432b398f4 fix: missing pipeline_templates.json when HOSTED_FETCH_PIPELINE_TEMPLATES_MODE is builtin (#27946)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-13 15:04:35 +08:00
katakyo
9cb2645793 fix: update input field width for retry configuration in RetryOnPanel (#28142) 2025-11-13 15:00:22 +08:00
ye4241
6ac61bd585 fix: correct spelling of "模板" in translation files (#28151) 2025-11-13 14:58:10 +08:00
非法操作
b02165ffe6 fix: inconsistent behaviour of zoom in button and shortcut (#27944) 2025-11-13 14:37:27 +08:00
Asuka Minato
6c576e2c66 add doc (#28016)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-13 13:38:45 +09:00
yangzheli
b0e7e7752f refactor(web): reuse the same edit-custom-collection-modal component, and fix the pop up error (#28003) 2025-11-13 11:44:21 +08:00
mnasrautinno
2799b79e8c fix: app's ai site text to speech api (#28091) 2025-11-13 11:44:04 +08:00
Harry
2fa6684c4d Merge remote-tracking branch 'origin/main' into feat/trigger 2025-11-13 10:59:57 +08:00
Maries
805a1479f9 fix: simplify graph structure validation in WorkflowService (#28146)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-13 10:59:31 +08:00
-LAN-
fe6538b08d chore: disable workflow logs auto-cleanup by default (#28136)
This PR changes the default value of `WORKFLOW_LOG_CLEANUP_ENABLED` from `true` to `false` across all configuration files.

## Motivation

Setting the default to `false` provides safer default behavior by:

- Preventing unintended data loss for new installations
- Giving users explicit control over when to enable log cleanup
- Following the opt-in principle for data deletion features

Users who need automatic cleanup can enable it by setting `WORKFLOW_LOG_CLEANUP_ENABLED=true` in their configuration.
2025-11-12 22:55:02 +08:00
Asuka Minato
1bbb9d6644 convert to TypeBase (#27935)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-12 21:50:13 +08:00
Gritty_dev
5c06e285ec test: create some hooks and utils test script, modified clipboard test script (#27928) 2025-11-12 21:47:06 +08:00
Gen Sato
19c92fd670 Add file type validation to paste upload (#28017) 2025-11-12 19:27:56 +08:00
非法操作
6026bd873b fix: variable assigner can't assign float number (#28068) 2025-11-12 19:27:36 +08:00
Bowen Liang
1369119a0c fix: determine cpu cores determination in baseedpyright-check script on macos (#28058) 2025-11-12 19:27:27 +08:00
Yeuoly
b76e17b25d feat: introduce trigger functionality (#27644)
Signed-off-by: lyzno1 <yuanyouhuilyz@gmail.com>
Co-authored-by: Stream <Stream_2@qq.com>
Co-authored-by: lyzno1 <92089059+lyzno1@users.noreply.github.com>
Co-authored-by: zhsama <torvalds@linux.do>
Co-authored-by: Harry <xh001x@hotmail.com>
Co-authored-by: lyzno1 <yuanyouhuilyz@gmail.com>
Co-authored-by: yessenia <yessenia.contact@gmail.com>
Co-authored-by: hjlarry <hjlarry@163.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: WTW0313 <twwu@dify.ai>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-12 17:59:37 +08:00
Yeuoly
87439b8fec Merge main 2025-11-12 17:49:17 +08:00
Yeuoly
08034532f6 Update trigger.py 2025-11-12 17:42:55 +08:00
Maries
c5f47ebccd Merge branch 'main' into feat/trigger 2025-11-12 17:14:35 +08:00
Jyong
ca7794305b add transform-datasource-credentials command online check (#28124)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Garfield Dai <dai.hai@foxmail.com>
2025-11-12 17:13:44 +08:00
Yeuoly
6744306818 Merge branch 'main' into feat/trigger 2025-11-12 17:04:31 +08:00
QuantumGhost
fd255e81e1 feat(api): Introduce WorkflowResumptionContext for pause state management (#28122)
Certain metadata (including but not limited to `InvokeFrom`, `call_depth`, and `streaming`)  is required when resuming a paused workflow. However, these fields are not part of `GraphRuntimeState` and were not saved in the previous
 implementation of  `PauseStatePersistenceLayer`.

This commit addresses this limitation by introducing a `WorkflowResumptionContext` model that wraps both the `*GenerateEntity` and `GraphRuntimeState`. This approach provides:

- A structured container for all necessary resumption data
- Better separation of concerns between execution state and persistence
- Enhanced extensibility for future metadata additions
- Clearer naming that distinguishes from `GraphRuntimeState`

The `WorkflowResumptionContext` model makes extending the pause state easier while maintaining backward compatibility and proper version management for the entire execution state ecosystem.

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-12 17:00:02 +08:00
lyzno1
3ff14ccc89 Merge branch 'main' into feat/trigger 2025-11-12 16:49:08 +08:00
Joel
09d31d1263 chore: improve the user experience of not login into apps (#28120) 2025-11-12 16:47:45 +08:00
Yeuoly
9c30f16e4b Update api/tasks/trigger_processing_tasks.py
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-12 16:42:12 +08:00
Harry
c31933c163 chore: downgrade dify-api version to 1.9.2 in uv.lock 2025-11-12 16:20:22 +08:00
Harry
574eb1a10a chore: downgrade version to 1.9.2 in pyproject.toml(ready for merge) 2025-11-12 16:18:19 +08:00
Harry
e6ac783fc3 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-11-12 16:01:19 +08:00
Jyong
47dc26f011 fix document index test (#28113) 2025-11-12 16:00:10 +08:00
Harry
689a75f44a Merge remote-tracking branch 'origin/main' into feat/trigger 2025-11-12 15:07:05 +08:00
湛露先生
123bb3ec08 When graph_engine worker run exception, keep the node_id for deep res… (#26205)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-11-12 15:03:45 +08:00
Joel
90f77282e3 chore: not SaaS version can query long log time range (#28109) 2025-11-12 14:45:56 +08:00
hjlarry
a25f469bde fix(trigger): Prevent marketplace tool downloads from retriggering on tab change 2025-11-12 13:40:56 +08:00
lyzno1
044ee7ef54 feat(workflow): always render featured tools section 2025-11-12 10:37:39 +08:00
hjlarry
2725f28fa8 fix(trigger): incorrect behavior when node uninstalled on the canvas 2025-11-12 10:12:49 +08:00
Charles Yao
c493e08df1 add new table of end user oauth 2025-11-11 20:05:11 -06:00
zhsama
36ad784251 feat(workflow-header): add conditional logic to disable publish and refresh actions based on workflow node presence 2025-11-11 20:08:09 +08:00
lyzno1
0a39e5c092 Fix modal query sync for settings & pricing 2025-11-11 19:32:48 +08:00
lyzno1
9169a5e35b Merge branch 'main' into feat/trigger 2025-11-11 18:05:23 +08:00
Jyong
5208867ccc fix document enable (#28081) 2025-11-11 17:50:45 +08:00
zhsama
bfdcb79e19 feat(card-view): enhance CardView to conditionally render AppCards based on trigger node presence in workflow 2025-11-11 16:54:06 +08:00
zhsama
c37cce000f refactor: replace TRIGGER_NODE_TYPES with isTriggerNode utility for improved node type checks across workflow components 2025-11-11 16:54:06 +08:00
autofix-ci[bot]
6d3fb9b769 [autofix.ci] apply automated fixes 2025-11-11 08:37:13 +00:00
Harry
c04913ecf8 refactor(tests): remove redundant graph validation tests from WorkflowService unit tests
- Deleted tests for graph initialization and error propagation that were deemed unnecessary.
- Cleaned up the test suite to improve maintainability and focus on essential validation scenarios.
2025-11-11 16:35:19 +08:00
lyzno1
3a84a64c32 refactor: unify account setting tab constants and tighten modal types 2025-11-11 16:16:41 +08:00
lyzno1
b344d4add1 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-11-11 16:16:22 +08:00
lyzno1
405a4ec9f8 feat: add URL parameter support for settings modal using action=showSettings 2025-11-11 16:16:06 +08:00
lyzno1
edc7ccc795 chore: add type-check to pre-commit (#28005) 2025-11-11 16:14:39 +08:00
Harry
8bb11a588c fix(tests): fix end node missing ouputs 2025-11-11 16:14:00 +08:00
Harry
44f451bd7d refactor(api): improve graph validation logic in WorkflowService
- Updated the validate_graph_structure method to handle empty graph cases gracefully.
- Introduced a variable for workflow_id to ensure consistent handling of unknown workflow IDs.
- Enhanced code readability and maintainability by refining the method's structure.
2025-11-11 16:14:00 +08:00
zhsama
35d914e755 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-11-11 15:22:39 +08:00
zhsama
9c37f8c1cb feat(trigger): add support for trigger nodes and user input node management in workflow components 2025-11-11 15:21:04 +08:00
lyzno1
9de0e3c3a7 fix: add missing TimePicker type definitions for notClearable, triggerFullWidth, showTimezone and placement 2025-11-11 15:18:14 +08:00
lyzno1
707c94f86e feat: add URL parameter support for pricing modal using action=showPricing 2025-11-11 15:15:40 +08:00
lyzno1
81afd087f6 feat: add trigger events, workflow execution, and start nodes to billing plan features
- Add three new feature items to cloud plan list:
  - Trigger Events (varies by plan: 3K for sandbox, 20K/month for pro, unlimited for team)
  - Workflow Execution (standard/faster/priority based on plan)
  - Start Nodes (limited to 2 for sandbox, unlimited for pro/team)
- Add i18n translations for en-US and zh-Hans
- Position new items below document processing priority and above divider
2025-11-11 15:00:25 +08:00
lyzno1
0f952f328f feat: add api rate limit and trigger events billing card 2025-11-11 15:00:25 +08:00
autofix-ci[bot]
50619fba0a [autofix.ci] apply automated fixes 2025-11-11 06:54:03 +00:00
Harry
aad31bb703 feat(api): enhance workflow validation and structure checks
- Added a new validation class to ensure that trigger nodes do not coexist with UserInput (start) nodes in the workflow graph.
- Implemented a method in WorkflowService to validate the graph structure before persisting workflows, leveraging the new validation logic.
- Updated unit tests to cover the new validation scenarios and ensure proper error propagation.
2025-11-11 14:52:13 +08:00
hjlarry
7484a020e1 fix(trigger): subscription schema use bool field cause pydantic error 2025-11-11 14:05:11 +08:00
autofix-ci[bot]
186828c13a [autofix.ci] apply automated fixes 2025-11-11 04:47:22 +00:00
Harry
203fb95391 chore(api): update dependencies and default queue configurations
- Updated `revision` in `uv.lock` from 3 to 2.
- Added `croniter` package version 6.0.0 with dependencies in `uv.lock`.
- Updated `dify-api` version to 1.10.0rc1 and added `croniter` as a dependency.
- Modified default queue names in `entrypoint.sh` for both CLOUD and SELF_HOSTED editions to include `priority_dataset`.
2025-11-11 12:45:02 +08:00
Harry
a94e650ffd Merge remote-tracking branch 'origin/main' into feat/trigger
# Conflicts:
#	api/docker/entrypoint.sh
#	api/uv.lock
#	dev/start-worker
#	docker/.env.example
#	docker/docker-compose.yaml
#	web/app/(commonLayout)/app/(appDetailLayout)/[appId]/overview/chart-view.tsx
#	web/app/components/base/date-and-time-picker/date-picker/index.tsx
#	web/app/components/base/date-and-time-picker/types.ts
2025-11-11 12:42:01 +08:00
Ali Saleh
c9798f6425 fix(api): Trace Hierarchy, Span Status, and Broken Workflow for Arize & Phoenix Integration (#27937)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-11 11:49:19 +08:00
crazywoola
20ecf7f1d0 chore: remove unused enterprise bot from the readme (#28073)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-11 10:52:27 +08:00
github-actions[bot]
9dcb780fcb chore: translate i18n files and update type definitions (#28054)
Co-authored-by: iamjoel <2120155+iamjoel@users.noreply.github.com>
2025-11-11 09:32:53 +08:00
Will
1cb7b09933 chore: Remove trailing space from migration filename (#28040) 2025-11-11 09:32:42 +08:00
Joel
2c62a77cf4 Chore: change query log time range (#28052)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-10 18:39:12 +08:00
QuantumGhost
b9bc48d8dd feat(api): Introduce Broadcast Channel (#27835)
This PR introduces a `BroadcastChannel` abstraction with broadcasting and at-most once delivery semantics, serving as the communication component between celery worker and API server.

It also includes a reference implementation backed by Redis PubSub.

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-10 17:23:21 +08:00
Will
ed234e311b fix workflow default updated_at (#28047) 2025-11-10 18:20:38 +09:00
hjlarry
00fdd06179 fix(trigger): subscription schema config not display field description 2025-11-10 13:43:56 +08:00
huangzhuo1949
9843fec393 fix: elasticsearch_vector version (#28028)
Co-authored-by: huangzhuo <huangzhuo1@xiaomi.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-10 13:17:13 +09:00
lyzno1
62fbc90389 refactor: update free plan rate limit description in pricing modal 2025-11-10 10:48:32 +08:00
Will
aa4cabdeb5 feat: Add Audio Content Support for MCP Tools (#27979) 2025-11-10 10:12:11 +08:00
NeatGuyCoding
eea713b668 Fix typo in weaviate comment, improve time test precision, and add security tests for get-icon utility (#27919)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-10 10:11:54 +08:00
hjlarry
f19a21da11 fix prepare userinput logic 2025-11-10 09:59:20 +08:00
dependabot[bot]
fc62538a94 chore(deps): bump scipy-stubs from 1.16.2.3 to 1.16.3.0 in /api (#28025)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-11-10 09:54:56 +08:00
Asuka Minato
7994144df7 add onupdate=func.current_timestamp() (#28014)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-10 01:48:52 +09:00
Kenn
e153c483b6 fix: the model list encountered two children with the same key (#27956)
Co-authored-by: haokai <haokai@shuwen.com>
2025-11-09 21:39:59 +08:00
wangxiaolei
422bb4d4bb fix: fix https://github.com/langgenius/dify/issues/27939 (#27985) 2025-11-09 21:39:05 +08:00
OneZero-Y
87a80d7613 docs: clarify how to obtain workflow_id for version execution (#28007)
Signed-off-by: OneZero-Y <aukovyps@163.com>
2025-11-09 21:38:06 +08:00
zhsama
7401792063 feat(last-run): add handling for Listening status in run result calculation 2025-11-07 16:39:30 +08:00
kenwoodjw
e91105ca87 fix: bump brotli to 1.2.0 resloved CVE-2025-6176 (#27950)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-11-07 15:57:29 +08:00
zhsama
79e46c8a81 feat(step-run): add resolvedStatus calculation for improved run result handling 2025-11-07 15:19:40 +08:00
zhsama
7658c92cf9 feat(trigger): improve trigger node in useOneStepRun for getting system variables 2025-11-07 13:17:57 +08:00
hj24
37903722fe refactor: implement tenant self queue for rag tasks (#27559)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-11-06 21:25:50 +08:00
QuantumGhost
f4c82d0010 fix(api): fix VariablePool.get adding unexpected keys to variable_dictionary (#26767)
Co-authored-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-06 18:30:35 +08:00
NFish
fe50093c18 fix: prevent fetch version info in enterprise edition (#27923) 2025-11-06 17:59:53 +08:00
Jyong
4317af1e90 fix jina reader transform (#27922) 2025-11-06 17:35:53 +08:00
zhsama
85a5c78b80 feat: enhance workflow log components with detailed trigger metadata and type safety 2025-11-06 16:16:34 +08:00
lyzno1
9d7b47c784 trigger CI
Signed-off-by: lyzno1 <yuanyouhuilyz@gmail.com>
2025-11-06 15:41:54 +08:00
lyzno1
e4c6ed9c60 Merge branch 'main' into feat/trigger 2025-11-06 15:25:19 +08:00
zhsama
fcfade4778 fix: update checkValidFns type to Partial and ensure error message fallback in useOneStepRun 2025-11-06 14:51:32 +08:00
zhsama
000e8bd12b fix: improve error handling in useOneStepRun and useWorkflowRun to provide structured error messages 2025-11-06 14:05:45 +08:00
Harry
ed8da2c760 fix: return structured error response for PluginInvokeError in workflow trigger APIs 2025-11-06 12:41:17 +08:00
zhsama
fb6dc14e9b refactor:simplify syncWorkflowDraft parameters 2025-11-06 12:19:09 +08:00
hjlarry
77e6e98234 fix CI 2025-11-06 09:46:43 +08:00
red_sun
61a0fcc2ea fix agent putout the output of workflow-tool twice (#26835) (#27087) 2025-11-06 09:41:05 +08:00
lyzno1
fb3699ec5e fix(web): resolve type checks in app operations 2025-11-05 20:08:59 +08:00
Jyong
f627348b11 fix jina reader creadential migration command (#27883) 2025-11-05 18:42:07 +08:00
zhsama
4601be8b67 fix: update hasConnectedUserInput function to use specific types for nodes and edges 2025-11-05 18:12:57 +08:00
Cursx
87fb9a6b69 fix Version 2.0.0-beta.2: Chat annotations Api Error #25506 (#27206)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
2025-11-05 17:37:19 +08:00
Yongtao Huang
97a2e2ec2e Fix: correct DraftWorkflowApi.post response model (#27289)
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-05 17:20:40 +08:00
Boris Polonsky
68d357d7f6 Add WEAVIATE_GRPC_ENDPOINT as designed in weaviate migration guide (#27861)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-05 17:19:08 +08:00
zhsama
cc4d4adfb9 feat: enhance workflow draft processing by adding hydration and sanitization functions 2025-11-05 16:54:49 +08:00
lyzno1
6a08623949 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-11-05 16:50:53 +08:00
Harry
f0127ffc9a feat: enhance trigger metadata structure by adding type field and updating trigger_metadata handling 2025-11-05 16:49:29 +08:00
Harry
f1e513830c feat: implement logging for failed trigger invocations in workflow processing 2025-11-05 16:49:29 +08:00
Harry
7de533a643 feat: add start-web script to automate web project setup and execution 2025-11-05 16:49:29 +08:00
crazywoola
a103ad3ee7 bump vite to 6.4.1 (#27877) 2025-11-05 16:33:19 +08:00
wangjifeng
f65d5a9761 Fix/template transformer line number (#27867)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-05 15:21:47 +08:00
github-actions[bot]
6e0a5f5bbd chore: translate i18n files and update type definitions (#27868)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-05 15:17:53 +08:00
crazywoola
22f858152f feat: change feedback to forum (#27862) 2025-11-05 14:51:57 +08:00
autofix-ci[bot]
052127c473 [autofix.ci] apply automated fixes 2025-11-05 05:07:06 +00:00
Harry
7a4be5c0d2 Update package metadata in uv.lock: increment revision to 2, add upload times for sdist and wheels, and update aiohttp sdist version to 3.13.2. 2025-11-05 13:00:28 +08:00
Harry
a6208feed8 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-11-05 12:59:25 +08:00
Gritty_dev
775d2e14fc test: create new test scripts and update some existing test scripts o… (#27850) 2025-11-05 11:09:24 +08:00
lyzno1
c8f55549d7 Remove global pnpm installation from script 2025-11-05 10:27:45 +08:00
johnny0120
744b287e67 fix: avoid passing empty uniqueIdentifier to InstallFromMarketplace (#27802)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-05 10:22:22 +08:00
crazywoola
c0fc5d98f0 fix: installation_id is missing when in tools page (#27849) 2025-11-05 10:19:12 +08:00
lyzno1
132a86dcb3 fix: output node vars check 2025-11-05 10:15:57 +08:00
Elliott
08ea79d730 fix(web): increase z-index of PortalToFollowElemContent (#27823) 2025-11-05 09:32:15 +08:00
yangzheli
f31b821cc0 fix(web): improve the consistency of the inputs-form UI (#27837) 2025-11-05 09:29:13 +08:00
Novice
34be16874f feat: add validation to prevent saving empty opening statement in conversation opener modal (#27843) 2025-11-05 09:28:49 +08:00
aka James4u
e9738b891f test: adding some web tests (#27792) 2025-11-04 21:06:44 +08:00
zhsama
9f59baed10 fix(urlValidation): remove specific check for Dify cloud trigger debug URLs 2025-11-04 18:34:41 +08:00
zhsama
ce56286329 feat(validation): implement isPrivateOrLocalAddress utility and integrate into webhook components for improved URL validation 2025-11-04 18:32:19 +08:00
Harry
6e76f2aff2 fix(api): update TriggerInvokeEventResponse to use Field for default value of cancelled 2025-11-04 18:15:13 +08:00
Harry
49edd58722 fix(trigger): enhance credential handling by decrypting and masking subscription properties and parameters 2025-11-04 18:15:13 +08:00
zhsama
6a28aee13e Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-11-04 16:51:37 +08:00
zhengchangchun
829796514a fix:knowledge base reference information is overwritten when using mu… (#27799)
Co-authored-by: zhengchangchun <zhengchangchun@corp.netease.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-04 16:40:44 +08:00
WTW0313
79c70d09c9 feat(marketplace): introduce IS_MARKETPLACE flag and update X-Dify-Version header logic 2025-11-04 16:09:50 +08:00
Novice
ef1db35f80 feat: implement file extension blacklist for upload security (#27540) 2025-11-04 15:45:22 +08:00
zhsama
b9bb97887b fix(workflow): handle node inspection variable deletion when not fetched 2025-11-04 15:25:15 +08:00
Cursx
f9c67621ca fix agent putout the output of workflow-tool twice (#26835) (#27706) 2025-11-04 14:24:51 +08:00
Guangdong Liu
e29e8e3180 feat: enhance annotation API to support optional message_id and content fields (#27460)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-04 14:11:09 +08:00
red_sun
7a81e720d4 fix: iteration node cannot be viewed(#27759) (#27786) 2025-11-04 12:37:31 +08:00
XlKsyt
55600c0eb1 feat: add metrics logging and improve MeterProvider lifecycle for tencent APM (#27733)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-04 12:35:53 +08:00
kenwoodjw
35e41d7d68 fix: bump pyobvector to 0.2.17 (#27791)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-11-04 12:25:50 +08:00
Ponder
b610cf9a11 feat: add segments max number limit for SegmentApi.post (#27745) 2025-11-04 10:27:58 +08:00
-LAN-
c8e9edc024 refactor(api): set default value for EasyUIBasedAppGenerateEntity.query (#27712) 2025-11-04 10:22:43 +08:00
49
471cd760d7 fix: improve infinite scroll observer responsiveness (#27546)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-11-04 10:15:27 +08:00
墨绿色
7f48c57edf fix: datasets weight settings embedding model does not change (#27694)
Co-authored-by: lijiezhao <lijiezhao@perfect99.com>
2025-11-04 10:00:36 +08:00
NeatGuyCoding
6569801162 extract parse_time_range for console app stats related queries (#27626)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
2025-11-04 10:00:12 +08:00
国昊
9dd83f50a7 FIX Issue #27697: Add env variable in docker-compose(template) and make it take effect. (#27704) 2025-11-04 09:58:59 +08:00
CrabSAMA
59c56b1b0d fix: File model add known extra fields, fix issue about the tool of… (#27607) 2025-11-04 09:57:25 +08:00
lyzno1
7df6d9f1aa Merge remote-tracking branch 'origin/main' into feat/trigger 2025-11-04 09:56:02 +08:00
Tianzhi Jin
94cd2de940 fix(api): return timestamp as integer in document api (#27761) 2025-11-04 09:55:47 +08:00
lyzno1
587f83bc34 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-11-04 09:55:43 +08:00
heyszt
3c23375607 refactor: Use Repository Pattern for Model Layer (#27663) 2025-11-04 09:53:22 +08:00
lyzno1
d81b2e6820 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-11-04 09:52:08 +08:00
dependabot[bot]
56047f638f chore(deps): bump dayjs from 1.11.18 to 1.11.19 in /web (#27735)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-11-04 09:47:57 +08:00
vicen
9c01d3e775 fix: two web bugs for json-schema-config-modal (#27718) 2025-11-04 09:45:28 +08:00
lyzno1
8315e0c74b Merge remote-tracking branch 'origin/main' into feat/trigger 2025-11-04 09:44:13 +08:00
海狸大師
c85c87f3da fix(i18n/zh-Hant): unify terminology and improve translation consistency (#27717) 2025-11-04 09:42:26 +08:00
-LAN-
eaa02e3d55 Add SQLAlchemy Mapped annotations to MessageFeedback (#27768) 2025-11-04 09:39:59 +08:00
yihong
0219222a60 fix: pin litellm version ignore build issue (#27742)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2025-11-04 09:39:03 +08:00
yangzheli
dba659b220 fix(web): fix issues with links, Chinese translations, and styling on the logs page (#27669) 2025-11-04 09:38:15 +08:00
Bowen Liang
ee6458768e cleanup orphan packages in packages stage of api dockerfile (#27617) 2025-11-04 09:36:52 +08:00
Shemol
ed3d02dc6d web(markdown): support <think> without trailing newline in preprocessThinkTag (#27776)
Signed-off-by: SherlockShemol <shemol@163.com>
2025-11-04 09:35:54 +08:00
CrabSAMA
95471b1188 fix(ui): fixed the bug about empty placeholder when plugin install successfully (#27780) 2025-11-04 09:35:14 +08:00
aka James4u
6190cfbfd8 feat: localization for hi-IN (#27783)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-04 09:34:41 +08:00
hjlarry
3cc6690356 fix(trigger): global var dialogue_count and timestamp auto converted to string 2025-11-03 17:12:58 +08:00
hjlarry
6507263b28 fix(trigger): timestamp should be number type 2025-11-03 15:40:33 +08:00
hjlarry
8fa0bb48df fix CI 2025-11-03 14:57:01 +08:00
aka James4u
11f2f95103 Added it-IT for italian (#27665) 2025-11-03 11:51:45 +08:00
-LAN-
2abbc14703 refactor: replace hardcoded user plan strings with CloudPlan enum (#27675) 2025-11-03 11:51:09 +08:00
dependabot[bot]
b2b2816ade chore(deps): bump tablestore from 6.2.0 to 6.3.7 in /api (#27736) 2025-11-03 11:50:39 +08:00
hjlarry
637a675681 fix(trigger): workflow checklist not work for knowledge pipeline 2025-11-03 09:42:33 +08:00
-LAN-
4461df1bd9 refactor(api): add SQLAlchemy 2.x Mapped type hints to Message model (#27709)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-11-01 01:16:07 +08:00
lyzno1
085ada86e6 chore: trigger ci
Signed-off-by: lyzno1 <yuanyouhuilyz@gmail.com>
2025-10-31 20:15:14 +08:00
lyzno1
f59d430219 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-31 20:01:16 +08:00
lyzno1
c415e5b893 Merge branch 'feat/trigger' of https://github.com/langgenius/dify into feat/trigger 2025-10-31 20:00:51 +08:00
lyzno1
67b6b3612c fix: trigger docs link 2025-10-31 20:00:39 +08:00
katakyo
f7f6b4a8b0 i18n(ja-JP): Use 「公開」 for App Overview “Launch” action label (#27680) 2025-10-31 11:23:38 +08:00
Yeuoly
229b0e190f bump version 2025-10-30 23:21:56 +08:00
autofix-ci[bot]
09d412cf2a [autofix.ci] apply automated fixes 2025-10-30 15:18:20 +00:00
Harry
2842cbf1e1 refactor: update error handling to use BadRequest for plugin invocation errors 2025-10-30 23:16:14 +08:00
Harry
e2543bcf30 refactor: remove unused error imports in TriggerManager 2025-10-30 23:05:08 +08:00
Harry
3f75aa6848 refactor: simplify error handling in TriggerManager 2025-10-30 23:04:54 +08:00
Yeuoly
57719f3ce9 fix: docker env 2025-10-30 22:21:14 +08:00
Harry
45677ac57c bump plugin daemon image version to 0.4.0-local 2025-10-30 22:19:24 +08:00
Yeuoly
4eacbf37ff bump docker compose daemon version 2025-10-30 21:22:09 +08:00
Yeuoly
ef256ac276 bump version 2025-10-30 21:20:58 +08:00
hjlarry
2733e04039 fix CI 2025-10-30 20:28:47 +08:00
hjlarry
e49ec82258 fix CI 2025-10-30 20:25:35 +08:00
hjlarry
cf301eb1d9 fix CI 2025-10-30 20:23:22 +08:00
hjlarry
98b9ba2b2e fix CI 2025-10-30 20:22:01 +08:00
hjlarry
2126c64468 fix CI 2025-10-30 20:17:11 +08:00
hjlarry
271a1b4f98 fix CI 2025-10-30 20:10:49 +08:00
hjlarry
9be3c62c04 fix CI 2025-10-30 20:08:16 +08:00
lyzno1
04bfa235a9 fix: test 2025-10-30 19:49:08 +08:00
lyzno1
3b37ae1b4e fix: dotenv lint 2025-10-30 19:37:45 +08:00
Yeuoly
c1cb93cd26 fix 2025-10-30 18:55:08 +08:00
Yeuoly
75fa161c46 apply fix 2025-10-30 18:49:06 +08:00
Yeuoly
d6d82cff33 apply linter 2025-10-30 18:48:16 +08:00
lyzno1
5c266fecf9 fix: types 2025-10-30 18:36:08 +08:00
Yeuoly
7244978b24 fix 2025-10-30 18:33:46 +08:00
Yeuoly
623021dcff cleanup 2025-10-30 18:32:05 +08:00
zhsama
5af165fce9 fix: change timestamp type to integer 2025-10-30 18:24:30 +08:00
lyzno1
9503fafc53 fix 2025-10-30 18:15:03 +08:00
lyzno1
99fac21bdb fix: type 2025-10-30 18:11:52 +08:00
yessenia
bc95678c5e fix(trigger): appmode type 2025-10-30 18:03:12 +08:00
lyzno1
3f34f38635 fix: types 2025-10-30 18:02:58 +08:00
lyzno1
30f771369b fix: types 2025-10-30 18:01:12 +08:00
hjlarry
20bd059a6c fix CI 2025-10-30 17:58:59 +08:00
lyzno1
f5eb406394 fix: types 2025-10-30 17:58:31 +08:00
hjlarry
cbebac1d45 fix: webhook container tests 2025-10-30 17:46:59 +08:00
lyzno1
030da43ae3 fix: type 2025-10-30 17:44:43 +08:00
yessenia
b7f1394403 fix(trigger): add default icon 2025-10-30 17:42:10 +08:00
lyzno1
ceb6a09387 exclude test file in tsc 2025-10-30 17:39:02 +08:00
Yeuoly
14ad800967 Revert "rm type check"
This reverts commit 34d1f86f76.
2025-10-30 17:34:45 +08:00
lyzno1
34d1f86f76 rm type check 2025-10-30 17:28:38 +08:00
lyzno1
b9b9f8eae3 fix: type 2025-10-30 17:24:36 +08:00
lyzno1
0de8596afe Merge branch 'feat/trigger' of https://github.com/langgenius/dify into feat/trigger 2025-10-30 17:14:11 +08:00
autofix-ci[bot]
4dbd26ff66 [autofix.ci] apply automated fixes 2025-10-30 09:14:08 +00:00
Yeuoly
d018ef9033 apply autofix to autofix CI 2025-10-30 17:11:49 +08:00
lyzno1
979c985804 Merge branch 'feat/trigger' of https://github.com/langgenius/dify into feat/trigger 2025-10-30 17:11:08 +08:00
Yeuoly
291e9a3aee fix: ruff 2025-10-30 17:10:21 +08:00
Yeuoly
5861ca773e fix: mapping to dict 2025-10-30 17:09:40 +08:00
autofix-ci[bot]
eb3b5f751a [autofix.ci] apply automated fixes 2025-10-30 09:08:11 +00:00
lyzno1
9bbfbf1c5f Merge branch 'feat/trigger' of https://github.com/langgenius/dify into feat/trigger 2025-10-30 17:06:27 +08:00
zhsama
8cbd124b80 delete: remove cron-parser unit tests 2025-10-30 17:05:41 +08:00
lyzno1
d137d0eed0 rm test 2025-10-30 17:05:27 +08:00
zhsama
58c5db3b00 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-30 17:05:14 +08:00
lyzno1
8750796f9f Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-30 16:59:56 +08:00
lyzno1
7d4bb45f94 fix(workflow): align plugin lock overlay with install availability 2025-10-30 16:55:07 +08:00
zhsama
db744444f2 fix(CI): fix CI errors 2025-10-30 16:54:17 +08:00
kenwoodjw
41be581594 fix: python package vulnerability (#27645)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-10-30 16:43:07 +08:00
Yeuoly
b25d379ef4 cleanup 2025-10-30 16:33:39 +08:00
zhsama
e1e95f7ccd fix(CI): fix CI errors 2025-10-30 16:33:04 +08:00
Yeuoly
edd50420ec apply test fix 2025-10-30 16:24:43 +08:00
Yeuoly
9af8fe085b fix: apply docker template 2025-10-30 16:20:26 +08:00
zhsama
ed6bb121bb fix: update types 2025-10-30 16:16:55 +08:00
zhsama
4635b99153 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-30 16:16:21 +08:00
zhsama
282fde9a04 feat(next.config): add console log removal configuration for production 2025-10-30 16:16:11 +08:00
Yeuoly
e9078eedbd fix: Variable e is not accessed (reportUnusedVariable) 2025-10-30 16:14:13 +08:00
Yeuoly
501698d844 Potential fix for code scanning alert no. 243: Information exposure through an exception
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-10-30 16:11:32 +08:00
Yeuoly
dd089b1b21 fix: coding style 2025-10-30 16:10:54 +08:00
Yeuoly
6260a1a28c fix: cycle imports 2025-10-30 16:09:39 +08:00
Yeuoly
8bc5035624 Potential fix for code scanning alert no. 211: Information exposure through an exception
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-10-30 16:00:57 +08:00
Yeuoly
2dbfd9ea5a Potential fix for code scanning alert no. 241: Use of a broken or weak cryptographic hashing algorithm on sensitive data
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-10-30 16:00:44 +08:00
Yeuoly
08e61d76d6 Potential fix for code scanning alert no. 244: Information exposure through an exception
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-10-30 16:00:24 +08:00
Yeuoly
447127cee4 Update api/controllers/console/app/workflow.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-30 15:59:37 +08:00
Yeuoly
49ebbd05b5 Update api/controllers/console/app/workflow.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-30 15:58:58 +08:00
Yeuoly
defea962f6 Update api/controllers/console/workspace/trigger_providers.py
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-30 15:58:41 +08:00
Yeuoly
b3866288e0 fix: docker compose 2025-10-30 15:54:25 +08:00
QuantumGhost
20ad5b7ac2 docs(api): update docs about gevent setup in app.py (#27611)
Add a warning about top level importing in gunicorn.conf.py

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-30 15:43:08 +08:00
lyzno1
bed2ce69bb Merge branch 'feat/trigger' of https://github.com/langgenius/dify into feat/trigger 2025-10-30 15:35:39 +08:00
lyzno1
4d37d61851 feat: dim node 2025-10-30 15:34:50 +08:00
lyzno1
8a48db6d0d Improve workflow tool install flow 2025-10-30 15:34:49 +08:00
lyzno1
ff0f645e54 Fix plugin install detection for tool nodes 2025-10-30 15:34:49 +08:00
lyzno1
6e0765fbaf feat: add install check for tools, triggers and datasources 2025-10-30 15:34:49 +08:00
yessenia
1d03e0e9fc fix(trigger): hide input params when no subscription 2025-10-30 15:28:34 +08:00
Yeuoly
cac60a25bb cleanup: migrations 2025-10-30 15:27:02 +08:00
Yeuoly
57c65ec625 fix: typing 2025-10-30 14:58:30 +08:00
Yeuoly
ffc3c61d00 merge workflow pasuing 2025-10-30 14:54:14 +08:00
Yeuoly
aa3b16a136 fix: migrations 2025-10-30 14:45:26 +08:00
Yeuoly
6e0b408dd5 Merge branch 'main' into feat/trigger 2025-10-30 14:43:27 +08:00
QuantumGhost
a1c0bd7a1c feat(api): Introduce workflow pause state management (#27298)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-30 14:41:09 +08:00
lyzno1
be9eeff6c2 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-30 12:14:47 +08:00
Wu Tianwei
fd7c4e8a6d feat: enhance pipeline template list with marketplace feature toggle (#27604) 2025-10-30 11:02:54 +08:00
quicksand
41e549af14 fix(weaviate): skip init checks to prevent PyPI requests on each search (#27624)
Co-authored-by: Claude <noreply@anthropic.com>
2025-10-30 09:59:08 +08:00
hjlarry
ca9d92b1e5 fix(variable): when open history mode not close global var panel 2025-10-30 09:53:00 +08:00
issac2e
b7360140ee fix: resolve stale closure values in LLM node callbacks (#27612) (#27614)
Co-authored-by: liuchen15 <liuchen15@gaotu.cn>
2025-10-30 09:38:39 +08:00
kurokobo
c71f7c7613 fix(http_request): set response.text if there is no file (#27610) 2025-10-30 09:34:59 +08:00
yangzheli
c905c47775 fix(web): add a scrollbar when the setting-modal content overflows (#27620) 2025-10-30 09:31:24 +08:00
hjlarry
0607db41e5 fix(variable): draft run workflow cause global var panel misalign 2025-10-30 09:02:38 +08:00
hjlarry
48b1829b14 chore: improve toggle env/conversation/global var panel 2025-10-29 22:08:04 +08:00
hjlarry
6767a8f72c chore: i18n for system var 2025-10-29 21:10:26 +08:00
Wu Tianwei
4ca7ba000c refactor: update install status handling in plugin installation process (#27594) 2025-10-29 18:31:02 +08:00
Harry
1e477af05f feat(trigger): add system variables to webhook node outputs
Enhanced the TriggerWebhookNode to include system variables as outputs. This change allows for better accessibility of system variables during node execution, improving the overall functionality of the webhook trigger process. A TODO comment has been added to address future improvements for direct access to system variables.
2025-10-29 18:15:36 +08:00
Harry
9b5e5f0f50 refactor(api): replace dict type hints with Mapping for improved type safety
Updated type hints in several services to use Mapping instead of dict for better compatibility with various dictionary-like objects. Adjusted credential handling to ensure consistent encryption and decryption processes across ToolManager, DatasourceProviderService, ApiToolManageService, BuiltinToolManageService, and MCPToolManageService. This change enhances code clarity and adheres to strong typing practices.
2025-10-29 18:10:38 +08:00
Harry
fb12f31df2 feat(trigger): system variables for trigger nodes
Added a timestamp field to the SystemVariable model and updated the WorkflowAppRunner to include the current timestamp during execution. Enhanced node type checks to recognize trigger nodes in various services, ensuring proper handling of system variables and node outputs in TriggerEventNode and TriggerScheduleNode. This improves the overall workflow execution context and maintains consistency across node types.
2025-10-29 18:10:38 +08:00
Xiyuan Chen
f260627660 feat: use id for webapp (#27576) 2025-10-29 01:45:40 -07:00
yessenia
db2c6678e4 fix(trigger): show subscription url & add readme in trigger plugin node 2025-10-29 16:16:29 +08:00
zhsama
bc3421add8 refactor(variable): update global variable names and types for consistency 2025-10-29 15:53:37 +08:00
XlKsyt
1e9142c213 feat: enhance tencent trace integration with LLM core metrics (#27126)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-29 15:53:30 +08:00
zhsama
61d8809a0f fix(workflow): enhance validation before running workflows by integrating warning notifications 2025-10-29 15:53:13 +08:00
Jyong
82890fe38e add uninstalled recommend tools detail (#27537)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-29 15:33:41 +08:00
Blackoutta
7dc7c8af98 improve: speed up tracing config decryption process (#27549) 2025-10-29 15:33:16 +08:00
quicksand
addebc465a fix: resolve 500 error when updating document chunk settings (#27551) (#27574) 2025-10-29 15:31:18 +08:00
lyzno1
d37cc9f9c8 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-29 15:16:28 +08:00
Vivec
5ab315aeaf fix: set conditional capabilities upon MCP client session initialization (#26234)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Novice <novice12185727@gmail.com>
2025-10-29 15:11:45 +08:00
lyzno1
0db082f6d0 feat(workflow): persist RAG recommendation panel collapse state 2025-10-29 15:10:45 +08:00
lyzno1
c94dc52310 fix: remove duplicate RAG tool heading and fix select callback type 2025-10-29 14:57:38 +08:00
非法操作
f092bc1912 chore: add more stories (#27403)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-29 14:33:43 +08:00
hjlarry
bebcbfd80e chore: improve delete app related tables 2025-10-29 14:29:59 +08:00
Harry
dfc5e3609d refactor(trigger): streamline OAuth client existence check
Replaced the method for checking the existence of a system OAuth client with a new dedicated method `is_oauth_system_client_exists` in the TriggerProviderService. This improves code clarity and encapsulates the logic for verifying the presence of a system-level OAuth client. Updated the TriggerOAuthClientManageApi to utilize the new method for better readability.
2025-10-29 14:22:56 +08:00
lyzno1
fa5765ae82 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-29 12:56:31 +08:00
lyzno1
852d851996 fix(workflow): add empty array validation for required checklist fields in trigger plugin
The checkValid function was not properly validating required checklist fields when they had empty array values. This caused required fields to pass validation even when no options were selected.

Added array length check to the constant type validation to ensure required checklist fields must have at least one selected option.
2025-10-29 12:36:43 +08:00
hjlarry
0b599b44b0 chore: when delete app also delete related trigger tables 2025-10-29 12:15:34 +08:00
lyzno1
f06dc3ef90 fix: localize workflow block search filters 2025-10-29 11:55:30 +08:00
Jianwei Mao
23b49b8304 fix issues 27388, add missing env variable: ENFORCE_LANGGENIUS_PLUGIN… (#27545) 2025-10-29 10:40:59 +08:00
NeatGuyCoding
9e97248ede fix unit test using enum (#27575)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
2025-10-29 10:26:40 +08:00
Asuka Minato
d532b06310 example of use api.model (#27514) 2025-10-29 10:25:15 +08:00
GuanMu
07a2281730 chore: add web type check step to GitHub Actions workflow (#27498) 2025-10-29 10:20:37 +08:00
Eric Guo
42385f3ffa Sync celery queue name list (#27554) 2025-10-29 10:19:57 +08:00
yangzheli
c597234374 fix(workflow): doc extractor node now correctly extracts mdx files (#27570) 2025-10-29 10:19:29 +08:00
lyzno1
f9df61e648 feat: add inline code copy styling for variable inspect webhook url 2025-10-29 10:14:50 +08:00
lyzno1
6e76e02dba fix: trigger plugin help link 2025-10-29 09:35:45 +08:00
zhsama
dc24450e29 feat(workflow): add webhook debug URL display in variable inspection 2025-10-29 04:38:27 +08:00
zhsama
8bcecce627 feat(workflow): add toast notifications for warning nodes during execution 2025-10-29 01:40:27 +08:00
zhsama
66cb963df3 feat(workflow): enhance validation by integrating warning nodes into last run checks. 2025-10-29 01:28:31 +08:00
Harry
5c95c77604 refactor(trigger): streamline workflow argument handling in DraftWorkflowTriggerNodeApi
- Simplified retrieval of workflow arguments by directly accessing event.workflow_args.
- Removed unnecessary conditional checks for user inputs, ensuring cleaner code.
- Enhanced TriggerEventNode to use deepcopy for user inputs to prevent unintended mutations.
2025-10-29 01:04:37 +08:00
zhsama
a264a609db feat(workflow): integrate workflow run validation before execution 2025-10-29 00:36:48 +08:00
zhsama
3a876fd437 Merge branch 'main' into feat/trigger 2025-10-29 00:08:50 +08:00
zhsama
13bc68a646 feat(trigger): enhance runScheduleSingleRun to handle API response 2025-10-29 00:07:08 +08:00
Harry
b41538d8c7 feat(trigger): reinforcement schedule trigger debugging with cron calculation
- Implemented a caching mechanism for schedule trigger debug events using Redis to optimize performance.
- Added methods to create and manage schedule debug runtime configurations, including cron expression handling.
- Updated the ScheduleTriggerDebugEventPoller to utilize the new caching and event creation logic.
- Removed the deprecated build_schedule_pool_key function from event handling.
2025-10-28 23:34:08 +08:00
zhsama
720480d05e chore(tests): remove deprecated test files for schedule and webhook triggers 2025-10-28 22:42:29 +08:00
yessenia
71b1af69c5 feat(trigger): request condition param 2025-10-28 22:35:03 +08:00
NeatGuyCoding
3de73f07c6 fix sl translation (#27555)
Signed-off-by: tech-leader <tech@sabegeek.com>
Co-authored-by: tech-leader <tech@sabegeek.com>
2025-10-28 18:48:12 +08:00
Harry
18fd79fbe6 feat(trigger): add event_name to PluginTriggerMetadata for enhanced trigger handling
- Introduced event_name attribute in PluginTriggerMetadata to improve metadata clarity.
- Updated dispatch_triggered_workflow function to include event_name when dispatching triggered workflows.
2025-10-28 18:32:06 +08:00
Harry
c16421df27 refactor: improve trigger metadata handling and streamline workflow service
- Updated ScheduleTriggerDebugEventPoller to include an empty files list in workflow_args.
- Enhanced WorkflowAppService to handle trigger metadata more effectively, including a new method for processing metadata and removing the deprecated _safe_json_loads function.
- Adjusted PluginTriggerMetadata to use icon_filename and icon_dark_filename for better clarity.
- Simplified async workflow task parameters by changing triggered_from to trigger_from for consistency.
2025-10-28 17:50:06 +08:00
Harry
0d686fc6ae refactor: streamline trigger event node metadata handling and update async workflow service for JSON serialization
- Removed unnecessary input data from the TriggerEventNode's metadata.
- Updated AsyncWorkflowService to use model_dump_json() for trigger metadata serialization.
- Added a comment in WorkflowAppService to address the large size of the workflow_app_log table and the use of an additional details field.
2025-10-28 17:50:06 +08:00
Novice
0caeaf6e5c chore: improve mcp server url validation (#27558) 2025-10-28 17:30:01 +08:00
yessenia
db352c0a18 Merge branch 'main' into feat/trigger 2025-10-28 17:11:15 +08:00
yessenia
bf7b18d442 feat(trigger): dynamic options opt 2025-10-28 16:20:01 +08:00
lyzno1
7d56ca5294 fix: add time-picker placement prop 2025-10-28 15:49:10 +08:00
zhsama
0b1015e221 feat(workflow): enhance variable inspector to support schedule trigger events with next execution time display 2025-10-28 14:12:20 +08:00
Joel
3395297c3e chore: warning messages too long in model config caused ui issue (#27542) 2025-10-28 13:58:31 +08:00
zhaobingshuang
e60a7c7143 fix(command): The vdb migrate command cannot be stopped (#27536) 2025-10-28 11:56:06 +08:00
lyzno1
96f0d648fa feat: invalidate trigger plugin queries after marketplace installs 2025-10-28 11:31:02 +08:00
lyzno1
c4996f9563 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-28 11:28:06 +08:00
Wu Tianwei
0e62a66cc2 feat: Introduce RAG tool recommendations and refactor related components for improved plugin management (#27259)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-28 10:22:16 +08:00
Eric Guo
ff32dff163 Enabled cross-subdomain console sessions by making the cookie domain configurable and aligning the frontend so it reads the shared CSRF cookie. (#27190) 2025-10-28 10:04:24 +08:00
heyszt
543c5236e7 refactor:Decouple Domain Models from Direct Database Access (#27316)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-28 09:59:30 +08:00
yalei
341b3ae7c9 Sync log detail drawer with conversation_id query parameter, so that we can share a specific conversation (#27518)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-28 09:59:16 +08:00
quicksand
f01907aac2 fix: knowledge sync from website error (#27534) 2025-10-28 09:46:33 +08:00
yangzheli
a7c855cab8 fix(workflow): resolve note node copy/duplicate errors (#27528)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-28 09:26:12 +08:00
crazywoola
29afc0657d Fix/27468 in dify 192 the iframe embed cannot pass the user id in system variable (#27524)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-28 09:19:54 +08:00
yessenia
850c5fec32 fix(trigger): invalid subscription 2025-10-27 21:27:08 +08:00
QuantumGhost
d9860b8907 fix(api): Disable SSE events truncation for service api (#27484)
Disable SSE events truncation for service api invocations to ensure backward compatibility.

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-27 21:15:44 +08:00
zhsama
b1f79c34d1 fix(workflow): add support for schedule triggers in workflow run hook 2025-10-27 20:52:44 +08:00
yessenia
96a461646e fix(trigger): readme portal zindex 2025-10-27 20:05:02 +08:00
zhsama
5df94fd866 fix(workflow): enhance node-run to include schedule triggers 2025-10-27 19:44:26 +08:00
Asuka Minato
dc1ae57dc6 example for 24421 doc (#27511)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-27 17:39:52 +08:00
github-actions[bot]
d6bd2a9bdb chore: translate i18n files and update type definitions (#27503)
Co-authored-by: Nov1c444 <66365942+Nov1c444@users.noreply.github.com>
2025-10-27 17:39:43 +08:00
lyzno1
e074ba84d1 fix(workflow): avoid nested buttons in subscription selector to stop hydration warning 2025-10-27 17:23:58 +08:00
zxhlyh
c9eed67cf6 Feat/mcp authentication (#27508) 2025-10-27 17:08:18 +08:00
Novice
0ded6303c1 feat: implement MCP specification 2025-06-18 (#25766)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-27 17:07:51 +08:00
lyzno1
1335be8d60 Revert "feat: propagate trigger metadata for plugin icons across UI"
This reverts commit 3bd62f3fdf.
2025-10-27 17:06:40 +08:00
lyzno1
c79d75b32d Revert "fix: display plugin trigger labels in logs using i18n metadata"
This reverts commit 651cc81cfe.
2025-10-27 17:06:35 +08:00
lyzno1
f18054847e Revert "fix: workflow_trigger"
This reverts commit cc219cc81c.
2025-10-27 17:06:30 +08:00
lyzno1
b2b81f3822 Revert "fix: trigger by display translations"
This reverts commit 33daedd7aa.
2025-10-27 17:06:25 +08:00
yessenia
90753b2782 fix(trigger): readme style 2025-10-27 16:33:40 +08:00
hjlarry
c05fa9963a fix plugin name incorrect encoded 2025-10-27 16:08:47 +08:00
Novice
b6e0abadab feat: add flatten_output configuration to iteration node (#27502) 2025-10-27 16:04:24 +08:00
Harry
9de7a7d48f fix(trigger): update outputs in TriggerEventNode to use inputs directly 2025-10-27 15:35:09 +08:00
GuanMu
43bcf40f80 refactor: update installed app component to handle missing params and improve type safety (#27331) 2025-10-27 14:38:58 +08:00
yessenia
29cddc449f fix(trigger): add clickOutsideNotClose prop 2025-10-27 14:30:08 +08:00
zhsama
dfed14ba67 refactor: simplify syncWorkflowDraft parameters by removing payload sanitization 2025-10-27 13:46:27 +08:00
KVOJJJin
f06025a342 Fix: upload limit in knowledge (#27480)
Co-authored-by: jyong <718720800@qq.com>
2025-10-27 13:35:54 +08:00
lyzno1
440262a51b fix: serialize workflow draft sync operations (#27487) 2025-10-27 13:29:40 +08:00
Harry
d705fece9d fix(plugin): update trigger field type to allow None and add field validator for parameters in EventEntity 2025-10-27 12:02:22 +08:00
lyzno1
d08cc48368 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-27 11:37:07 +08:00
lyzno1
b94ad084c3 feat: surface featured trigger recommendations in start tab (#27319) 2025-10-27 11:33:02 +08:00
dependabot[bot]
24fb95b050 chore(deps-dev): bump @happy-dom/jest-environment from 20.0.7 to 20.0.8 in /web (#27465)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-27 10:42:42 +08:00
dependabot[bot]
49fca63927 chore(deps): bump testcontainers from 4.10.0 to 4.13.2 in /api (#27469)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-27 10:41:36 +08:00
wangxiaolei
ce5fe86430 feat: add env NEXT_PUBLIC_ENABLE_SINGLE_DOLLAR_LATEX (#27070) 2025-10-27 10:36:03 +08:00
Harry
9453148233 chore(migrations): remove obsolete migration files for workflow webhook and schedule plan tables 2025-10-27 00:36:27 +08:00
Harry
1857d0e53f chore(migrations): remove obsolete migration files for workflow trigger logs, app triggers, and plugin triggers 2025-10-27 00:28:07 +08:00
Tanaka Kisuke
666586b59c improve opensearch index deletion #27231 (#27336)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-26 23:57:21 +08:00
Harry
ae422c2628 fix(trigger): simplify return logic in TriggerProviderService by removing unnecessary None return 2025-10-26 23:48:50 +08:00
Harry
d933116e46 fix(workflow): improve error handling in DraftWorkflowTriggerNodeApi by returning JSON response 2025-10-26 23:48:50 +08:00
yihong
8a2851551a fix: dev container warning (#27444)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2025-10-26 19:26:55 +08:00
yalei
a2fe4a28c3 rm useless router.replace (#27386) 2025-10-26 19:26:46 +08:00
yangzheli
417ebd160b fix(web): update the tip in the file-uploader component (#27452) 2025-10-26 19:26:09 +08:00
lyzno1
5cf4afd7b2 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-26 11:54:17 +08:00
MelodicGin
82be305680 Bugfix: Windows compatibility issue with 'cp' command not found when running pnpm start. (#25670) (#25672)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-26 11:53:56 +08:00
-LAN-
03002f4971 Add Swagger docs for file download endpoints (#27374)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-25 19:23:27 +09:00
lyzno1
f7853f3b27 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-25 14:58:32 +08:00
lyzno1
913d85302c fix: hide replay button for non app-run workflow logs 2025-10-25 14:42:26 +08:00
lyzno1
33daedd7aa fix: trigger by display translations 2025-10-25 14:36:29 +08:00
lyzno1
cc219cc81c fix: workflow_trigger 2025-10-25 14:21:52 +08:00
lyzno1
945295adc3 chore: add some ja-JP translations 2025-10-25 14:00:21 +08:00
lyzno1
651cc81cfe fix: display plugin trigger labels in logs using i18n metadata 2025-10-25 12:41:37 +08:00
lyzno1
3bd62f3fdf feat: propagate trigger metadata for plugin icons across UI 2025-10-25 12:15:21 +08:00
lyzno1
e3484c8dc3 fix: ruff format 2025-10-25 12:08:22 +08:00
lyzno1
eecbe533a1 fix: ruff check 2025-10-25 12:07:46 +08:00
Harry
4221e99362 update(docker): add triggered workflow dispatcher and refresh executor to default Celery queues 2025-10-24 21:31:38 +08:00
Stream
5c69521973 feat: align with params 2025-10-24 21:20:12 +08:00
Stream
ffcaa67a56 feat: align with path 2025-10-24 20:47:54 +08:00
Stream
64a070f6b0 feat: align with path 2025-10-24 19:56:06 +08:00
Stream
c61656c759 fix: request param 2025-10-24 19:40:50 +08:00
Stream
6d34e4e99b fix: request path 2025-10-24 19:33:19 +08:00
Stream
d3a767364b fix: request path 2025-10-24 19:15:18 +08:00
github-actions[bot]
1e7e8a8988 chore: translate i18n files and update type definitions (#27423)
Co-authored-by: douxc <7553076+douxc@users.noreply.github.com>
2025-10-24 19:09:16 +08:00
Stream
f3c6d1ca1d fix: param passing 2025-10-24 18:27:23 +08:00
NFish
a715d5ac23 hide brand name in enterprise use (#27422) 2025-10-24 17:17:38 +08:00
quicksand
398c8117fe fix: rag pipeline priority_pipeline always queuing (#27416) 2025-10-24 16:32:23 +08:00
-LAN-
f45c18ee35 fix(graph_engine): NodeRunRetrieverResourceEvent is not handled (#27405) 2025-10-24 16:20:27 +08:00
zhsama
6098dc0242 feat(workflow): enhance listening descriptions for plugin and webhook triggers 2025-10-24 16:18:07 +08:00
非法操作
15c1db42dd fix: workflow can't publish tool when has checkbox parameter (#27394) 2025-10-24 15:33:43 +08:00
zhsama
29ec3c7d5c feat(workflow): update listening descriptions when trigger nodes start test-run 2025-10-24 15:32:23 +08:00
Alfred
a31c01f8d9 fix: correct HTML br tags in README.md (#27399) 2025-10-24 15:31:05 +08:00
Alfred
62753cdf13 Fix typo in docker/.env.example: 'defualt' -> 'default' (#27400) 2025-10-24 15:28:59 +08:00
zhsama
4597ab4efb Merge branch 'refs/heads/main' into feat/trigger 2025-10-24 14:44:15 +08:00
-LAN-
dc7ce125ad chore: disable postgres timeouts for docker workflows (#27397) 2025-10-24 13:46:36 +08:00
Novice
eabdb09f8e fix: support webapp passport token with end_user_id in web API auth (#27396) 2025-10-24 13:29:47 +08:00
Yunlu Wen
fa6d03c979 Fix/refresh token (#27381) 2025-10-24 13:09:34 +08:00
lyzno1
1dddcf1194 fix(workflow): resolve occasional issues with syncing historical states or clearing DSL in draft mode (#27391) 2025-10-24 12:33:14 +08:00
Stream
c4c38a51d9 fix: merge 2025-10-24 11:58:13 +08:00
yessenia
8a8c0703b1 feat: add datasource node readme 2025-10-24 11:46:58 +08:00
yessenia
1b74869b04 fix: plugin readme params 2025-10-24 10:48:59 +08:00
Novice
634fb192ef fix: remove unnecessary Flask context preservation to avoid circular import in audio service (#27380) 2025-10-24 10:41:14 +08:00
crazywoola
a4b38e7521 Revert "Sync log detail drawer with conversation_id query parameter, so that we can share a specific conversation" (#27382) 2025-10-24 10:40:41 +08:00
lyzno1
f065504ed6 fix(app-overview): soften tooltip styling 2025-10-24 10:34:16 +08:00
lyzno1
3f5485605f chore: update docs link 2025-10-24 10:31:09 +08:00
lyzno1
399bb522e0 chore: add ts-node for test 2025-10-24 10:20:58 +08:00
lyzno1
9fffa9a996 refactor(workflow): clean up entry node status and colocate store types 2025-10-24 10:20:38 +08:00
lyzno1
aee9a8366f refactor: move marketplace footer outside scroll containers 2025-10-24 10:07:02 +08:00
lyzno1
c3eec7ea8a Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-24 09:47:28 +08:00
yessenia
4b4ec3438f feat: add plugin readme 2025-10-24 01:18:58 +08:00
-LAN-
8ff6de91b0 Fix UpdatedVariable truncation crash (#27359)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-10-23 23:18:20 +08:00
Will
7fa0ad3161 fix: Render variables in Question Classifier class names (#27356) 2025-10-23 22:56:08 +08:00
-LAN-
53b21eea61 Promote GraphRuntimeState snapshot loading to class factory (#27222) 2025-10-23 22:29:02 +08:00
非法操作
2f3a61b51b fix: missing import dsl version incompatible modal (#27338) 2025-10-23 20:34:41 +08:00
zhsama
9aa43c9165 feat(workflow): enhance trigger node handling with event listening and state management 2025-10-23 18:21:22 +08:00
zhsama
4ae23ed0f9 feat(workflow): remove unused trigger status logic and simplify entry node status handling 2025-10-23 18:19:06 +08:00
Yeuoly
efe68d5aa6 Merge branch 'main' into feat/trigger 2025-10-23 18:05:59 +08:00
quicksand
8bca7814f4 fix: resolve AssertionError in workflows/run endpoint (#27318) (#27323) 2025-10-23 17:57:54 +09:00
zhsama
1604db02b5 fix(workflow): change description field to be required in TriggerPluginNodePayload 2025-10-23 16:56:00 +08:00
zhsama
e7192de9c0 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-23 16:46:35 +08:00
zhsama
f822b38a00 feat(workflow): integrate payload sanitization for workflow draft synchronization 2025-10-23 16:45:28 +08:00
yessenia
5a5c7f38d1 fix(plugin): stop loading when uninstall fails 2025-10-23 16:18:52 +08:00
zhsama
42a9a88ae2 refactor(trigger-plugin): enhance variable type resolution and encapsulate output variable logic in a dedicated function 2025-10-23 16:01:18 +08:00
zhsama
aea3fc6281 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-23 15:33:31 +08:00
zhsama
bcdc11396a refactor(trigger-plugin): streamline variable type resolution and output variable construction 2025-10-23 15:33:19 +08:00
hjlarry
ecd1d44d23 chore: update trigger dsl version to 0.5.0 2025-10-23 15:05:39 +08:00
hjlarry
6df786248c fix: draft run webhook node the _raw var not display on the panel 2025-10-23 13:35:45 +08:00
lyzno1
37e75f7791 Ensure workflow tools tab always shows marketplace footer 2025-10-23 13:08:59 +08:00
zlyszx
92c81b1833 fix: document word_count appear negative (#27313)
Co-authored-by: zlyszx <zlyszx>
2025-10-23 12:32:34 +08:00
lyzno1
7ada2385b3 feat: add toggle behavior for featured tools 2025-10-23 12:27:42 +08:00
lyzno1
a77aab96f5 fix: align all workflow trigger docs link 2025-10-23 12:17:27 +08:00
Stream
13af48800b fix: merge 2025-10-23 12:13:33 +08:00
Stream
6e7fb59638 Merge branch 'feat/plugin-readme' into feat/trigger
# Conflicts:
#	api/controllers/console/workspace/plugin.py
#	api/core/plugin/entities/plugin_daemon.py
2025-10-23 12:10:44 +08:00
lyzno1
44553d412c chore: bump pnpm version (#27315) 2025-10-23 12:07:58 +08:00
lyzno1
863b4f8fe9 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-23 11:54:35 +08:00
Wu Tianwei
95ce224df0 fix: enhance checklist functionality with embedding and rerank model lists (#27312) 2025-10-23 11:33:58 +08:00
yalei
8555635967 Sync log detail drawer with conversation_id query parameter, so that we can share a specific conversation (#26980)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-23 11:22:40 +08:00
Yunlu Wen
e843fe8aa6 fix: rename cookie for webapp (#27264)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-23 11:03:48 +08:00
非法操作
b198c9474a chore: improve storybooks (#27306) 2025-10-23 11:00:45 +08:00
yessenia
949ac9d930 feat(trigger): add formitem desc 2025-10-23 10:57:42 +08:00
yessenia
06d1a2e2fd feat(trigger): remove Redundant comp & triggers api no cache 2025-10-23 10:57:41 +08:00
lyzno1
d478f62b49 Optimize workflow tool sync after plugin install (#27280) 2025-10-23 09:58:54 +08:00
Wu Tianwei
4bb00b83d9 fix: Downgrade @monaco-editor/loader to v1.5.0 (#27282) 2025-10-22 20:18:24 +08:00
zhsama
128bc2241d feat(checkbox): adjust styles for checkbox component layout 2025-10-22 17:35:04 +08:00
ZalterCitty
c91cbf6b97 feat: compatible custom avatar url (#26975)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-22 17:21:17 +08:00
Asuka Minato
f6ede6f1c1 Add threading option to basedpyright checks (#27203) 2025-10-22 17:09:46 +08:00
Maries
65976b27fe fix: improve plugin invoke error (#27137)
Co-authored-by: Yeuoly <45712896+Yeuoly@users.noreply.github.com>
2025-10-22 17:09:24 +08:00
-LAN-
2d73ee64a3 Refine variable truncator type hints (#27220) 2025-10-22 17:08:55 +08:00
GuanMu
c61c2b0abd Fix type error (#27274) 2025-10-22 17:08:27 +08:00
Harry
b2730d680c fix(trigger): add missing fields in TriggerEventNode configuration 2025-10-22 16:55:55 +08:00
lyzno1
52b180104a Limit workflow tool/trigger search to provider and item names 2025-10-22 16:53:23 +08:00
Harry
9cdb62da93 fix(trigger): reset inputs in TriggerEventNode to an empty dictionary 2025-10-22 15:58:43 +08:00
Joel
5af08edfda chore: add missing icon 2025-10-22 15:38:54 +08:00
Harry
24fa5f33d7 fix(trigger): update input handling in TriggerEventNode to correctly retrieve and set outputs 2025-10-22 15:37:00 +08:00
yessenia
caf0bf34dd fix(trigger): add event node status & min-height in modal 2025-10-22 15:31:03 +08:00
lyzno1
181a1ae7f3 fix: hover tooltip 2025-10-22 15:09:01 +08:00
lyzno1
e18ecead2c fix: hide All tools when searching 2025-10-22 15:07:46 +08:00
lyzno1
36a26adab2 fix: install from marketplace 2025-10-22 14:59:49 +08:00
-LAN-
40d3332690 fix: preserve share code headers after login redirect (#27225)
Co-authored-by: yunlu.wen <yunlu.wen@dify.ai>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-22 14:59:08 +08:00
zhsama
bc2edf5107 refactor(workflow): replace fetch function with base/post in use-one-step-run and use-workflow-run hooks in polling 2025-10-22 14:55:57 +08:00
lyzno1
50bbac5973 Add double-arrow icon swap on featured tools “Show more” hover 2025-10-22 14:47:19 +08:00
lyzno1
45b221659b Tighten featured tools header arrow and add “All tools” section divider 2025-10-22 14:41:18 +08:00
lyzno1
16957f14f1 fix: featured icons 2025-10-22 14:33:00 +08:00
lyzno1
0d7dde0639 Align featured tool hover layout and widen action dropdown 2025-10-22 14:19:22 +08:00
Cris
8e45753c68 fix:restore correct numeric values for ParamsAutoGenerated (#27252) 2025-10-22 13:36:29 +08:00
GuanMu
73e217ab0d Fix type error (#27250) 2025-10-22 13:06:15 +08:00
zhsama
37805184d9 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-22 12:58:15 +08:00
Yeuoly
df9932088f avoid time slice strategy in community edition 2025-10-22 12:50:11 +08:00
zhsama
d101a83be8 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-22 12:49:41 +08:00
lyzno1
94ea289c75 fix: suggestions tools list 2025-10-22 12:46:22 +08:00
lyzno1
e2539e91eb fix: view docs 2025-10-22 12:46:22 +08:00
lyzno1
77e9bae3ff feat(workflow): polish featured tools recommendations 2025-10-22 12:46:21 +08:00
lyzno1
d99644237b chore: align help link translations 2025-10-22 12:46:21 +08:00
lyzno1
5cb268e99b feat: suggestions ui 2025-10-22 12:46:21 +08:00
lyzno1
f179b03d6e fix: constrain rag pipeline datasource selector width 2025-10-22 12:46:21 +08:00
lyzno1
28fe58f3dd feat: try to add tools suggestions 2025-10-22 12:46:21 +08:00
Yeuoly
14acd05846 fix 2025-10-22 12:41:19 +08:00
zhsama
ccce135bf5 fix(workflow): add setShowVariableInspectPanel for specific block types in useLastRun hook 2025-10-22 12:38:03 +08:00
Yeuoly
cb5607fc8c refactor: TimeSliceLayer 2025-10-22 12:13:12 +08:00
Yeuoly
7f70d1de1c ASYNC_WORKFLOW_SCHEDULER_GRANULARITY 2025-10-22 12:10:12 +08:00
Yeuoly
c36173f5a9 fix: typing 2025-10-22 11:55:26 +08:00
Yeuoly
7acbe981e2 fix: discorrect elapsed_time 2025-10-22 11:49:02 +08:00
Alain
26ff59172e fix: fix OpenAPI Schema Import Pydantic Validation Errors for Complex Default Values (#27159)
Co-authored-by: Alain <yinxulai@hoymail.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-22 11:45:31 +08:00
Yeuoly
dd6ab7c68c feat: support pausing workflow trigger log 2025-10-22 11:45:16 +08:00
GuanMu
bebb4ffbaa Fix type error (#27217) 2025-10-22 11:43:37 +08:00
github-actions[bot]
523da66134 chore: translate i18n files and update type definitions (#27243)
Co-authored-by: WTW0313 <30284043+WTW0313@users.noreply.github.com>
2025-10-22 11:41:49 +08:00
Joel
a1ea256e79 fix: global icon in inspect 2025-10-22 11:36:01 +08:00
Joel
14942c9ee9 fix: page crash 2025-10-22 11:28:28 +08:00
Joel
e1ca7a9bdb chore: hide useless error info in login page (#27245) 2025-10-22 11:20:31 +08:00
Joel
b0b316ed48 fix: rag pipline not show sys vars 2025-10-22 11:07:08 +08:00
Nite Knite
9a8cf709ba chore: adjust the route scope for loading Zendesk scripts (#27244)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-22 11:05:27 +08:00
Wu Tianwei
f909040567 feat: Enhance knowledge base node validation by adding checks for embedding and reranking models (#27241) 2025-10-22 10:49:49 +08:00
Garfield Dai
845adb664a knowledge-pipeline-for-enterprise (#27240) 2025-10-22 10:29:27 +08:00
hjlarry
871cfbd40c fix: CredentialsSchema missing help field display 2025-10-22 09:23:59 +08:00
-LAN-
0c6cae2d59 chore: align version identifiers with 1.9.2 (#27212) 2025-10-21 20:12:07 +08:00
yessenia
9a3ca0ce3b fix(trigger): check subscription removed 2025-10-21 20:01:16 +08:00
zhsama
c90df5c12c refactor(entry-node): remove showIndicator prop and related logic for cleaner component structure 2025-10-21 19:53:29 +08:00
yessenia
f4acc78f66 fix(trigger): deal with empty manualPropertiesSchema 2025-10-21 19:49:42 +08:00
Jyong
a893ee0ffc Feat/add celery prefetch setting (#27218) 2025-10-21 19:40:36 +08:00
Yeuoly
3d5e2c5ca1 feat(trigger): add suspend/timeslice layers and workflow CFS scheduler
- add suspend, timeslice, and trigger post engine layers
- introduce CFS workflow scheduler tasks and supporting entities
- update async workflow, trigger, and webhook services to wire in the new scheduling flow
2025-10-21 19:20:54 +08:00
zhsama
55bf9196dc feat(trigger): add TriggerSchedule to node type checks for workflow execution 2025-10-21 18:57:57 +08:00
yessenia
18a52b4937 fix(trigger): subscription removed in workflow 2025-10-21 18:43:15 +08:00
Joel
439727746c fix: trigger timestamp show place 2025-10-21 18:21:37 +08:00
Joel
04b55177b5 feat: support show global vars 2025-10-21 17:59:37 +08:00
Jyong
82b63cc6e2 add billing enable check (#27213) 2025-10-21 17:49:38 +08:00
GuanMu
c327cfa86e fix(storybook): add required handler props and fix TypeScript errors in component stories (#27187) 2025-10-21 17:44:26 +08:00
zhsama
2793ede875 feat: update checkbox component in the panel and refactor form types for checkbox and boolean 2025-10-21 17:28:25 +08:00
Guangdong Liu
82219c1162 fix: eagerly load EndUser attributes to prevent DetachedInstanceError (#27162)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Novice <novice12185727@gmail.com>
2025-10-21 17:12:17 +08:00
yessenia
dc4801c014 refactor(trigger): refactor app mode type to enum 2025-10-21 16:50:18 +08:00
Nite Knite
cfc3f1527a chore: switch support channels according to configuration (#27195)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-21 16:23:49 +08:00
-LAN-
caf1a5fbab Fix variable truncator handling for UpdatedVariable (#27197)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-21 16:23:17 +08:00
-LAN-
4a6398fc1f Fix: surface workflow container LLM usage (#27021) 2025-10-21 16:05:26 +08:00
feelshana
2bcf96565a Feature:during account initialization, set the interface language to be consistent with the display language(#27029) (#27042)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-21 15:53:12 +08:00
Nite Knite
9a9d6a4a2b chore: update support channels (#27188)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-21 15:48:02 +08:00
Jyong
05f66fcf0d remove built-in pipeline template user field (#27184)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-21 15:30:58 +08:00
yessenia
d5e2649608 fix(trigger): disable some options when no start node 2025-10-21 15:25:36 +08:00
Guangdong Liu
ea8245a91b fix: handle exceptions during loop break condition evaluation (#26961)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Novice <novice12185727@gmail.com>
2025-10-21 15:25:01 +08:00
-LAN-
759a932bb7 Fix: release WorkflowTool database sessions promptly (#26893) 2025-10-21 15:17:17 +08:00
Joel
4102f0bc9d feat: vars to new place 2025-10-21 15:03:16 +08:00
Joel
25e4203cb1 main 2025-10-21 14:44:24 +08:00
Joel
e1a3ead941 main 2025-10-21 14:42:27 +08:00
zhsama
6251090893 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-21 14:39:10 +08:00
zhsama
aa5e04b70e Merge branch 'main' into feat/trigger
# Conflicts:
#	web/app/components/workflow/hooks-store/store.ts
#	web/package.json
#	web/pnpm-lock.yaml
2025-10-21 14:36:07 +08:00
Harry
8ac25c29ee feat(trigger): implement subscription refresh logic with enhanced error handling and logging 2025-10-21 14:11:27 +08:00
Harry
f4517d667b feat(trigger): enhance trigger provider refresh task with locking mechanism and due filter logic 2025-10-21 14:11:27 +08:00
Yeuoly
dc2481c805 feat: docs 2025-10-21 13:56:31 +08:00
Joel
fb6f05c267 fix: infinite jump to login url (#27178) 2025-10-21 13:25:20 +08:00
Yunlu Wen
ff9b74efeb fix: remove login status api (#27177)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-21 13:24:57 +08:00
Yeuoly
8d7435a51b docs: introduce agent skill for trigger 2025-10-21 12:23:19 +08:00
lyzno1
bb28c718df fix: correct webhook trigger node id parsing 2025-10-21 11:48:50 +08:00
Joel
d6e7543ba6 fix: passport outdate caused webapp reload (#27175)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-21 11:47:28 +08:00
lyzno1
1b7a5b6209 fix: immer breaking change 2025-10-21 11:42:31 +08:00
Eric Guo
e45d5700ec Fix vs code and using min version after bump @remixicon/react and @monaco-editor/loader (#27008) 2025-10-21 11:41:44 +08:00
lyzno1
448622b4fd fix: pnpm lock file 2025-10-21 11:39:39 +08:00
crazywoola
e9dda03e8d fix: immer version and ref in code base (#27130) 2025-10-21 11:38:44 +08:00
lyzno1
8d3d177932 fix: pnpm lock file 2025-10-21 11:35:41 +08:00
lyzno1
f0af4d692a fix: breaking change 2025-10-21 11:32:20 +08:00
-LAN-
4e6682bd85 Add workflow graph validation checks (#27106)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-21 11:27:12 +08:00
Asuka Minato
32c715c4d0 rm type ignore (#25715)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
2025-10-21 11:26:58 +08:00
lyzno1
075173e67d fix(workflow): reset onboarding auto-open flag across flows 2025-10-21 11:19:36 +08:00
Yeuoly
f02d575379 Merge branch 'main' into feat/trigger 2025-10-21 11:09:26 +08:00
yessenia
735ebf6c59 fix(trigger): oauth client params 2025-10-21 09:27:10 +08:00
Harry
96f0b7abe3 fix(trigger): handle missing 'inputs' key in trigger data retrieval 2025-10-20 21:47:49 +08:00
zhsama
eb1686f04b Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-20 20:27:02 +08:00
zhsama
d4b5d9a02a feat(trigger): add trigger validation logic and utility functions for improved checklist integration 2025-10-20 20:26:40 +08:00
Joel
c11cdf7468 fix: infinite reload (#27150) 2025-10-20 21:18:26 +09:00
Harry
f87f77ce7b feat(trigger): add configuration for trigger provider refresh task 2025-10-20 20:02:12 +08:00
Harry
24619e74f6 fix(trigger): update error handling and credential expiration field 2025-10-20 20:02:12 +08:00
GuanMu
6217c96576 Fix type error (#27152)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-20 19:35:46 +08:00
zhsama
f5c1646f79 fix(dynamic-options): fix the dynamic options in plugin trigger 2025-10-20 19:34:41 +08:00
zhsama
e26d77e78c fix(checklist): enhance type safety by refining BlockEnum usage in checklist components 2025-10-20 19:34:41 +08:00
yessenia
8e1e81732a fix(trigger): formitem boolean layout 2025-10-20 19:27:21 +08:00
Bohan Feng
977690590e fix: parameter extractor instructions placeholder not replaced (#26235) (#27135) 2025-10-20 19:39:20 +09:00
非法操作
fd845c8b6c chore: add more stories (#27142)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-20 18:30:52 +08:00
yessenia
801f8c1592 fix(trigger): oauth client default values 2025-10-20 18:21:38 +08:00
Joel
d7d9abb007 chore: use new api to check login status (#27143)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-20 16:54:31 +08:00
Harry
fd4b234171 feat: improve oauth client info api 2025-10-20 16:50:03 +08:00
Harry
dff536ab6d feat(trigger): trigger plugin protocol improvements 2025-10-20 16:50:03 +08:00
hjlarry
a152ce45d3 fix: start/stop button on the node control not work 2025-10-20 16:43:16 +08:00
Yeuoly
6a164f8811 refactor: use EnumText 2025-10-20 15:48:11 +08:00
github-actions[bot]
9f22b2726b chore: translate i18n files and update type definitions (#27141)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-20 15:45:06 +08:00
Yeuoly
a03ff39f3e chore: add to .env.example 2025-10-20 15:42:29 +08:00
-LAN-
f28b519556 Allow custom app headers in CORS configuration (#27133)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-20 15:39:07 +08:00
Yeuoly
a6373e357a fix: typing 2025-10-20 15:38:54 +08:00
croatialu
762cf91133 feat(web): Add parameter rendering to MCP tool item component (#27099) 2025-10-20 15:37:30 +08:00
GuanMu
9dd3dcff2b Fix type error 5 (#27139)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-20 15:35:13 +08:00
Yeuoly
538b639bef fix: unify trigger url generation 2025-10-20 15:34:51 +08:00
yessenia
fe0457b257 fix(trigger): show text 2025-10-20 14:17:52 +08:00
Guangdong Liu
34fbcc9457 fix: ensure document re-querying in indexing process for consistency (#27077) 2025-10-20 14:12:39 +08:00
yangzheli
9cc8ac981b fix(web): improve UI consistency and remove related unused icons (#27004) 2025-10-20 14:03:16 +08:00
zhsama
d5b228f234 fix(end-node): adjust required status and update end node terminology to output in i18n 2025-10-20 14:00:14 +08:00
zyssyz123
1153dcef69 fix: delete migrate sync data script (#27061) 2025-10-20 14:54:24 +09:00
white-loub
f811471b18 fix: support structured output in streaming mode for LLM node (#27089) 2025-10-20 14:53:25 +09:00
hj24
2382229c7d fix variable-truncator max size comments (#27129) 2025-10-20 14:52:40 +09:00
crazywoola
f0e739be43 fix: immer version and ref in code base (#27130) 2025-10-20 14:49:26 +09:00
lyzno1
1c2f95eeb6 fix(migrations): chain messages.app_mode upgrade after plugin trigger 2025-10-20 13:40:37 +08:00
-LAN-
4dccdf9478 Ensure suggested questions parser returns typed sequence (#27104)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-20 13:01:09 +08:00
GuanMu
4c37d650d3 fix: update attribute types to allow undefined values in icon utilities (#27121) 2025-10-20 12:55:37 +08:00
Guangdong Liu
1b334e6966 fix: handle None values in dataset and document deletion logic (#27083) 2025-10-20 12:52:48 +08:00
crazywoola
d463bd6323 Revert "chore(deps): bump immer from 9.0.21 to 10.1.3 in /web" (#27119) 2025-10-20 11:28:45 +08:00
GuanMu
8c298b33cd Fix frontend type error (#27116) 2025-10-20 11:27:18 +08:00
非法操作
dc1a380888 chore: improve storybook (#27111) 2025-10-20 10:17:17 +08:00
dependabot[bot]
7e9be4d3d9 chore(deps): bump immer from 9.0.21 to 10.1.3 in /web (#27113)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-20 10:16:35 +08:00
dependabot[bot]
5579521ffc chore(deps-dev): bump cross-env from 7.0.3 to 10.1.0 in /web (#27112)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-20 10:12:30 +08:00
dependabot[bot]
ab1059134d chore(deps): bump pydantic-settings from 2.9.1 to 2.11.0 in /api (#27114)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-20 10:12:16 +08:00
dependabot[bot]
fe2ac66a52 chore(deps): bump html-to-image from 1.11.11 to 1.11.13 in /web (#27109)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-20 09:37:10 +08:00
dependabot[bot]
f87db2652b chore(deps): bump @lexical/selection from 0.36.2 to 0.37.0 in /web (#27108)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-20 09:37:02 +08:00
-LAN-
3f9f02b9e7 docs: mention backend lint gate in AGENTS (#27102)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-20 09:36:41 +08:00
lyzno1
81b3436ec4 fix(trigger): resolve circular import in models 2025-10-20 09:23:11 +08:00
-LAN-
578247ffbc feat(graph_engine): Support pausing workflow graph executions (#26585)
Signed-off-by: -LAN- <laipz8200@outlook.com>
2025-10-19 21:33:41 +08:00
-LAN-
9a5f214623 refactor: replace localStorage with HTTP-only cookies for auth tokens (#24365)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Signed-off-by: lyzno1 <yuanyouhuilyz@gmail.com>
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Yunlu Wen <wylswz@163.com>
Co-authored-by: Joel <iamjoel007@gmail.com>
Co-authored-by: GareArc <chen4851@purdue.edu>
Co-authored-by: NFish <douxc512@gmail.com>
Co-authored-by: Davide Delbianco <davide.delbianco@outlook.com>
Co-authored-by: minglu7 <1347866672@qq.com>
Co-authored-by: Ponder <ruan.lj@foxmail.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: heyszt <270985384@qq.com>
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
Co-authored-by: Guangdong Liu <liugddx@gmail.com>
Co-authored-by: Eric Guo <eric.guocz@gmail.com>
Co-authored-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: XlKsyt <caixuesen@outlook.com>
Co-authored-by: Dhruv Gorasiya <80987415+DhruvGorasiya@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: lyzno1 <92089059+lyzno1@users.noreply.github.com>
Co-authored-by: hj24 <mambahj24@gmail.com>
Co-authored-by: GuanMu <ballmanjq@gmail.com>
Co-authored-by: 非法操作 <hjlarry@163.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Tonlo <123lzs123@gmail.com>
Co-authored-by: Yusuke Yamada <yamachu.dev@gmail.com>
Co-authored-by: Novice <novice12185727@gmail.com>
Co-authored-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: Ademílson Tonato <ademilsonft@outlook.com>
Co-authored-by: znn <jubinkumarsoni@gmail.com>
Co-authored-by: yangzheli <43645580+yangzheli@users.noreply.github.com>
2025-10-19 21:29:04 +08:00
QuantumGhost
141ca8904a fix(api): ensure JSON responses are properly serialized in ApiTool (#27097)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-19 18:56:02 +08:00
Asuka Minato
4488c090b2 fluent api (#27093)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-19 12:54:41 +09:00
Bowen Liang
59c1fde351 doc: add Grafana dashboard template link to docs (#27079)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-18 23:24:35 +08:00
GuanMu
cf7ff76165 fix(web): resolve TypeScript type errors in workflow components (#27086) 2025-10-18 23:09:00 +08:00
Yeuoly
3e4f2bcf14 optimize: TriggerDispatchResponse 2025-10-18 20:40:59 +08:00
Yeuoly
c7696964b9 fix: refine 2025-10-18 20:27:22 +08:00
Yeuoly
fb8ecf7b5a refactor: move out enums to specific file 2025-10-18 20:22:21 +08:00
Yeuoly
e3c2345b21 fix: typing 2025-10-18 20:17:23 +08:00
Yeuoly
bfe0d14409 fix: typing 2025-10-18 20:16:10 +08:00
Yeuoly
c7498c3a11 fix: typing 2025-10-18 20:14:00 +08:00
Yeuoly
5fba41688a refactor: cleaning up terrible data 2025-10-18 20:12:20 +08:00
Yeuoly
b63b9c32f7 refactor: models 2025-10-18 20:06:46 +08:00
Yeuoly
65c6203ad7 fix: correct building reference 2025-10-18 19:54:06 +08:00
Yeuoly
3a18337129 refactor: confused abstract class 2025-10-18 19:47:23 +08:00
Yeuoly
b6b433626e fix: typing 2025-10-18 19:43:00 +08:00
Yeuoly
5d6b9b0cb1 refactor 2025-10-18 19:41:53 +08:00
Yeuoly
6d09330f98 chore: rename PluginTriggerManager to PluginTriggerClient 2025-10-18 19:33:08 +08:00
Yeuoly
5df9afa91a fix: typing 2025-10-18 19:32:08 +08:00
Yeuoly
30a341331f chore: unify request handling 2025-10-18 19:29:00 +08:00
Yeuoly
31cf4b6619 fix: query parameter dose not exist in workflow 2025-10-18 19:19:36 +08:00
Yeuoly
dd0da3218c feat: introduce payload field to plugin trigger processing 2025-10-18 19:15:46 +08:00
Yeuoly
11c9219848 chore: better exception handling 2025-10-18 19:15:09 +08:00
Yeuoly
b1ffd2ef2b refine: use enum reference to avoid plain text declarations 2025-10-18 19:14:24 +08:00
Yeuoly
86cf7952fb refactor: add typing annotation 2025-10-18 19:13:07 +08:00
Yeuoly
d790d2b6bc feat: introduce payload field to TriggerDispatchResponse and a better typing 2025-10-18 19:12:43 +08:00
Yeuoly
a711a8e759 refactor: better typing 2025-10-18 19:11:50 +08:00
Yeuoly
8a18b6e13b refactor webhook service enduser operations 2025-10-18 19:11:15 +08:00
Yeuoly
95aeb61d7c fix: missing backwards invocation 2025-10-18 19:10:22 +08:00
Yeuoly
e8b0144cf7 refactor: remove common end user operations out of wraps.py and move it into EndUserService 2025-10-18 19:09:55 +08:00
yessenia
2c8c1860ca fix(trigger): show event output 2025-10-18 16:28:26 +08:00
Yeuoly
5edfbd5305 fix: meaningless error messages 2025-10-18 16:27:12 +08:00
lyzno1
4ceae655bd fix: prevent selecting time text in picker 2025-10-18 15:50:15 +08:00
Jacky Su
ac79691d69 Feat/add status filter to workflow runs (#26850)
Co-authored-by: Jacky Su <jacky_su@trendmicro.com>
2025-10-18 12:15:29 +08:00
GuanMu
1a37989769 Fix type-check error (#27051) 2025-10-18 12:03:40 +08:00
Amy
830f891a74 Fix json in md when use quesion classifier node (#26992)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-18 11:58:40 +08:00
Eric Guo
5937a66e22 Sync same logic for datasets. (#27056) 2025-10-18 11:49:20 +08:00
wangxiaolei
894e38f713 fix: https://github.com/langgenius/dify/issues/27063 (#27074) 2025-10-18 11:47:04 +08:00
Guangdong Liu
e4b5b0e5fd feat: implement strict type validation for remote file uploads (#27010) 2025-10-18 11:44:11 +08:00
Guangdong Liu
598dd1f816 fix: allow optional config parameter and conditionally include message file ID (#26960) 2025-10-18 11:43:24 +08:00
Yongtao Huang
35e24d4d14 Chore: remove redundant tenant lookup in APIBasedExtensionAPI.post (#27067)
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
2025-10-18 09:54:52 +08:00
lyzno1
6ae76d108b feat: add cursor pointer to macketplace actions 2025-10-17 21:31:40 +08:00
lyzno1
9cc3cfb63e fix: hide footer from all start block when search not found 2025-10-17 21:28:57 +08:00
lyzno1
58e4c0793a feat: align tool selector empty state with start blocks 2025-10-17 21:25:28 +08:00
Harry
80f2c1be67 fix(trigger): enhance error handling and refactor end user creation in trigger workflows
- Improved error handling in `TriggerSubscriptionListApi` to return a 404 response for ValueErrors.
- Refactored end user creation logic in `service_api/wraps.py` to use `get_or_create_end_user` for better clarity and consistency.
- Introduced a new method `create_end_user_batch` for batch creation of end users, optimizing database interactions.
- Updated various trigger-related services to utilize the new end user handling, ensuring proper user context during trigger dispatching.
2025-10-17 21:00:57 +08:00
lyzno1
8a5174d078 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-17 19:21:15 +08:00
zhsama
d0f357a690 feat(workflow): enhance listening functionality with multiple trigger node support 2025-10-17 19:09:55 +08:00
zhsama
fbe3df5658 fix(plugin-detail-panel): update provider reference to use trigger identity name 2025-10-17 18:23:35 +08:00
yessenia
21e3ef91eb fix(trigger): show event detail 2025-10-17 18:23:04 +08:00
zhsama
3f116dc74b feat(variable-inspect): improve listening description resolution in Listening component 2025-10-17 18:11:26 +08:00
hjlarry
32731c4622 render autoCommonParametersSchema other input type 2025-10-17 18:09:14 +08:00
GuanMu
fea2ffb3ba fix: improve URL validation logic in validateRedirectUrl function (#27058) 2025-10-17 17:46:28 +08:00
zhsama
3c1f0e1aec fix(trigger): fix authentication status check 2025-10-17 17:13:07 +08:00
Joel
685e48636d fix: if tag show global vars problem 2025-10-17 16:57:42 +08:00
Joel
7c4edaa636 fix: variableValid in prompt editor 2025-10-17 16:48:27 +08:00
Joel
35867707d0 fix: global var type render in node 2025-10-17 15:24:49 +08:00
Wu Tianwei
64f55d55a1 fix: update TopK and Score Threshold components to use InputNumber and improve value handling (#27045) 2025-10-17 14:58:30 +08:00
2h0ng
bfda4ce7e6 Merge commit from fork 2025-10-17 14:58:15 +08:00
zhsama
5b884d750f feat(trigger): add run all triggers test-run and implement TriggerType enum 2025-10-17 14:56:05 +08:00
Harry
bc0d5f4e41 fix(trigger): enhance subscription retrieval error handling in TriggerService
- Added exception handling for `get_subscription_by_endpoint` to return a 404 response when the plugin is not found and a 500 response for other errors.
- Improved overall robustness of the subscription retrieval process.
2025-10-17 14:43:43 +08:00
Harry
f20452622a fix(trigger): improve event retrieval handling in PluginTriggerProviderController
- Updated the `get_event` method to return `None` instead of raising a ValueError when an event is not found, enhancing error handling.
- Adjusted the `get_event_parameters` method to handle cases where the event may be `None`, returning an empty dictionary instead of causing an error.
- Improved type hinting for better clarity and type safety.
2025-10-17 14:43:43 +08:00
GuanMu
4f7cb7cd2a Fix type error (#27044) 2025-10-17 14:42:58 +08:00
Joel
6ba26cf7b5 fix: global var show in node 2025-10-17 14:39:30 +08:00
NeatGuyCoding
6517323add Feature: add test containers based tests for mail register tasks (#27040)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
2025-10-17 14:29:56 +08:00
Joel
7510e0654b fix: show global vars in picker 2025-10-17 14:24:20 +08:00
NFish
531a0b755a fix: show 'Invalid email or password' error tip when web app login failed (#27034) 2025-10-17 14:03:34 +08:00
Joel
564bb22d8b feat: system var icon 2025-10-17 13:57:26 +08:00
lyzno1
5e2d5f0d83 feat: allow trigger schedule TimePicker to stretch with panel 2025-10-17 13:52:26 +08:00
hjlarry
d90ffbcf14 rm unused ensureWebhookRawVariable 2025-10-17 13:49:33 +08:00
Joel
91bb8ae4d2 fix: happy-dom security issues (#27037) 2025-10-17 13:42:56 +08:00
hjlarry
771cc72dcf fix auto generate webhook url 2025-10-17 13:41:03 +08:00
-LAN-
04c91111e9 fix(trigger): trigger node is marked as 'branch' type 2025-10-17 13:37:46 +08:00
yessenia
5a13daefdb fix(trigger): close portal after select a subscription 2025-10-17 13:31:00 +08:00
lyzno1
c033c05ec1 fix: resolve trigger plugin icons in workflow checklist 2025-10-17 12:55:41 +08:00
hjlarry
5b2f323a87 improve webhook request headers 2025-10-17 11:27:48 +08:00
Joel
b855d95430 feat: can choose global vars 2025-10-17 11:02:27 +08:00
yessenia
fe4b63210e fix(trigger): oauth client config 2025-10-17 10:52:42 +08:00
GuanMu
8cafc20098 Fix type error (#27024) 2025-10-17 10:46:43 +08:00
Joel
84c09ec59d chore: user input output vars show 2025-10-17 10:21:11 +08:00
hjlarry
40e17ef801 fix merge main cause current_user not defined 2025-10-17 09:49:09 +08:00
-LAN-
9d5300440c Restore coverage for skipped workflow tests (#27018)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-17 09:11:48 +08:00
Guangdong Liu
58524d6d2b fix: remove unnecessary properties from condition draft (#27009)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-17 09:11:03 +08:00
Asuka Minato
19cc6ea993 fix 27003 (#27005)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-17 09:10:16 +08:00
quicksand
d7f0a31e24 Fix: User Context Loss When Invoking Workflow Tool Node in Knowledge … (#26495) 2025-10-17 09:09:45 +08:00
Yongtao Huang
312974aa20 Chore: remove unused class-level variables in DatasourceManager (#27011)
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-17 09:07:28 +08:00
Dhruv Gorasiya
d19c100166 fix: logical error in Weaviate distance calculation (#27019)
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-17 09:06:50 +08:00
Dhruv Gorasiya
a8ad80c405 Fixed Weaviate no module found issue (issue #26938) (#26964)
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-16 22:41:48 +08:00
GuanMu
650e38e17f refactor: improve TypeScript types for NodeCardProps and debug configuration context (#27001) 2025-10-16 22:16:01 +08:00
-LAN-
24612adf2c Fix dispatcher idle hang and add pytest timeouts (#26998) 2025-10-16 22:15:03 +08:00
yessenia
f1fcb92691 feat(trigger): add category trigger 2025-10-16 18:30:54 +08:00
lyzno1
3865555113 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-16 18:30:33 +08:00
lyzno1
95e46806a4 fix: marketplace item install hover 2025-10-16 18:01:00 +08:00
lyzno1
c9c3d03878 fix: keep start tab search results restorable 2025-10-16 17:56:32 +08:00
lyzno1
b28ec4be6e fix: start block ui 2025-10-16 17:48:24 +08:00
lyzno1
29d7023fae - Update all-tools.tsx so provider search results keep only relevant items: full list retained when the provider matches; otherwise the provider is cloned with just matching tools.
- Mirror the same filtering strategy for Start-tab trigger plugins in trigger-plugin/list.tsx, ensuring only matching events render when searching.
2025-10-16 17:46:44 +08:00
lyzno1
22f6c23780 refactor: remove empty search placeholder from tool selector 2025-10-16 17:39:35 +08:00
hjlarry
548db29a47 add var name check for webhook node 2025-10-16 16:59:46 +08:00
Xiyuan Chen
06649f6c21 Update email templates to improve clarity and consistency in messagin… (#26970) 2025-10-16 01:42:22 -07:00
hjlarry
1089c5bf04 add _webhook_raw to downstreamed node 2025-10-16 16:35:05 +08:00
Yongtao Huang
8b61f5e9c4 Fix: avoid duplicate response_chunk update in convert_stream_simple_response (#26965)
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-16 15:53:07 +08:00
GuanMu
6432898e7a refactor: update TypeScript definitions for custom JSX elements and clean up global declarations in emoji picker (#26985) 2025-10-16 15:51:39 +08:00
hjlarry
559cf6583f fix add candidate webhook node raise error 2025-10-16 15:33:18 +08:00
yessenia
b04f92715c feat(trigger): plugin category type 2025-10-16 15:30:04 +08:00
Harry
671aba6ab7 fix(trigger): handle missing subscription constructor gracefully in PluginTriggerProviderController
- Updated the logic in `PluginTriggerProviderController` to return an empty list instead of raising a ValueError when the subscription constructor is not found, improving error handling and flow.
2025-10-16 15:09:13 +08:00
Harry
beaeb30dcc fix(trigger): enhance credential encryption handling in TriggerProviderService
- Introduced conditional initialization of credential_encrypter based on credential_type to prevent errors when unauthorized.
- Updated the encryption logic to handle cases where credential_encrypter may be None, ensuring robustness in credential processing.
2025-10-16 15:07:05 +08:00
hjlarry
56abca1f41 webhook i18n 2025-10-16 14:52:15 +08:00
Asuka Minato
cced33d068 use deco to avoid current_user (#26077)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-16 15:45:51 +09:00
zhsama
52d5f219e1 fix(workflow): include trigger node type in available blocks check 2025-10-16 14:24:44 +08:00
Harry
d4516e942c fix(trigger): improve error handling in DraftWorkflowTriggerNodeApi and update input class naming
- Removed specific exception handling for ValueError and PluginInvokeError in `DraftWorkflowTriggerNodeApi`, allowing a more general exception to be raised.
- Renamed `PluginTriggerInput` to `TriggerEventInput` in `TriggerEventNodeData` for better clarity and consistency.
- Updated validation logic in `TriggerEventInput` to ensure correct type checks for input values.
2025-10-16 14:04:44 +08:00
zhsama
1c17a16830 feat(trigger): format event_parameters and improve 2025-10-16 14:00:21 +08:00
Xiyuan Chen
bd01af6415 fix: update load balancing configurations with new credential IDs and… (#26900) 2025-10-15 21:15:26 -07:00
wellCh4n
35011b810d feat: run with params from logs (#26787)
Co-authored-by: lyzno1 <yuanyouhuilyz@gmail.com>
Co-authored-by: lyzno1 <92089059+lyzno1@users.noreply.github.com>
2025-10-16 11:01:11 +08:00
Xin Zhang
f295c7532c fix plugin installation permissions when using a local pkg (#26822)
Co-authored-by: zhangx1n <zhangxin@dify.ai>
2025-10-16 10:58:28 +08:00
zyssyz123
7065b67d07 add app mode for message (#26876) 2025-10-16 10:19:49 +08:00
lyzno1
1f6ab13fc5 fix(workflow): auto run single start node without dropdown 2025-10-16 09:37:18 +08:00
lyzno1
7344df87e5 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-15 20:47:20 +08:00
lyzno1
29353bd7c2 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-15 20:47:02 +08:00
yessenia
7b6f5d6860 fix(trigger): show tool credentials in workflow 2025-10-15 20:42:14 +08:00
lyzno1
2ccb20bf3a fix(workflow): gate “publish as tool” on published user input node validity 2025-10-15 20:26:12 +08:00
lyzno1
34b7e5cbca fix: enable scrolling in start selector tab 2025-10-15 19:09:23 +08:00
yessenia
a595e2df06 fix(trigger): skip validation when updating properties 2025-10-15 18:44:05 +08:00
zhsama
729e0e9b1e feat(workflow): add disableVariableInsertion prop to form input and trigger components 2025-10-15 18:20:13 +08:00
zhsama
c03b790888 feat(trigger): add event_parameters to PluginTriggerNode configuration 2025-10-15 18:14:43 +08:00
zhsama
112b5f63dd feat(workflow): enhance single run handling 2025-10-15 18:14:33 +08:00
Harry
334e5f19bf fix(trigger): handle missing subscription constructor in trigger subscription builder
- Updated the `TriggerSubscriptionBuilderService` to return an empty dictionary when the subscription constructor is not available, improving robustness in subscription handling.
2025-10-15 17:44:51 +08:00
Harry
35bbf67175 refactor(trigger): Rename and replace PluginTriggerNode with TriggerEventNode
- Updated references from `PluginTriggerNode` to `TriggerEventNode` across multiple files to reflect the new naming convention.
- Modified `PluginTriggerNodeData` to `TriggerEventNodeData`, including changes to event parameters for better clarity and consistency in data handling.
- Removed the deprecated `trigger_plugin_node.py` file as part of the refactor.
2025-10-15 17:30:42 +08:00
yessenia
9aec255ee9 feat(trigger): update subscription list after saving draft 2025-10-15 17:22:14 +08:00
Harry
b07e80e6ae fix(trigger): update error type for event handling in trigger manager
- Changed the error type check from "TriggerIgnoreEventError" to "EventIgnoreError" in the `TriggerManager` class to improve clarity in error handling during trigger invocations.
2025-10-15 17:14:44 +08:00
Harry
ad2b910d73 refactor(trigger): Enhance error handling and parameter resolution in trigger workflows
- Improved error handling in `DraftWorkflowTriggerRunApi`, `DraftWorkflowTriggerNodeApi`, and `DraftWorkflowTriggerRunAllApi` to raise exceptions directly, providing clearer error messages.
- Introduced `get_event_parameters` method in `PluginTriggerProviderController` to retrieve event parameters for triggers.
- Updated `PluginTriggerNodeData` to include a new method for resolving parameters based on event schemas, ensuring better validation and handling of trigger inputs.
- Refactored `TriggerService` to utilize the new parameter resolution method, enhancing the clarity and reliability of trigger invocations.
2025-10-15 17:05:51 +08:00
GuanMu
c0b50ef61d chore: remove unused icon components and related features from the co… (#26933) 2025-10-15 16:48:02 +08:00
-LAN-
1d8cca4fa2 Fix: check external commands after node completion (#26891)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-15 16:47:43 +08:00
Wu Tianwei
3474c179e6 fix: enhance dataset menu and add service API translations (#26931) 2025-10-15 16:46:46 +08:00
GuanMu
433dad7e1a chore: add type-check script to package.json for TypeScript validation (#26929) 2025-10-15 16:37:46 +08:00
github-actions[bot]
be7ee380bc chore: translate i18n files and update type definitions (#26916)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-15 16:36:39 +08:00
yessenia
f28a7218cd fix(trigger): optimize subscription entry in workflow 2025-10-15 16:13:00 +08:00
lyzno1
4164e1191e fix: hide checklist navigation for missing nodes 2025-10-15 16:10:34 +08:00
Harry
bd31c6f90b refactor(trigger): Reinstate DraftWorkflowTriggerNodeApi with improved structure
- Restored the `DraftWorkflowTriggerNodeApi` class to handle polling for trigger events in draft workflows.
- Enhanced the implementation to utilize `TriggerDebugEvent` and `TriggerDebugEventPoller` for better event management.
- Improved error handling and response structure for node execution, ensuring clarity in API responses.
- Updated API documentation to reflect the restored functionality and parameters.
2025-10-15 14:45:00 +08:00
Harry
8f7bef9509 fix(trigger): Update API routes for draft workflow trigger
- Changed the endpoint for triggering draft workflows from `/trigger/plugin/run` to `/trigger/run` in both backend and frontend to ensure consistency and clarity in the API structure.
- Adjusted the URL construction in the `useWorkflowRun` hook to reflect the updated route.
2025-10-15 14:44:00 +08:00
Harry
06c91fbcbd refactor(trigger): Unify the Trigger Debug interface and event handling and enhance error management
- Updated `DraftWorkflowTriggerNodeApi` to utilize the new `TriggerDebugEvent` and `TriggerDebugEventPoller` for improved event polling.
- Removed deprecated `poll_debug_event` methods from `TriggerService`, `ScheduleService`, and `WebhookService`, consolidating functionality into the new event structure.
- Enhanced error handling in `invoke_trigger_event` to utilize `TriggerPluginInvokeError` for better clarity on invocation issues.
- Updated frontend API routes to reflect changes in trigger event handling, ensuring consistency across the application.
2025-10-15 14:41:53 +08:00
yangzheli
cff5de626b feat(agent): similar to the start node of workflow, agent variables also support drag-and-drop (#26899) 2025-10-15 13:07:51 +08:00
znn
4d8b8f9210 allow editing of hidden inputs in preview (#24370)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-10-15 11:19:53 +08:00
Harry
dab4e521af feat(trigger): enhance trigger event handling and introduce new debug event polling
- Refactored the `DraftWorkflowTriggerNodeApi` and related services to utilize the new `TriggerService` for polling debug events, improving modularity and clarity.
- Added `poll_debug_event` methods in `TriggerService`, `ScheduleService`, and `WebhookService` to streamline event handling for different trigger types.
- Introduced `ScheduleDebugEvent` and updated `PluginTriggerDebugEvent` to include a more structured approach for event data.
- Enhanced the `invoke_trigger_event` method to improve error handling and data validation during trigger invocations.
- Updated frontend API calls to align with the new event structure, removing deprecated parameters for cleaner integration.
2025-10-15 11:04:09 +08:00
Ademílson Tonato
a16ef7e73c refactor: Update Firecrawl to use v2 API (#24734)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-15 10:48:54 +08:00
kenwoodjw
c39dae06d4 fix: workflow token usage (#26723)
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
2025-10-15 10:39:51 +08:00
Novice
f906e70f6b chore: remove redundant dependencies (#26907) 2025-10-15 09:55:39 +08:00
lyzno1
5139119307 chore: bump pnpm version (#26905)
Signed-off-by: lyzno1 <yuanyouhuilyz@gmail.com>
2025-10-15 09:55:05 +08:00
lyzno1
b20f61356c Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-15 09:53:03 +08:00
Yusuke Yamada
1b537f904a fix: replace CodeGroup's POST /meta with GET /site (#26886) 2025-10-15 09:43:10 +08:00
-LAN-
556b631c54 Normalize null metadata handling in tool entities (#26890) 2025-10-15 09:42:22 +08:00
NeatGuyCoding
49df9ceaf3 minor fix: test cases for alibabacloud mysql and chinese translations (#26902)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-15 09:41:12 +08:00
Tonlo
92ec1ac27a Fix/remove logo in withoutbrand template (#26882) 2025-10-15 09:40:33 +08:00
-LAN-
e74097afdf Remove unused after_request hooks from console API keys (#26896) 2025-10-15 00:43:11 +08:00
Asuka Minato
8ddc4f2292 example to auto rollback (#26200)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-15 00:42:55 +09:00
yessenia
4ec23eea00 fix: add i18n key 2025-10-14 21:23:24 +08:00
Yeuoly
270fd9cb07 fix: discorrect entity reference 2025-10-14 20:14:13 +08:00
非法操作
7b51320346 fix: when create provider credential set the provider record to vaild (#26868)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-14 19:42:48 +08:00
GuanMu
9e39be0770 fix: correct indentation in JSON payloads (#26871) 2025-10-14 19:41:01 +08:00
GuanMu
3e5e87930c feat: add Knip configuration for dead code detection and remove unused icon components (#26758)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-14 19:06:31 +08:00
yessenia
c7c5e07d43 fix(trigger): add tooltip when only one creation type 2025-10-14 18:39:22 +08:00
zhsama
c1ba83f0d4 feat(trigger): add validation for subscription in PluginTrigger node 2025-10-14 18:13:02 +08:00
zhsama
d71200ee32 feat: enhance block selector and change block components with flow type handling 2025-10-14 16:42:21 +08:00
yessenia
16ac05ebd5 feat: support search in checkbox list 2025-10-14 16:24:44 +08:00
zhsama
ac77b9b735 Merge remote-tracking branch 'origin/feat/trigger' into feat/trigger 2025-10-14 15:28:35 +08:00
zhsama
0fa4b77ff8 feat(style): adjust minimum and maximum width for block-selector and data source components 2025-10-14 15:23:28 +08:00
Harry
6773dda657 feat(trigger): enhance trigger handling with new data validation and logging improvements
- Added validation for `PluginTriggerData` and `ScheduleTriggerData` in the `WorkflowService` to support new trigger types.
- Updated debug event return strings in `PluginTriggerDebugEvent` and `WebhookDebugEvent` for clarity and consistency.
- Enhanced logging in `dispatch_triggered_workflows_async` to include subscription and provider IDs, improving traceability during trigger dispatching.
2025-10-14 14:36:52 +08:00
hj24
15a5ba67f1 fix: use account id in workflow app log filter (#26811)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-14 14:32:40 +08:00
zhsama
bf42386c5b feat(trigger): add PluginTrigger node support and enhance output variable handling 2025-10-14 11:55:12 +08:00
github-actions[bot]
9e3b4dc90d chore: translate i18n files and update type definitions (#26859)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: lyzno1 <92089059+lyzno1@users.noreply.github.com>
2025-10-14 10:43:28 +08:00
Dhruv Gorasiya
48c42a9fba Weaviate update version (#25447)
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-14 10:39:53 +08:00
XlKsyt
0b35bc1ede feat: add Tencent Cloud APM tracing integration (#25657)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-14 10:21:17 +08:00
Davide Delbianco
8e01bb40fe fix: Do not show the toggle button for chat input when all input hidden (#26826)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-14 10:15:06 +08:00
Guangdong Liu
9d21772820 fix: Validate transfer method in file mapping and improve file input handling (#26848)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-14 10:10:31 +08:00
NeatGuyCoding
b745839bdb Feature add test containers mail owner transfer task (#26854)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-14 10:01:47 +08:00
Eric Guo
59ad6e02ce Add timeout so any plugin daemon call (including the SSE path) that legitimately takes longer than 5s would right. (#26852) 2025-10-14 09:23:27 +08:00
Guangdong Liu
a3b33cbe28 refactor: streamline database session usage in batch_create_segment_to_index_task (#26795)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-14 09:22:48 +08:00
Davide Delbianco
7b8540281a fix: Chat Opener visibility flickering (#26836) 2025-10-14 09:21:00 +08:00
Asuka Minato
0a6b78f883 Use hook to get userid (#26839)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-14 09:20:37 +08:00
heyszt
56ee8f7d64 fix: files/support-type JSON serialization error (#26842) 2025-10-14 09:20:19 +08:00
Harry
90fc06a494 refactor(trigger): update TriggerApiEntity description type to TypeWithI18N
- Changed the description field type in `TriggerApiEntity` from `TriggerDescription` to `TypeWithI18N` for improved internationalization support.
- Adjusted the usage of the description field in the `convertToTriggerWithProvider` function to align with the new type definition.
2025-10-13 22:24:12 +08:00
Harry
8dfe693529 refactor(trigger): rename TriggerApiEntity to EventApiEntity and update related references
- Changed `TriggerApiEntity` to `EventApiEntity` in the trigger provider and subscription models to better reflect its purpose.
- Updated the description field type from `EventDescription` to `I18nObject` for improved consistency in event descriptions.
- Adjusted imports and references across multiple files to accommodate the renaming and type changes, ensuring proper functionality in trigger processing.
2025-10-13 21:10:31 +08:00
yessenia
d65d27a6bb fix: creating button style 2025-10-13 20:53:06 +08:00
lyzno1
e6a6bde8e2 feat(i18n): add draft reminder to app overview tooltips 2025-10-13 20:18:54 +08:00
lyzno1
c7d0a7be04 feat(trigger): enable triggers by default after workflow publish 2025-10-13 19:59:39 +08:00
Harry
e0f1b03cf0 fix(trigger): clear subscription_id in trigger plugin processing
- Updated the `AppDslService` to clear the `subscription_id` when processing nodes of type `TRIGGER_PLUGIN`. This change ensures that sensitive subscription data is not retained unnecessarily, enhancing data security during workflow execution.
2025-10-13 18:42:54 +08:00
Harry
902737b262 feat(trigger): enhance subscription decryption in trigger processing
- Added functionality to decrypt subscription credentials and properties within the `dispatch_triggered_workflows_async` method. This ensures that sensitive data is securely handled before processing, improving the overall security of trigger invocations.
2025-10-13 18:10:53 +08:00
Harry
429cd05a0f fix(trigger): serialize subscription model in trigger invocation
- Updated the `PluginTriggerManager` to serialize the `subscription` parameter using `model_dump()` before passing it during trigger invocation. This change ensures that the subscription data is correctly formatted for processing.
2025-10-13 18:07:51 +08:00
Harry
46e7e99c5a feat(trigger): add subscription parameter to trigger invocation methods
- Enhanced `PluginTriggerManager`, `PluginTriggerProviderController`, and `TriggerManager` to accept a `subscription` parameter in their trigger invocation methods.
- Updated `TriggerService` to pass the subscription entity when invoking trigger events, improving the handling of subscription-related data during trigger execution.
2025-10-13 17:47:40 +08:00
Davide Delbianco
3cfcd32876 chore: Fix 25795 (#26823)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-13 17:44:51 +08:00
Davide Delbianco
06dcb55a9d chore: Don't show chat input area scrollbar overflow (#26828) 2025-10-13 17:43:46 +08:00
Ponder
ec6cafd7aa feat: Cache AppQueueManager.is_stopped() to reduce unnecessary Redis … (#26778) 2025-10-13 17:41:16 +08:00
Davide Delbianco
6e9858960d chore: Fix chat-input-area resize (#26824) 2025-10-13 17:36:15 +08:00
minglu7
150a8276b9 fix: avoid closing shared session during embeddings (#26830) 2025-10-13 17:36:00 +08:00
Davide Delbianco
c6a90d4bb3 fix: Don't hide chat streaming loader on '\n' content (#26829) 2025-10-13 17:31:52 +08:00
Davide Delbianco
c71fd7113c chore: Correct padding in embedded chatbot (#26832) 2025-10-13 17:29:47 +08:00
lyzno1
d19ce15f3d Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-13 17:28:47 +08:00
lyzno1
49af7eb370 fix(trigger-schedule): make timezone field optional to match actual usage 2025-10-13 17:28:40 +08:00
lyzno1
8e235dc92c feat(workflow): hide timezone in node next execution, keep in panel next 5 executions 2025-10-13 17:28:40 +08:00
lyzno1
3b3963b055 refactor(workflow): remove timezone required validation as it is auto-filled by use-config 2025-10-13 17:28:40 +08:00
lyzno1
378c2afcd3 fix(workflow): remove hardcoded UTC timezone from new schedule node to use user timezone 2025-10-13 17:28:40 +08:00
lyzno1
d709f20e1f fix(workflow): update community feedback link to plugin request template 2025-10-13 17:28:40 +08:00
lyzno1
99d9657af8 feat(workflow): integrate timezone display into execution time format for better readability 2025-10-13 17:28:40 +08:00
lyzno1
62efdd7f7a fix(workflow): preserve saved timezone in trigger-schedule to match backend fixed-timezone design 2025-10-13 17:28:39 +08:00
lyzno1
ebcf98c137 revert(workflow): remove timezone label from trigger-schedule node display 2025-10-13 17:28:39 +08:00
lyzno1
7560e2427d fix(timezone): support half-hour and 45-minute timezone offsets
Critical regression fix for convertTimezoneToOffsetStr:

Issues Fixed:
- Previous regex /^([+-]?\d{1,2}):00/ only matched :00 offsets
- This caused half-hour offsets (e.g., India +05:30) to return UTC+0
- Even if matched, parseInt only parsed hours, losing minute info

Changes:
- Update regex to /^([+-]?\d{1,2}):(\d{2})/ to match all offset formats
- Parse both hours and minutes separately
- Output format: "UTC+5:30" for non-zero minutes, "UTC+8" for whole hours
- Preserve leading zeros in minute part (e.g., "UTC+5:30" not "UTC+5:3")

Test Coverage:
- Added 8 comprehensive tests covering:
  * Default/invalid timezone handling
  * Whole hour offsets (positive/negative)
  * Zero offset (UTC)
  * Half-hour offsets (India +5:30, Australia +9:30)
  * 45-minute offset (Chatham +12:45)
  * Leading zero preservation in minutes

All 14 tests passing. Verified with timezone.json entries at lines 967, 1135, 1251.
2025-10-13 17:28:39 +08:00
lyzno1
920a608e5d fix(trigger-schedule): prevent timezone label truncation in node
- Change layout to ensure timezone label always visible with shrink-0
- Time text can truncate but timezone label stays intact
- Improves readability in constrained node space
2025-10-13 17:28:39 +08:00
lyzno1
4dfb8b988c feat(time-picker): add showTimezone prop with comprehensive tests
- Add showTimezone prop to TimePickerProps for optional inline timezone display
- Integrate TimezoneLabel component into TimePicker when showTimezone=true
- Add 6 comprehensive test cases covering all showTimezone scenarios:
  * Default behavior (no timezone label)
  * Explicit disable with showTimezone=false
  * Enable with showTimezone=true
  * Inline prop correctly passed
  * No display when timezone is missing
  * Correct styling classes applied
- Update trigger-schedule panel to use showTimezone prop
- All 15 tests passing with good coverage
2025-10-13 17:28:39 +08:00
lyzno1
af6dae3498 fix(timezone): fix UTC offset display bug and add timezone labels
- Fixed convertTimezoneToOffsetStr() that only extracted first digit
  * UTC-11 was incorrectly displayed as UTC-1, UTC+10 as UTC+0
  * Now correctly extracts full offset using regex and removes leading zeros
- Created reusable TimezoneLabel component with inline mode support
- Added comprehensive unit tests with 100% coverage
- Integrated timezone labels into 3 locations:
  * Panel time picker (next to time input)
  * Node next execution display
  * Panel next 5 executions list
2025-10-13 17:28:39 +08:00
yessenia
ee21b4d435 feat: support copy to clipboard in input component 2025-10-13 17:21:26 +08:00
zhsama
654adccfbf fix(trigger): implement plugin single run functionality and update node status handling 2025-10-13 17:02:44 +08:00
Harry
b283a2b3d9 feat(trigger): add API endpoint to retrieve trigger plugin icons and enhance workflow response handling
- Introduced `TriggerProviderIconApi` to fetch icons for trigger plugins based on tenant and provider ID.
- Updated `WorkflowResponseConverter` to include trigger plugin icons in the response.
- Implemented `get_trigger_plugin_icon` method in `TriggerManager` for icon retrieval logic.
- Adjusted `Node` class to correctly set provider information for trigger plugins.
- Modified TypeScript types to accommodate new provider ID field in workflow nodes.
2025-10-13 16:50:32 +08:00
NFish
5fc104a992 Fix/web app permission check (#26821) 2025-10-13 16:17:42 +08:00
lyzno1
cce729916a fix(trigger-schedule): pass time string directly to TimePicker to avoid double timezone conversion 2025-10-13 16:00:13 +08:00
yessenia
4f8bf97935 fix: creating modal style 2025-10-13 14:54:24 +08:00
zhsama
ba88c7b25b fix(workflow): handle plugin run mode correctly by setting status 2025-10-13 14:50:12 +08:00
yessenia
0ec5d53e5b fix(trigger): log style 2025-10-13 14:46:08 +08:00
lyzno1
f3b415c095 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-13 13:21:51 +08:00
fenglin
d1de3cfb94 fix: use enum .value strings in retrieval-setting API to fix JSON serialization error (#26785)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-13 13:01:44 +08:00
Harry
6fb657a89e refactor(subscription): enhance subscription count handling in selector view
- Introduced a subscriptionCount variable to improve readability and performance when checking the number of subscriptions.
- Updated the rendering logic to use subscriptionCount, ensuring consistent and clear display of subscription information in the component.
2025-10-13 11:22:25 +08:00
屈定
44d36f2460 fix: external knowledge url check ssrf (#26789)
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-13 11:19:00 +08:00
Wu Tianwei
9088f151d9 fix: invalid data source list in plugin refresh hook (#26813) 2025-10-13 11:17:46 +08:00
Wu Tianwei
c692962650 fix: update tooltip for chunk structure in knowledge base component (#26808) 2025-10-13 10:44:10 +08:00
Wu Tianwei
f0a60a9000 feat: enhance DataSources component with marketplace plugin integration and search filtering (#26810)
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-13 10:43:51 +08:00
AsperforMias
2f50f3fd4b refactor: use libs.login current_user in console controllers (#26745)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-13 10:33:33 +08:00
Asuka Minato
24cd7bbc62 fix RetrievalMethod StrEnum (#26768)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-13 10:29:37 +08:00
Guangdong Liu
d299e75e1b refactor: use dynamic max characters for chunking in extractors (#26782) 2025-10-13 10:22:59 +08:00
yangzheli
f86b6658c9 perf(web): split constant files to improve web performance (#26794) 2025-10-13 10:22:34 +08:00
Asuka Minato
0a56d65581 Issue 23579 (#26777)
Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-13 10:16:12 +08:00
Yuto Yamada
dfc03bac9f Fix typo: reponse to response (#26792) 2025-10-13 10:04:19 +08:00
dependabot[bot]
81e1376e08 chore(deps): bump opik from 1.7.43 to 1.8.72 in /api (#26804)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 10:00:35 +08:00
dependabot[bot]
f50c85d536 chore(deps-dev): bump knip from 5.64.1 to 5.64.3 in /web (#26802)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 10:00:03 +08:00
dependabot[bot]
5830c69694 chore(deps): bump @lexical/utils from 0.36.2 to 0.37.0 in /web (#26801)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-13 09:59:31 +08:00
Harry
90240cb6db refactor(subscription): optimize subscription count handling in list view
- Replaced direct length checks on subscriptions with a computed subscriptionCount variable for improved readability and performance.
- Updated the CreateSubscriptionButton to conditionally render based on the new subscriptionCount variable, enhancing clarity in the component logic.
- Adjusted className logic for the button to account for multiple supported methods, ensuring better user experience.
2025-10-12 23:56:27 +08:00
Harry
cca48f07aa feat(trigger): implement atomic update and verification for subscription builders
- Introduced atomic operations for updating and verifying subscription builders to prevent race conditions.
- Added distributed locking mechanism to ensure data consistency during concurrent updates and builds.
- Refactored existing methods to utilize the new atomic update and verification logic, enhancing the reliability of trigger subscription handling.
2025-10-12 21:27:38 +08:00
Harry
beff639c3d fix(trigger): improve trigger subscription query with AppTrigger join
- Updated the trigger subscription query to join with the AppTrigger model, ensuring only enabled app triggers are considered.
- Enhanced the filtering criteria for retrieving subscribers based on the AppTrigger status, improving the accuracy of the trigger subscription handling.
2025-10-12 19:24:54 +08:00
Harry
00359830c2 refactor(trigger): streamline response handling in trigger subscription dispatch
- Removed the redundant response extraction from the dispatch call and directly assigned the response to a variable for clarity.
- Enhanced logging by appending the request log after dispatching, ensuring better traceability of requests and responses in the trigger subscription workflow.
2025-10-11 22:16:18 +08:00
Harry
f23e098b9a fix(trigger): handle exceptions in trigger subscription dispatch
- Wrapped the dispatch call in a try-except block to catch exceptions and return a 500 error response if an error occurs.
- Enhanced logging of the request and error response for better traceability in the trigger subscription workflow.
2025-10-11 22:13:36 +08:00
Harry
42f75b6602 feat(trigger): enhance trigger subscription handling with credential support
- Added `credentials` and `credential_type` parameters to various methods in `PluginTriggerManager`, `PluginTriggerProviderController`, and `TriggerManager` to support improved credential management for trigger subscriptions.
- Updated the `Subscription` model to include `parameters` for better subscription data handling.
- Refactored related services to accommodate the new credential handling, ensuring consistency across the trigger workflow.
2025-10-11 21:12:27 +08:00
yessenia
4f65cc312d feat: delete confirm opt 2025-10-11 20:19:27 +08:00
yessenia
854a091f82 feat: add validation status for formitem 2025-10-11 19:50:05 +08:00
zhsama
63dbc7c63d fix(trigger): update provider_id reference to plugin_id in useToolIcon hook 2025-10-11 19:05:57 +08:00
crazywoola
0173496a77 fix: happy-dom version (#26764)
Co-authored-by: lyzno1 <yuanyouhuilyz@gmail.com>
Co-authored-by: lyzno1 <92089059+lyzno1@users.noreply.github.com>
2025-10-11 18:59:31 +08:00
zhsama
a4e80640fe chore(trigger): remove debug console logs 2025-10-11 18:54:47 +08:00
zhsama
fe0a139c89 fix(trigger): update provider_id references to plugin_id in BasePanel component 2025-10-11 18:52:15 +08:00
lyzno1
30c5b47699 refactor: simplify InlineDeleteConfirm component structure (#26771) 2025-10-11 18:18:18 +08:00
NeatGuyCoding
e3191d4e91 fix enum and type (#26756)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-11 17:46:44 +08:00
lyzno1
a9b3539b90 feat: migrate Python SDK to httpx with async/await support (#26726)
Signed-off-by: lyzno1 <yuanyouhuilyz@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-11 17:45:42 +08:00
github-actions[bot]
5217017e69 chore: translate i18n files and update type definitions (#26763)
Co-authored-by: asukaminato0721 <30024051+asukaminato0721@users.noreply.github.com>
2025-10-11 17:23:40 +08:00
zhsama
ac2616545b fix(trigger): update provider_id field in TriggerPluginActionItem component 2025-10-11 17:10:29 +08:00
zhsama
c9e7922a14 refactor(trigger): update trigger-related types and field names / values 2025-10-11 17:06:43 +08:00
lyzno1
bd5df5cf1c feat: add InlineDeleteConfirm base component (#26762) 2025-10-11 17:33:31 +09:00
yessenia
12a7402291 fix: create button not working in manual creation mode 2025-10-11 15:36:37 +08:00
yessenia
33d7b48e49 fix: error when fetching info while switching plugins 2025-10-11 15:00:35 +08:00
Guangdong Liu
456dbfe7d7 feat: add tracking for updated_by and updated_at fields in app models (#26736) 2025-10-11 13:48:57 +08:00
Harry
ee89e9eb2f refactor(trigger): update type parameter naming in PluginTriggerManager
- Changed the parameter name from 'type' to 'type_' in multiple method calls within the PluginTriggerManager class to avoid conflicts with the built-in type function and improve code clarity.
2025-10-11 13:09:25 +08:00
GuanMu
586f210d6e chore: remove unused dependencies for dagre from package.json and pnpm-lock.yaml (#26755) 2025-10-11 13:01:05 +08:00
Harry
e793f9e871 refactor(trigger): remove unnecessary whitespace in trigger-related files
- Cleaned up the code by removing extraneous whitespace in `trigger.py` and `workflow_plugin_trigger_service.py`, improving readability and maintaining code style consistency.
2025-10-11 12:44:54 +08:00
Maries
275a0f9ddd chore(workflows): update deployment configurations for trigger dev (#26753) 2025-10-11 12:43:09 +08:00
Harry
18b02370a2 chore(workflows): update deployment configurations
- Modified the build-push workflow to trigger on all branches under "deploy/**" for broader deployment coverage.
- Changed the SSH host secret in the deploy-dev workflow from RAG_SSH_HOST to DEV_SSH_HOST for improved clarity.
- Removed the obsolete deploy-rag-dev workflow to streamline the CI/CD process.
2025-10-11 12:26:31 +08:00
Harry
d53399e546 refactor(trigger): rename trigger-related fields and methods for consistency
- Updated the naming convention from 'trigger_name' to 'event_name' across various models and services to align with the new event-driven architecture.
- Refactored methods in PluginTriggerManager and PluginTriggerProviderController to use 'invoke_trigger_event' instead of 'invoke_trigger'.
- Adjusted database migration scripts to reflect changes in the schema, including the addition of 'event_name' and 'subscription_id' fields in the workflow_plugin_triggers table.
- Removed deprecated trigger-related methods in WorkflowPluginTriggerService to streamline the codebase.
2025-10-11 12:26:08 +08:00
carribean
cbf2ba6cec Feature integrate alibabacloud mysql vector (#25994)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-11 10:47:28 +08:00
Asuka Minato
1bd621f819 remove .value (#26633)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-11 09:08:29 +08:00
Asuka Minato
bb6a331490 change all to httpx (#26119)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-10 23:41:16 +08:00
Asuka Minato
3922ad876f part of add type to orm (#26262)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-10 23:40:54 +08:00
jiangbo721
fdb53fdeb1 fix: Set ApiTool’s do_http_request to do not retry. (#26721) 2025-10-10 23:39:25 +08:00
GuanMu
3fb5a7bff1 fix: add z-index class to PortalToFollowElemContent for proper layering in dataset extra info component (#26729) 2025-10-10 23:39:13 +08:00
heyszt
6157c67cfe fix: sync aliyun icon SVG files (#26719) 2025-10-10 23:38:45 +08:00
GuanMu
fbc745764a chore: update packageManager version in package.json to pnpm@10.18.2 (#26731) 2025-10-10 23:37:40 +08:00
Arno Ren
78f09801b5 fix: #26668 restore manual tool parameter values (#26733)
Co-authored-by: renzeyu1 <renzeyu1@lixiang.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-10 23:37:10 +08:00
yessenia
622d12137a feat: change subscription field in workflow 2025-10-10 20:58:56 +08:00
lyzno1
bae8e44b32 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-10 19:43:23 +08:00
zhsama
24b5387fd1 fix(workflow): streamline stopping workflow run process 2025-10-10 18:45:43 +08:00
Harry
0c65824cad fix(workflow): update API route for DraftWorkflowTriggerRunApi
- Changed the route from "/apps/<uuid:app_id>/workflows/draft/trigger/run" to "/apps/<uuid:app_id>/workflows/draft/trigger/plugin/run" to reflect the new plugin-based trigger structure.
- Updated corresponding URL in the useWorkflowRun hook to maintain consistency across the application.
2025-10-10 18:13:28 +08:00
Bowen Liang
d0dd81cf84 chore: bump ruff to 0.14 (#26063) 2025-10-10 18:10:23 +08:00
Harry
31c9d9da3f fix(workflow): enhance response structure in DraftWorkflowTriggerRunApi
- Added a "retry_in" field to the response when no event is found, improving the API's feedback during workflow execution.
2025-10-10 18:05:50 +08:00
Harry
8f854e6a45 fix(workflow): add root_node_id to DraftWorkflowTriggerRunApi for improved response handling
- Included root_node_id in the API call to enhance the response structure during workflow execution.
2025-10-10 18:05:50 +08:00
yessenia
75b3f5ac5a feat: change subscription field 2025-10-10 17:37:20 +08:00
zhsama
323e183775 refactor(trigger): improve config value formatting in PluginTriggerNode 2025-10-10 17:28:41 +08:00
znn
65b832c46c pan and zoom during workflow execution (#24254) 2025-10-10 17:07:25 +08:00
znn
a90b60c36f removing horus eye and adding mcp icon (#25323)
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: crazywoola <427733928@qq.com>
2025-10-10 17:00:03 +08:00
诗浓
94a07706ec fix: restore None guards for _environment_variables/_conversation_variables getters (#25633) 2025-10-10 16:32:09 +08:00
Asuka Minato
ab2eacb6c1 use model_validate (#26182)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-10 17:30:13 +09:00
Xiyuan Chen
aead192743 Fix/token exp when exchange main (#26708) 2025-10-10 01:24:36 -07:00
Asuka Minato
c1e8584b97 feat: Refactor api.add_resource to @console_ns.route decorator (#26386)
Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
2025-10-10 16:23:39 +08:00
Asuka Minato
8a2b208299 Refactor account models to use SQLAlchemy 2.0 dataclass mapping (#26415)
Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-10 17:12:12 +09:00
znn
2b6882bd97 fix chunks 2 (#26623)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-10 16:01:33 +08:00
Guangdong Liu
aa51662d98 refactor(api): add new endpoints for workspace management and update routing (#26465)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-10 15:59:14 +08:00
github-actions[bot]
3068526797 chore: translate i18n files and update type definitions (#26709)
Co-authored-by: iamjoel <2120155+iamjoel@users.noreply.github.com>
2025-10-10 15:55:24 +08:00
Jyong
298d8c2d88 Update deploy-dev.yml (#26712) 2025-10-10 15:54:33 +08:00
fenglin
294e01a8c1 Fix/tool provider tag internationalization (#26710)
Co-authored-by: qiaofenglin <qiaofenglin@baidu.com>
2025-10-10 15:52:09 +08:00
Harry
380ef52331 refactor(trigger): update API and service to use 'event' terminology
- Renamed 'trigger_name' to 'event_name' in the DraftWorkflowTriggerNodeApi for consistency with the new naming convention.
- Added 'provider_id' to the API request model to enhance functionality.
- Updated the PluginTriggerDebugEvent and TriggerDebugService to reflect changes in naming and improve address formatting.
- Adjusted frontend utility to align with the updated variable names.
2025-10-10 15:48:42 +08:00
Coding On Star
3a5aa4587c feat(billing): add tax information tooltips in pricing footer (#26705)
Co-authored-by: CodingOnStar <hanxujiang@dify.ai>
2025-10-10 15:34:56 +08:00
lyzno1
b8862293b6 fix: resolve semantic conflict in TimePicker notClearable logic 2025-10-10 15:17:19 +08:00
lyzno1
85f1cf1d90 Merge branch 'main' into feat/trigger 2025-10-10 15:16:00 +08:00
lyzno1
1d4e36d58f fix: display correct icon for trigger nodes in listening panel 2025-10-10 15:04:58 +08:00
Harry
90ae5e5865 refactor(trigger): enhance update method to use explicit None checks
- Updated the `update` method in `SubscriptionBuilderUpdater` to use 'is not None' checks instead of truthy evaluations for better handling of empty values.
- This change improves clarity and ensures that empty dictionaries or strings are correctly processed during updates.
2025-10-10 14:52:03 +08:00
crazywoola
cf1778e696 fix: issue w/ timepicker (#26696)
Co-authored-by: lyzno1 <yuanyouhuilyz@gmail.com>
Co-authored-by: lyzno1 <92089059+lyzno1@users.noreply.github.com>
2025-10-10 13:17:33 +08:00
yihong
54db4c176a fix: drop useless logic (#26678)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2025-10-10 12:59:28 +08:00
Novice
5d3e8a31d0 fix: restore array flattening behavior in iteration node (#26695) 2025-10-10 10:54:32 +08:00
zhsama
755fb96a33 feat(trigger): add plugin trigger test-run handling to workflow 2025-10-10 10:43:13 +08:00
Nan LI
885dff82e3 feat: update HTTP timeout configurations and enhance timeout input handling in UI (#26685) 2025-10-10 09:00:06 +08:00
Asuka Minato
3c4aa24198 Refactor: Remove unnecessary casts and tighten type checking (#26625)
Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-09 22:11:14 +08:00
GuanMu
33b0814323 refactor(types): remove any usages and strengthen typings across web and base (#26677) 2025-10-09 21:36:42 +08:00
Tianyi Jing
45ae511036 fix: add missing toType to toolCredentialToFormSchemas (#26681)
Signed-off-by: jingfelix <jingfelix@outlook.com>
2025-10-09 21:23:15 +08:00
Harry
b8ca480b07 refactor(trigger): update variable names for clarity and consistency
- Renamed variables related to triggers to use 'trigger' terminology consistently across the codebase.
- Adjusted filtering logic in `TriggerPluginList` to reference 'events' instead of 'triggers' for improved clarity.
- Updated the `getTriggerIcon` function to reflect the new naming conventions and ensure proper icon rendering.
2025-10-09 12:23:48 +08:00
Asuka Minato
0fa063c640 Refactor: Remove reportUnnecessaryContains from pyrightconfig.json (#26626)
Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
2025-10-09 10:22:41 +08:00
Bowen Liang
40d35304ea fix: check allowed file extensions in rag transform pipeline and use set type instead of list for performance in file extensions (#26593) 2025-10-09 10:21:56 +08:00
耐小心
89821d66bb feat: add HTTPX client instrumentation for OpenTelemetry (#26651) 2025-10-09 09:24:47 +08:00
yihong
09d84e900c fix: drop useless logger code (#26650)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
2025-10-09 09:24:10 +08:00
Asuka Minato
a8746bff30 fix oxlint warnings (#26634) 2025-10-09 09:23:34 +08:00
非法操作
c4d8bf0ce9 fix: missing LLM node output var description (#26648) 2025-10-09 09:22:45 +08:00
lyzno1
8a5fbf183b Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-09 09:07:01 +08:00
Harry
91318d3d04 refactor(trigger): rename trigger references to event for consistency
- Updated variable names and types from 'trigger' to 'event' across multiple files to enhance clarity and maintain consistency in the codebase.
- Adjusted related data structures and API responses to reflect the new naming convention.
- Improved type annotations and error handling in the workflow trigger run API and associated services.
2025-10-09 03:12:35 +08:00
非法操作
9cca605bac chore: improve bool input of start node (#26647) 2025-10-08 19:09:03 +08:00
NeatGuyCoding
dbd23f91e5 Feature add test containers mail invite task (#26637)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
2025-10-08 18:40:19 +08:00
Asuka Minato
9387cc088c feat: remove unused python dependency (#26629)
Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-08 18:38:38 +08:00
Harry
a33d04d1ac refactor(trigger): unify debug event handling and improve polling mechanism
- Introduced a base class for debug events to streamline event handling.
- Refactored `TriggerDebugService` to support multiple event types through a generic dispatch/poll interface.
- Updated webhook and plugin trigger debug services to utilize the new event structure.
- Enhanced the dispatch logic in `dispatch_triggered_workflows_async` to accommodate the new event model.
2025-10-08 17:31:16 +08:00
lyzno1
02222752f0 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-07 18:25:43 +08:00
Asuka Minato
11f7a89e25 refactor: Enable type checking for dataset config manager (#26494)
Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-07 15:50:44 +09:00
Yadong (Adam) Zhang
654d522b31 perf(web): improve app workflow build performance. (#26310) 2025-10-07 14:21:08 +08:00
Ponder
31e6ef77a6 feat: optimize the page jump logic to prevent unnecessary jumps. (#26481)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-10-07 14:20:12 +08:00
dependabot[bot]
e56c847210 chore(deps): bump esdk-obs-python from 3.24.6.1 to 3.25.8 in /api (#26604)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-07 14:17:56 +08:00
dependabot[bot]
e00172199a chore(deps-dev): bump babel-loader from 9.2.1 to 10.0.0 in /web (#26601)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-07 14:17:05 +08:00
yihong
04f47836d8 fix: two functions comments doc is not right (#26624)
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-07 14:16:14 +08:00
Charles Liu
faaca822e4 fix bug 26613: get wrong credentials with multiple authorizations plugin (#26615)
Co-authored-by: charles liu <dearcharles.liu@gmail.com>
2025-10-07 12:49:44 +08:00
NeatGuyCoding
dc0f053925 Feature add test containers mail inner task (#26622)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-07 12:48:11 +08:00
NeatGuyCoding
517726da3a Feature add test containers mail change mail task (#26570)
Signed-off-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
2025-10-06 20:25:31 +08:00
Will
1d6c03eddf delete unnecessary db merge (#26588) 2025-10-06 20:24:24 +08:00
dependabot[bot]
fdfccd1205 chore(deps): bump azure-storage-blob from 12.13.0 to 12.26.0 in /api (#26603)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-06 20:22:53 +08:00
dependabot[bot]
b30e7ced0a chore(deps): bump react-easy-crop from 5.5.0 to 5.5.3 in /web (#26602)
Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-06 20:22:32 +08:00
Will
11770439be chore: remove explicit dependency on the fastapi framework (#26609) 2025-10-06 20:21:51 +08:00
lyzno1
04d94e3337 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-06 19:12:16 +08:00
hjlarry
b98c36db48 fix trigger related api 404 2025-10-06 14:36:07 +08:00
hjlarry
d05d11e67f add webhook node draft single run 2025-10-06 14:35:12 +08:00
Will
d89c5f7146 chore: Avoid directly using OpenAI dependencies (#26590)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-10-06 10:40:38 +08:00
-LAN-
4a475bf1cd chore: Raise default string length limits (#26592)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: Bowen Liang <liangbowen@gf.com.cn>
2025-10-06 10:40:13 +08:00
Bowen Liang
10be9cfbbf chore: fix basedwright style warning for opendal.layers imports (#26596) 2025-10-06 10:39:28 +08:00
lyzno1
3370736e09 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-04 11:30:26 +08:00
hjlarry
cc5a315039 fix pyright exception 2025-10-02 20:26:29 +08:00
hjlarry
6ea10cdaaf debug webhook don't require publish the app 2025-10-02 20:07:57 +08:00
lyzno1
9643fa1c9a fix: use StopCircle icon in variable inspect listening panel 2025-10-02 10:02:19 +08:00
lyzno1
937a58d0dd Merge remote-tracking branch 'origin/main' into feat/trigger 2025-10-02 09:18:21 +08:00
hjlarry
d9faa1329a move workflow_plugin_trigger_service to trigger sub dir 2025-10-02 00:31:33 +08:00
hjlarry
fec09e7ed3 move trigger_service to trigger sub dir 2025-10-02 00:29:53 +08:00
hjlarry
31b15b492e move trigger_debug_service to trigger sub dir 2025-10-02 00:27:48 +08:00
hjlarry
f96bd4eb18 move schedule service to trigger sub dir 2025-10-02 00:24:32 +08:00
hjlarry
a4109088c9 move webhook service to trigger sub dir 2025-10-02 00:18:37 +08:00
hjlarry
f827e8e1b7 add more code comment 2025-10-02 00:14:35 +08:00
hjlarry
82f2f76dc4 ruff format code 2025-10-01 23:39:46 +08:00
hjlarry
e6a44a0860 can debug when disable webhook 2025-10-01 23:39:37 +08:00
hjlarry
604651873e refactor webhook service 2025-10-01 12:46:42 +08:00
lyzno1
9114881623 fix: update frontend trigger field mapping from triggers to events
- Update TriggerProviderApiEntity type to use events field (aligned with backend commit 32f4d1af8)
- Update conversion function in use-triggers.ts to map provider.events to TriggerWithProvider.triggers
- Fix trigger-events-list.tsx to use providerInfo.events (TriggerProviderApiEntity type)
- Fix parameters-form.tsx to use provider.triggers (TriggerWithProvider type)
2025-10-01 09:53:45 +08:00
hjlarry
080cdda4fa query param of webhook backend support 2025-09-30 21:21:39 +08:00
Harry
32f4d1af8b Refactor: Rename triggers to events in trigger-related entities and services
- Updated class and variable names from 'triggers' to 'events' across multiple files to improve clarity and consistency.
- Adjusted related data structures and methods to reflect the new naming convention, including changes in API entities, service methods, and trigger management logic.
- Ensured all references to triggers are replaced with events to align with the updated terminology.
2025-09-30 20:18:33 +08:00
lyzno1
1bfa8e6662 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-30 18:56:21 +08:00
lyzno1
7c97ea4a9e fix: correct entry node alignment for wrapper offset
- Add ENTRY_NODE_WRAPPER_OFFSET constant (x: 0, y: 21) for Start/Trigger nodes
- Implement getNodeAlignPosition() to calculate actual inner node positions
- Fix horizontal/vertical helpline rendering to account for wrapper offset
- Fix snap-to-align logic to properly align inner nodes instead of wrapper
- Correct helpline width/height calculation by subtracting offset for entry nodes
- Ensure backward compatibility: only affects Start/Trigger nodes with EntryNodeContainer wrapper

This fix ensures that Start and Trigger nodes (which have an EntryNodeContainer wrapper
with status indicator) align based on their inner node boundaries rather than the wrapper
boundaries, matching the alignment behavior of regular nodes.
2025-09-30 18:36:49 +08:00
lyzno1
bea11b08d7 refactor: hide workflow features button in workflow mode, keep it visible in chatflow mode 2025-09-30 17:51:01 +08:00
lyzno1
8547032a87 Revert "refactor: app publisher"
This reverts commit 8feef2c1a9.
2025-09-30 17:46:27 +08:00
hjlarry
43574c852d add variable type to webhook request parameters panel 2025-09-30 16:31:21 +08:00
hjlarry
5ecc006805 add listening status for variable panel 2025-09-30 15:18:07 +08:00
lyzno1
15413108f0 chore: remove unused empty enums.py file 2025-09-30 13:52:33 +08:00
lyzno1
831c888b84 feat: sort output variables by table display order in webhook trigger 2025-09-30 12:34:09 +08:00
lyzno1
f0ed09a8d4 feat: add output variables display to webhook trigger node (#26478)
Co-authored-by: Claude <noreply@anthropic.com>
2025-09-30 12:26:42 +08:00
hjlarry
a80f30f9ef add nginx /triggers endpoint 2025-09-30 11:08:14 +08:00
hjlarry
fd2f0df097 useStore to isListening status 2025-09-30 10:48:38 +08:00
lyzno1
d72a3e1879 fix: translations 2025-09-30 10:01:33 +08:00
lyzno1
4a6903fdb4 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-30 08:00:16 +08:00
zxhlyh
8106df1d7d fix: types 2025-09-29 20:53:50 +08:00
Harry
5e3e6b0bd8 refactor(api): update subscription handling in trigger provider
- Replaced SubscriptionSchema with SubscriptionConstructor in various parts of the trigger provider implementation to streamline subscription management.
- Enhanced the PluginTriggerProviderController to utilize the new subscription constructor for retrieving default properties and credential schemas.
- Removed the deprecated get_provider_subscription_schema method from TriggerManager.
- Updated TriggerSubscriptionBuilderService to reflect changes in subscription handling, ensuring compatibility with the new structure.

These changes improve the clarity and maintainability of the subscription handling within the trigger provider architecture.
2025-09-29 18:28:10 +08:00
Harry
a06d2892f8 fix(plugin): handle optional property in llm_description assignment
- Updated the llm_description assignment in the ToolParameter to safely access the en_US property of paramDescription, ensuring it defaults to an empty string if not present. This change improves the robustness of the parameter handling in the plugin detail panel.
2025-09-29 18:28:10 +08:00
Harry
e377e90666 feat(api): add CHECKBOX parameter type to plugin and tool entities
- Introduced CHECKBOX as a new parameter type in CommonParameterType and PluginParameterType.
- Updated as_normal_type and cast_parameter_value functions to handle CHECKBOX type.
- Enhanced ToolParameter class to include CHECKBOX for consistency across parameter types.

These changes expand the parameter capabilities within the API, allowing for more versatile input options.
2025-09-29 18:28:10 +08:00
Harry
19cc67561b refactor(api): improve error handling in trigger providers
- Removed unnecessary ValueError handling in TriggerSubscriptionBuilderCreateApi and TriggerSubscriptionBuilderBuildApi, allowing for more streamlined exception management.
- Updated TriggerSubscriptionBuilderVerifyApi and TriggerSubscriptionBuilderBuildApi to raise ValueError with the original exception context for better debugging.
- Enhanced trigger_endpoint in trigger.py to log errors and return a JSON response for not found endpoints, improving user feedback and error reporting.

These changes enhance the robustness and clarity of error handling across the API.
2025-09-29 18:28:10 +08:00
hjlarry
92f2ca1866 add listening status in the run panel result 2025-09-29 17:55:53 +08:00
hjlarry
1949074e2f add shortcut for open test run panel 2025-09-29 14:39:44 +08:00
hjlarry
1c0068e95b fix can't stop webhook debug 2025-09-29 13:34:05 +08:00
lyzno1
4b43196295 feat: add specialized trigger icons to workflow logs
- Create TriggerByDisplay component with appropriate colored icons
- Add dedicated Code icon for debugging triggers (blue background)
- Add KnowledgeRetrieval icon for RAG pipeline triggers (green background)
- Use existing webhook, schedule, and plugin icons with proper colors
- Add comprehensive i18n translations for Chinese, Japanese, and English
- Integrate icon display into workflow logs table
- Follow project color standards from block-icon component
2025-09-29 12:53:35 +08:00
lyzno1
2c3cf9a25e Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-29 12:13:39 +08:00
lyzno1
67fbfc0b8f fix: adjust MoreActions menu position based on sidebar state 2025-09-29 12:13:18 +08:00
hjlarry
6e6198c64e debug webhook node 2025-09-29 09:28:19 +08:00
lyzno1
6b677c16ce refactor: use Tailwind className for MiniMap node colors instead of CSS variables 2025-09-29 08:09:38 +08:00
yessenia
973b937ba5 feat: add subscription in node 2025-09-28 22:40:31 +08:00
zhsama
48597ef193 feat: enhance minimap node color handling 2025-09-28 21:11:46 +08:00
zhsama
ffbc007f82 feat(i18n): add tooltip and placeholder for callback URL in plugin-trigger translations 2025-09-28 20:13:10 +08:00
lyzno1
8fc88f8cbf Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-28 19:32:33 +08:00
zhsama
a4b932c78b feat: integrate chat mode detection in ChangeBlock component 2025-09-28 17:10:09 +08:00
hjlarry
2ff4af8ce3 add debug run schedule node 2025-09-28 16:37:37 +08:00
zhsama
6795015d00 refactor: enhance type definitions and update import paths in form input and trigger components 2025-09-28 15:42:38 +08:00
zhsama
b100ce15cd refactor: update import paths and remove unused props in block selector components 2025-09-28 15:21:44 +08:00
yessenia
3edf1e2f59 feat: add checkbox list 2025-09-28 15:12:17 +08:00
lyzno1
4d49db0ff9 Unify SearchBox styles with Input component and add autoFocus 2025-09-28 14:33:27 +08:00
lyzno1
7da22e4915 Add toast notifications to TriggerCard toggle operations 2025-09-28 14:21:51 +08:00
lyzno1
8d4a9df6b1 fix: more button dropdown menu visibility and auto-close behavior 2025-09-28 14:15:33 +08:00
lyzno1
f620e78b20 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-28 13:40:46 +08:00
hjlarry
8df80781d9 _node_type has changed to node_type in new version 2025-09-28 09:36:45 +08:00
hjlarry
edec065fee fix can't subtract offset-naive and offset-aware datetimes 2025-09-28 09:10:21 +08:00
lyzno1
0fe529c3aa Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-27 21:32:38 +08:00
zhsama
bcfdd07f85 feat(plugin): enhance trigger events list with dynamic tool integration
- Refactor TriggerEventsList component to utilize provider information for dynamic tool rendering.
- Implement locale-aware text handling for trigger descriptions and labels.
- Introduce utility functions for better management of tool parameters and trigger descriptions.
- Improve user experience by ensuring consistent display of trigger events based on available provider data.

This update enhances the functionality and maintainability of the trigger events list, aligning with the project's metadata-driven approach.
2025-09-26 23:27:27 +08:00
lyzno1
a9a118aaf9 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-26 22:48:15 +08:00
lyzno1
60c86dd8d1 fix(workflow): replace hardcoded trigger node logic with metadata-driven approach
- Add isStart: true to all trigger nodes (TriggerWebhook, TriggerSchedule, TriggerPlugin)
- Replace hardcoded BlockEnum checks in use-checklist.ts with metadata-driven logic
- Update trigger node tests to validate metadata instead of obsolete methods
- Add webhook URL validation to TriggerWebhook node
- Ensure backward compatibility with existing workflow configurations

This change migrates from hardcoded trigger node identification to a
centralized metadata-driven approach, improving maintainability and
consistency across the workflow system.
2025-09-26 22:35:21 +08:00
lyzno1
8feef2c1a9 refactor: app publisher 2025-09-26 22:06:05 +08:00
lyzno1
4ba99db88c feat: Restore complete test run functionality and fix workflow block selector system
This comprehensive restore includes:

## Test Run System Restoration
- Restore test-run-menu.tsx component with multi-trigger support and keyboard shortcuts
- Restore use-dynamic-test-run-options.tsx hook for dynamic trigger option generation
- Restore workflow-entry.ts utilities for entry node detection and validation
- Integrate complete test run functionality back into run-mode.tsx

## Block Selector System Fixes
- Fix workflow block selector constants by uncommenting BLOCKS and START_BLOCKS arrays
- Restore proper i18n translations for trigger node descriptions using workflow.blocksAbout keys
- Filter trigger types from Blocks tab to prevent duplication with Start tab
- Fix trigger node handle display to match start node behavior (hide left input handles)

## Workflow Validation System Improvements
- Restore unified workflow validation using correct getValidTreeNodes(nodes, edges) signature
- Remove duplicate Start node validation from isRequired mechanism
- Eliminate "user input must be added" validation error by setting Start node isRequired: false
- Fix end node connectivity validation to properly detect valid workflow chains

## Component Integration
- Verify all dependencies exist (TriggerAll icon, useAllTriggerPlugins hook)
- Maintain keyboard shortcut integration (Alt+R, ~, 0-9 keys)
- Preserve portal-based dropdown positioning and tooltip structure
- Support multiple trigger types: user_input, schedule, webhook, plugin, all

This restores the complete test run functionality that was missing from feat/trigger branch
by systematically analyzing and restoring components from feat/trigger-backup-before-merge.
2025-09-26 21:34:08 +08:00
lyzno1
b4801adfbd refactor(workflow): Remove Start node from isRequired mechanism
- Set Start node isRequired: false since entry node validation is handled by unified logic
- Remove conditional skip logic in checklist since Start is no longer in isRequiredNodesType
- Cleaner separation of concerns: unified entry node check vs individual required nodes
- Eliminates architectural inconsistency where Start was both individually required and part of group validation
2025-09-26 21:09:48 +08:00
lyzno1
08e8f8676e fix(workflow): Remove duplicate Start node validation
- Skip Start node requirement in isRequiredNodesType loop since it's already covered by unified entry node validation
- Eliminates duplicate 'User Input must be added' error when trigger nodes are present
- Both useChecklist and useChecklistBeforePublish now consistently handle entry node validation
- Resolves UI showing redundant validation errors for Start vs Trigger nodes
2025-09-26 21:08:21 +08:00
lyzno1
2dca0c20db fix: restore unified workflow validation system
Major fixes to workflow checklist validation:

## Fixed getValidTreeNodes function (workflow.ts)
- Restore original function signature: (nodes, edges) instead of (startNode, nodes, edges)
- Re-implement automatic start node discovery for all entry types
- Unified traversal from Start, TriggerWebhook, TriggerSchedule, TriggerPlugin nodes
- Single call now discovers all valid connected nodes correctly

## Simplified useChecklist validation (use-checklist.ts)
- Remove complex manual start node iteration and result aggregation
- Unified entry node validation concept for all start node types
- Remove dependency on getStartNodes() utility
- Simplified validation logic matching backup branch approach

## Resolved Issues
-  End node connectivity: Now correctly detects connections from any entry node
-  Unified entry validation: All start types (Start/Triggers) validated consistently
-  Simplified architecture: Restored proven validation approach from backup branch

This restores the reliable workflow validation system while maintaining trigger node support.
2025-09-26 20:54:28 +08:00
lyzno1
6f57aa3f53 fix: hide left input handles for all trigger node types
- Extend handle hiding logic to include TriggerWebhook, TriggerSchedule, TriggerPlugin
- Make trigger nodes behave like Start nodes without left-side input handles
- Apply fix to both main workflow and preview node handle components
- Ensures consistent UX where all start-type nodes have no input handles
2025-09-26 20:39:29 +08:00
lyzno1
1aafe915e4 fix: trigger tooltip descriptions and filter trigger types from Nodes tab
- Fix trigger tooltip descriptions to use workflow.blocksAbout translations
- Filter TriggerWebhook/TriggerSchedule/TriggerPlugin from Blocks component
- Ensure trigger types only appear in Start tab, not Nodes tab
2025-09-26 20:28:59 +08:00
lyzno1
6d4d25ee6f feat(workflow): Restore block selector functionality
- Restore BLOCKS constant array and useBlocks hook
- Add intelligent fallback mechanism for blocks prop
- Fix metadata access in StartBlocks tooltip
- Restore defaultActiveTab support in NodeSelector
- Improve component robustness with graceful degradation
- Fix TypeScript errors and component interfaces

Phase 1-3 of atomic refactoring complete:
- Critical fixes: Constants, hooks, components
- Interface fixes: Props, tabs, modal integration
- Architecture improvements: Metadata, wrappers
2025-09-26 20:05:59 +08:00
yessenia
6b94d30a5f fix: oauth subscription 2025-09-26 17:44:57 +08:00
lyzno1
1a9798c559 fix(workflow): Fix onboarding node creation after knowledge pipeline refactor (#26289) 2025-09-26 16:43:36 +08:00
lyzno1
764436ed8e feat(workflow): Enable keyboard delete for all node types including Start
Removes explicit Start node exclusion from handleNodesDelete function:
- Remove BlockEnum.Start filter from bundled nodes selection
- Remove BlockEnum.Start filter from selected node detection
- Allows DEL/Backspace keys to delete Start nodes same as other nodes
- Button delete already worked, now keyboard delete works too

Fixes: Start nodes can now be deleted via both button and keyboard shortcuts

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-26 14:10:28 +08:00
lyzno1
2a1c5ff57b feat(workflow): Enable entry node deletion and fix draft sync
Complete workflow liberalization following PR #24627:

1. Remove Start node deletion restriction by removing isUndeletable property
2. Fix draft sync blocking when no Start node exists
3. Restore isWorkflowDataLoaded protection to prevent race conditions
4. Ensure all entry nodes (Start + 3 trigger types) have equal deletion rights

This allows workflows with only trigger nodes and fixes the issue where
added nodes would disappear after page refresh due to sync API blocking.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-26 13:54:52 +08:00
lyzno1
cc4ba1a3a9 chore: add settings.local.json to .gitignore 2025-09-26 13:26:51 +08:00
lyzno1
d68a9f1850 Merge remote-tracking branch 'origin/main' into feat/trigger
Resolve merge conflict in use-workflow.ts:
- Keep trigger branch workflow-entry utilities imports
- Preserve SUPPORT_OUTPUT_VARS_NODE from main branch
- Remove unused PARALLEL_DEPTH_LIMIT import

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-26 13:17:14 +08:00
Harry
4f460160d2 refactor(api): reorganize migration files 2025-09-26 12:31:44 +08:00
Harry
d5ff89f6d3 refactor(api): enhance request handling and time management
- Initialized `response` variable in `trigger.py` to ensure proper handling in the trigger endpoint.
- Updated `http_parser.py` to conditionally set `CONTENT_TYPE` and `CONTENT_LENGTH` headers for improved robustness.
- Changed `datetime.utcnow()` to `datetime.now(UTC)` in `sqlalchemy_workflow_trigger_log_repository.py` and `rate_limiter.py` for consistent time zone handling.
- Refactored `async_workflow_service.py` to use the public method `get_tenant_owner_timezone` for better encapsulation.
- Simplified subscription retrieval logic in `plugin_parameter_service.py` for clarity.

These changes improve code reliability and maintainability while ensuring accurate time management and request processing.
2025-09-25 19:46:52 +08:00
Harry
452588dded refactor(api): fix pyright check
- Replaced `is_editor` checks with `has_edit_permission` in `workflow_trigger.py` and `workflow.py` to enhance clarity and consistency in permission handling.
- Updated the rate limiter to use `datetime.now(UTC)` instead of `datetime.utcnow()` for accurate time handling.
- Added `__all__` declaration in `trigger/__init__.py` for better module export management.
- Initialized `debug_dispatched` variable in `trigger_processing_tasks.py` to ensure proper tracking during workflow dispatching.

These changes improve code readability and maintainability while ensuring correct permission checks and time management.
2025-09-25 18:32:22 +08:00
Harry
aef862d9ce refactor(api): remove unused PluginTriggerApi route
- Removed the `PluginTriggerApi` resource route from `workflow_trigger.py` to streamline the API and improve maintainability. This change contributes to a cleaner and more organized codebase.
2025-09-25 18:23:17 +08:00
Harry
896f3252b8 refactor(api): refactor all
- Replaced direct imports of `TriggerProviderID` and `ToolProviderID` from `core.plugin.entities.plugin` with imports from `models.provider_ids` for better organization.
- Refactored workflow node classes to inherit from a unified `Node` class, improving consistency and maintainability.
- Removed unused code and comments to clean up the implementation, particularly in the `workflow_trigger.py` and `builtin_tools_manage_service.py` files.

These changes enhance the clarity and structure of the codebase, facilitating easier future modifications.
2025-09-25 18:22:30 +08:00
yessenia
6853a699e1 Merge branch 'main' into feat/trigger 2025-09-25 17:43:39 +08:00
yessenia
cd07eef639 Merge remote-tracking branch 'origin/main' into feat/trigger 2025-09-25 17:14:24 +08:00
Harry
ef9a741781 feat(trigger): enhance trigger management with new error handling and response structure
- Added `TriggerInvokeError` and `TriggerIgnoreEventError` for better error categorization during trigger invocation.
- Updated `TriggerInvokeResponse` to include a `cancelled` field, indicating if a trigger was ignored.
- Enhanced `TriggerManager` to handle specific errors and return appropriate responses.
- Refactored `dispatch_triggered_workflows` to improve workflow execution logic and error handling.

These changes improve the robustness and clarity of the trigger management system.
2025-09-23 16:01:59 +08:00
Harry
c5de91ba94 refactor(trigger): update cache expiration constants and log key format
- Renamed validation-related constants to builder-related ones for clarity.
- Updated cache expiration from milliseconds to seconds for consistency.
- Adjusted log key format to reflect the builder context instead of validation.

These changes enhance the readability and maintainability of the TriggerSubscriptionBuilderService.
2025-09-22 13:37:46 +08:00
Harry
bc1e6e011b fix(trigger): update cache key format in TriggerSubscriptionBuilderService
- Changed the cache key format in the `encode_cache_key` method from `trigger:subscription:validation:{subscription_id}` to `trigger:subscription:builder:{subscription_id}` to better reflect its purpose.

This update improves clarity in cache key usage for trigger subscriptions.
2025-09-22 13:37:46 +08:00
lyzno1
906028b1fb fix: start node validation 2025-09-22 12:58:20 +08:00
lyzno1
034602969f feat(schedule-trigger): enhance cron parser with mature library and comprehensive testing (#26002) 2025-09-22 10:01:48 +08:00
非法操作
4ca14bfdad chore: improve webhook (#25998) 2025-09-21 12:16:31 +08:00
lyzno1
59f56d8c94 feat: schedule trigger default daily midnight (#25937) 2025-09-19 08:05:00 +08:00
yessenia
63d26f0478 fix: api key params 2025-09-18 17:35:34 +08:00
yessenia
eae65e55ce feat: oauth config opt & add dynamic options 2025-09-18 17:12:48 +08:00
lyzno1
0edf06329f fix: apply suggestions 2025-09-18 17:04:02 +08:00
lyzno1
6943a379c9 feat: show placeholder '--' for invalid cron expressions in node display
- Return '--' placeholder when cron mode has empty or invalid expressions
- Prevents displaying fallback dates that confuse users
- Maintains consistent UX for invalid schedule configurations
2025-09-18 17:04:02 +08:00
lyzno1
e49534b70c fix: make frequency optional 2025-09-18 17:04:02 +08:00
lyzno1
344616ca2f fix: clear opposite mode data only when editing, preserve data during mode switching 2025-09-18 17:04:02 +08:00
lyzno1
0e287a9c93 chore: add missing translations 2025-09-18 13:25:57 +08:00
lyzno1
8141f53af5 fix: add preventDefaultSubmit prop to BaseForm to prevent unwanted page refresh on Enter key 2025-09-18 12:48:26 +08:00
lyzno1
5a6cb0d887 feat: enhance API key modal step indicator with active dots and improved styling 2025-09-18 12:44:11 +08:00
lyzno1
26e7677595 fix: align width and use rounded xl 2025-09-18 12:08:21 +08:00
yessenia
814b0e1fe8 feat: oauth config init 2025-09-18 00:00:50 +08:00
Harry
a173dc5c9d feat(provider): add multiple option support in ProviderConfig
- Introduced a new field `multiple` in the `ProviderConfig` class to allow for multiple selections, enhancing the configuration capabilities for providers.
- This addition improves flexibility in provider settings and aligns with the evolving requirements for provider configurations.
2025-09-17 22:12:01 +08:00
Harry
a567facf2b refactor(trigger): streamline encrypter creation in TriggerProviderService
- Replaced calls to `create_trigger_provider_encrypter` and `create_trigger_provider_oauth_encrypter` with a unified `create_provider_encrypter` method, simplifying the encrypter creation process.
- Updated the parameters passed to the new method to enhance configuration management and cache handling.

These changes improve code clarity and maintainability in the trigger provider service.
2025-09-17 21:47:11 +08:00
Harry
e76d80defe fix(trigger): update client parameter handling in TriggerProviderService
- Modified the `create_provider_encrypter` call to include a cache assignment, ensuring proper management of encryption resources.
- Added a cache deletion step after updating client parameters, enhancing the integrity of the parameter handling process.

These changes improve the reliability of client parameter updates within the trigger provider service.
2025-09-17 20:57:52 +08:00
Harry
4a17025467 fix(trigger): update session management in TriggerProviderService
- Changed session management in `TriggerProviderService` from `autoflush=True` to `expire_on_commit=False` for improved control over session state.
- This change enhances the reliability of database interactions by preventing automatic expiration of objects after commit, ensuring data consistency during trigger operations.

These updates contribute to better session handling and stability in trigger-related functionalities.
2025-09-16 18:01:44 +08:00
Harry
bd1fcd3525 feat(trigger): add TriggerProviderInfoApi and enhance trigger provider service
- Introduced `TriggerProviderInfoApi` to retrieve information for a specific trigger provider, improving API capabilities.
- Added `get_trigger_provider` method in `TriggerProviderService` to fetch trigger provider details, enhancing data retrieval.
- Updated route configurations to include the new API endpoint for trigger provider information.

These changes enhance the functionality and usability of trigger provider interactions within the application.
2025-09-16 17:03:52 +08:00
Harry
0cb0cea167 feat(trigger): enhance trigger plugin data structure and error handling
- Added `plugin_unique_identifier` to `PluginTriggerData` and `TriggerProviderApiEntity` to improve identification of trigger plugins.
- Introduced `PluginTriggerDispatchData` for structured dispatch data in Celery tasks, enhancing the clarity of trigger dispatching.
- Updated `dispatch_triggered_workflows_async` to utilize the new dispatch data structure, improving error handling and logging for trigger invocations.
- Enhanced metadata handling in `TriggerPluginNode` to include trigger information, aiding in debugging and tracking.

These changes improve the robustness and maintainability of trigger plugin interactions within the workflow system.
2025-09-16 15:39:40 +08:00
Harry
ee68a685a7 fix(workflow): enforce non-nullable arguments in DraftWorkflowTriggerRunApi
- Updated the argument definitions in the DraftWorkflowTriggerRunApi to include `nullable=False` for `node_id`, `trigger_name`, and `subscription_id`. This change ensures that these fields are always provided in the request, improving the robustness of the API.

This fix enhances input validation and prevents potential errors related to missing arguments.
2025-09-16 11:25:16 +08:00
Harry
c78bd492af feat(trigger): add supported creation methods to TriggerProviderApiEntity
- Introduced a new field `supported_creation_methods` in `TriggerProviderApiEntity` to specify the available methods for creating triggers, including OAUTH, APIKEY, and MANUAL.
- Updated the `PluginTriggerProviderController` to populate this field based on the entity's schemas, enhancing the API's clarity and usability.

These changes improve the flexibility and configurability of trigger providers within the application.
2025-09-15 17:01:29 +08:00
Harry
6857bb4406 feat(trigger): implement plugin trigger synchronization and subscription management in workflow
- Added a new event handler for syncing plugin trigger relationships when a draft workflow is synced, ensuring that the database reflects the current state of plugin triggers.
- Introduced subscription management features in the frontend, allowing users to select, add, and remove subscriptions for trigger plugins.
- Updated various components to support subscription handling, including the addition of new UI elements for subscription selection and removal.
- Enhanced internationalization support by adding new translation keys related to subscription management.

These changes improve the overall functionality and user experience of trigger plugins within workflows.
2025-09-15 15:49:07 +08:00
Harry
dcf3ee6982 fix(trigger): update trigger label assignment for improved clarity
- Changed the label assignment in the convertToTriggerWithProvider function from trigger.description.human to trigger.identity.label, ensuring the label reflects the correct identity format.

This update enhances the accuracy of trigger data representation in the application.
2025-09-15 14:50:56 +08:00
Harry
76850749e4 feat(trigger): enhance trigger debugging with polling API and new subscription retrieval
- Refactored DraftWorkflowTriggerNodeApi and DraftWorkflowTriggerRunApi to implement polling for trigger events instead of listening, improving responsiveness and reliability.
- Introduced TriggerSubscriptionBuilderGetApi to retrieve subscription instances for trigger providers, enhancing the API's capabilities.
- Removed deprecated trigger event classes and streamlined event handling in TriggerDebugService, ensuring a cleaner architecture.
- Updated Queue and Stream entities to reflect the changes in trigger event handling, improving overall clarity and maintainability.

These enhancements significantly improve the trigger debugging experience and API usability.
2025-09-14 19:12:31 +08:00
yessenia
91e5e33440 feat: add modal style opt 2025-09-12 20:22:33 +08:00
lyzno1
11e55088c9 fix: restore id prop passing to node children in BaseNode (#25520) 2025-09-11 17:54:31 +08:00
Harry
57c0bc9fb6 feat(trigger): refactor trigger debug event handling and improve response structures
- Renamed and refactored trigger debug event classes to enhance clarity and consistency, including changes from `TriggerDebugEventData` to `TriggerEventData` and related response classes.
- Updated `DraftWorkflowTriggerNodeApi` and `DraftWorkflowTriggerRunApi` to utilize the new event structures, improving the handling of trigger events.
- Removed the `TriggerDebugEventGenerator` class, consolidating event generation directly within the API logic for streamlined processing.
- Enhanced error handling and response formatting for trigger events, ensuring structured outputs for better integration and debugging.

This refactor improves the overall architecture of trigger debugging, making it more intuitive and maintainable.
2025-09-11 16:55:58 +08:00
Harry
c3ebb22a4b feat(trigger): add workflows_in_use field to TriggerProviderSubscriptionApiEntity
- Introduced a new field `workflows_in_use` to the TriggerProviderSubscriptionApiEntity to track the number of workflows utilizing each subscription.
- Enhanced the TriggerProviderService to populate this field by querying the WorkflowPluginTrigger model for usage counts associated with each subscription.

This addition improves the visibility of subscription usage within the trigger provider context.
2025-09-11 16:55:58 +08:00
Harry
1562d00037 feat(trigger): implement trigger debugging functionality
- Added DraftWorkflowTriggerNodeApi and DraftWorkflowTriggerRunApi for debugging trigger nodes and workflows.
- Enhanced TriggerDebugService to manage trigger debugging sessions and event listening.
- Introduced structured event responses for trigger debugging, including listening started, received, node finished, and workflow started events.
- Updated Queue and Stream entities to support new trigger debug events.
- Refactored trigger input handling to streamline the process of creating inputs from trigger data.

This implementation improves the debugging capabilities for trigger nodes and workflows, providing clearer event handling and structured responses.
2025-09-11 16:55:58 +08:00
Harry
e9e843b27d fix(tool): standardize tool naming across components
- Updated references from `trigger_name` to `tool_name` in multiple components for consistency.
- Adjusted type definitions to reflect the change in naming convention, enhancing clarity in the codebase.
2025-09-11 16:55:57 +08:00
Harry
ec33b9908e fix(trigger): improve formatting of OAuth client response in TriggerOAuthClientManageApi
- Refactored the return statement in the TriggerOAuthClientManageApi to enhance readability and maintainability.
- Ensured consistent formatting of the response structure for better clarity in API responses.
2025-09-11 16:55:57 +08:00
yessenia
67004368d9 feat: sub card style 2025-09-11 16:22:59 +08:00
Stream
94ecbd44e4 feat: add API endpoint to extract plugin assets 2025-09-11 14:48:42 +08:00
Stream
ba76312248 feat: adapt to plugin_daemon endpoint 2025-09-11 14:46:12 +08:00
yessenia
50bff270b6 feat: add subscription 2025-09-10 23:21:33 +08:00
Harry
bd5cf1c272 fix(trigger): enhance OAuth client response in TriggerOAuthClientManageApi
- Integrated TriggerManager to retrieve the trigger provider's OAuth client schema.
- Updated the return structure to include the redirect URI and OAuth client schema for improved API response clarity.
2025-09-10 17:35:30 +08:00
Yeuoly
d22404994a chore: add comments on generate_webhook_id 2025-09-10 17:23:29 +08:00
Yeuoly
9898730cc5 feat: add webhook node limit validation (max 5 per workflow)
- Add MAX_WEBHOOK_NODES_PER_WORKFLOW constant set to 5
- Validate webhook node count in sync_webhook_relationships method
- Raise ValueError when workflow exceeds webhook node limit
- Block workflow save when limit is exceeded to ensure data integrity
- Provide clear error message indicating current count and maximum allowed

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-10 17:22:09 +08:00
Yeuoly
b0f1e55a87 refactor: remove triggered_by field from webhook triggers and use automatic sync
- Remove triggered_by field from WorkflowWebhookTrigger model
- Replace manual webhook creation/deletion APIs with automatic sync via WebhookService
- Keep only GET API for retrieving webhook information
- Use same webhook ID for both debug and production environments (differentiated by endpoint)
- Add sync_webhook_relationships to automatically manage webhook lifecycle
- Update tests to remove triggered_by references
- Clean up unused imports and fix type checking issues

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-10 17:17:19 +08:00
Harry
6566824807 fix(trigger): update return type in TriggerSubscriptionBuilderService
- Changed the return type of the method in `TriggerSubscriptionBuilderService` from `SubscriptionBuilder` to `SubscriptionBuilderApiEntity` for improved clarity and alignment with API entity structures.
- Updated the return statement to utilize the new method for converting the builder to the API entity.
2025-09-10 15:48:32 +08:00
Harry
9249a2af0d fix(trigger): update event data publishing in TriggerDebugService
- Changed the event data publishing method in `TriggerDebugService` to use `model_dump()` for improved data structure handling when publishing to Redis Pub/Sub.
2025-09-10 15:48:32 +08:00
Yeuoly
112fc3b1d1 fix: clear schedule config when exporting data 2025-09-10 13:50:37 +08:00
Yeuoly
37299b3bd7 fix: rename migration 2025-09-10 13:41:50 +08:00
Yeuoly
8f65ce995a fix: migrations 2025-09-10 13:38:34 +08:00
诗浓
4a743e6dc1 feat: add workflow schedule trigger support (#24428)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-09-10 13:24:23 +08:00
lyzno1
07dda61929 fix/tooltip and onboarding ui (#25451) 2025-09-10 10:40:14 +08:00
Harry
0d8438ef40 fix(trigger): add 'trigger' category key to plugin constants for error avoid 2025-09-10 10:34:33 +08:00
Yeuoly
96bb638969 fix: limits 2025-09-09 23:32:51 +08:00
lyzno1
e74962272e fix: only workflow use trigger api (#25443) 2025-09-09 23:14:10 +08:00
Harry
5a15419baf feat(trigger): implement debug session capabilities for trigger nodes
- Added `DraftWorkflowTriggerNodeApi` to handle debugging of trigger nodes, allowing for real-time event listening and session management.
- Introduced `TriggerDebugService` for managing debug sessions and event dispatching using Redis Pub/Sub.
- Updated `TriggerService` to support dispatching events to debug sessions and refactored related methods for improved clarity and functionality.
- Enhanced data structures in `request.py` and `entities.py` to accommodate new debug event data requirements.

These changes significantly improve the debugging capabilities for trigger nodes in draft workflows, facilitating better development and troubleshooting processes.
2025-09-09 21:27:31 +08:00
Harry
e8403977b9 feat(plugin): add triggers field to PluginDeclaration for enhanced functionality
- Introduced a new `triggers` field in the `PluginDeclaration` class to support trigger functionalities within plugins.
- This addition improves the integration of triggers in the plugin architecture, aligning with recent updates to the trigger entity structures.

These changes enhance the overall capabilities of the plugin system.
2025-09-09 17:22:11 +08:00
Harry
add2ca85f2 refactor(trigger): update plugin and trigger entity structures
- Removed unnecessary newline in `TriggerPluginNode` class for consistency.
- Made `provider` in `TriggerIdentity` optional to enhance flexibility.
- Added `trigger` field to `PluginDeclaration` and updated `PluginCategory` to include `Trigger`, improving the integration of trigger functionalities within the plugin architecture.

These changes streamline the entity definitions and enhance the overall structure of the trigger and plugin components.
2025-09-09 17:16:44 +08:00
lyzno1
fbb7b02e90 fix(webhook): prevent SimpleSelect from resetting user selections (#25423)
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
2025-09-09 17:11:11 +08:00
lyzno1
249b62c9de fix: workflow header (#25411) 2025-09-09 15:34:15 +08:00
lyzno1
b433322e8d feat/trigger plugin apikey (#25388) 2025-09-09 15:01:06 +08:00
lyzno1
1c8850fc95 feat: adjust scroll to selected node position to top-left area (#25403) 2025-09-09 14:58:42 +08:00
Harry
dc16f1b65a refactor(trigger): simplify provider path handling in workflow components
- Updated various components to directly use `provider.name` instead of constructing a path with `provider.plugin_id` and `provider.name`.
- Adjusted related calls to `invalidateSubscriptions` and other functions to reflect this change.

These modifications enhance code clarity and streamline the handling of provider information in the trigger plugin components.
2025-09-09 00:17:20 +08:00
Harry
ff30395dc1 fix(OAuthClientConfigModal): simplify provider path handling in OAuth configuration
- Updated the provider path handling in `OAuthClientConfigModal` to directly use `provider.name` instead of constructing a path with `provider.plugin_id` and `provider.name`.
- Adjusted the corresponding calls to `invalidateOAuthConfig` and `configureTriggerOAuth` to reflect this change.

These modifications enhance code clarity and streamline the OAuth configuration process in the trigger plugin component.
2025-09-09 00:10:04 +08:00
Harry
8e600f3302 feat(trigger): optimize trigger parameter schema handling in useConfig
- Refactored the trigger parameter schema construction in `useConfig` to utilize a Map for improved efficiency and clarity.
- Updated the return value to ensure unique schema entries, enhancing the integrity of the trigger configuration.

These changes streamline the management of trigger parameters, improving performance and maintainability in the workflow component.
2025-09-08 23:39:44 +08:00
Harry
5a1e0a8379 feat(FormInputItem): enhance UI components for improved user experience
- Added loading indicator using `RiLoader4Line` to `FormInputItem` for better feedback during option fetching.
- Refactored button and option styles for improved accessibility and visual consistency.
- Updated text color classes to enhance readability based on loading state and selection.

These changes improve the overall user experience and visual clarity of the form input components.
2025-09-08 23:19:33 +08:00
Harry
2a3ce6baa9 feat(trigger): enhance plugin and trigger integration with updated naming conventions
- Refactored `PluginFetchDynamicSelectOptionsApi` to replace the `extra` argument with `credential_id`, improving clarity in dynamic option fetching.
- Updated `ProviderConfigEncrypter` to rename `mask_tool_credentials` to `mask_credentials` for consistency, and added a new method to maintain backward compatibility.
- Enhanced `PluginParameterService` to utilize `credential_id` for fetching subscriptions, improving the handling of trigger credentials.
- Adjusted various components and types in the frontend to replace `tool_name` with `trigger_name`, ensuring consistency across the application.
- Introduced `multiple` property in `TriggerParameter` to support multi-select functionality.

These changes improve the integration of triggers and plugins, enhance code clarity, and align naming conventions across the codebase.
2025-09-08 23:14:50 +08:00
Harry
01b2f9cff6 feat: add providerType prop to form components for dynamic behavior
- Introduced `providerType` prop in `FormInputItem`, `ToolForm`, and `ToolFormItem` components to support both 'tool' and 'trigger' types, enhancing flexibility in handling different provider scenarios.
- Updated the `useFetchDynamicOptions` function to accept `provider_type` as 'tool' | 'trigger', allowing for more dynamic option fetching based on the provider type.

These changes improve the adaptability of the form components and streamline the integration of different provider types in the workflow.
2025-09-08 18:29:48 +08:00
Harry
ac38614171 refactor(trigger): streamline trigger provider verification and update imports
- Updated `TriggerSubscriptionBuilderVerifyApi` to directly return the result of `verify_trigger_subscription_builder`, improving clarity.
- Refactored import statement in `trigger_plugin/__init__.py` to point to the correct module, enhancing code organization.
- Removed the obsolete `node.py` file, cleaning up the codebase by eliminating unused components.

These changes enhance the maintainability and clarity of the trigger provider functionality.
2025-09-08 18:25:04 +08:00
Harry
eb95c5cd07 feat(trigger): enhance subscription builder management and update API
- Introduced `SubscriptionBuilderUpdater` class to streamline updates to subscription builders, encapsulating properties like name, parameters, and credentials.
- Refactored API endpoints to utilize the new updater class, improving code clarity and maintainability.
- Adjusted OAuth handling to create and update subscription builders more effectively, ensuring proper credential management.

This change enhances the overall functionality and organization of the trigger subscription builder API.
2025-09-08 15:09:47 +08:00
lyzno1
a799b54b9e feat: initialize trigger status at application level to prevent canvas refresh state issues (#25329) 2025-09-08 09:34:28 +08:00
lyzno1
98ba0236e6 feat: implement trigger plugin authentication UI (#25310) 2025-09-07 21:53:22 +08:00
lyzno1
b6c552df07 fix: add stable sorting for trigger list to prevent position changes (#25328) 2025-09-07 21:52:41 +08:00
lyzno1
e2827e475d feat: implement trigger-plugin support with real-time status sync (#25326) 2025-09-07 21:29:53 +08:00
lyzno1
58cbd337b5 fix: improve test run menu and checklist ui (#25300) 2025-09-06 22:54:36 +08:00
lyzno1
a91e59d544 feat: implement trigger plugin frontend integration (#25283) 2025-09-06 16:18:46 +08:00
Harry
814787677a feat(trigger): update plugin trigger API and model to use trigger_name
- Modified `PluginTriggerApi` to accept `trigger_name` as a JSON argument and return encoded plugin triggers.
- Updated `WorkflowPluginTrigger` model to replace `trigger_id` with `trigger_name` for better clarity.
- Adjusted `WorkflowPluginTriggerService` to handle the new `trigger_name` field and ensure proper error handling for subscriptions.
- Enhanced `workflow_trigger_fields` to include `trigger_name` in the plugin trigger schema.

This change improves the API's clarity and aligns the model with the updated naming conventions.
2025-09-05 15:56:13 +08:00
Harry
85caa5bd0c fix(trigger): clean up whitespace in encryption utility and trigger provider service
- Removed unnecessary blank lines in `encryption.py` and `trigger_provider_service.py` for improved code readability.
- This minor adjustment enhances the overall code quality without altering functionality.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-05 15:56:13 +08:00
lyzno1
e04083fc0e feat: add icon support for trigger plugin workflow nodes (#25241) 2025-09-05 15:50:54 +08:00
Harry
cf532e5e0d feat(trigger): add context caching for trigger providers
- Add plugin_trigger_providers and plugin_trigger_providers_lock to contexts module
- Implement caching mechanism in TriggerManager.get_trigger_provider() method
- Cache fetched trigger providers to reduce repeated daemon calls
- Use double-check locking pattern for thread-safe cache access

This follows the same pattern as ToolManager.get_plugin_provider() to improve performance
by avoiding redundant requests to the daemon when accessing trigger providers.

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-05 14:30:10 +08:00
Harry
c097fc2c48 refactor(trigger): add uuid import to trigger provider service
- Imported `uuid` in `trigger_provider_service.py` to support unique identifier generation.
- This change prepares the service for future enhancements that may require UUID functionality.
2025-09-05 14:30:10 +08:00
Harry
0371d71409 feat(trigger): enhance trigger subscription management and cache handling
- Added `name` parameter to `TriggerSubscriptionBuilderCreateApi` for better subscription identification.
- Implemented `delete_cache_for_subscription` function to clear cache associated with trigger subscriptions.
- Updated `WorkflowPluginTriggerService` to check for existing subscriptions before creating new plugin triggers, improving error handling.
- Refactored `TriggerProviderService` to utilize the new cache deletion method during provider deletion.

This improves the overall management of trigger subscriptions and enhances cache efficiency.
2025-09-05 14:30:10 +08:00
非法操作
81ef7343d4 chore: (trigger) refactor webhook service (#25229) 2025-09-05 14:00:20 +08:00
zhangxuhe1
8e4b59c90c feat: improve trigger plugin UI layout and responsiveness (#25232) 2025-09-05 14:00:14 +08:00
非法操作
68f73410fc chore: (trigger) add WEBHOOK_REQUEST_BODY_MAX_SIZE (#25217) 2025-09-05 12:23:11 +08:00
lyzno1
88af8ed374 fix: block selector ui (#25228) 2025-09-05 12:22:13 +08:00
Harry
015f82878e feat(trigger): integrate plugin icon retrieval into trigger provider
- Added `get_plugin_icon_url` method in `PluginService` to fetch plugin icons.
- Updated `PluginTriggerProviderController` to use the new method for icon handling.
- Refactored `ToolTransformService` to utilize `PluginService` for consistent icon URL generation.

This enhances the trigger provider's ability to manage plugin icons effectively.
2025-09-05 12:01:41 +08:00
Harry
3874e58dc2 refactor(trigger): enhance trigger provider deletion process and session management 2025-09-05 11:31:57 +08:00
lyzno1
9f8c159583 feat(trigger): implement trigger plugin block selector following tools pattern (#25204) 2025-09-05 10:20:47 +08:00
非法操作
d8f6f9ce19 chore: (trigger)change content type from form to application/octet-stream (#25167) 2025-09-05 09:54:07 +08:00
Harry
eab03e63d4 refactor(trigger): rename request logs API and enhance logging functionality
- Renamed `TriggerSubscriptionBuilderRequestLogsApi` to `TriggerSubscriptionBuilderLogsApi` for clarity.
- Updated the API endpoint to retrieve logs for subscription builders.
- Enhanced logging functionality in `TriggerSubscriptionBuilderService` to append and list logs more effectively.
- Refactored trigger processing tasks to improve naming consistency and clarity in logging.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-04 21:11:25 +08:00
非法操作
461829274a feat: (trigger) support file upload in webhook (#25159) 2025-09-04 18:33:42 +08:00
Harry
e751c0c535 refactor(trigger): update trigger provider API and clean up unused classes
- Renamed the API endpoint for trigger providers from `/workspaces/current/trigger-providers` to `/workspaces/current/triggers` for consistency.
- Removed unused `TriggerProviderCredentialsCache` and `TriggerProviderOAuthClientParamsCache` classes to streamline the codebase.
- Enhanced the `TriggerProviderApiEntity` to include additional properties and improved the conversion logic in `PluginTriggerProviderController`.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-04 17:45:15 +08:00
lyzno1
1fffc79c32 fix: prevent empty workflow draft sync during page navigation (#25140) 2025-09-04 17:13:49 +08:00
非法操作
83fab4bc19 chore: (webhook) when content type changed clear the body variables (#25136) 2025-09-04 15:09:54 +08:00
Harry
f60e28d2f5 feat(trigger): enhance user role validation and add request logs API for trigger providers
- Updated user role validation in PluginTriggerApi and WebhookTriggerApi to assert current_user as an Account and check tenant ID.
- Introduced TriggerSubscriptionBuilderRequestLogsApi to retrieve request logs for subscription instances, ensuring proper user authentication and error handling.
- Added new API endpoint for accessing request logs related to trigger providers.

🤖 Generated with [Claude Code](https://claude.ai/code)
2025-09-04 14:44:02 +08:00
Harry
a62d7aa3ee feat(trigger): add plugin trigger workflow support and refactor trigger system
- Add new workflow plugin trigger service for managing plugin-based triggers
- Implement trigger provider encryption utilities for secure credential storage
- Add custom trigger errors module for better error handling
- Refactor trigger provider and manager classes for improved plugin integration
- Update API endpoints to support plugin trigger workflows
- Add database migration for plugin trigger workflow support

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-04 13:20:43 +08:00
非法操作
cc84a45244 chore: (webhook) use variable instead of InputVar (#25119) 2025-09-04 11:10:42 +08:00
cathy
5cf3d24018 fix(webhook): selected type ui style (#25106) 2025-09-04 10:59:08 +08:00
lyzno1
4bdbe617fe fix: uuidv7 (#25097) 2025-09-04 08:44:14 +08:00
lyzno1
33c867fd8c feat(workflow): enhance webhook status code input with increment/decrement controls (#25099) 2025-09-03 22:26:00 +08:00
非法操作
2013ceb9d2 chore: validate param type of application/json when call a webhook (#25074) 2025-09-03 15:49:07 +08:00
非法操作
7120c6414c fix: content type of webhook (#25032) 2025-09-03 15:13:01 +08:00
Harry
5ce7b2d98d refactor(migrations): remove obsolete plugin_trigger migration file
- Deleted the plugin_trigger migration file as it is no longer needed in the codebase.
- Updated model imports in `__init__.py` to include new trigger-related classes for better organization.
2025-09-03 15:02:17 +08:00
Harry
cb82198271 refactor(trigger): update trigger provider classes and API endpoints
- Renamed classes for trigger subscription management to improve clarity, including TriggerProviderSubscriptionListApi to TriggerSubscriptionListApi and TriggerSubscriptionsDeleteApi to TriggerSubscriptionDeleteApi.
- Updated API endpoint paths to reflect the new naming conventions for trigger subscriptions.
- Removed deprecated TriggerOAuthRefreshTokenApi class to streamline the codebase.
- Added trigger_providers import to the console controller for better organization.
2025-09-03 14:53:27 +08:00
Harry
5e5ffaa416 feat(tool-form): add extraParams prop to ToolForm and ToolFormItem components
- Introduced extraParams prop to both ToolForm and ToolFormItem components for enhanced flexibility in passing additional parameters.
- Updated component usage to accommodate the new prop, improving the overall functionality of the tool forms.
2025-09-03 14:53:27 +08:00
Harry
4b253e1f73 feat(trigger): plugin trigger workflow 2025-09-03 14:53:27 +08:00
Harry
dd929dbf0e fix(dynamic_select): implement function 2025-09-03 14:53:27 +08:00
Harry
97a9d34e96 feat(trigger): introduce plugin trigger management and enhance trigger processing
- Remove the debug endpoint for cleaner API structure
- Add support for TRIGGER_PLUGIN in NodeType enumeration
- Implement WorkflowPluginTrigger model to map plugin triggers to workflow nodes
- Enhance TriggerService to process plugin triggers and store trigger data in Redis
- Update node mapping to include TriggerPluginNode for workflow execution

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry
602070ec9c refactor(trigger): improve method signature formatting in TriggerService
- Adjust the formatting of the `process_triggered_workflows` method signature for better readability
- Ensure consistent style across method definitions in the TriggerService class

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry
afd8989150 feat(trigger): introduce subscription builder and enhance trigger management
- Refactor trigger provider classes to improve naming consistency, including renaming classes for subscription management
- Implement new TriggerSubscriptionBuilderService for creating and verifying subscription builders
- Update API endpoints to support subscription builder creation and verification
- Enhance data models to include new attributes for subscription builders
- Remove the deprecated TriggerSubscriptionValidationService to streamline the codebase

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry
694197a701 refactor(trigger): clean up imports and optimize trigger-related code
- Remove unused imports in trigger-related files for better clarity and maintainability
- Streamline import statements across various modules to enhance code quality

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry
2f08306695 feat(trigger): enhance trigger subscription management and processing
- Refactor trigger provider classes to improve naming consistency and clarity
- Introduce new methods for managing trigger subscriptions, including validation and dispatching
- Update API endpoints to reflect changes in subscription handling
- Implement logging and request management for endpoint interactions
- Enhance data models to support subscription attributes and lifecycle management

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry
6acc77d86d feat(trigger): refactor trigger provider to subscription model
- Rename classes and methods to reflect the transition from credentials to subscriptions
- Update API endpoints for managing trigger subscriptions
- Modify data models and entities to support subscription attributes
- Enhance service methods for listing, adding, updating, and deleting subscriptions
- Adjust encryption utilities to handle subscription data

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry
5ddd5e49ee feat(trigger): enhance subscription schema and provider configuration
- Update ProviderConfig to allow a list as a default value
- Introduce SubscriptionSchema for better organization of subscription-related configurations
- Modify TriggerProviderApiEntity to use Optional for subscription_schema
- Add custom_model_schema to TriggerProviderEntity for additional configuration options

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:27 +08:00
Harry
72f9e77368 refactor(trigger): clean up and optimize trigger-related code
- Remove unused classes and imports in encryption utilities
- Simplify method signatures for better readability
- Enhance code quality by adding newlines for clarity
- Update tests to reflect changes in import paths

Co-authored-by: Claude <noreply@anthropic.com>
2025-09-03 14:53:26 +08:00
Harry
a46c9238fa feat(trigger): implement complete OAuth authorization flow for trigger providers
- Add OAuth authorization URL generation API endpoint
- Implement OAuth callback handler for credential storage
- Support both system-level and tenant-level OAuth clients
- Add trigger provider credential encryption utilities
- Refactor trigger entities into separate modules
- Update trigger provider service with OAuth client management
- Add credential cache for trigger providers

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-09-03 14:53:26 +08:00
Harry
87120ad4ac feat(trigger): add trigger provider management and webhook handling functionality 2025-09-03 14:53:26 +08:00
非法操作
7544b5ec9a fix: delete var of webhook (#25038) 2025-09-03 14:49:56 +08:00
非法操作
ff4a62d1e7 chore: limit webhook status code 200~399 (#25045) 2025-09-03 14:48:18 +08:00
lyzno1
41daa51988 fix: missing key for translation path (#25059) 2025-09-03 14:43:40 +08:00
cathy
d522350c99 fix(webhook-trigger): request array type adjustment (#25005) 2025-09-02 23:20:12 +08:00
lyzno1
1d1bb9451e fix: prevent workflow canvas clearing due to race condition and viewport errors (#25003) 2025-09-02 20:53:44 +08:00
lyzno1
1fce1a61d4 feat(workflow-log): enhance workflow logs UI with sorting and status filters (#24978) 2025-09-02 16:43:11 +08:00
非法操作
883a6caf96 feat: add trigger by of app log (#24973) 2025-09-02 16:04:08 +08:00
非法操作
a239c39f09 fix: webhook http method should case insensitive (#24957) 2025-09-02 14:47:24 +08:00
lyzno1
e925a8ab99 fix(app-cards): restrict toggle enable to Start nodes only (#24918) 2025-09-01 22:52:23 +08:00
Yeuoly
bccaf939e6 fix: migrations 2025-09-01 18:07:21 +08:00
Yeuoly
676648e0b3 Merge branch 'main' into feat/trigger 2025-09-01 18:05:31 +08:00
cathy
4ae19e6dde fix(webhook-trigger): remove error handling (#24902) 2025-09-01 17:11:49 +08:00
非法操作
4d0ff5c281 feat: implement variable synchronization for webhook node (#24874)
Co-authored-by: Claude <noreply@anthropic.com>
2025-09-01 16:58:06 +08:00
lyzno1
327b354cc2 refactor: unify trigger node architecture and clean up technical debt (#24886)
Co-authored-by: hjlarry <hjlarry@163.com>
Co-authored-by: Claude <noreply@anthropic.com>
2025-09-01 15:47:44 +08:00
lyzno1
6d307cc9fc Fix test run shortcut consistency and improve dropdown styling (#24849) 2025-09-01 14:47:21 +08:00
lyzno1
adc7134af5 fix: improve TimePicker footer layout and button styling (#24831) 2025-09-01 13:34:53 +08:00
cathy
10f19cd0c2 fix(webhook): add content-type aware parameter type handling (#24865) 2025-09-01 10:06:26 +08:00
lyzno1
9ed45594c6 fix: improve schedule trigger and quick settings app-operation btns ui (#24843) 2025-08-31 16:59:49 +08:00
非法操作
c138f4c3a6 fix: check AppTrigger status before webhook execution (#24829)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-30 16:40:21 +08:00
lyzno1
a35be05790 Fix workflow card toggle logic and implement minimal state UI (#24822) 2025-08-30 16:35:34 +08:00
lyzno1
60b5ed8e5d fix: enhance webhook trigger panel UI consistency and user experience (#24780) 2025-08-29 17:41:42 +08:00
lyzno1
d8ddbc4d87 feat: enhance webhook trigger panel UI consistency and interactivity (#24759)
Co-authored-by: hjlarry <hjlarry@163.com>
2025-08-29 14:24:23 +08:00
非法操作
19c0fc85e2 feat: when add/delete webhook trigger call the API (#24755) 2025-08-29 14:23:50 +08:00
lyzno1
a58df35ead fix: improve trigger card layout spacing and remove dividers (#24756) 2025-08-29 13:37:44 +08:00
lyzno1
9789bd02d8 feat: implement trigger card component with auto-refresh (#24743) 2025-08-29 11:57:08 +08:00
lyzno1
d94e54923f Improve tooltip design for trigger blocks (#24724) 2025-08-28 23:18:00 +08:00
lyzno1
64c7be59b7 Improve workflow block selector search functionality (#24707) 2025-08-28 17:21:34 +08:00
非法操作
89ad6ad902 feat: add app trigger list api (#24693) 2025-08-28 15:23:08 +08:00
lyzno1
4f73bc9693 fix(schedule): add time logic to weekly frequency mode for consistent behavior with daily mode (#24673) 2025-08-28 14:40:11 +08:00
lyzno1
add6b79231 UI enhancements for workflow checklist component (#24647) 2025-08-28 10:10:10 +08:00
lyzno1
c90dad566f feat: enhance workflow error handling and internationalization (#24648) 2025-08-28 09:41:22 +08:00
lyzno1
5cbe6bf8f8 fix(schedule): correct weekly frequency weekday calculation algorithm (#24641) 2025-08-27 18:20:09 +08:00
Yeuoly
4ef6ff217e fix: improve code quality in webhook services and controllers (#24634)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-27 17:50:51 +08:00
lyzno1
87abfbf515 Allow empty workflows and improve workflow validation (#24627) 2025-08-27 17:49:09 +08:00
lyzno1
73e65fd838 feat: align trigger webhook style with schedule node and fix selection border truncation (#24635) 2025-08-27 17:47:14 +08:00
Yeuoly
e53edb0fc2 refactor: optimize TenantDailyRateLimiter to use UTC internally with timezone-aware error messages (#24632)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-27 17:35:04 +08:00
非法操作
17908fbf6b fix: only workflow should display start modal (#24623) 2025-08-27 16:20:31 +08:00
zhangxuhe1
3dae108f84 refactor(sidebar): Restructure app operations with toggle functionality (#24625) 2025-08-27 16:20:17 +08:00
lyzno1
5bbf685035 feat: fix i18n missing keys and merge upstream/main (#24615)
Signed-off-by: -LAN- <laipz8200@outlook.com>
Signed-off-by: kenwoodjw <blackxin55+@gmail.com>
Signed-off-by: Yongtao Huang <yongtaoh2022@gmail.com>
Signed-off-by: yihong0618 <zouzou0208@gmail.com>
Signed-off-by: zhanluxianshen <zhanluxianshen@163.com>
Co-authored-by: -LAN- <laipz8200@outlook.com>
Co-authored-by: GuanMu <ballmanjq@gmail.com>
Co-authored-by: Davide Delbianco <davide.delbianco@outlook.com>
Co-authored-by: NeatGuyCoding <15627489+NeatGuyCoding@users.noreply.github.com>
Co-authored-by: kenwoodjw <blackxin55+@gmail.com>
Co-authored-by: Yongtao Huang <yongtaoh2022@gmail.com>
Co-authored-by: Yongtao Huang <99629139+hyongtao-db@users.noreply.github.com>
Co-authored-by: Qiang Lee <18018968632@163.com>
Co-authored-by: 李强04 <liqiang04@gaotu.cn>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: Asuka Minato <i@asukaminato.eu.org>
Co-authored-by: Matri Qi <matrixdom@126.com>
Co-authored-by: huayaoyue6 <huayaoyue@163.com>
Co-authored-by: Bowen Liang <liangbowen@gf.com.cn>
Co-authored-by: znn <jubinkumarsoni@gmail.com>
Co-authored-by: crazywoola <427733928@qq.com>
Co-authored-by: crazywoola <100913391+crazywoola@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: yihong <zouzou0208@gmail.com>
Co-authored-by: Muke Wang <shaodwaaron@gmail.com>
Co-authored-by: wangmuke <wangmuke@kingsware.cn>
Co-authored-by: Wu Tianwei <30284043+WTW0313@users.noreply.github.com>
Co-authored-by: quicksand <quicksandzn@gmail.com>
Co-authored-by: 非法操作 <hjlarry@163.com>
Co-authored-by: zxhlyh <jasonapring2015@outlook.com>
Co-authored-by: Eric Guo <eric.guocz@gmail.com>
Co-authored-by: Zhedong Cen <cenzhedong2@126.com>
Co-authored-by: jiangbo721 <jiangbo721@163.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: hjlarry <25834719+hjlarry@users.noreply.github.com>
Co-authored-by: lxsummer <35754229+lxjustdoit@users.noreply.github.com>
Co-authored-by: 湛露先生 <zhanluxianshen@163.com>
Co-authored-by: Guangdong Liu <liugddx@gmail.com>
Co-authored-by: QuantumGhost <obelisk.reg+git@gmail.com>
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: Yessenia-d <yessenia.contact@gmail.com>
Co-authored-by: huangzhuo1949 <167434202+huangzhuo1949@users.noreply.github.com>
Co-authored-by: huangzhuo <huangzhuo1@xiaomi.com>
Co-authored-by: 17hz <0x149527@gmail.com>
Co-authored-by: Amy <1530140574@qq.com>
Co-authored-by: Joel <iamjoel007@gmail.com>
Co-authored-by: Nite Knite <nkCoding@gmail.com>
Co-authored-by: Yeuoly <45712896+Yeuoly@users.noreply.github.com>
Co-authored-by: Petrus Han <petrus.hanks@gmail.com>
Co-authored-by: iamjoel <2120155+iamjoel@users.noreply.github.com>
Co-authored-by: Kalo Chin <frog.beepers.0n@icloud.com>
Co-authored-by: Ujjwal Maurya <ujjwalsbx@gmail.com>
Co-authored-by: Maries <xh001x@hotmail.com>
2025-08-27 15:07:28 +08:00
非法操作
a63d1e87b1 feat: webhook trigger backend api (#24387)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-08-27 14:42:45 +08:00
lyzno1
7129de98cd feat: implement workflow onboarding modal system (#24551) 2025-08-27 13:31:22 +08:00
非法操作
2984dbc0df fix: when workflow not has start node can't open service api (#24564) 2025-08-26 18:06:11 +08:00
非法操作
392db7f611 fix: when workflow only has trigger node can't save (#24546) 2025-08-26 16:41:47 +08:00
lyzno1
5a427b8daa refactor: rename RunAllTriggers icon to TriggerAll for semantic clarity (#24478)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-08-25 17:51:04 +08:00
Yeuoly
18f2e6f166 refactor: Use specific error types for workflow execution (#24475)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-25 16:19:12 +08:00
lyzno1
e78903302f feat(trigger-schedule): simplify timezone handling with user-centric approach (#24401) 2025-08-24 21:03:59 +08:00
cathy
4084ade86c refactor(trigger-webhook): remove redundant WebhookParam type and sim… (#24390) 2025-08-24 00:21:47 +08:00
cathy
6b0d919dbd feat: webhook trigger frontend (#24311) 2025-08-23 23:54:41 +08:00
Yeuoly
a7b558b38b feat/trigger: support specifying root node (#24388)
Co-authored-by: Claude <noreply@anthropic.com>
2025-08-23 20:44:03 +08:00
Yeuoly
6aed7e3ff4 feat/trigger universal entry (#24358)
Co-authored-by: Claude <noreply@anthropic.com>
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-08-23 20:18:08 +08:00
lyzno1
8e93a8a2e2 refactor: comprehensive schedule trigger component redesign (#24359)
Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-23 11:03:18 +08:00
Yeuoly
e38a86e37b Merge branch 'main' into feat/trigger 2025-08-22 20:11:49 +08:00
lyzno1
392e3530bf feat: replace mock data with dynamic workflow options in test run dropdown (#24320) 2025-08-22 16:36:09 +08:00
lyzno1
833c902b2b feat(workflow): Plugin Trigger Node with Unified Entry Node System (#24205) 2025-08-20 23:49:10 +08:00
lyzno1
6eaea64b3f feat: implement multi-select monthly trigger schedule (#24247) 2025-08-20 06:23:30 -07:00
lyzno1
5303b50737 fix: initialize recur fields when switching to hourly frequency (#24181) 2025-08-20 09:32:05 +08:00
lyzno1
6acbcfe679 UI improvements: fix translation and custom icons for schedule trigger (#24167) 2025-08-19 18:27:07 +08:00
lyzno1
16ef5ebb97 fix: remove duplicate weekdays keys in i18n workflow files (#24157) 2025-08-19 14:55:16 +08:00
lyzno1
acfb95f9c2 Refactor Start node UI to User Input and optimize EntryNodeContainer (#24156) 2025-08-19 14:40:24 +08:00
lyzno1
aacea166d7 fix: resolve merge conflict between Features removal and validation enhancement (#24150) 2025-08-19 13:47:38 +08:00
lyzno1
f7bb3b852a feat: implement Schedule Trigger validation with multi-start node topology support (#24134) 2025-08-19 11:55:15 +08:00
lyzno1
d4ff1e031a Remove workflow features button (#24085) 2025-08-19 09:32:07 +08:00
lyzno1
6a3d135d49 fix: simplify trigger-schedule hourly mode calculation and improve UI consistency (#24082)
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-18 23:37:57 +08:00
lyzno1
5c4bf7aabd feat: Test Run dropdown with dynamic trigger selection (#24113) 2025-08-18 17:46:36 +08:00
lyzno1
e9c7dc7464 feat: update workflow run button to Test Run with keyboard shortcut (#24071) 2025-08-18 10:44:17 +08:00
lyzno1
74ad21b145 feat: comprehensive trigger node system with Schedule Trigger implementation (#24039)
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-18 09:23:16 +08:00
lyzno1
f214eeb7b1 feat: add scroll to selected node button in workflow header (#24030)
Co-authored-by: zhangxuhe1 <xuhezhang6@gmail.com>
2025-08-16 19:26:44 +08:00
lyzno1
ae25f90f34 Replace export button with more actions button in workflow control panel (#24033) 2025-08-16 19:25:18 +08:00
2842 changed files with 243029 additions and 43052 deletions

6
.cursorrules Normal file
View File

@@ -0,0 +1,6 @@
# Cursor Rules for Dify Project
## Automated Test Generation
- Use `web/testing/testing.md` as the canonical instruction set for generating frontend automated tests.
- When proposing or saving tests, re-read that document and follow every requirement.

View File

@@ -11,7 +11,7 @@
"nodeGypDependencies": true,
"version": "lts"
},
"ghcr.io/devcontainers-contrib/features/npm-package:1": {
"ghcr.io/devcontainers-extra/features/npm-package:1": {
"package": "typescript",
"version": "latest"
},

View File

@@ -6,11 +6,10 @@ cd web && pnpm install
pipx install uv
echo "alias start-api=\"cd $WORKSPACE_ROOT/api && uv run python -m flask run --host 0.0.0.0 --port=5001 --debug\"" >> ~/.bashrc
echo "alias start-worker=\"cd $WORKSPACE_ROOT/api && uv run python -m celery -A app.celery worker -P gevent -c 1 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion,plugin,workflow_storage\"" >> ~/.bashrc
echo "alias start-worker=\"cd $WORKSPACE_ROOT/api && uv run python -m celery -A app.celery worker -P threads -c 1 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor\"" >> ~/.bashrc
echo "alias start-web=\"cd $WORKSPACE_ROOT/web && pnpm dev\"" >> ~/.bashrc
echo "alias start-web-prod=\"cd $WORKSPACE_ROOT/web && pnpm build && pnpm start\"" >> ~/.bashrc
echo "alias start-containers=\"cd $WORKSPACE_ROOT/docker && docker-compose -f docker-compose.middleware.yaml -p dify --env-file middleware.env up -d\"" >> ~/.bashrc
echo "alias stop-containers=\"cd $WORKSPACE_ROOT/docker && docker-compose -f docker-compose.middleware.yaml -p dify --env-file middleware.env down\"" >> ~/.bashrc
source /home/vscode/.bashrc

View File

@@ -29,7 +29,7 @@ trim_trailing_whitespace = false
# Matches multiple files with brace expansion notation
# Set default charset
[*.{js,tsx}]
[*.{js,jsx,ts,tsx,mjs}]
indent_style = space
indent_size = 2

226
.github/CODEOWNERS vendored Normal file
View File

@@ -0,0 +1,226 @@
# CODEOWNERS
# This file defines code ownership for the Dify project.
# Each line is a file pattern followed by one or more owners.
# Owners can be @username, @org/team-name, or email addresses.
# For more information, see: https://docs.github.com/en/repositories/managing-your-repositorys-settings-and-features/customizing-your-repository/about-code-owners
* @crazywoola @laipz8200 @Yeuoly
# Backend (default owner, more specific rules below will override)
api/ @QuantumGhost
# Backend - Workflow - Engine (Core graph execution engine)
api/core/workflow/graph_engine/ @laipz8200 @QuantumGhost
api/core/workflow/runtime/ @laipz8200 @QuantumGhost
api/core/workflow/graph/ @laipz8200 @QuantumGhost
api/core/workflow/graph_events/ @laipz8200 @QuantumGhost
api/core/workflow/node_events/ @laipz8200 @QuantumGhost
api/core/model_runtime/ @laipz8200 @QuantumGhost
# Backend - Workflow - Nodes (Agent, Iteration, Loop, LLM)
api/core/workflow/nodes/agent/ @Nov1c444
api/core/workflow/nodes/iteration/ @Nov1c444
api/core/workflow/nodes/loop/ @Nov1c444
api/core/workflow/nodes/llm/ @Nov1c444
# Backend - RAG (Retrieval Augmented Generation)
api/core/rag/ @JohnJyong
api/services/rag_pipeline/ @JohnJyong
api/services/dataset_service.py @JohnJyong
api/services/knowledge_service.py @JohnJyong
api/services/external_knowledge_service.py @JohnJyong
api/services/hit_testing_service.py @JohnJyong
api/services/metadata_service.py @JohnJyong
api/services/vector_service.py @JohnJyong
api/services/entities/knowledge_entities/ @JohnJyong
api/services/entities/external_knowledge_entities/ @JohnJyong
api/controllers/console/datasets/ @JohnJyong
api/controllers/service_api/dataset/ @JohnJyong
api/models/dataset.py @JohnJyong
api/tasks/rag_pipeline/ @JohnJyong
api/tasks/add_document_to_index_task.py @JohnJyong
api/tasks/batch_clean_document_task.py @JohnJyong
api/tasks/clean_document_task.py @JohnJyong
api/tasks/clean_notion_document_task.py @JohnJyong
api/tasks/document_indexing_task.py @JohnJyong
api/tasks/document_indexing_sync_task.py @JohnJyong
api/tasks/document_indexing_update_task.py @JohnJyong
api/tasks/duplicate_document_indexing_task.py @JohnJyong
api/tasks/recover_document_indexing_task.py @JohnJyong
api/tasks/remove_document_from_index_task.py @JohnJyong
api/tasks/retry_document_indexing_task.py @JohnJyong
api/tasks/sync_website_document_indexing_task.py @JohnJyong
api/tasks/batch_create_segment_to_index_task.py @JohnJyong
api/tasks/create_segment_to_index_task.py @JohnJyong
api/tasks/delete_segment_from_index_task.py @JohnJyong
api/tasks/disable_segment_from_index_task.py @JohnJyong
api/tasks/disable_segments_from_index_task.py @JohnJyong
api/tasks/enable_segment_to_index_task.py @JohnJyong
api/tasks/enable_segments_to_index_task.py @JohnJyong
api/tasks/clean_dataset_task.py @JohnJyong
api/tasks/deal_dataset_index_update_task.py @JohnJyong
api/tasks/deal_dataset_vector_index_task.py @JohnJyong
# Backend - Plugins
api/core/plugin/ @Mairuis @Yeuoly @Stream29
api/services/plugin/ @Mairuis @Yeuoly @Stream29
api/controllers/console/workspace/plugin.py @Mairuis @Yeuoly @Stream29
api/controllers/inner_api/plugin/ @Mairuis @Yeuoly @Stream29
api/tasks/process_tenant_plugin_autoupgrade_check_task.py @Mairuis @Yeuoly @Stream29
# Backend - Trigger/Schedule/Webhook
api/controllers/trigger/ @Mairuis @Yeuoly
api/controllers/console/app/workflow_trigger.py @Mairuis @Yeuoly
api/controllers/console/workspace/trigger_providers.py @Mairuis @Yeuoly
api/core/trigger/ @Mairuis @Yeuoly
api/core/app/layers/trigger_post_layer.py @Mairuis @Yeuoly
api/services/trigger/ @Mairuis @Yeuoly
api/models/trigger.py @Mairuis @Yeuoly
api/fields/workflow_trigger_fields.py @Mairuis @Yeuoly
api/repositories/workflow_trigger_log_repository.py @Mairuis @Yeuoly
api/repositories/sqlalchemy_workflow_trigger_log_repository.py @Mairuis @Yeuoly
api/libs/schedule_utils.py @Mairuis @Yeuoly
api/services/workflow/scheduler.py @Mairuis @Yeuoly
api/schedule/trigger_provider_refresh_task.py @Mairuis @Yeuoly
api/schedule/workflow_schedule_task.py @Mairuis @Yeuoly
api/tasks/trigger_processing_tasks.py @Mairuis @Yeuoly
api/tasks/trigger_subscription_refresh_tasks.py @Mairuis @Yeuoly
api/tasks/workflow_schedule_tasks.py @Mairuis @Yeuoly
api/tasks/workflow_cfs_scheduler/ @Mairuis @Yeuoly
api/events/event_handlers/sync_plugin_trigger_when_app_created.py @Mairuis @Yeuoly
api/events/event_handlers/update_app_triggers_when_app_published_workflow_updated.py @Mairuis @Yeuoly
api/events/event_handlers/sync_workflow_schedule_when_app_published.py @Mairuis @Yeuoly
api/events/event_handlers/sync_webhook_when_app_created.py @Mairuis @Yeuoly
# Backend - Async Workflow
api/services/async_workflow_service.py @Mairuis @Yeuoly
api/tasks/async_workflow_tasks.py @Mairuis @Yeuoly
# Backend - Billing
api/services/billing_service.py @hj24 @zyssyz123
api/controllers/console/billing/ @hj24 @zyssyz123
# Backend - Enterprise
api/configs/enterprise/ @GarfieldDai @GareArc
api/services/enterprise/ @GarfieldDai @GareArc
api/services/feature_service.py @GarfieldDai @GareArc
api/controllers/console/feature.py @GarfieldDai @GareArc
api/controllers/web/feature.py @GarfieldDai @GareArc
# Backend - Database Migrations
api/migrations/ @snakevash @laipz8200
# Frontend
web/ @iamjoel
# Frontend - App - Orchestration
web/app/components/workflow/ @iamjoel @zxhlyh
web/app/components/workflow-app/ @iamjoel @zxhlyh
web/app/components/app/configuration/ @iamjoel @zxhlyh
web/app/components/app/app-publisher/ @iamjoel @zxhlyh
# Frontend - WebApp - Chat
web/app/components/base/chat/ @iamjoel @zxhlyh
# Frontend - WebApp - Completion
web/app/components/share/text-generation/ @iamjoel @zxhlyh
# Frontend - App - List and Creation
web/app/components/apps/ @JzoNgKVO @iamjoel
web/app/components/app/create-app-dialog/ @JzoNgKVO @iamjoel
web/app/components/app/create-app-modal/ @JzoNgKVO @iamjoel
web/app/components/app/create-from-dsl-modal/ @JzoNgKVO @iamjoel
# Frontend - App - API Documentation
web/app/components/develop/ @JzoNgKVO @iamjoel
# Frontend - App - Logs and Annotations
web/app/components/app/workflow-log/ @JzoNgKVO @iamjoel
web/app/components/app/log/ @JzoNgKVO @iamjoel
web/app/components/app/log-annotation/ @JzoNgKVO @iamjoel
web/app/components/app/annotation/ @JzoNgKVO @iamjoel
# Frontend - App - Monitoring
web/app/(commonLayout)/app/(appDetailLayout)/\[appId\]/overview/ @JzoNgKVO @iamjoel
web/app/components/app/overview/ @JzoNgKVO @iamjoel
# Frontend - App - Settings
web/app/components/app-sidebar/ @JzoNgKVO @iamjoel
# Frontend - RAG - Hit Testing
web/app/components/datasets/hit-testing/ @JzoNgKVO @iamjoel
# Frontend - RAG - List and Creation
web/app/components/datasets/list/ @iamjoel @WTW0313
web/app/components/datasets/create/ @iamjoel @WTW0313
web/app/components/datasets/create-from-pipeline/ @iamjoel @WTW0313
web/app/components/datasets/external-knowledge-base/ @iamjoel @WTW0313
# Frontend - RAG - Orchestration (general rule first, specific rules below override)
web/app/components/rag-pipeline/ @iamjoel @WTW0313
web/app/components/rag-pipeline/components/rag-pipeline-main.tsx @iamjoel @zxhlyh
web/app/components/rag-pipeline/store/ @iamjoel @zxhlyh
# Frontend - RAG - Documents List
web/app/components/datasets/documents/list.tsx @iamjoel @WTW0313
web/app/components/datasets/documents/create-from-pipeline/ @iamjoel @WTW0313
# Frontend - RAG - Segments List
web/app/components/datasets/documents/detail/ @iamjoel @WTW0313
# Frontend - RAG - Settings
web/app/components/datasets/settings/ @iamjoel @WTW0313
# Frontend - Ecosystem - Plugins
web/app/components/plugins/ @iamjoel @zhsama
# Frontend - Ecosystem - Tools
web/app/components/tools/ @iamjoel @Yessenia-d
# Frontend - Ecosystem - MarketPlace
web/app/components/plugins/marketplace/ @iamjoel @Yessenia-d
# Frontend - Login and Registration
web/app/signin/ @douxc @iamjoel
web/app/signup/ @douxc @iamjoel
web/app/reset-password/ @douxc @iamjoel
web/app/install/ @douxc @iamjoel
web/app/init/ @douxc @iamjoel
web/app/forgot-password/ @douxc @iamjoel
web/app/account/ @douxc @iamjoel
# Frontend - Service Authentication
web/service/base.ts @douxc @iamjoel
# Frontend - WebApp Authentication and Access Control
web/app/(shareLayout)/components/ @douxc @iamjoel
web/app/(shareLayout)/webapp-signin/ @douxc @iamjoel
web/app/(shareLayout)/webapp-reset-password/ @douxc @iamjoel
web/app/components/app/app-access-control/ @douxc @iamjoel
# Frontend - Explore Page
web/app/components/explore/ @CodingOnStar @iamjoel
# Frontend - Personal Settings
web/app/components/header/account-setting/ @CodingOnStar @iamjoel
web/app/components/header/account-dropdown/ @CodingOnStar @iamjoel
# Frontend - Analytics
web/app/components/base/ga/ @CodingOnStar @iamjoel
# Frontend - Base Components
web/app/components/base/ @iamjoel @zxhlyh
# Frontend - Utils and Hooks
web/utils/classnames.ts @iamjoel @zxhlyh
web/utils/time.ts @iamjoel @zxhlyh
web/utils/format.ts @iamjoel @zxhlyh
web/utils/clipboard.ts @iamjoel @zxhlyh
web/hooks/use-document-title.ts @iamjoel @zxhlyh
# Frontend - Billing and Education
web/app/components/billing/ @iamjoel @zxhlyh
web/app/education-apply/ @iamjoel @zxhlyh
# Frontend - Workspace
web/app/components/header/account-dropdown/workplace-selector/ @iamjoel @zxhlyh

12
.github/copilot-instructions.md vendored Normal file
View File

@@ -0,0 +1,12 @@
# Copilot Instructions
GitHub Copilot must follow the unified frontend testing requirements documented in `web/testing/testing.md`.
Key reminders:
- Generate tests using the mandated tech stack, naming, and code style (AAA pattern, `fireEvent`, descriptive test names, cleans up mocks).
- Cover rendering, prop combinations, and edge cases by default; extend coverage for hooks, routing, async flows, and domain-specific components when applicable.
- Target >95% line and branch coverage and 100% function/statement coverage.
- Apply the project's mocking conventions for i18n, toast notifications, and Next.js utilities.
Any suggestions from Copilot that conflict with `web/testing/testing.md` should be revised before acceptance.

View File

@@ -39,25 +39,11 @@ jobs:
- name: Install dependencies
run: uv sync --project api --dev
- name: Run Unit tests
run: |
uv run --project api bash dev/pytest/pytest_unit_tests.sh
- name: Run pyrefly check
run: |
cd api
uv add --dev pyrefly
uv run pyrefly check || true
- name: Coverage Summary
run: |
set -x
# Extract coverage percentage and create a summary
TOTAL_COVERAGE=$(python -c 'import json; print(json.load(open("coverage.json"))["totals"]["percent_covered_display"])')
# Create a detailed coverage summary
echo "### Test Coverage Summary :test_tube:" >> $GITHUB_STEP_SUMMARY
echo "Total Coverage: ${TOTAL_COVERAGE}%" >> $GITHUB_STEP_SUMMARY
uv run --project api coverage report --format=markdown >> $GITHUB_STEP_SUMMARY
- name: Run dify config tests
run: uv run --project api dev/pytest/pytest_config_tests.py
@@ -76,7 +62,7 @@ jobs:
compose-file: |
docker/docker-compose.middleware.yaml
services: |
db
db_postgres
redis
sandbox
ssrf_proxy
@@ -93,3 +79,19 @@ jobs:
- name: Run TestContainers
run: uv run --project api bash dev/pytest/pytest_testcontainers.sh
- name: Run Unit tests
run: |
uv run --project api bash dev/pytest/pytest_unit_tests.sh
- name: Coverage Summary
run: |
set -x
# Extract coverage percentage and create a summary
TOTAL_COVERAGE=$(python -c 'import json; print(json.load(open("coverage.json"))["totals"]["percent_covered_display"])')
# Create a detailed coverage summary
echo "### Test Coverage Summary :test_tube:" >> $GITHUB_STEP_SUMMARY
echo "Total Coverage: ${TOTAL_COVERAGE}%" >> $GITHUB_STEP_SUMMARY
uv run --project api coverage report --format=markdown >> $GITHUB_STEP_SUMMARY

View File

@@ -2,6 +2,8 @@ name: autofix.ci
on:
pull_request:
branches: ["main"]
push:
branches: ["main"]
permissions:
contents: read
@@ -26,10 +28,17 @@ jobs:
# Format code
uv run ruff format ..
- name: count migration progress
run: |
cd api
./cnt_base.sh
- name: ast-grep
run: |
uvx --from ast-grep-cli sg --pattern 'db.session.query($WHATEVER).filter($HERE)' --rewrite 'db.session.query($WHATEVER).where($HERE)' -l py --update-all
uvx --from ast-grep-cli sg --pattern 'session.query($WHATEVER).filter($HERE)' --rewrite 'session.query($WHATEVER).where($HERE)' -l py --update-all
uvx --from ast-grep-cli sg -p '$A = db.Column($$$B)' -r '$A = mapped_column($$$B)' -l py --update-all
uvx --from ast-grep-cli sg -p '$A : $T = db.Column($$$B)' -r '$A : $T = mapped_column($$$B)' -l py --update-all
# Convert Optional[T] to T | None (ignoring quoted types)
cat > /tmp/optional-rule.yml << 'EOF'
id: convert-optional-to-union

View File

@@ -4,8 +4,7 @@ on:
push:
branches:
- "main"
- "deploy/dev"
- "deploy/enterprise"
- "deploy/**"
- "build/**"
- "release/e-*"
- "hotfix/**"

View File

@@ -8,7 +8,7 @@ concurrency:
cancel-in-progress: true
jobs:
db-migration-test:
db-migration-test-postgres:
runs-on: ubuntu-latest
steps:
@@ -45,7 +45,7 @@ jobs:
compose-file: |
docker/docker-compose.middleware.yaml
services: |
db
db_postgres
redis
- name: Prepare configs
@@ -57,3 +57,60 @@ jobs:
env:
DEBUG: true
run: uv run --directory api flask upgrade-db
db-migration-test-mysql:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Setup UV and Python
uses: astral-sh/setup-uv@v6
with:
enable-cache: true
python-version: "3.12"
cache-dependency-glob: api/uv.lock
- name: Install dependencies
run: uv sync --project api
- name: Ensure Offline migration are supported
run: |
# upgrade
uv run --directory api flask db upgrade 'base:head' --sql
# downgrade
uv run --directory api flask db downgrade 'head:base' --sql
- name: Prepare middleware env for MySQL
run: |
cd docker
cp middleware.env.example middleware.env
sed -i 's/DB_TYPE=postgresql/DB_TYPE=mysql/' middleware.env
sed -i 's/DB_HOST=db_postgres/DB_HOST=db_mysql/' middleware.env
sed -i 's/DB_PORT=5432/DB_PORT=3306/' middleware.env
sed -i 's/DB_USERNAME=postgres/DB_USERNAME=mysql/' middleware.env
- name: Set up Middlewares
uses: hoverkraft-tech/compose-action@v2.0.2
with:
compose-file: |
docker/docker-compose.middleware.yaml
services: |
db_mysql
redis
- name: Prepare configs for MySQL
run: |
cd api
cp .env.example .env
sed -i 's/DB_TYPE=postgresql/DB_TYPE=mysql/' .env
sed -i 's/DB_PORT=5432/DB_PORT=3306/' .env
sed -i 's/DB_USERNAME=postgres/DB_USERNAME=root/' .env
- name: Run DB Migration
env:
DEBUG: true
run: uv run --directory api flask upgrade-db

View File

@@ -18,7 +18,7 @@ jobs:
- name: Deploy to server
uses: appleboy/ssh-action@v0.1.8
with:
host: ${{ secrets.RAG_SSH_HOST }}
host: ${{ secrets.SSH_HOST }}
username: ${{ secrets.SSH_USER }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
script: |

View File

@@ -0,0 +1,28 @@
name: Deploy Trigger Dev
permissions:
contents: read
on:
workflow_run:
workflows: ["Build and Push API & Web"]
branches:
- "deploy/end-user-oauth"
types:
- completed
jobs:
deploy:
runs-on: ubuntu-latest
if: |
github.event.workflow_run.conclusion == 'success' &&
github.event.workflow_run.head_branch == 'deploy/end-user-oauth'
steps:
- name: Deploy to server
uses: appleboy/ssh-action@v0.1.8
with:
host: ${{ secrets.TRIGGER_SSH_HOST }}
username: ${{ secrets.SSH_USER }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
script: |
${{ vars.SSH_SCRIPT || secrets.SSH_SCRIPT }}

View File

@@ -1,4 +1,4 @@
name: Deploy RAG Dev
name: Deploy Trigger Dev
permissions:
contents: read
@@ -7,7 +7,7 @@ on:
workflow_run:
workflows: ["Build and Push API & Web"]
branches:
- "deploy/rag-dev"
- "deploy/trigger-dev"
types:
- completed
@@ -16,12 +16,12 @@ jobs:
runs-on: ubuntu-latest
if: |
github.event.workflow_run.conclusion == 'success' &&
github.event.workflow_run.head_branch == 'deploy/rag-dev'
github.event.workflow_run.head_branch == 'deploy/trigger-dev'
steps:
- name: Deploy to server
uses: appleboy/ssh-action@v0.1.8
with:
host: ${{ secrets.RAG_SSH_HOST }}
host: ${{ secrets.TRIGGER_SSH_HOST }}
username: ${{ secrets.SSH_USER }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
script: |

View File

@@ -1,6 +1,7 @@
#!/bin/bash
yq eval '.services.weaviate.ports += ["8080:8080"]' -i docker/docker-compose.yaml
yq eval '.services.weaviate.ports += ["50051:50051"]' -i docker/docker-compose.yaml
yq eval '.services.qdrant.ports += ["6333:6333"]' -i docker/docker-compose.yaml
yq eval '.services.chroma.ports += ["8000:8000"]' -i docker/docker-compose.yaml
yq eval '.services["milvus-standalone"].ports += ["19530:19530"]' -i docker/docker-compose.yaml
@@ -13,4 +14,4 @@ yq eval '.services.tidb.ports += ["4000:4000"]' -i docker/tidb/docker-compose.ya
yq eval '.services.oceanbase.ports += ["2881:2881"]' -i docker/docker-compose.yaml
yq eval '.services.opengauss.ports += ["6600:6600"]' -i docker/docker-compose.yaml
echo "Ports exposed for sandbox, weaviate, tidb, qdrant, chroma, milvus, pgvector, pgvecto-rs, elasticsearch, couchbase, opengauss"
echo "Ports exposed for sandbox, weaviate (HTTP 8080, gRPC 50051), tidb, qdrant, chroma, milvus, pgvector, pgvecto-rs, elasticsearch, couchbase, opengauss"

View File

@@ -103,6 +103,11 @@ jobs:
run: |
pnpm run lint
- name: Web type check
if: steps.changed-files.outputs.any_changed == 'true'
working-directory: ./web
run: pnpm run type-check
docker-compose-template:
name: Docker Compose Template
runs-on: ubuntu-latest

View File

@@ -20,22 +20,22 @@ jobs:
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 2
fetch-depth: 0
token: ${{ secrets.GITHUB_TOKEN }}
- name: Check for file changes in i18n/en-US
id: check_files
run: |
recent_commit_sha=$(git rev-parse HEAD)
second_recent_commit_sha=$(git rev-parse HEAD~1)
changed_files=$(git diff --name-only $recent_commit_sha $second_recent_commit_sha -- 'i18n/en-US/*.ts')
git fetch origin "${{ github.event.before }}" || true
git fetch origin "${{ github.sha }}" || true
changed_files=$(git diff --name-only "${{ github.event.before }}" "${{ github.sha }}" -- 'i18n/en-US/*.ts')
echo "Changed files: $changed_files"
if [ -n "$changed_files" ]; then
echo "FILES_CHANGED=true" >> $GITHUB_ENV
file_args=""
for file in $changed_files; do
filename=$(basename "$file" .ts)
file_args="$file_args --file=$filename"
file_args="$file_args --file $filename"
done
echo "FILE_ARGS=$file_args" >> $GITHUB_ENV
echo "File arguments: $file_args"
@@ -77,12 +77,15 @@ jobs:
uses: peter-evans/create-pull-request@v6
with:
token: ${{ secrets.GITHUB_TOKEN }}
commit-message: Update i18n files and type definitions based on en-US changes
title: 'chore: translate i18n files and update type definitions'
commit-message: 'chore(i18n): update translations based on en-US changes'
title: 'chore(i18n): translate i18n files and update type definitions'
body: |
This PR was automatically created to update i18n files and TypeScript type definitions based on changes in en-US locale.
**Triggered by:** ${{ github.sha }}
**Changes included:**
- Updated translation files for all locales
- Regenerated TypeScript type definitions for type safety
branch: chore/automated-i18n-updates
branch: chore/automated-i18n-updates-${{ github.sha }}
delete-branch: true

View File

@@ -51,13 +51,13 @@ jobs:
- name: Expose Service Ports
run: sh .github/workflows/expose_service_ports.sh
- name: Set up Vector Store (TiDB)
uses: hoverkraft-tech/compose-action@v2.0.2
with:
compose-file: docker/tidb/docker-compose.yaml
services: |
tidb
tiflash
# - name: Set up Vector Store (TiDB)
# uses: hoverkraft-tech/compose-action@v2.0.2
# with:
# compose-file: docker/tidb/docker-compose.yaml
# services: |
# tidb
# tiflash
- name: Set up Vector Stores (Weaviate, Qdrant, PGVector, Milvus, PgVecto-RS, Chroma, MyScale, ElasticSearch, Couchbase, OceanBase)
uses: hoverkraft-tech/compose-action@v2.0.2
@@ -83,8 +83,8 @@ jobs:
ls -lah .
cp api/tests/integration_tests/.env.example api/tests/integration_tests/.env
- name: Check VDB Ready (TiDB)
run: uv run --project api python api/tests/integration_tests/vdb/tidb_vector/check_tiflash_ready.py
# - name: Check VDB Ready (TiDB)
# run: uv run --project api python api/tests/integration_tests/vdb/tidb_vector/check_tiflash_ready.py
- name: Test Vector Stores
run: uv run --project api bash dev/pytest/pytest_vdb.sh

11
.gitignore vendored
View File

@@ -6,6 +6,9 @@ __pycache__/
# C extensions
*.so
# *db files
*.db
# Distribution / packaging
.Python
build/
@@ -97,6 +100,7 @@ __pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat-schedule.db
celerybeat.pid
# SageMath parsed files
@@ -182,6 +186,8 @@ docker/volumes/couchbase/*
docker/volumes/oceanbase/*
docker/volumes/plugin_daemon/*
docker/volumes/matrixone/*
docker/volumes/mysql/*
docker/volumes/seekdb/*
!docker/volumes/oceanbase/init.d
docker/nginx/conf.d/default.conf
@@ -234,4 +240,7 @@ scripts/stress-test/reports/
# mcp
.playwright-mcp/
.serena/
.serena/
# settings
*.local.json

View File

@@ -8,8 +8,7 @@
"module": "flask",
"env": {
"FLASK_APP": "app.py",
"FLASK_ENV": "development",
"GEVENT_SUPPORT": "True"
"FLASK_ENV": "development"
},
"args": [
"run",
@@ -28,9 +27,7 @@
"type": "debugpy",
"request": "launch",
"module": "celery",
"env": {
"GEVENT_SUPPORT": "True"
},
"env": {},
"args": [
"-A",
"app.celery",
@@ -40,7 +37,7 @@
"-c",
"1",
"-Q",
"dataset,generation,mail,ops_trace",
"dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor",
"--loglevel",
"INFO"
],

View File

@@ -0,0 +1,5 @@
# Windsurf Testing Rules
- Use `web/testing/testing.md` as the single source of truth for frontend automated testing.
- Honor every requirement in that document when generating or accepting tests.
- When proposing or saving tests, re-read that document and follow every requirement.

View File

@@ -14,7 +14,7 @@ The codebase is split into:
- Run backend CLI commands through `uv run --project api <command>`.
- Backend QA gate requires passing `make lint`, `make type-check`, and `uv run --project api --dev dev/pytest/pytest_unit_tests.sh` before review.
- Before submission, all backend modifications must pass local checks: `make lint`, `make type-check`, and `uv run --project api --dev dev/pytest/pytest_unit_tests.sh`.
- Use Makefile targets for linting and formatting; `make lint` and `make type-check` cover the required checks.

View File

@@ -77,6 +77,8 @@ How we prioritize:
For setting up the frontend service, please refer to our comprehensive [guide](https://github.com/langgenius/dify/blob/main/web/README.md) in the `web/README.md` file. This document provides detailed instructions to help you set up the frontend environment properly.
**Testing**: All React components must have comprehensive test coverage. See [web/testing/testing.md](https://github.com/langgenius/dify/blob/main/web/testing/testing.md) for the canonical frontend testing guidelines and follow every requirement described there.
#### Backend
For setting up the backend service, kindly refer to our detailed [instructions](https://github.com/langgenius/dify/blob/main/api/README.md) in the `api/README.md` file. This document contains step-by-step guidance to help you get the backend up and running smoothly.

View File

@@ -70,6 +70,11 @@ type-check:
@uv run --directory api --dev basedpyright
@echo "✅ Type check complete"
test:
@echo "🧪 Running backend unit tests..."
@uv run --project api --dev dev/pytest/pytest_unit_tests.sh
@echo "✅ Tests complete"
# Build Docker images
build-web:
@echo "Building web Docker image: $(WEB_IMAGE):$(VERSION)..."
@@ -119,6 +124,7 @@ help:
@echo " make check - Check code with ruff"
@echo " make lint - Format and fix code with ruff"
@echo " make type-check - Run type checking with basedpyright"
@echo " make test - Run backend unit tests"
@echo ""
@echo "Docker Build Targets:"
@echo " make build-web - Build web Docker image"
@@ -128,4 +134,4 @@ help:
@echo " make build-push-all - Build and push all Docker images"
# Phony targets
.PHONY: build-web build-api push-web push-api build-all push-all build-push-all dev-setup prepare-docker prepare-web prepare-api dev-clean help format check lint type-check
.PHONY: build-web build-api push-web push-api build-all push-all build-push-all dev-setup prepare-docker prepare-web prepare-api dev-clean help format check lint type-check test

View File

@@ -36,6 +36,12 @@
<img alt="Issues closed" src="https://img.shields.io/github/issues-search?query=repo%3Alanggenius%2Fdify%20is%3Aclosed&label=issues%20closed&labelColor=%20%237d89b0&color=%20%235d6b98"></a>
<a href="https://github.com/langgenius/dify/discussions/" target="_blank">
<img alt="Discussion posts" src="https://img.shields.io/github/discussions/langgenius/dify?labelColor=%20%239b8afb&color=%20%237a5af8"></a>
<a href="https://insights.linuxfoundation.org/project/langgenius-dify" target="_blank">
<img alt="LFX Health Score" src="https://insights.linuxfoundation.org/api/badge/health-score?project=langgenius-dify"></a>
<a href="https://insights.linuxfoundation.org/project/langgenius-dify" target="_blank">
<img alt="LFX Contributors" src="https://insights.linuxfoundation.org/api/badge/contributors?project=langgenius-dify"></a>
<a href="https://insights.linuxfoundation.org/project/langgenius-dify" target="_blank">
<img alt="LFX Active Contributors" src="https://insights.linuxfoundation.org/api/badge/active-contributors?project=langgenius-dify"></a>
</p>
<p align="center">
@@ -63,7 +69,7 @@ Dify is an open-source platform for developing LLM applications. Its intuitive i
> - CPU >= 2 Core
> - RAM >= 4 GiB
</br>
<br/>
The easiest way to start the Dify server is through [Docker Compose](docker/docker-compose.yaml). Before running Dify with the following commands, make sure that [Docker](https://docs.docker.com/get-docker/) and [Docker Compose](https://docs.docker.com/compose/install/) are installed on your machine:
@@ -109,15 +115,15 @@ All of Dify's offerings come with corresponding APIs, so you could effortlessly
## Using Dify
- **Cloud </br>**
- **Cloud <br/>**
We host a [Dify Cloud](https://dify.ai) service for anyone to try with zero setup. It provides all the capabilities of the self-deployed version, and includes 200 free GPT-4 calls in the sandbox plan.
- **Self-hosting Dify Community Edition</br>**
- **Self-hosting Dify Community Edition<br/>**
Quickly get Dify running in your environment with this [starter guide](#quick-start).
Use our [documentation](https://docs.dify.ai) for further references and more in-depth instructions.
- **Dify for enterprise / organizations</br>**
We provide additional enterprise-centric features. [Log your questions for us through this chatbot](https://udify.app/chat/22L1zSxg6yW1cWQg) or [send us an email](mailto:business@dify.ai?subject=%5BGitHub%5DBusiness%20License%20Inquiry) to discuss enterprise needs. </br>
- **Dify for enterprise / organizations<br/>**
We provide additional enterprise-centric features. [Send us an email](mailto:business@dify.ai?subject=%5BGitHub%5DBusiness%20License%20Inquiry) to discuss your enterprise needs. <br/>
> For startups and small businesses using AWS, check out [Dify Premium on AWS Marketplace](https://aws.amazon.com/marketplace/pp/prodview-t22mebxzwjhu6) and deploy it to your own AWS VPC with one click. It's an affordable AMI offering with the option to create apps with custom logo and branding.
@@ -129,8 +135,18 @@ Star Dify on GitHub and be instantly notified of new releases.
## Advanced Setup
### Custom configurations
If you need to customize the configuration, please refer to the comments in our [.env.example](docker/.env.example) file and update the corresponding values in your `.env` file. Additionally, you might need to make adjustments to the `docker-compose.yaml` file itself, such as changing image versions, port mappings, or volume mounts, based on your specific deployment environment and requirements. After making any changes, please re-run `docker-compose up -d`. You can find the full list of available environment variables [here](https://docs.dify.ai/getting-started/install-self-hosted/environments).
### Metrics Monitoring with Grafana
Import the dashboard to Grafana, using Dify's PostgreSQL database as data source, to monitor metrics in granularity of apps, tenants, messages, and more.
- [Grafana Dashboard by @bowenliang123](https://github.com/bowenliang123/dify-grafana-dashboard)
### Deployment with Kubernetes
If you'd like to configure a highly-available setup, there are community-contributed [Helm Charts](https://helm.sh/) and YAML files which allow Dify to be deployed on Kubernetes.
- [Helm Chart by @LeoQuote](https://github.com/douban/charts/tree/master/charts/dify)

View File

@@ -27,6 +27,9 @@ FILES_URL=http://localhost:5001
# Example: INTERNAL_FILES_URL=http://api:5001
INTERNAL_FILES_URL=http://127.0.0.1:5001
# TRIGGER URL
TRIGGER_URL=http://localhost:5001
# The time in seconds after the signature is rejected
FILES_ACCESS_TIMEOUT=300
@@ -69,12 +72,15 @@ REDIS_CLUSTERS_PASSWORD=
# celery configuration
CELERY_BROKER_URL=redis://:difyai123456@localhost:${REDIS_PORT}/1
CELERY_BACKEND=redis
# PostgreSQL database configuration
# Database configuration
DB_TYPE=postgresql
DB_USERNAME=postgres
DB_PASSWORD=difyai123456
DB_HOST=localhost
DB_PORT=5432
DB_DATABASE=dify
SQLALCHEMY_POOL_PRE_PING=true
SQLALCHEMY_POOL_TIMEOUT=30
@@ -156,9 +162,11 @@ SUPABASE_URL=your-server-url
# CORS configuration
WEB_API_CORS_ALLOW_ORIGINS=http://localhost:3000,*
CONSOLE_CORS_ALLOW_ORIGINS=http://localhost:3000,*
# When the frontend and backend run on different subdomains, set COOKIE_DOMAIN to the sites top-level domain (e.g., `example.com`). Leading dots are optional.
COOKIE_DOMAIN=
# Vector database configuration
# Supported values are `weaviate`, `qdrant`, `milvus`, `myscale`, `relyt`, `pgvector`, `pgvecto-rs`, `chroma`, `opensearch`, `oracle`, `tencent`, `elasticsearch`, `elasticsearch-ja`, `analyticdb`, `couchbase`, `vikingdb`, `oceanbase`, `opengauss`, `tablestore`,`vastbase`,`tidb`,`tidb_on_qdrant`,`baidu`,`lindorm`,`huawei_cloud`,`upstash`, `matrixone`.
# Supported values are `weaviate`, `oceanbase`, `qdrant`, `milvus`, `myscale`, `relyt`, `pgvector`, `pgvecto-rs`, `chroma`, `opensearch`, `oracle`, `tencent`, `elasticsearch`, `elasticsearch-ja`, `analyticdb`, `couchbase`, `vikingdb`, `opengauss`, `tablestore`,`vastbase`,`tidb`,`tidb_on_qdrant`,`baidu`,`lindorm`,`huawei_cloud`,`upstash`, `matrixone`.
VECTOR_STORE=weaviate
# Prefix used to create collection name in vector database
VECTOR_INDEX_NAME_PREFIX=Vector_index
@@ -168,6 +176,18 @@ WEAVIATE_ENDPOINT=http://localhost:8080
WEAVIATE_API_KEY=WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih
WEAVIATE_GRPC_ENABLED=false
WEAVIATE_BATCH_SIZE=100
WEAVIATE_TOKENIZATION=word
# OceanBase Vector configuration
OCEANBASE_VECTOR_HOST=127.0.0.1
OCEANBASE_VECTOR_PORT=2881
OCEANBASE_VECTOR_USER=root@test
OCEANBASE_VECTOR_PASSWORD=difyai123456
OCEANBASE_VECTOR_DATABASE=test
OCEANBASE_MEMORY_LIMIT=6G
OCEANBASE_ENABLE_HYBRID_SEARCH=false
OCEANBASE_FULLTEXT_PARSER=ik
SEEKDB_MEMORY_LIMIT=2G
# Qdrant configuration, use `http://localhost:6333` for local mode or `https://your-qdrant-cluster-url.qdrant.io` for remote mode
QDRANT_URL=http://localhost:6333
@@ -334,14 +354,14 @@ LINDORM_PASSWORD=admin
LINDORM_USING_UGC=True
LINDORM_QUERY_TIMEOUT=1
# OceanBase Vector configuration
OCEANBASE_VECTOR_HOST=127.0.0.1
OCEANBASE_VECTOR_PORT=2881
OCEANBASE_VECTOR_USER=root@test
OCEANBASE_VECTOR_PASSWORD=difyai123456
OCEANBASE_VECTOR_DATABASE=test
OCEANBASE_MEMORY_LIMIT=6G
OCEANBASE_ENABLE_HYBRID_SEARCH=false
# AlibabaCloud MySQL Vector configuration
ALIBABACLOUD_MYSQL_HOST=127.0.0.1
ALIBABACLOUD_MYSQL_PORT=3306
ALIBABACLOUD_MYSQL_USER=root
ALIBABACLOUD_MYSQL_PASSWORD=root
ALIBABACLOUD_MYSQL_DATABASE=dify
ALIBABACLOUD_MYSQL_MAX_CONNECTION=5
ALIBABACLOUD_MYSQL_HNSW_M=6
# openGauss configuration
OPENGAUSS_HOST=127.0.0.1
@@ -359,6 +379,12 @@ UPLOAD_IMAGE_FILE_SIZE_LIMIT=10
UPLOAD_VIDEO_FILE_SIZE_LIMIT=100
UPLOAD_AUDIO_FILE_SIZE_LIMIT=50
# Comma-separated list of file extensions blocked from upload for security reasons.
# Extensions should be lowercase without dots (e.g., exe,bat,sh,dll).
# Empty by default to allow all file types.
# Recommended: exe,bat,cmd,com,scr,vbs,ps1,msi,dll
UPLOAD_FILE_EXTENSION_BLACKLIST=
# Model configuration
MULTIMODAL_SEND_FORMAT=base64
PROMPT_GENERATION_MAX_TOKENS=512
@@ -425,10 +451,13 @@ CODE_EXECUTION_SSL_VERIFY=True
CODE_EXECUTION_POOL_MAX_CONNECTIONS=100
CODE_EXECUTION_POOL_MAX_KEEPALIVE_CONNECTIONS=20
CODE_EXECUTION_POOL_KEEPALIVE_EXPIRY=5.0
CODE_EXECUTION_CONNECT_TIMEOUT=10
CODE_EXECUTION_READ_TIMEOUT=60
CODE_EXECUTION_WRITE_TIMEOUT=10
CODE_MAX_NUMBER=9223372036854775807
CODE_MIN_NUMBER=-9223372036854775808
CODE_MAX_STRING_LENGTH=80000
TEMPLATE_TRANSFORM_MAX_LENGTH=80000
CODE_MAX_STRING_LENGTH=400000
TEMPLATE_TRANSFORM_MAX_LENGTH=400000
CODE_MAX_STRING_ARRAY_LENGTH=30
CODE_MAX_OBJECT_ARRAY_LENGTH=30
CODE_MAX_NUMBER_ARRAY_LENGTH=1000
@@ -445,6 +474,9 @@ HTTP_REQUEST_NODE_MAX_BINARY_SIZE=10485760
HTTP_REQUEST_NODE_MAX_TEXT_SIZE=1048576
HTTP_REQUEST_NODE_SSL_VERIFY=True
# Webhook request configuration
WEBHOOK_REQUEST_BODY_MAX_SIZE=10485760
# Respect X-* headers to redirect clients
RESPECT_XFORWARD_HEADERS_ENABLED=false
@@ -500,7 +532,7 @@ API_WORKFLOW_NODE_EXECUTION_REPOSITORY=repositories.sqlalchemy_api_workflow_node
API_WORKFLOW_RUN_REPOSITORY=repositories.sqlalchemy_api_workflow_run_repository.DifyAPISQLAlchemyWorkflowRunRepository
# Workflow log cleanup configuration
# Enable automatic cleanup of workflow run logs to manage database size
WORKFLOW_LOG_CLEANUP_ENABLED=true
WORKFLOW_LOG_CLEANUP_ENABLED=false
# Number of days to retain workflow run logs (default: 30 days)
WORKFLOW_LOG_RETENTION_DAYS=30
# Batch size for workflow log cleanup operations (default: 100)
@@ -508,6 +540,7 @@ WORKFLOW_LOG_CLEANUP_BATCH_SIZE=100
# App configuration
APP_MAX_EXECUTION_TIME=1200
APP_DEFAULT_ACTIVE_REQUESTS=0
APP_MAX_ACTIVE_REQUESTS=0
# Celery beat configuration
@@ -522,6 +555,12 @@ ENABLE_CLEAN_MESSAGES=false
ENABLE_MAIL_CLEAN_DOCUMENT_NOTIFY_TASK=false
ENABLE_DATASETS_QUEUE_MONITOR=false
ENABLE_CHECK_UPGRADABLE_PLUGIN_TASK=true
ENABLE_WORKFLOW_SCHEDULE_POLLER_TASK=true
# Interval time in minutes for polling scheduled workflows(default: 1 min)
WORKFLOW_SCHEDULE_POLLER_INTERVAL=1
WORKFLOW_SCHEDULE_POLLER_BATCH_SIZE=100
# Maximum number of scheduled workflows to dispatch per tick (0 for unlimited)
WORKFLOW_SCHEDULE_MAX_DISPATCH_PER_TICK=0
# Position configuration
POSITION_TOOL_PINS=
@@ -593,3 +632,9 @@ SWAGGER_UI_PATH=/swagger-ui.html
# Whether to encrypt dataset IDs when exporting DSL files (default: true)
# Set to false to export dataset IDs as plain text for easier cross-environment import
DSL_EXPORT_ENCRYPT_DATASET_ID=true
# Tenant isolated task queue configuration
TENANT_ISOLATED_TASK_CONCURRENCY=1
# Maximum number of segments for dataset segments API (0 for unlimited)
DATASET_MAX_SEGMENTS_PER_REQUEST=0

View File

@@ -16,6 +16,7 @@ layers =
graph
nodes
node_events
runtime
entities
containers =
core.workflow

View File

@@ -81,7 +81,6 @@ ignore = [
"SIM113", # enumerate-for-loop
"SIM117", # multiple-with-statements
"SIM210", # if-expr-with-true-false
"UP038", # deprecated and not recommended by Ruff, https://docs.astral.sh/ruff/rules/non-pep604-isinstance/
]
[lint.per-file-ignores]

View File

@@ -54,7 +54,7 @@
"--loglevel",
"DEBUG",
"-Q",
"dataset,generation,mail,ops_trace,app_deletion"
"dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor"
]
}
]

62
api/AGENTS.md Normal file
View File

@@ -0,0 +1,62 @@
# Agent Skill Index
Start with the section that best matches your need. Each entry lists the problems it solves plus key files/concepts so you know what to expect before opening it.
______________________________________________________________________
## Platform Foundations
- **[Infrastructure Overview](agent_skills/infra.md)**\
When to read this:
- You need to understand where a feature belongs in the architecture.
- Youre wiring storage, Redis, vector stores, or OTEL.
- Youre about to add CLI commands or async jobs.\
What it covers: configuration stack (`configs/app_config.py`, remote settings), storage entry points (`extensions/ext_storage.py`, `core/file/file_manager.py`), Redis conventions (`extensions/ext_redis.py`), plugin runtime topology, vector-store factory (`core/rag/datasource/vdb/*`), observability hooks, SSRF proxy usage, and core CLI commands.
- **[Coding Style](agent_skills/coding_style.md)**\
When to read this:
- Youre writing or reviewing backend code and need the authoritative checklist.
- Youre unsure about Pydantic validators, SQLAlchemy session usage, or logging patterns.
- You want the exact lint/type/test commands used in PRs.\
Includes: Ruff & BasedPyright commands, no-annotation policy, session examples (`with Session(db.engine, ...)`), `@field_validator` usage, logging expectations, and the rule set for file size, helpers, and package management.
______________________________________________________________________
## Plugin & Extension Development
- **[Plugin Systems](agent_skills/plugin.md)**\
When to read this:
- Youre building or debugging a marketplace plugin.
- You need to know how manifests, providers, daemons, and migrations fit together.\
What it covers: plugin manifests (`core/plugin/entities/plugin.py`), installation/upgrade flows (`services/plugin/plugin_service.py`, CLI commands), runtime adapters (`core/plugin/impl/*` for tool/model/datasource/trigger/endpoint/agent), daemon coordination (`core/plugin/entities/plugin_daemon.py`), and how provider registries surface capabilities to the rest of the platform.
- **[Plugin OAuth](agent_skills/plugin_oauth.md)**\
When to read this:
- You must integrate OAuth for a plugin or datasource.
- Youre handling credential encryption or refresh flows.\
Topics: credential storage, encryption helpers (`core/helper/provider_encryption.py`), OAuth client bootstrap (`services/plugin/oauth_service.py`, `services/plugin/plugin_parameter_service.py`), and how console/API layers expose the flows.
______________________________________________________________________
## Workflow Entry & Execution
- **[Trigger Concepts](agent_skills/trigger.md)**\
When to read this:
- Youre debugging why a workflow didnt start.
- Youre adding a new trigger type or hook.
- You need to trace async execution, draft debugging, or webhook/schedule pipelines.\
Details: Start-node taxonomy, webhook & schedule internals (`core/workflow/nodes/trigger_*`, `services/trigger/*`), async orchestration (`services/async_workflow_service.py`, Celery queues), debug event bus, and storage/logging interactions.
______________________________________________________________________
## Additional Notes for Agents
- All skill docs assume you follow the coding style guide—run Ruff/BasedPyright/tests listed there before submitting changes.
- When you cannot find an answer in these briefs, search the codebase using the paths referenced (e.g., `core/plugin/impl/tool.py`, `services/dataset_service.py`).
- If you run into cross-cutting concerns (tenancy, configuration, storage), check the infrastructure guide first; it links to most supporting modules.
- Keep multi-tenancy and configuration central: everything flows through `configs.dify_config` and `tenant_id`.
- When touching plugins or triggers, consult both the system overview and the specialised doc to ensure you adjust lifecycle, storage, and observability consistently.

View File

@@ -15,7 +15,11 @@ FROM base AS packages
# RUN sed -i 's@deb.debian.org@mirrors.aliyun.com@g' /etc/apt/sources.list.d/debian.sources
RUN apt-get update \
&& apt-get install -y --no-install-recommends gcc g++ libc-dev libffi-dev libgmp-dev libmpfr-dev libmpc-dev
&& apt-get install -y --no-install-recommends \
# basic environment
g++ \
# for building gmpy2
libmpfr-dev libmpc-dev
# Install Python dependencies
COPY pyproject.toml uv.lock ./
@@ -44,14 +48,22 @@ ENV PYTHONIOENCODING=utf-8
WORKDIR /app/api
# Create non-root user
ARG dify_uid=1001
RUN groupadd -r -g ${dify_uid} dify && \
useradd -r -u ${dify_uid} -g ${dify_uid} -s /bin/bash dify && \
chown -R dify:dify /app
RUN \
apt-get update \
# Install dependencies
&& apt-get install -y --no-install-recommends \
# basic environment
curl nodejs libgmp-dev libmpfr-dev libmpc-dev \
curl nodejs \
# for gmpy2 \
libgmp-dev libmpfr-dev libmpc-dev \
# For Security
expat libldap-2.5-0 perl libsqlite3-0 zlib1g \
expat libldap-2.5-0=2.5.13+dfsg-5 perl libsqlite3-0=3.40.1-2+deb12u2 zlib1g=1:1.2.13.dfsg-1 \
# install fonts to support the use of tools like pypdfium2
fonts-noto-cjk \
# install a package to improve the accuracy of guessing mime type and file extension
@@ -63,24 +75,29 @@ RUN \
# Copy Python environment and packages
ENV VIRTUAL_ENV=/app/api/.venv
COPY --from=packages ${VIRTUAL_ENV} ${VIRTUAL_ENV}
COPY --from=packages --chown=dify:dify ${VIRTUAL_ENV} ${VIRTUAL_ENV}
ENV PATH="${VIRTUAL_ENV}/bin:${PATH}"
# Download nltk data
RUN python -c "import nltk; nltk.download('punkt'); nltk.download('averaged_perceptron_tagger')"
RUN mkdir -p /usr/local/share/nltk_data && NLTK_DATA=/usr/local/share/nltk_data python -c "import nltk; nltk.download('punkt'); nltk.download('averaged_perceptron_tagger'); nltk.download('stopwords')" \
&& chmod -R 755 /usr/local/share/nltk_data
ENV TIKTOKEN_CACHE_DIR=/app/api/.tiktoken_cache
RUN python -c "import tiktoken; tiktoken.encoding_for_model('gpt2')"
RUN python -c "import tiktoken; tiktoken.encoding_for_model('gpt2')" \
&& chown -R dify:dify ${TIKTOKEN_CACHE_DIR}
# Copy source code
COPY . /app/api/
COPY --chown=dify:dify . /app/api/
# Prepare entrypoint script
COPY --chown=dify:dify --chmod=755 docker/entrypoint.sh /entrypoint.sh
# Copy entrypoint
COPY docker/entrypoint.sh /entrypoint.sh
RUN chmod +x /entrypoint.sh
ARG COMMIT_SHA
ENV COMMIT_SHA=${COMMIT_SHA}
ENV NLTK_DATA=/usr/local/share/nltk_data
USER dify
ENTRYPOINT ["/bin/bash", "/entrypoint.sh"]

View File

@@ -15,8 +15,8 @@
```bash
cd ../docker
cp middleware.env.example middleware.env
# change the profile to other vector database if you are not using weaviate
docker compose -f docker-compose.middleware.yaml --profile weaviate -p dify up -d
# change the profile to mysql if you are not using postgres,change the profile to other vector database if you are not using weaviate
docker compose -f docker-compose.middleware.yaml --profile postgresql --profile weaviate -p dify up -d
cd ../api
```
@@ -26,6 +26,10 @@
cp .env.example .env
```
> [!IMPORTANT]
>
> When the frontend and backend run on different subdomains, set COOKIE_DOMAIN to the sites top-level domain (e.g., `example.com`). The frontend and backend must be under the same top-level domain in order to share authentication cookies.
1. Generate a `SECRET_KEY` in the `.env` file.
bash for Linux
@@ -80,7 +84,7 @@
1. If you need to handle and debug the async tasks (e.g. dataset importing and documents indexing), please start the worker service.
```bash
uv run celery -A app.celery worker -P gevent -c 2 --loglevel INFO -Q dataset,generation,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation
uv run celery -A app.celery worker -P threads -c 2 --loglevel INFO -Q dataset,priority_dataset,priority_pipeline,pipeline,mail,ops_trace,app_deletion,plugin,workflow_storage,conversation,workflow,schedule_poller,schedule_executor,triggered_workflow_dispatcher,trigger_refresh_executor
```
Additionally, if you want to debug the celery scheduled tasks, you can run the following command in another terminal to start the beat service:

View File

@@ -0,0 +1,115 @@
## Linter
- Always follow `.ruff.toml`.
- Run `uv run ruff check --fix --unsafe-fixes`.
- Keep each line under 100 characters (including spaces).
## Code Style
- `snake_case` for variables and functions.
- `PascalCase` for classes.
- `UPPER_CASE` for constants.
## Rules
- Use Pydantic v2 standard.
- Use `uv` for package management.
- Do not override dunder methods like `__init__`, `__iadd__`, etc.
- Never launch services (`uv run app.py`, `flask run`, etc.); running tests under `tests/` is allowed.
- Prefer simple functions over classes for lightweight helpers.
- Keep files below 800 lines; split when necessary.
- Keep code readable—no clever hacks.
- Never use `print`; log with `logger = logging.getLogger(__name__)`.
## Guiding Principles
- Mirror the projects layered architecture: controller → service → core/domain.
- Reuse existing helpers in `core/`, `services/`, and `libs/` before creating new abstractions.
- Optimise for observability: deterministic control flow, clear logging, actionable errors.
## SQLAlchemy Patterns
- Models inherit from `models.base.Base`; never create ad-hoc metadata or engines.
- Open sessions with context managers:
```python
from sqlalchemy.orm import Session
with Session(db.engine, expire_on_commit=False) as session:
stmt = select(Workflow).where(
Workflow.id == workflow_id,
Workflow.tenant_id == tenant_id,
)
workflow = session.execute(stmt).scalar_one_or_none()
```
- Use SQLAlchemy expressions; avoid raw SQL unless necessary.
- Introduce repository abstractions only for very large tables (e.g., workflow executions) to support alternative storage strategies.
- Always scope queries by `tenant_id` and protect write paths with safeguards (`FOR UPDATE`, row counts, etc.).
## Storage & External IO
- Access storage via `extensions.ext_storage.storage`.
- Use `core.helper.ssrf_proxy` for outbound HTTP fetches.
- Background tasks that touch storage must be idempotent and log the relevant object identifiers.
## Pydantic Usage
- Define DTOs with Pydantic v2 models and forbid extras by default.
- Use `@field_validator` / `@model_validator` for domain rules.
- Example:
```python
from pydantic import BaseModel, ConfigDict, HttpUrl, field_validator
class TriggerConfig(BaseModel):
endpoint: HttpUrl
secret: str
model_config = ConfigDict(extra="forbid")
@field_validator("secret")
def ensure_secret_prefix(cls, value: str) -> str:
if not value.startswith("dify_"):
raise ValueError("secret must start with dify_")
return value
```
## Generics & Protocols
- Use `typing.Protocol` to define behavioural contracts (e.g., cache interfaces).
- Apply generics (`TypeVar`, `Generic`) for reusable utilities like caches or providers.
- Validate dynamic inputs at runtime when generics cannot enforce safety alone.
## Error Handling & Logging
- Raise domain-specific exceptions (`services/errors`, `core/errors`) and translate to HTTP responses in controllers.
- Declare `logger = logging.getLogger(__name__)` at module top.
- Include tenant/app/workflow identifiers in log context.
- Log retryable events at `warning`, terminal failures at `error`.
## Tooling & Checks
- Format/lint: `uv run --project api --dev ruff format ./api` and `uv run --project api --dev ruff check --fix --unsafe-fixes ./api`.
- Type checks: `uv run --directory api --dev basedpyright`.
- Tests: `uv run --project api --dev dev/pytest/pytest_unit_tests.sh`.
- Run all of the above before submitting your work.
## Controllers & Services
- Controllers: parse input via Pydantic, invoke services, return serialised responses; no business logic.
- Services: coordinate repositories, providers, background tasks; keep side effects explicit.
- Avoid repositories unless necessary; direct SQLAlchemy usage is preferred for typical tables.
- Document non-obvious behaviour with concise comments.
## Miscellaneous
- Use `configs.dify_config` for configuration—never read environment variables directly.
- Maintain tenant awareness end-to-end; `tenant_id` must flow through every layer touching shared resources.
- Queue async work through `services/async_workflow_service`; implement tasks under `tasks/` with explicit queue selection.
- Keep experimental scripts under `dev/`; do not ship them in production builds.

96
api/agent_skills/infra.md Normal file
View File

@@ -0,0 +1,96 @@
## Configuration
- Import `configs.dify_config` for every runtime toggle. Do not read environment variables directly.
- Add new settings to the proper mixin inside `configs/` (deployment, feature, middleware, etc.) so they load through `DifyConfig`.
- Remote overrides come from the optional providers in `configs/remote_settings_sources`; keep defaults in code safe when the value is missing.
- Example: logging pulls targets from `extensions/ext_logging.py`, and model provider URLs are assembled in `services/entities/model_provider_entities.py`.
## Dependencies
- Runtime dependencies live in `[project].dependencies` inside `pyproject.toml`. Optional clients go into the `storage`, `tools`, or `vdb` groups under `[dependency-groups]`.
- Always pin versions and keep the list alphabetised. Shared tooling (lint, typing, pytest) belongs in the `dev` group.
- When code needs a new package, explain why in the PR and run `uv lock` so the lockfile stays current.
## Storage & Files
- Use `extensions.ext_storage.storage` for all blob IO; it already respects the configured backend.
- Convert files for workflows with helpers in `core/file/file_manager.py`; they handle signed URLs and multimodal payloads.
- When writing controller logic, delegate upload quotas and metadata to `services/file_service.py` instead of touching storage directly.
- All outbound HTTP fetches (webhooks, remote files) must go through the SSRF-safe client in `core/helper/ssrf_proxy.py`; it wraps `httpx` with the allow/deny rules configured for the platform.
## Redis & Shared State
- Access Redis through `extensions.ext_redis.redis_client`. For locking, reuse `redis_client.lock`.
- Prefer higher-level helpers when available: rate limits use `libs.helper.RateLimiter`, provider metadata uses caches in `core/helper/provider_cache.py`.
## Models
- SQLAlchemy models sit in `models/` and inherit from the shared declarative `Base` defined in `models/base.py` (metadata configured via `models/engine.py`).
- `models/__init__.py` exposes grouped aggregates: account/tenant models, app and conversation tables, datasets, providers, workflow runs, triggers, etc. Import from there to avoid deep path churn.
- Follow the DDD boundary: persistence objects live in `models/`, repositories under `repositories/` translate them into domain entities, and services consume those repositories.
- When adding a table, create the model class, register it in `models/__init__.py`, wire a repository if needed, and generate an Alembic migration as described below.
## Vector Stores
- Vector client implementations live in `core/rag/datasource/vdb/<provider>`, with a common factory in `core/rag/datasource/vdb/vector_factory.py` and enums in `core/rag/datasource/vdb/vector_type.py`.
- Retrieval pipelines call these providers through `core/rag/datasource/retrieval_service.py` and dataset ingestion flows in `services/dataset_service.py`.
- The CLI helper `flask vdb-migrate` orchestrates bulk migrations using routines in `commands.py`; reuse that pattern when adding new backend transitions.
- To add another store, mirror the provider layout, register it with the factory, and include any schema changes in Alembic migrations.
## Observability & OTEL
- OpenTelemetry settings live under the observability mixin in `configs/observability`. Toggle exporters and sampling via `dify_config`, not ad-hoc env reads.
- HTTP, Celery, Redis, SQLAlchemy, and httpx instrumentation is initialised in `extensions/ext_app_metrics.py` and `extensions/ext_request_logging.py`; reuse these hooks when adding new workers or entrypoints.
- When creating background tasks or external calls, propagate tracing context with helpers in the existing instrumented clients (e.g. use the shared `httpx` session from `core/helper/http_client_pooling.py`).
- If you add a new external integration, ensure spans and metrics are emitted by wiring the appropriate OTEL instrumentation package in `pyproject.toml` and configuring it in `extensions/`.
## Ops Integrations
- Langfuse support and other tracing bridges live under `core/ops/opik_trace`. Config toggles sit in `configs/observability`, while exporters are initialised in the OTEL extensions mentioned above.
- External monitoring services should follow this pattern: keep client code in `core/ops`, expose switches via `dify_config`, and hook initialisation in `extensions/ext_app_metrics.py` or sibling modules.
- Before instrumenting new code paths, check whether existing context helpers (e.g. `extensions/ext_request_logging.py`) already capture the necessary metadata.
## Controllers, Services, Core
- Controllers only parse HTTP input and call a service method. Keep business rules in `services/`.
- Services enforce tenant rules, quotas, and orchestration, then call into `core/` engines (workflow execution, tools, LLMs).
- When adding a new endpoint, search for an existing service to extend before introducing a new layer. Example: workflow APIs pipe through `services/workflow_service.py` into `core/workflow`.
## Plugins, Tools, Providers
- In Dify a plugin is a tenant-installable bundle that declares one or more providers (tool, model, datasource, trigger, endpoint, agent strategy) plus its resource needs and version metadata. The manifest (`core/plugin/entities/plugin.py`) mirrors what you see in the marketplace documentation.
- Installation, upgrades, and migrations are orchestrated by `services/plugin/plugin_service.py` together with helpers such as `services/plugin/plugin_migration.py`.
- Runtime loading happens through the implementations under `core/plugin/impl/*` (tool/model/datasource/trigger/endpoint/agent). These modules normalise plugin providers so that downstream systems (`core/tools/tool_manager.py`, `services/model_provider_service.py`, `services/trigger/*`) can treat builtin and plugin capabilities the same way.
- For remote execution, plugin daemons (`core/plugin/entities/plugin_daemon.py`, `core/plugin/impl/plugin.py`) manage lifecycle hooks, credential forwarding, and background workers that keep plugin processes in sync with the main application.
- Acquire tool implementations through `core/tools/tool_manager.py`; it resolves builtin, plugin, and workflow-as-tool providers uniformly, injecting the right context (tenant, credentials, runtime config).
- To add a new plugin capability, extend the relevant `core/plugin/entities` schema and register the implementation in the matching `core/plugin/impl` module rather than importing the provider directly.
## Async Workloads
see `agent_skills/trigger.md` for more detailed documentation.
- Enqueue background work through `services/async_workflow_service.py`. It routes jobs to the tiered Celery queues defined in `tasks/`.
- Workers boot from `celery_entrypoint.py` and execute functions in `tasks/workflow_execution_tasks.py`, `tasks/trigger_processing_tasks.py`, etc.
- Scheduled workflows poll from `schedule/workflow_schedule_tasks.py`. Follow the same pattern if you need new periodic jobs.
## Database & Migrations
- SQLAlchemy models live under `models/` and map directly to migration files in `migrations/versions`.
- Generate migrations with `uv run --project api flask db revision --autogenerate -m "<summary>"`, then review the diff; never hand-edit the database outside Alembic.
- Apply migrations locally using `uv run --project api flask db upgrade`; production deploys expect the same history.
- If you add tenant-scoped data, confirm the upgrade includes tenant filters or defaults consistent with the service logic touching those tables.
## CLI Commands
- Maintenance commands from `commands.py` are registered on the Flask CLI. Run them via `uv run --project api flask <command>`.
- Use the built-in `db` commands from Flask-Migrate for schema operations (`flask db upgrade`, `flask db stamp`, etc.). Only fall back to custom helpers if you need their extra behaviour.
- Custom entries such as `flask reset-password`, `flask reset-email`, and `flask vdb-migrate` handle self-hosted account recovery and vector database migrations.
- Before adding a new command, check whether an existing service can be reused and ensure the command guards edition-specific behaviour (many enforce `SELF_HOSTED`). Document any additions in the PR.
- Ruff helpers are run directly with `uv`: `uv run --project api --dev ruff format ./api` for formatting and `uv run --project api --dev ruff check ./api` (add `--fix` if you want automatic fixes).
## When You Add Features
- Check for an existing helper or service before writing a new util.
- Uphold tenancy: every service method should receive the tenant ID from controller wrappers such as `controllers/console/wraps.py`.
- Update or create tests alongside behaviour changes (`tests/unit_tests` for fast coverage, `tests/integration_tests` when touching orchestrations).
- Run `uv run --project api --dev ruff check ./api`, `uv run --directory api --dev basedpyright`, and `uv run --project api --dev dev/pytest/pytest_unit_tests.sh` before submitting changes.

View File

@@ -0,0 +1 @@
// TBD

View File

@@ -0,0 +1 @@
// TBD

View File

@@ -0,0 +1,53 @@
## Overview
Trigger is a collection of nodes that we called `Start` nodes, also, the concept of `Start` is the same as `RootNode` in the workflow engine `core/workflow/graph_engine`, On the other hand, `Start` node is the entry point of workflows, every workflow run always starts from a `Start` node.
## Trigger nodes
- `UserInput`
- `Trigger Webhook`
- `Trigger Schedule`
- `Trigger Plugin`
### UserInput
Before `Trigger` concept is introduced, it's what we called `Start` node, but now, to avoid confusion, it was renamed to `UserInput` node, has a strong relation with `ServiceAPI` in `controllers/service_api/app`
1. `UserInput` node introduces a list of arguments that need to be provided by the user, finally it will be converted into variables in the workflow variable pool.
1. `ServiceAPI` accept those arguments, and pass through them into `UserInput` node.
1. For its detailed implementation, please refer to `core/workflow/nodes/start`
### Trigger Webhook
Inside Webhook Node, Dify provided a UI panel that allows user define a HTTP manifest `core/workflow/nodes/trigger_webhook/entities.py`.`WebhookData`, also, Dify generates a random webhook id for each `Trigger Webhook` node, the implementation was implemented in `core/trigger/utils/endpoint.py`, as you can see, `webhook-debug` is a debug mode for webhook, you may find it in `controllers/trigger/webhook.py`.
Finally, requests to `webhook` endpoint will be converted into variables in workflow variable pool during workflow execution.
### Trigger Schedule
`Trigger Schedule` node is a node that allows user define a schedule to trigger the workflow, detailed manifest is here `core/workflow/nodes/trigger_schedule/entities.py`, we have a poller and executor to handle millions of schedules, see `docker/entrypoint.sh` / `schedule/workflow_schedule_task.py` for help.
To Achieve this, a `WorkflowSchedulePlan` model was introduced in `models/trigger.py`, and a `events/event_handlers/sync_workflow_schedule_when_app_published.py` was used to sync workflow schedule plans when app is published.
### Trigger Plugin
`Trigger Plugin` node allows user define there own distributed trigger plugin, whenever a request was received, Dify forwards it to the plugin and wait for parsed variables from it.
1. Requests were saved in storage by `services/trigger/trigger_request_service.py`, referenced by `services/trigger/trigger_service.py`.`TriggerService`.`process_endpoint`
1. Plugins accept those requests and parse variables from it, see `core/plugin/impl/trigger.py` for details.
A `subscription` concept was out here by Dify, it means an endpoint address from Dify was bound to thirdparty webhook service like `Github` `Slack` `Linear` `GoogleDrive` `Gmail` etc. Once a subscription was created, Dify continually receives requests from the platforms and handle them one by one.
## Worker Pool / Async Task
All the events that triggered a new workflow run is always in async mode, a unified entrypoint can be found here `services/async_workflow_service.py`.`AsyncWorkflowService`.`trigger_workflow_async`.
The infrastructure we used is `celery`, we've already configured it in `docker/entrypoint.sh`, and the consumers are in `tasks/async_workflow_tasks.py`, 3 queues were used to handle different tiers of users, `PROFESSIONAL_QUEUE` `TEAM_QUEUE` `SANDBOX_QUEUE`.
## Debug Strategy
Dify divided users into 2 groups: builders / end users.
Builders are the users who create workflows, in this stage, debugging a workflow becomes a critical part of the workflow development process, as the start node in workflows, trigger nodes can `listen` to the events from `WebhookDebug` `Schedule` `Plugin`, debugging process was created in `controllers/console/app/workflow.py`.`DraftWorkflowTriggerNodeApi`.
A polling process can be considered as combine of few single `poll` operations, each `poll` operation fetches events cached in `Redis`, returns `None` if no event was found, more detailed implemented: `core/trigger/debug/event_bus.py` was used to handle the polling process, and `core/trigger/debug/event_selectors.py` was used to select the event poller based on the trigger type.

View File

@@ -1,7 +1,7 @@
import sys
def is_db_command():
def is_db_command() -> bool:
if len(sys.argv) > 1 and sys.argv[0].endswith("flask") and sys.argv[1] == "db":
return True
return False
@@ -13,23 +13,12 @@ if is_db_command():
app = create_migrations_app()
else:
# It seems that JetBrains Python debugger does not work well with gevent,
# so we need to disable gevent in debug mode.
# If you are using debugpy and set GEVENT_SUPPORT=True, you can debug with gevent.
# if (flask_debug := os.environ.get("FLASK_DEBUG", "0")) and flask_debug.lower() in {"false", "0", "no"}:
# from gevent import monkey
# Gunicorn and Celery handle monkey patching automatically in production by
# specifying the `gevent` worker class. Manual monkey patching is not required here.
#
# # gevent
# monkey.patch_all()
# See `api/docker/entrypoint.sh` (lines 33 and 47) for details.
#
# from grpc.experimental import gevent as grpc_gevent # type: ignore
#
# # grpc gevent
# grpc_gevent.init_gevent()
# import psycogreen.gevent # type: ignore
#
# psycogreen.gevent.patch_psycopg()
# For third-party library patching, refer to `gunicorn.conf.py` and `celery_entrypoint.py`.
from app_factory import create_app

View File

@@ -1,6 +1,8 @@
import logging
import time
from opentelemetry.trace import get_current_span
from configs import dify_config
from contexts.wrapper import RecyclableContextVar
from dify_app import DifyApp
@@ -18,6 +20,7 @@ def create_flask_app_with_configs() -> DifyApp:
"""
dify_app = DifyApp(__name__)
dify_app.config.from_mapping(dify_config.model_dump())
dify_app.config["RESTX_INCLUDE_ALL_MODELS"] = True
# add before request hook
@dify_app.before_request
@@ -25,8 +28,25 @@ def create_flask_app_with_configs() -> DifyApp:
# add an unique identifier to each request
RecyclableContextVar.increment_thread_recycles()
# add after request hook for injecting X-Trace-Id header from OpenTelemetry span context
@dify_app.after_request
def add_trace_id_header(response):
try:
span = get_current_span()
ctx = span.get_span_context() if span else None
if ctx and ctx.is_valid:
trace_id_hex = format(ctx.trace_id, "032x")
# Avoid duplicates if some middleware added it
if "X-Trace-Id" not in response.headers:
response.headers["X-Trace-Id"] = trace_id_hex
except Exception:
# Never break the response due to tracing header injection
logger.warning("Failed to add trace ID to response header", exc_info=True)
return response
# Capture the decorator's return value to avoid pyright reportUnusedFunction
_ = before_request
_ = add_trace_id_header
return dify_app
@@ -50,6 +70,7 @@ def initialize_extensions(app: DifyApp):
ext_commands,
ext_compress,
ext_database,
ext_forward_refs,
ext_hosting_provider,
ext_import_modules,
ext_logging,
@@ -74,6 +95,7 @@ def initialize_extensions(app: DifyApp):
ext_warnings,
ext_import_modules,
ext_orjson,
ext_forward_refs,
ext_set_secretkey,
ext_compress,
ext_code_based_extension,

7
api/cnt_base.sh Executable file
View File

@@ -0,0 +1,7 @@
#!/bin/bash
set -euxo pipefail
for pattern in "Base" "TypeBase"; do
printf "%s " "$pattern"
grep "($pattern):" -r --include='*.py' --exclude-dir=".venv" --exclude-dir="tests" . | wc -l
done

View File

@@ -15,12 +15,12 @@ from sqlalchemy.orm import sessionmaker
from configs import dify_config
from constants.languages import languages
from core.helper import encrypter
from core.plugin.entities.plugin_daemon import CredentialType
from core.plugin.impl.plugin import PluginInstaller
from core.rag.datasource.vdb.vector_factory import Vector
from core.rag.datasource.vdb.vector_type import VectorType
from core.rag.index_processor.constant.built_in_field import BuiltInField
from core.rag.models.document import Document
from core.tools.entities.tool_entities import CredentialType
from core.tools.utils.system_oauth_encryption import encrypt_system_oauth_params
from events.app_event import app_was_created
from extensions.ext_database import db
@@ -321,6 +321,8 @@ def migrate_knowledge_vector_database():
)
datasets = db.paginate(select=stmt, page=page, per_page=50, max_per_page=50, error_out=False)
if not datasets.items:
break
except SQLAlchemyError:
raise
@@ -1227,6 +1229,55 @@ def setup_system_tool_oauth_client(provider, client_params):
click.echo(click.style(f"OAuth client params setup successfully. id: {oauth_client.id}", fg="green"))
@click.command("setup-system-trigger-oauth-client", help="Setup system trigger oauth client.")
@click.option("--provider", prompt=True, help="Provider name")
@click.option("--client-params", prompt=True, help="Client Params")
def setup_system_trigger_oauth_client(provider, client_params):
"""
Setup system trigger oauth client
"""
from models.provider_ids import TriggerProviderID
from models.trigger import TriggerOAuthSystemClient
provider_id = TriggerProviderID(provider)
provider_name = provider_id.provider_name
plugin_id = provider_id.plugin_id
try:
# json validate
click.echo(click.style(f"Validating client params: {client_params}", fg="yellow"))
client_params_dict = TypeAdapter(dict[str, Any]).validate_json(client_params)
click.echo(click.style("Client params validated successfully.", fg="green"))
click.echo(click.style(f"Encrypting client params: {client_params}", fg="yellow"))
click.echo(click.style(f"Using SECRET_KEY: `{dify_config.SECRET_KEY}`", fg="yellow"))
oauth_client_params = encrypt_system_oauth_params(client_params_dict)
click.echo(click.style("Client params encrypted successfully.", fg="green"))
except Exception as e:
click.echo(click.style(f"Error parsing client params: {str(e)}", fg="red"))
return
deleted_count = (
db.session.query(TriggerOAuthSystemClient)
.filter_by(
provider=provider_name,
plugin_id=plugin_id,
)
.delete()
)
if deleted_count > 0:
click.echo(click.style(f"Deleted {deleted_count} existing oauth client params.", fg="yellow"))
oauth_client = TriggerOAuthSystemClient(
provider=provider_name,
plugin_id=plugin_id,
encrypted_oauth_params=oauth_client_params,
)
db.session.add(oauth_client)
db.session.commit()
click.echo(click.style(f"OAuth client params setup successfully. id: {oauth_client.id}", fg="green"))
def _find_orphaned_draft_variables(batch_size: int = 1000) -> list[str]:
"""
Find draft variables that reference non-existent apps.
@@ -1420,7 +1471,10 @@ def setup_datasource_oauth_client(provider, client_params):
@click.command("transform-datasource-credentials", help="Transform datasource credentials.")
def transform_datasource_credentials():
@click.option(
"--environment", prompt=True, help="the environment to transform datasource credentials", default="online"
)
def transform_datasource_credentials(environment: str):
"""
Transform datasource credentials
"""
@@ -1431,9 +1485,14 @@ def transform_datasource_credentials():
notion_plugin_id = "langgenius/notion_datasource"
firecrawl_plugin_id = "langgenius/firecrawl_datasource"
jina_plugin_id = "langgenius/jina_datasource"
notion_plugin_unique_identifier = plugin_migration._fetch_plugin_unique_identifier(notion_plugin_id) # pyright: ignore[reportPrivateUsage]
firecrawl_plugin_unique_identifier = plugin_migration._fetch_plugin_unique_identifier(firecrawl_plugin_id) # pyright: ignore[reportPrivateUsage]
jina_plugin_unique_identifier = plugin_migration._fetch_plugin_unique_identifier(jina_plugin_id) # pyright: ignore[reportPrivateUsage]
if environment == "online":
notion_plugin_unique_identifier = plugin_migration._fetch_plugin_unique_identifier(notion_plugin_id) # pyright: ignore[reportPrivateUsage]
firecrawl_plugin_unique_identifier = plugin_migration._fetch_plugin_unique_identifier(firecrawl_plugin_id) # pyright: ignore[reportPrivateUsage]
jina_plugin_unique_identifier = plugin_migration._fetch_plugin_unique_identifier(jina_plugin_id) # pyright: ignore[reportPrivateUsage]
else:
notion_plugin_unique_identifier = None
firecrawl_plugin_unique_identifier = None
jina_plugin_unique_identifier = None
oauth_credential_type = CredentialType.OAUTH2
api_key_credential_type = CredentialType.API_KEY
@@ -1521,6 +1580,14 @@ def transform_datasource_credentials():
auth_count = 0
for firecrawl_tenant_credential in firecrawl_tenant_credentials:
auth_count += 1
if not firecrawl_tenant_credential.credentials:
click.echo(
click.style(
f"Skipping firecrawl credential for tenant {tenant_id} due to missing credentials.",
fg="yellow",
)
)
continue
# get credential api key
credentials_json = json.loads(firecrawl_tenant_credential.credentials)
api_key = credentials_json.get("config", {}).get("api_key")
@@ -1576,6 +1643,14 @@ def transform_datasource_credentials():
auth_count = 0
for jina_tenant_credential in jina_tenant_credentials:
auth_count += 1
if not jina_tenant_credential.credentials:
click.echo(
click.style(
f"Skipping jina credential for tenant {tenant_id} due to missing credentials.",
fg="yellow",
)
)
continue
# get credential api key
credentials_json = json.loads(jina_tenant_credential.credentials)
api_key = credentials_json.get("config", {}).get("api_key")
@@ -1583,7 +1658,7 @@ def transform_datasource_credentials():
"integration_secret": api_key,
}
datasource_provider = DatasourceProvider(
provider="jina",
provider="jinareader",
tenant_id=tenant_id,
plugin_id=jina_plugin_id,
auth_type=api_key_credential_type.value,

View File

@@ -73,14 +73,14 @@ class AppExecutionConfig(BaseSettings):
description="Maximum allowed execution time for the application in seconds",
default=1200,
)
APP_DEFAULT_ACTIVE_REQUESTS: NonNegativeInt = Field(
description="Default number of concurrent active requests per app (0 for unlimited)",
default=0,
)
APP_MAX_ACTIVE_REQUESTS: NonNegativeInt = Field(
description="Maximum number of concurrent active requests per app (0 for unlimited)",
default=0,
)
APP_DAILY_RATE_LIMIT: NonNegativeInt = Field(
description="Maximum number of requests per app per day",
default=5000,
)
class CodeExecutionSandboxConfig(BaseSettings):
@@ -150,7 +150,7 @@ class CodeExecutionSandboxConfig(BaseSettings):
CODE_MAX_STRING_LENGTH: PositiveInt = Field(
description="Maximum allowed length for strings in code execution",
default=80000,
default=400_000,
)
CODE_MAX_STRING_ARRAY_LENGTH: PositiveInt = Field(
@@ -174,6 +174,33 @@ class CodeExecutionSandboxConfig(BaseSettings):
)
class TriggerConfig(BaseSettings):
"""
Configuration for trigger
"""
WEBHOOK_REQUEST_BODY_MAX_SIZE: PositiveInt = Field(
description="Maximum allowed size for webhook request bodies in bytes",
default=10485760,
)
class AsyncWorkflowConfig(BaseSettings):
"""
Configuration for async workflow
"""
ASYNC_WORKFLOW_SCHEDULER_GRANULARITY: int = Field(
description="Granularity for async workflow scheduler, "
"sometime, few users could block the queue due to some time-consuming tasks, "
"to avoid this, workflow can be suspended if needed, to achieve"
"this, a time-based checker is required, every granularity seconds, "
"the checker will check the workflow queue and suspend the workflow",
default=120,
ge=1,
)
class PluginConfig(BaseSettings):
"""
Plugin configs
@@ -189,6 +216,11 @@ class PluginConfig(BaseSettings):
default="plugin-api-key",
)
PLUGIN_DAEMON_TIMEOUT: PositiveFloat | None = Field(
description="Timeout in seconds for requests to the plugin daemon (set to None to disable)",
default=300.0,
)
INNER_API_KEY_FOR_PLUGIN: str = Field(description="Inner api key for plugin", default="inner-api-key")
PLUGIN_REMOTE_INSTALL_HOST: str = Field(
@@ -258,6 +290,8 @@ class EndpointConfig(BaseSettings):
description="Template url for endpoint plugin", default="http://localhost:5002/e/{hook_id}"
)
TRIGGER_URL: str = Field(description="Template url for triggers", default="http://localhost:5001")
class FileAccessConfig(BaseSettings):
"""
@@ -326,12 +360,42 @@ class FileUploadConfig(BaseSettings):
default=10,
)
inner_UPLOAD_FILE_EXTENSION_BLACKLIST: str = Field(
description=(
"Comma-separated list of file extensions that are blocked from upload. "
"Extensions should be lowercase without dots (e.g., 'exe,bat,sh,dll'). "
"Empty by default to allow all file types."
),
validation_alias=AliasChoices("UPLOAD_FILE_EXTENSION_BLACKLIST"),
default="",
)
@computed_field # type: ignore[misc]
@property
def UPLOAD_FILE_EXTENSION_BLACKLIST(self) -> set[str]:
"""
Parse and return the blacklist as a set of lowercase extensions.
Returns an empty set if no blacklist is configured.
"""
if not self.inner_UPLOAD_FILE_EXTENSION_BLACKLIST:
return set()
return {
ext.strip().lower().strip(".")
for ext in self.inner_UPLOAD_FILE_EXTENSION_BLACKLIST.split(",")
if ext.strip()
}
class HttpConfig(BaseSettings):
"""
HTTP-related configurations for the application
"""
COOKIE_DOMAIN: str = Field(
description="Explicit cookie domain for console/service cookies when sharing across subdomains",
default="",
)
API_COMPRESSION_ENABLED: bool = Field(
description="Enable or disable gzip compression for HTTP responses",
default=False,
@@ -362,11 +426,11 @@ class HttpConfig(BaseSettings):
)
HTTP_REQUEST_MAX_READ_TIMEOUT: int = Field(
ge=1, description="Maximum read timeout in seconds for HTTP requests", default=60
ge=1, description="Maximum read timeout in seconds for HTTP requests", default=600
)
HTTP_REQUEST_MAX_WRITE_TIMEOUT: int = Field(
ge=1, description="Maximum write timeout in seconds for HTTP requests", default=20
ge=1, description="Maximum write timeout in seconds for HTTP requests", default=600
)
HTTP_REQUEST_NODE_MAX_BINARY_SIZE: PositiveInt = Field(
@@ -489,7 +553,10 @@ class LoggingConfig(BaseSettings):
LOG_FORMAT: str = Field(
description="Format string for log messages",
default="%(asctime)s.%(msecs)03d %(levelname)s [%(threadName)s] [%(filename)s:%(lineno)d] - %(message)s",
default=(
"%(asctime)s.%(msecs)03d %(levelname)s [%(threadName)s] "
"[%(filename)s:%(lineno)d] %(trace_id)s - %(message)s"
),
)
LOG_DATEFORMAT: str | None = Field(
@@ -543,7 +610,7 @@ class UpdateConfig(BaseSettings):
class WorkflowVariableTruncationConfig(BaseSettings):
WORKFLOW_VARIABLE_TRUNCATION_MAX_SIZE: PositiveInt = Field(
# 100KB
# 1000 KiB
1024_000,
description="Maximum size for variable to trigger final truncation.",
)
@@ -582,6 +649,11 @@ class WorkflowConfig(BaseSettings):
default=200 * 1024,
)
TEMPLATE_TRANSFORM_MAX_LENGTH: PositiveInt = Field(
description="Maximum number of characters allowed in Template Transform node output",
default=400_000,
)
# GraphEngine Worker Pool Configuration
GRAPH_ENGINE_MIN_WORKERS: PositiveInt = Field(
description="Minimum number of workers per GraphEngine instance",
@@ -766,7 +838,7 @@ class MailConfig(BaseSettings):
MAIL_TEMPLATING_TIMEOUT: int = Field(
description="""
Timeout for email templating in seconds. Used to prevent infinite loops in malicious templates.
Timeout for email templating in seconds. Used to prevent infinite loops in malicious templates.
Only available in sandbox mode.""",
default=3,
)
@@ -905,6 +977,11 @@ class DataSetConfig(BaseSettings):
default=True,
)
DATASET_MAX_SEGMENTS_PER_REQUEST: NonNegativeInt = Field(
description="Maximum number of segments for dataset segments API (0 for unlimited)",
default=0,
)
class WorkspaceConfig(BaseSettings):
"""
@@ -980,6 +1057,44 @@ class CeleryScheduleTasksConfig(BaseSettings):
description="Enable check upgradable plugin task",
default=True,
)
ENABLE_WORKFLOW_SCHEDULE_POLLER_TASK: bool = Field(
description="Enable workflow schedule poller task",
default=True,
)
WORKFLOW_SCHEDULE_POLLER_INTERVAL: int = Field(
description="Workflow schedule poller interval in minutes",
default=1,
)
WORKFLOW_SCHEDULE_POLLER_BATCH_SIZE: int = Field(
description="Maximum number of schedules to process in each poll batch",
default=100,
)
WORKFLOW_SCHEDULE_MAX_DISPATCH_PER_TICK: int = Field(
description="Maximum schedules to dispatch per tick (0=unlimited, circuit breaker)",
default=0,
)
# Trigger provider refresh (simple version)
ENABLE_TRIGGER_PROVIDER_REFRESH_TASK: bool = Field(
description="Enable trigger provider refresh poller",
default=True,
)
TRIGGER_PROVIDER_REFRESH_INTERVAL: int = Field(
description="Trigger provider refresh poller interval in minutes",
default=1,
)
TRIGGER_PROVIDER_REFRESH_BATCH_SIZE: int = Field(
description="Max trigger subscriptions to process per tick",
default=200,
)
TRIGGER_PROVIDER_CREDENTIAL_THRESHOLD_SECONDS: int = Field(
description="Proactive credential refresh threshold in seconds",
default=60 * 60,
)
TRIGGER_PROVIDER_SUBSCRIPTION_THRESHOLD_SECONDS: int = Field(
description="Proactive subscription refresh threshold in seconds",
default=60 * 60,
)
class PositionConfig(BaseSettings):
@@ -1078,7 +1193,7 @@ class AccountConfig(BaseSettings):
class WorkflowLogConfig(BaseSettings):
WORKFLOW_LOG_CLEANUP_ENABLED: bool = Field(default=True, description="Enable workflow run log cleanup")
WORKFLOW_LOG_CLEANUP_ENABLED: bool = Field(default=False, description="Enable workflow run log cleanup")
WORKFLOW_LOG_RETENTION_DAYS: int = Field(default=30, description="Retention days for workflow run logs")
WORKFLOW_LOG_CLEANUP_BATCH_SIZE: int = Field(
default=100, description="Batch size for workflow run log cleanup operations"
@@ -1097,12 +1212,21 @@ class SwaggerUIConfig(BaseSettings):
)
class TenantIsolatedTaskQueueConfig(BaseSettings):
TENANT_ISOLATED_TASK_CONCURRENCY: int = Field(
description="Number of tasks allowed to be delivered concurrently from isolated queue per tenant",
default=1,
)
class FeatureConfig(
# place the configs in alphabet order
AppExecutionConfig,
AuthConfig, # Changed from OAuthConfig to AuthConfig
BillingConfig,
CodeExecutionSandboxConfig,
TriggerConfig,
AsyncWorkflowConfig,
PluginConfig,
MarketplaceConfig,
DataSetConfig,
@@ -1121,6 +1245,7 @@ class FeatureConfig(
RagEtlConfig,
RepositoryConfig,
SecurityConfig,
TenantIsolatedTaskQueueConfig,
ToolConfig,
UpdateConfig,
WorkflowConfig,

View File

@@ -18,6 +18,7 @@ from .storage.opendal_storage_config import OpenDALStorageConfig
from .storage.supabase_storage_config import SupabaseStorageConfig
from .storage.tencent_cos_storage_config import TencentCloudCOSStorageConfig
from .storage.volcengine_tos_storage_config import VolcengineTOSStorageConfig
from .vdb.alibabacloud_mysql_config import AlibabaCloudMySQLConfig
from .vdb.analyticdb_config import AnalyticdbConfig
from .vdb.baidu_vector_config import BaiduVectorDBConfig
from .vdb.chroma_config import ChromaConfig
@@ -104,6 +105,12 @@ class KeywordStoreConfig(BaseSettings):
class DatabaseConfig(BaseSettings):
# Database type selector
DB_TYPE: Literal["postgresql", "mysql", "oceanbase"] = Field(
description="Database type to use. OceanBase is MySQL-compatible.",
default="postgresql",
)
DB_HOST: str = Field(
description="Hostname or IP address of the database server.",
default="localhost",
@@ -139,12 +146,12 @@ class DatabaseConfig(BaseSettings):
default="",
)
SQLALCHEMY_DATABASE_URI_SCHEME: str = Field(
description="Database URI scheme for SQLAlchemy connection.",
default="postgresql",
)
@computed_field # type: ignore[prop-decorator]
@property
def SQLALCHEMY_DATABASE_URI_SCHEME(self) -> str:
return "postgresql" if self.DB_TYPE == "postgresql" else "mysql+pymysql"
@computed_field # type: ignore[misc]
@computed_field # type: ignore[prop-decorator]
@property
def SQLALCHEMY_DATABASE_URI(self) -> str:
db_extras = (
@@ -197,21 +204,21 @@ class DatabaseConfig(BaseSettings):
default=os.cpu_count() or 1,
)
@computed_field # type: ignore[misc]
@computed_field # type: ignore[prop-decorator]
@property
def SQLALCHEMY_ENGINE_OPTIONS(self) -> dict[str, Any]:
# Parse DB_EXTRAS for 'options'
db_extras_dict = dict(parse_qsl(self.DB_EXTRAS))
options = db_extras_dict.get("options", "")
# Always include timezone
timezone_opt = "-c timezone=UTC"
if options:
# Merge user options and timezone
merged_options = f"{options} {timezone_opt}"
else:
merged_options = timezone_opt
connect_args = {"options": merged_options}
connect_args = {}
# Use the dynamic SQLALCHEMY_DATABASE_URI_SCHEME property
if self.SQLALCHEMY_DATABASE_URI_SCHEME.startswith("postgresql"):
timezone_opt = "-c timezone=UTC"
if options:
merged_options = f"{options} {timezone_opt}"
else:
merged_options = timezone_opt
connect_args = {"options": merged_options}
return {
"pool_size": self.SQLALCHEMY_POOL_SIZE,
@@ -330,6 +337,7 @@ class MiddlewareConfig(
ClickzettaConfig,
HuaweiCloudConfig,
MilvusConfig,
AlibabaCloudMySQLConfig,
MyScaleConfig,
OpenSearchConfig,
OracleConfig,

View File

@@ -0,0 +1,54 @@
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class AlibabaCloudMySQLConfig(BaseSettings):
"""
Configuration settings for AlibabaCloud MySQL vector database
"""
ALIBABACLOUD_MYSQL_HOST: str = Field(
description="Hostname or IP address of the AlibabaCloud MySQL server (e.g., 'localhost' or 'mysql.aliyun.com')",
default="localhost",
)
ALIBABACLOUD_MYSQL_PORT: PositiveInt = Field(
description="Port number on which the AlibabaCloud MySQL server is listening (default is 3306)",
default=3306,
)
ALIBABACLOUD_MYSQL_USER: str = Field(
description="Username for authenticating with AlibabaCloud MySQL (default is 'root')",
default="root",
)
ALIBABACLOUD_MYSQL_PASSWORD: str = Field(
description="Password for authenticating with AlibabaCloud MySQL (default is an empty string)",
default="",
)
ALIBABACLOUD_MYSQL_DATABASE: str = Field(
description="Name of the AlibabaCloud MySQL database to connect to (default is 'dify')",
default="dify",
)
ALIBABACLOUD_MYSQL_MAX_CONNECTION: PositiveInt = Field(
description="Maximum number of connections in the connection pool",
default=5,
)
ALIBABACLOUD_MYSQL_CHARSET: str = Field(
description="Character set for AlibabaCloud MySQL connection (default is 'utf8mb4')",
default="utf8mb4",
)
ALIBABACLOUD_MYSQL_DISTANCE_FUNCTION: str = Field(
description="Distance function used for vector similarity search in AlibabaCloud MySQL "
"(e.g., 'cosine', 'euclidean')",
default="cosine",
)
ALIBABACLOUD_MYSQL_HNSW_M: PositiveInt = Field(
description="Maximum number of connections per layer for HNSW vector index (default is 6, range: 3-200)",
default=6,
)

View File

@@ -1,23 +1,24 @@
from enum import Enum
from enum import StrEnum
from typing import Literal
from pydantic import Field, PositiveInt
from pydantic_settings import BaseSettings
class AuthMethod(StrEnum):
"""
Authentication method for OpenSearch
"""
BASIC = "basic"
AWS_MANAGED_IAM = "aws_managed_iam"
class OpenSearchConfig(BaseSettings):
"""
Configuration settings for OpenSearch
"""
class AuthMethod(Enum):
"""
Authentication method for OpenSearch
"""
BASIC = "basic"
AWS_MANAGED_IAM = "aws_managed_iam"
OPENSEARCH_HOST: str | None = Field(
description="Hostname or IP address of the OpenSearch server (e.g., 'localhost' or 'opensearch.example.com')",
default=None,

View File

@@ -22,7 +22,17 @@ class WeaviateConfig(BaseSettings):
default=True,
)
WEAVIATE_GRPC_ENDPOINT: str | None = Field(
description="URL of the Weaviate gRPC server (e.g., 'grpc://localhost:50051' or 'grpcs://weaviate.example.com:443')",
default=None,
)
WEAVIATE_BATCH_SIZE: PositiveInt = Field(
description="Number of objects to be processed in a single batch operation (default is 100)",
default=100,
)
WEAVIATE_TOKENIZATION: str | None = Field(
description="Tokenization for Weaviate (default is word)",
default="word",
)

View File

@@ -1,4 +1,5 @@
from configs import dify_config
from libs.collection_utils import convert_to_lower_and_upper_set
HIDDEN_VALUE = "[__HIDDEN__]"
UNKNOWN_VALUE = "[__UNKNOWN__]"
@@ -6,24 +7,39 @@ UUID_NIL = "00000000-0000-0000-0000-000000000000"
DEFAULT_FILE_NUMBER_LIMITS = 3
IMAGE_EXTENSIONS = ["jpg", "jpeg", "png", "webp", "gif", "svg"]
IMAGE_EXTENSIONS.extend([ext.upper() for ext in IMAGE_EXTENSIONS])
IMAGE_EXTENSIONS = convert_to_lower_and_upper_set({"jpg", "jpeg", "png", "webp", "gif", "svg"})
VIDEO_EXTENSIONS = ["mp4", "mov", "mpeg", "webm"]
VIDEO_EXTENSIONS.extend([ext.upper() for ext in VIDEO_EXTENSIONS])
VIDEO_EXTENSIONS = convert_to_lower_and_upper_set({"mp4", "mov", "mpeg", "webm"})
AUDIO_EXTENSIONS = ["mp3", "m4a", "wav", "amr", "mpga"]
AUDIO_EXTENSIONS.extend([ext.upper() for ext in AUDIO_EXTENSIONS])
AUDIO_EXTENSIONS = convert_to_lower_and_upper_set({"mp3", "m4a", "wav", "amr", "mpga"})
_doc_extensions: list[str]
_doc_extensions: set[str]
if dify_config.ETL_TYPE == "Unstructured":
_doc_extensions = ["txt", "markdown", "md", "mdx", "pdf", "html", "htm", "xlsx", "xls", "vtt", "properties"]
_doc_extensions.extend(("doc", "docx", "csv", "eml", "msg", "pptx", "xml", "epub"))
_doc_extensions = {
"txt",
"markdown",
"md",
"mdx",
"pdf",
"html",
"htm",
"xlsx",
"xls",
"vtt",
"properties",
"doc",
"docx",
"csv",
"eml",
"msg",
"pptx",
"xml",
"epub",
}
if dify_config.UNSTRUCTURED_API_URL:
_doc_extensions.append("ppt")
_doc_extensions.add("ppt")
else:
_doc_extensions = [
_doc_extensions = {
"txt",
"markdown",
"md",
@@ -37,5 +53,18 @@ else:
"csv",
"vtt",
"properties",
]
DOCUMENT_EXTENSIONS = _doc_extensions + [ext.upper() for ext in _doc_extensions]
}
DOCUMENT_EXTENSIONS: set[str] = convert_to_lower_and_upper_set(_doc_extensions)
# console
COOKIE_NAME_ACCESS_TOKEN = "access_token"
COOKIE_NAME_REFRESH_TOKEN = "refresh_token"
COOKIE_NAME_CSRF_TOKEN = "csrf_token"
# webapp
COOKIE_NAME_WEBAPP_ACCESS_TOKEN = "webapp_access_token"
COOKIE_NAME_PASSPORT = "passport"
HEADER_NAME_CSRF_TOKEN = "X-CSRF-Token"
HEADER_NAME_APP_CODE = "X-App-Code"
HEADER_NAME_PASSPORT = "X-App-Passport"

View File

@@ -31,3 +31,9 @@ def supported_language(lang):
error = f"{lang} is not a valid language."
raise ValueError(error)
def get_valid_language(lang: str | None) -> str:
if lang and lang in languages:
return lang
return languages[0]

File diff suppressed because one or more lines are too long

View File

@@ -9,6 +9,7 @@ if TYPE_CHECKING:
from core.model_runtime.entities.model_entities import AIModelEntity
from core.plugin.entities.plugin_daemon import PluginModelProviderEntity
from core.tools.plugin_tool.provider import PluginToolProviderController
from core.trigger.provider import PluginTriggerProviderController
"""
@@ -41,3 +42,11 @@ datasource_plugin_providers: RecyclableContextVar[dict[str, "DatasourcePluginPro
datasource_plugin_providers_lock: RecyclableContextVar[Lock] = RecyclableContextVar(
ContextVar("datasource_plugin_providers_lock")
)
plugin_trigger_providers: RecyclableContextVar[dict[str, "PluginTriggerProviderController"]] = RecyclableContextVar(
ContextVar("plugin_trigger_providers")
)
plugin_trigger_providers_lock: RecyclableContextVar[Lock] = RecyclableContextVar(
ContextVar("plugin_trigger_providers_lock")
)

View File

@@ -25,6 +25,12 @@ class UnsupportedFileTypeError(BaseHTTPException):
code = 415
class BlockedFileExtensionError(BaseHTTPException):
error_code = "file_extension_blocked"
description = "The file extension is blocked for security reasons."
code = 400
class TooManyFilesError(BaseHTTPException):
error_code = "too_many_files"
description = "Only one file is allowed."

View File

@@ -24,7 +24,7 @@ except ImportError:
)
else:
warnings.warn("To use python-magic guess MIMETYPE, you need to install `libmagic`", stacklevel=2)
magic = None # type: ignore
magic = None # type: ignore[assignment]
from pydantic import BaseModel

View File

@@ -66,6 +66,7 @@ from .app import (
workflow_draft_variable,
workflow_run,
workflow_statistic,
workflow_trigger,
)
# Import auth controllers
@@ -126,6 +127,7 @@ from .workspace import (
models,
plugin,
tool_providers,
trigger_providers,
workspace,
)
@@ -196,6 +198,7 @@ __all__ = [
"statistic",
"tags",
"tool_providers",
"trigger_providers",
"version",
"website",
"workflow",
@@ -203,5 +206,6 @@ __all__ = [
"workflow_draft_variable",
"workflow_run",
"workflow_statistic",
"workflow_trigger",
"workspace",
]

View File

@@ -12,9 +12,10 @@ P = ParamSpec("P")
R = TypeVar("R")
from configs import dify_config
from constants.languages import supported_language
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.wraps import only_edition_cloud
from extensions.ext_database import db
from libs.token import extract_access_token
from models.model import App, InstalledApp, RecommendedApp
@@ -24,19 +25,9 @@ def admin_required(view: Callable[P, R]):
if not dify_config.ADMIN_API_KEY:
raise Unauthorized("API key is invalid.")
auth_header = request.headers.get("Authorization")
if auth_header is None:
auth_token = extract_access_token(request)
if not auth_token:
raise Unauthorized("Authorization header is missing.")
if " " not in auth_header:
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
auth_scheme, auth_token = auth_header.split(None, 1)
auth_scheme = auth_scheme.lower()
if auth_scheme != "bearer":
raise Unauthorized("Invalid Authorization header format. Expected 'Bearer <api-key>' format.")
if auth_token != dify_config.ADMIN_API_KEY:
raise Unauthorized("API key is invalid.")
@@ -47,10 +38,10 @@ def admin_required(view: Callable[P, R]):
@console_ns.route("/admin/insert-explore-apps")
class InsertExploreAppListApi(Resource):
@api.doc("insert_explore_app")
@api.doc(description="Insert or update an app in the explore list")
@api.expect(
api.model(
@console_ns.doc("insert_explore_app")
@console_ns.doc(description="Insert or update an app in the explore list")
@console_ns.expect(
console_ns.model(
"InsertExploreAppRequest",
{
"app_id": fields.String(required=True, description="Application ID"),
@@ -64,21 +55,23 @@ class InsertExploreAppListApi(Resource):
},
)
)
@api.response(200, "App updated successfully")
@api.response(201, "App inserted successfully")
@api.response(404, "App not found")
@console_ns.response(200, "App updated successfully")
@console_ns.response(201, "App inserted successfully")
@console_ns.response(404, "App not found")
@only_edition_cloud
@admin_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("app_id", type=str, required=True, nullable=False, location="json")
parser.add_argument("desc", type=str, location="json")
parser.add_argument("copyright", type=str, location="json")
parser.add_argument("privacy_policy", type=str, location="json")
parser.add_argument("custom_disclaimer", type=str, location="json")
parser.add_argument("language", type=supported_language, required=True, nullable=False, location="json")
parser.add_argument("category", type=str, required=True, nullable=False, location="json")
parser.add_argument("position", type=int, required=True, nullable=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("app_id", type=str, required=True, nullable=False, location="json")
.add_argument("desc", type=str, location="json")
.add_argument("copyright", type=str, location="json")
.add_argument("privacy_policy", type=str, location="json")
.add_argument("custom_disclaimer", type=str, location="json")
.add_argument("language", type=supported_language, required=True, nullable=False, location="json")
.add_argument("category", type=str, required=True, nullable=False, location="json")
.add_argument("position", type=int, required=True, nullable=False, location="json")
)
args = parser.parse_args()
app = db.session.execute(select(App).where(App.id == args["app_id"])).scalar_one_or_none()
@@ -138,10 +131,10 @@ class InsertExploreAppListApi(Resource):
@console_ns.route("/admin/insert-explore-apps/<uuid:app_id>")
class InsertExploreAppApi(Resource):
@api.doc("delete_explore_app")
@api.doc(description="Remove an app from the explore list")
@api.doc(params={"app_id": "Application ID to remove"})
@api.response(204, "App removed successfully")
@console_ns.doc("delete_explore_app")
@console_ns.doc(description="Remove an app from the explore list")
@console_ns.doc(params={"app_id": "Application ID to remove"})
@console_ns.response(204, "App removed successfully")
@only_edition_cloud
@admin_required
def delete(self, app_id):

View File

@@ -1,5 +1,4 @@
import flask_restx
from flask_login import current_user
from flask_restx import Resource, fields, marshal_with
from flask_restx._http import HTTPStatus
from sqlalchemy import select
@@ -8,12 +7,12 @@ from werkzeug.exceptions import Forbidden
from extensions.ext_database import db
from libs.helper import TimestampField
from libs.login import login_required
from libs.login import current_account_with_tenant, login_required
from models.dataset import Dataset
from models.model import ApiToken, App
from . import api, console_ns
from .wraps import account_initialization_required, setup_required
from . import console_ns
from .wraps import account_initialization_required, edit_permission_required, setup_required
api_key_fields = {
"id": fields.String,
@@ -25,6 +24,12 @@ api_key_fields = {
api_key_list = {"data": fields.List(fields.Nested(api_key_fields), attribute="items")}
api_key_item_model = console_ns.model("ApiKeyItem", api_key_fields)
api_key_list_model = console_ns.model(
"ApiKeyList", {"data": fields.List(fields.Nested(api_key_item_model), attribute="items")}
)
def _get_resource(resource_id, tenant_id, resource_model):
if resource_model == App:
@@ -53,11 +58,13 @@ class BaseApiKeyListResource(Resource):
token_prefix: str | None = None
max_keys = 10
@marshal_with(api_key_list)
@marshal_with(api_key_list_model)
def get(self, resource_id):
assert self.resource_id_field is not None, "resource_id_field must be set"
resource_id = str(resource_id)
_get_resource(resource_id, current_user.current_tenant_id, self.resource_model)
_, current_tenant_id = current_account_with_tenant()
_get_resource(resource_id, current_tenant_id, self.resource_model)
keys = db.session.scalars(
select(ApiToken).where(
ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id
@@ -65,14 +72,13 @@ class BaseApiKeyListResource(Resource):
).all()
return {"items": keys}
@marshal_with(api_key_fields)
@marshal_with(api_key_item_model)
@edit_permission_required
def post(self, resource_id):
assert self.resource_id_field is not None, "resource_id_field must be set"
resource_id = str(resource_id)
_get_resource(resource_id, current_user.current_tenant_id, self.resource_model)
if not current_user.is_editor:
raise Forbidden()
_, current_tenant_id = current_account_with_tenant()
_get_resource(resource_id, current_tenant_id, self.resource_model)
current_key_count = (
db.session.query(ApiToken)
.where(ApiToken.type == self.resource_type, getattr(ApiToken, self.resource_id_field) == resource_id)
@@ -89,7 +95,7 @@ class BaseApiKeyListResource(Resource):
key = ApiToken.generate_api_key(self.token_prefix or "", 24)
api_token = ApiToken()
setattr(api_token, self.resource_id_field, resource_id)
api_token.tenant_id = current_user.current_tenant_id
api_token.tenant_id = current_tenant_id
api_token.token = key
api_token.type = self.resource_type
db.session.add(api_token)
@@ -104,13 +110,11 @@ class BaseApiKeyResource(Resource):
resource_model: type | None = None
resource_id_field: str | None = None
def delete(self, resource_id, api_key_id):
def delete(self, resource_id: str, api_key_id: str):
assert self.resource_id_field is not None, "resource_id_field must be set"
resource_id = str(resource_id)
api_key_id = str(api_key_id)
_get_resource(resource_id, current_user.current_tenant_id, self.resource_model)
current_user, current_tenant_id = current_account_with_tenant()
_get_resource(resource_id, current_tenant_id, self.resource_model)
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
raise Forbidden()
@@ -135,28 +139,23 @@ class BaseApiKeyResource(Resource):
@console_ns.route("/apps/<uuid:resource_id>/api-keys")
class AppApiKeyListResource(BaseApiKeyListResource):
@api.doc("get_app_api_keys")
@api.doc(description="Get all API keys for an app")
@api.doc(params={"resource_id": "App ID"})
@api.response(200, "Success", api_key_list)
def get(self, resource_id):
@console_ns.doc("get_app_api_keys")
@console_ns.doc(description="Get all API keys for an app")
@console_ns.doc(params={"resource_id": "App ID"})
@console_ns.response(200, "Success", api_key_list_model)
def get(self, resource_id): # type: ignore
"""Get all API keys for an app"""
return super().get(resource_id)
@api.doc("create_app_api_key")
@api.doc(description="Create a new API key for an app")
@api.doc(params={"resource_id": "App ID"})
@api.response(201, "API key created successfully", api_key_fields)
@api.response(400, "Maximum keys exceeded")
def post(self, resource_id):
@console_ns.doc("create_app_api_key")
@console_ns.doc(description="Create a new API key for an app")
@console_ns.doc(params={"resource_id": "App ID"})
@console_ns.response(201, "API key created successfully", api_key_item_model)
@console_ns.response(400, "Maximum keys exceeded")
def post(self, resource_id): # type: ignore
"""Create a new API key for an app"""
return super().post(resource_id)
def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
return resp
resource_type = "app"
resource_model = App
resource_id_field = "app_id"
@@ -165,19 +164,14 @@ class AppApiKeyListResource(BaseApiKeyListResource):
@console_ns.route("/apps/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
class AppApiKeyResource(BaseApiKeyResource):
@api.doc("delete_app_api_key")
@api.doc(description="Delete an API key for an app")
@api.doc(params={"resource_id": "App ID", "api_key_id": "API key ID"})
@api.response(204, "API key deleted successfully")
@console_ns.doc("delete_app_api_key")
@console_ns.doc(description="Delete an API key for an app")
@console_ns.doc(params={"resource_id": "App ID", "api_key_id": "API key ID"})
@console_ns.response(204, "API key deleted successfully")
def delete(self, resource_id, api_key_id):
"""Delete an API key for an app"""
return super().delete(resource_id, api_key_id)
def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
return resp
resource_type = "app"
resource_model = App
resource_id_field = "app_id"
@@ -185,28 +179,23 @@ class AppApiKeyResource(BaseApiKeyResource):
@console_ns.route("/datasets/<uuid:resource_id>/api-keys")
class DatasetApiKeyListResource(BaseApiKeyListResource):
@api.doc("get_dataset_api_keys")
@api.doc(description="Get all API keys for a dataset")
@api.doc(params={"resource_id": "Dataset ID"})
@api.response(200, "Success", api_key_list)
def get(self, resource_id):
@console_ns.doc("get_dataset_api_keys")
@console_ns.doc(description="Get all API keys for a dataset")
@console_ns.doc(params={"resource_id": "Dataset ID"})
@console_ns.response(200, "Success", api_key_list_model)
def get(self, resource_id): # type: ignore
"""Get all API keys for a dataset"""
return super().get(resource_id)
@api.doc("create_dataset_api_key")
@api.doc(description="Create a new API key for a dataset")
@api.doc(params={"resource_id": "Dataset ID"})
@api.response(201, "API key created successfully", api_key_fields)
@api.response(400, "Maximum keys exceeded")
def post(self, resource_id):
@console_ns.doc("create_dataset_api_key")
@console_ns.doc(description="Create a new API key for a dataset")
@console_ns.doc(params={"resource_id": "Dataset ID"})
@console_ns.response(201, "API key created successfully", api_key_item_model)
@console_ns.response(400, "Maximum keys exceeded")
def post(self, resource_id): # type: ignore
"""Create a new API key for a dataset"""
return super().post(resource_id)
def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
return resp
resource_type = "dataset"
resource_model = Dataset
resource_id_field = "dataset_id"
@@ -215,19 +204,14 @@ class DatasetApiKeyListResource(BaseApiKeyListResource):
@console_ns.route("/datasets/<uuid:resource_id>/api-keys/<uuid:api_key_id>")
class DatasetApiKeyResource(BaseApiKeyResource):
@api.doc("delete_dataset_api_key")
@api.doc(description="Delete an API key for a dataset")
@api.doc(params={"resource_id": "Dataset ID", "api_key_id": "API key ID"})
@api.response(204, "API key deleted successfully")
@console_ns.doc("delete_dataset_api_key")
@console_ns.doc(description="Delete an API key for a dataset")
@console_ns.doc(params={"resource_id": "Dataset ID", "api_key_id": "API key ID"})
@console_ns.response(204, "API key deleted successfully")
def delete(self, resource_id, api_key_id):
"""Delete an API key for a dataset"""
return super().delete(resource_id, api_key_id)
def after_request(self, resp):
resp.headers["Access-Control-Allow-Origin"] = "*"
resp.headers["Access-Control-Allow-Credentials"] = "true"
return resp
resource_type = "dataset"
resource_model = Dataset
resource_id_field = "dataset_id"

View File

@@ -1,35 +1,39 @@
from flask_restx import Resource, fields, reqparse
from flask import request
from flask_restx import Resource, fields
from pydantic import BaseModel, Field
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.wraps import account_initialization_required, setup_required
from libs.login import login_required
from services.advanced_prompt_template_service import AdvancedPromptTemplateService
class AdvancedPromptTemplateQuery(BaseModel):
app_mode: str = Field(..., description="Application mode")
model_mode: str = Field(..., description="Model mode")
has_context: str = Field(default="true", description="Whether has context")
model_name: str = Field(..., description="Model name")
console_ns.schema_model(
AdvancedPromptTemplateQuery.__name__,
AdvancedPromptTemplateQuery.model_json_schema(ref_template="#/definitions/{model}"),
)
@console_ns.route("/app/prompt-templates")
class AdvancedPromptTemplateList(Resource):
@api.doc("get_advanced_prompt_templates")
@api.doc(description="Get advanced prompt templates based on app mode and model configuration")
@api.expect(
api.parser()
.add_argument("app_mode", type=str, required=True, location="args", help="Application mode")
.add_argument("model_mode", type=str, required=True, location="args", help="Model mode")
.add_argument("has_context", type=str, default="true", location="args", help="Whether has context")
.add_argument("model_name", type=str, required=True, location="args", help="Model name")
)
@api.response(
@console_ns.doc("get_advanced_prompt_templates")
@console_ns.doc(description="Get advanced prompt templates based on app mode and model configuration")
@console_ns.expect(console_ns.models[AdvancedPromptTemplateQuery.__name__])
@console_ns.response(
200, "Prompt templates retrieved successfully", fields.List(fields.Raw(description="Prompt template data"))
)
@api.response(400, "Invalid request parameters")
@console_ns.response(400, "Invalid request parameters")
@setup_required
@login_required
@account_initialization_required
def get(self):
parser = reqparse.RequestParser()
parser.add_argument("app_mode", type=str, required=True, location="args")
parser.add_argument("model_mode", type=str, required=True, location="args")
parser.add_argument("has_context", type=str, required=False, default="true", location="args")
parser.add_argument("model_name", type=str, required=True, location="args")
args = parser.parse_args()
args = AdvancedPromptTemplateQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
return AdvancedPromptTemplateService.get_prompt(args)
return AdvancedPromptTemplateService.get_prompt(args.model_dump())

View File

@@ -1,6 +1,6 @@
from flask_restx import Resource, fields, reqparse
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from libs.helper import uuid_value
@@ -8,29 +8,29 @@ from libs.login import login_required
from models.model import AppMode
from services.agent_service import AgentService
parser = (
reqparse.RequestParser()
.add_argument("message_id", type=uuid_value, required=True, location="args", help="Message UUID")
.add_argument("conversation_id", type=uuid_value, required=True, location="args", help="Conversation UUID")
)
@console_ns.route("/apps/<uuid:app_id>/agent/logs")
class AgentLogApi(Resource):
@api.doc("get_agent_logs")
@api.doc(description="Get agent execution logs for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("message_id", type=str, required=True, location="args", help="Message UUID")
.add_argument("conversation_id", type=str, required=True, location="args", help="Conversation UUID")
@console_ns.doc("get_agent_logs")
@console_ns.doc(description="Get agent execution logs for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(parser)
@console_ns.response(
200, "Agent logs retrieved successfully", fields.List(fields.Raw(description="Agent log entries"))
)
@api.response(200, "Agent logs retrieved successfully", fields.List(fields.Raw(description="Agent log entries")))
@api.response(400, "Invalid request parameters")
@console_ns.response(400, "Invalid request parameters")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.AGENT_CHAT])
def get(self, app_model):
"""Get agent logs"""
parser = reqparse.RequestParser()
parser.add_argument("message_id", type=uuid_value, required=True, location="args")
parser.add_argument("conversation_id", type=uuid_value, required=True, location="args")
args = parser.parse_args()
return AgentService.get_agent_logs(app_model, args["conversation_id"], args["message_id"])

View File

@@ -1,33 +1,34 @@
from typing import Literal
from flask import request
from flask_login import current_user
from flask_restx import Resource, fields, marshal, marshal_with, reqparse
from werkzeug.exceptions import Forbidden
from controllers.common.errors import NoFileUploadedError, TooManyFilesError
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
edit_permission_required,
setup_required,
)
from extensions.ext_redis import redis_client
from fields.annotation_fields import (
annotation_fields,
annotation_hit_history_fields,
build_annotation_model,
)
from libs.helper import uuid_value
from libs.login import login_required
from services.annotation_service import AppAnnotationService
@console_ns.route("/apps/<uuid:app_id>/annotation-reply/<string:action>")
class AnnotationReplyActionApi(Resource):
@api.doc("annotation_reply_action")
@api.doc(description="Enable or disable annotation reply for an app")
@api.doc(params={"app_id": "Application ID", "action": "Action to perform (enable/disable)"})
@api.expect(
api.model(
@console_ns.doc("annotation_reply_action")
@console_ns.doc(description="Enable or disable annotation reply for an app")
@console_ns.doc(params={"app_id": "Application ID", "action": "Action to perform (enable/disable)"})
@console_ns.expect(
console_ns.model(
"AnnotationReplyActionRequest",
{
"score_threshold": fields.Float(required=True, description="Score threshold for annotation matching"),
@@ -36,21 +37,21 @@ class AnnotationReplyActionApi(Resource):
},
)
)
@api.response(200, "Action completed successfully")
@api.response(403, "Insufficient permissions")
@console_ns.response(200, "Action completed successfully")
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check("annotation")
@edit_permission_required
def post(self, app_id, action: Literal["enable", "disable"]):
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
parser = reqparse.RequestParser()
parser.add_argument("score_threshold", required=True, type=float, location="json")
parser.add_argument("embedding_provider_name", required=True, type=str, location="json")
parser.add_argument("embedding_model_name", required=True, type=str, location="json")
parser = (
reqparse.RequestParser()
.add_argument("score_threshold", required=True, type=float, location="json")
.add_argument("embedding_provider_name", required=True, type=str, location="json")
.add_argument("embedding_model_name", required=True, type=str, location="json")
)
args = parser.parse_args()
if action == "enable":
result = AppAnnotationService.enable_app_annotation(args, app_id)
@@ -61,18 +62,16 @@ class AnnotationReplyActionApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/annotation-setting")
class AppAnnotationSettingDetailApi(Resource):
@api.doc("get_annotation_setting")
@api.doc(description="Get annotation settings for an app")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Annotation settings retrieved successfully")
@api.response(403, "Insufficient permissions")
@console_ns.doc("get_annotation_setting")
@console_ns.doc(description="Get annotation settings for an app")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(200, "Annotation settings retrieved successfully")
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def get(self, app_id):
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
result = AppAnnotationService.get_app_annotation_setting_by_app_id(app_id)
return result, 200
@@ -80,11 +79,11 @@ class AppAnnotationSettingDetailApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/annotation-settings/<uuid:annotation_setting_id>")
class AppAnnotationSettingUpdateApi(Resource):
@api.doc("update_annotation_setting")
@api.doc(description="Update annotation settings for an app")
@api.doc(params={"app_id": "Application ID", "annotation_setting_id": "Annotation setting ID"})
@api.expect(
api.model(
@console_ns.doc("update_annotation_setting")
@console_ns.doc(description="Update annotation settings for an app")
@console_ns.doc(params={"app_id": "Application ID", "annotation_setting_id": "Annotation setting ID"})
@console_ns.expect(
console_ns.model(
"AnnotationSettingUpdateRequest",
{
"score_threshold": fields.Float(required=True, description="Score threshold"),
@@ -93,20 +92,17 @@ class AppAnnotationSettingUpdateApi(Resource):
},
)
)
@api.response(200, "Settings updated successfully")
@api.response(403, "Insufficient permissions")
@console_ns.response(200, "Settings updated successfully")
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def post(self, app_id, annotation_setting_id):
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
annotation_setting_id = str(annotation_setting_id)
parser = reqparse.RequestParser()
parser.add_argument("score_threshold", required=True, type=float, location="json")
parser = reqparse.RequestParser().add_argument("score_threshold", required=True, type=float, location="json")
args = parser.parse_args()
result = AppAnnotationService.update_app_annotation_setting(app_id, annotation_setting_id, args)
@@ -115,19 +111,17 @@ class AppAnnotationSettingUpdateApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/annotation-reply/<string:action>/status/<uuid:job_id>")
class AnnotationReplyActionStatusApi(Resource):
@api.doc("get_annotation_reply_action_status")
@api.doc(description="Get status of annotation reply action job")
@api.doc(params={"app_id": "Application ID", "job_id": "Job ID", "action": "Action type"})
@api.response(200, "Job status retrieved successfully")
@api.response(403, "Insufficient permissions")
@console_ns.doc("get_annotation_reply_action_status")
@console_ns.doc(description="Get status of annotation reply action job")
@console_ns.doc(params={"app_id": "Application ID", "job_id": "Job ID", "action": "Action type"})
@console_ns.response(200, "Job status retrieved successfully")
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check("annotation")
@edit_permission_required
def get(self, app_id, job_id, action):
if not current_user.is_editor:
raise Forbidden()
job_id = str(job_id)
app_annotation_job_key = f"{action}_app_annotation_job_{str(job_id)}"
cache_result = redis_client.get(app_annotation_job_key)
@@ -145,24 +139,22 @@ class AnnotationReplyActionStatusApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/annotations")
class AnnotationApi(Resource):
@api.doc("list_annotations")
@api.doc(description="Get annotations for an app with pagination")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
@console_ns.doc("list_annotations")
@console_ns.doc(description="Get annotations for an app with pagination")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(
console_ns.parser()
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size")
.add_argument("keyword", type=str, location="args", default="", help="Search keyword")
)
@api.response(200, "Annotations retrieved successfully")
@api.response(403, "Insufficient permissions")
@console_ns.response(200, "Annotations retrieved successfully")
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def get(self, app_id):
if not current_user.is_editor:
raise Forbidden()
page = request.args.get("page", default=1, type=int)
limit = request.args.get("limit", default=20, type=int)
keyword = request.args.get("keyword", default="", type=str)
@@ -178,45 +170,48 @@ class AnnotationApi(Resource):
}
return response, 200
@api.doc("create_annotation")
@api.doc(description="Create a new annotation for an app")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
@console_ns.doc("create_annotation")
@console_ns.doc(description="Create a new annotation for an app")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(
console_ns.model(
"CreateAnnotationRequest",
{
"question": fields.String(required=True, description="Question text"),
"answer": fields.String(required=True, description="Answer text"),
"message_id": fields.String(description="Message ID (optional)"),
"question": fields.String(description="Question text (required when message_id not provided)"),
"answer": fields.String(description="Answer text (use 'answer' or 'content')"),
"content": fields.String(description="Content text (use 'answer' or 'content')"),
"annotation_reply": fields.Raw(description="Annotation reply data"),
},
)
)
@api.response(201, "Annotation created successfully", annotation_fields)
@api.response(403, "Insufficient permissions")
@console_ns.response(201, "Annotation created successfully", build_annotation_model(console_ns))
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check("annotation")
@marshal_with(annotation_fields)
@edit_permission_required
def post(self, app_id):
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
parser = reqparse.RequestParser()
parser.add_argument("question", required=True, type=str, location="json")
parser.add_argument("answer", required=True, type=str, location="json")
parser = (
reqparse.RequestParser()
.add_argument("message_id", required=False, type=uuid_value, location="json")
.add_argument("question", required=False, type=str, location="json")
.add_argument("answer", required=False, type=str, location="json")
.add_argument("content", required=False, type=str, location="json")
.add_argument("annotation_reply", required=False, type=dict, location="json")
)
args = parser.parse_args()
annotation = AppAnnotationService.insert_app_annotation_directly(args, app_id)
annotation = AppAnnotationService.up_insert_app_annotation_from_message(args, app_id)
return annotation
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def delete(self, app_id):
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
# Use request.args.getlist to get annotation_ids array directly
@@ -241,46 +236,51 @@ class AnnotationApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/annotations/export")
class AnnotationExportApi(Resource):
@api.doc("export_annotations")
@api.doc(description="Export all annotations for an app")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Annotations exported successfully", fields.List(fields.Nested(annotation_fields)))
@api.response(403, "Insufficient permissions")
@console_ns.doc("export_annotations")
@console_ns.doc(description="Export all annotations for an app")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(
200,
"Annotations exported successfully",
console_ns.model("AnnotationList", {"data": fields.List(fields.Nested(build_annotation_model(console_ns)))}),
)
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def get(self, app_id):
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
annotation_list = AppAnnotationService.export_annotation_list_by_app_id(app_id)
response = {"data": marshal(annotation_list, annotation_fields)}
return response, 200
parser = (
reqparse.RequestParser()
.add_argument("question", required=True, type=str, location="json")
.add_argument("answer", required=True, type=str, location="json")
)
@console_ns.route("/apps/<uuid:app_id>/annotations/<uuid:annotation_id>")
class AnnotationUpdateDeleteApi(Resource):
@api.doc("update_delete_annotation")
@api.doc(description="Update or delete an annotation")
@api.doc(params={"app_id": "Application ID", "annotation_id": "Annotation ID"})
@api.response(200, "Annotation updated successfully", annotation_fields)
@api.response(204, "Annotation deleted successfully")
@api.response(403, "Insufficient permissions")
@console_ns.doc("update_delete_annotation")
@console_ns.doc(description="Update or delete an annotation")
@console_ns.doc(params={"app_id": "Application ID", "annotation_id": "Annotation ID"})
@console_ns.response(200, "Annotation updated successfully", build_annotation_model(console_ns))
@console_ns.response(204, "Annotation deleted successfully")
@console_ns.response(403, "Insufficient permissions")
@console_ns.expect(parser)
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check("annotation")
@edit_permission_required
@marshal_with(annotation_fields)
def post(self, app_id, annotation_id):
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
annotation_id = str(annotation_id)
parser = reqparse.RequestParser()
parser.add_argument("question", required=True, type=str, location="json")
parser.add_argument("answer", required=True, type=str, location="json")
args = parser.parse_args()
annotation = AppAnnotationService.update_app_annotation_directly(args, app_id, annotation_id)
return annotation
@@ -288,10 +288,8 @@ class AnnotationUpdateDeleteApi(Resource):
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def delete(self, app_id, annotation_id):
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
annotation_id = str(annotation_id)
AppAnnotationService.delete_app_annotation(app_id, annotation_id)
@@ -300,20 +298,18 @@ class AnnotationUpdateDeleteApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/annotations/batch-import")
class AnnotationBatchImportApi(Resource):
@api.doc("batch_import_annotations")
@api.doc(description="Batch import annotations from CSV file")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Batch import started successfully")
@api.response(403, "Insufficient permissions")
@api.response(400, "No file uploaded or too many files")
@console_ns.doc("batch_import_annotations")
@console_ns.doc(description="Batch import annotations from CSV file")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(200, "Batch import started successfully")
@console_ns.response(403, "Insufficient permissions")
@console_ns.response(400, "No file uploaded or too many files")
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check("annotation")
@edit_permission_required
def post(self, app_id):
if not current_user.is_editor:
raise Forbidden()
app_id = str(app_id)
# check file
if "file" not in request.files:
@@ -332,19 +328,17 @@ class AnnotationBatchImportApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/annotations/batch-import-status/<uuid:job_id>")
class AnnotationBatchImportStatusApi(Resource):
@api.doc("get_batch_import_status")
@api.doc(description="Get status of batch import job")
@api.doc(params={"app_id": "Application ID", "job_id": "Job ID"})
@api.response(200, "Job status retrieved successfully")
@api.response(403, "Insufficient permissions")
@console_ns.doc("get_batch_import_status")
@console_ns.doc(description="Get status of batch import job")
@console_ns.doc(params={"app_id": "Application ID", "job_id": "Job ID"})
@console_ns.response(200, "Job status retrieved successfully")
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check("annotation")
@edit_permission_required
def get(self, app_id, job_id):
if not current_user.is_editor:
raise Forbidden()
job_id = str(job_id)
indexing_cache_key = f"app_annotation_batch_import_{str(job_id)}"
cache_result = redis_client.get(indexing_cache_key)
@@ -361,25 +355,32 @@ class AnnotationBatchImportStatusApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/annotations/<uuid:annotation_id>/hit-histories")
class AnnotationHitHistoryListApi(Resource):
@api.doc("list_annotation_hit_histories")
@api.doc(description="Get hit histories for an annotation")
@api.doc(params={"app_id": "Application ID", "annotation_id": "Annotation ID"})
@api.expect(
api.parser()
@console_ns.doc("list_annotation_hit_histories")
@console_ns.doc(description="Get hit histories for an annotation")
@console_ns.doc(params={"app_id": "Application ID", "annotation_id": "Annotation ID"})
@console_ns.expect(
console_ns.parser()
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size")
)
@api.response(
200, "Hit histories retrieved successfully", fields.List(fields.Nested(annotation_hit_history_fields))
@console_ns.response(
200,
"Hit histories retrieved successfully",
console_ns.model(
"AnnotationHitHistoryList",
{
"data": fields.List(
fields.Nested(console_ns.model("AnnotationHitHistoryItem", annotation_hit_history_fields))
)
},
),
)
@api.response(403, "Insufficient permissions")
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def get(self, app_id, annotation_id):
if not current_user.is_editor:
raise Forbidden()
page = request.args.get("page", default=1, type=int)
limit = request.args.get("limit", default=20, type=int)
app_id = str(app_id)

View File

@@ -1,96 +1,295 @@
import uuid
from typing import cast
from typing import Literal
from flask_login import current_user
from flask_restx import Resource, fields, inputs, marshal, marshal_with, reqparse
from flask import request
from flask_restx import Resource, fields, marshal, marshal_with
from pydantic import BaseModel, Field, field_validator
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import BadRequest, Forbidden, abort
from werkzeug.exceptions import BadRequest
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
edit_permission_required,
enterprise_license_required,
is_admin_or_owner_required,
setup_required,
)
from core.ops.ops_trace_manager import OpsTraceManager
from core.workflow.enums import NodeType
from extensions.ext_database import db
from fields.app_fields import app_detail_fields, app_detail_fields_with_site, app_pagination_fields
from libs.login import login_required
from fields.app_fields import (
deleted_tool_fields,
model_config_fields,
model_config_partial_fields,
site_fields,
tag_fields,
)
from fields.workflow_fields import workflow_partial_fields as _workflow_partial_fields_dict
from libs.helper import AppIconUrlField, TimestampField
from libs.login import current_account_with_tenant, login_required
from libs.validators import validate_description_length
from models import Account, App
from models import App, Workflow
from services.app_dsl_service import AppDslService, ImportMode
from services.app_service import AppService
from services.enterprise.enterprise_service import EnterpriseService
from services.feature_service import FeatureService
ALLOW_CREATE_APP_MODES = ["chat", "agent-chat", "advanced-chat", "workflow", "completion"]
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class AppListQuery(BaseModel):
page: int = Field(default=1, ge=1, le=99999, description="Page number (1-99999)")
limit: int = Field(default=20, ge=1, le=100, description="Page size (1-100)")
mode: Literal["completion", "chat", "advanced-chat", "workflow", "agent-chat", "channel", "all"] = Field(
default="all", description="App mode filter"
)
name: str | None = Field(default=None, description="Filter by app name")
tag_ids: list[str] | None = Field(default=None, description="Comma-separated tag IDs")
is_created_by_me: bool | None = Field(default=None, description="Filter by creator")
@field_validator("tag_ids", mode="before")
@classmethod
def validate_tag_ids(cls, value: str | list[str] | None) -> list[str] | None:
if not value:
return None
if isinstance(value, str):
items = [item.strip() for item in value.split(",") if item.strip()]
elif isinstance(value, list):
items = [str(item).strip() for item in value if item and str(item).strip()]
else:
raise TypeError("Unsupported tag_ids type.")
if not items:
return None
try:
return [str(uuid.UUID(item)) for item in items]
except ValueError as exc:
raise ValueError("Invalid UUID format in tag_ids.") from exc
class CreateAppPayload(BaseModel):
name: str = Field(..., min_length=1, description="App name")
description: str | None = Field(default=None, description="App description (max 400 chars)")
mode: Literal["chat", "agent-chat", "advanced-chat", "workflow", "completion"] = Field(..., description="App mode")
icon_type: str | None = Field(default=None, description="Icon type")
icon: str | None = Field(default=None, description="Icon")
icon_background: str | None = Field(default=None, description="Icon background color")
@field_validator("description")
@classmethod
def validate_description(cls, value: str | None) -> str | None:
if value is None:
return value
return validate_description_length(value)
class UpdateAppPayload(BaseModel):
name: str = Field(..., min_length=1, description="App name")
description: str | None = Field(default=None, description="App description (max 400 chars)")
icon_type: str | None = Field(default=None, description="Icon type")
icon: str | None = Field(default=None, description="Icon")
icon_background: str | None = Field(default=None, description="Icon background color")
use_icon_as_answer_icon: bool | None = Field(default=None, description="Use icon as answer icon")
max_active_requests: int | None = Field(default=None, description="Maximum active requests")
@field_validator("description")
@classmethod
def validate_description(cls, value: str | None) -> str | None:
if value is None:
return value
return validate_description_length(value)
class CopyAppPayload(BaseModel):
name: str | None = Field(default=None, description="Name for the copied app")
description: str | None = Field(default=None, description="Description for the copied app")
icon_type: str | None = Field(default=None, description="Icon type")
icon: str | None = Field(default=None, description="Icon")
icon_background: str | None = Field(default=None, description="Icon background color")
@field_validator("description")
@classmethod
def validate_description(cls, value: str | None) -> str | None:
if value is None:
return value
return validate_description_length(value)
class AppExportQuery(BaseModel):
include_secret: bool = Field(default=False, description="Include secrets in export")
workflow_id: str | None = Field(default=None, description="Specific workflow ID to export")
class AppNamePayload(BaseModel):
name: str = Field(..., min_length=1, description="Name to check")
class AppIconPayload(BaseModel):
icon: str | None = Field(default=None, description="Icon data")
icon_background: str | None = Field(default=None, description="Icon background color")
class AppSiteStatusPayload(BaseModel):
enable_site: bool = Field(..., description="Enable or disable site")
class AppApiStatusPayload(BaseModel):
enable_api: bool = Field(..., description="Enable or disable API")
class AppTracePayload(BaseModel):
enabled: bool = Field(..., description="Enable or disable tracing")
tracing_provider: str = Field(..., description="Tracing provider")
def reg(cls: type[BaseModel]):
console_ns.schema_model(cls.__name__, cls.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0))
reg(AppListQuery)
reg(CreateAppPayload)
reg(UpdateAppPayload)
reg(CopyAppPayload)
reg(AppExportQuery)
reg(AppNamePayload)
reg(AppIconPayload)
reg(AppSiteStatusPayload)
reg(AppApiStatusPayload)
reg(AppTracePayload)
# Register models for flask_restx to avoid dict type issues in Swagger
# Register base models first
tag_model = console_ns.model("Tag", tag_fields)
workflow_partial_model = console_ns.model("WorkflowPartial", _workflow_partial_fields_dict)
model_config_model = console_ns.model("ModelConfig", model_config_fields)
model_config_partial_model = console_ns.model("ModelConfigPartial", model_config_partial_fields)
deleted_tool_model = console_ns.model("DeletedTool", deleted_tool_fields)
site_model = console_ns.model("Site", site_fields)
app_partial_model = console_ns.model(
"AppPartial",
{
"id": fields.String,
"name": fields.String,
"max_active_requests": fields.Raw(),
"description": fields.String(attribute="desc_or_prompt"),
"mode": fields.String(attribute="mode_compatible_with_agent"),
"icon_type": fields.String,
"icon": fields.String,
"icon_background": fields.String,
"icon_url": AppIconUrlField,
"model_config": fields.Nested(model_config_partial_model, attribute="app_model_config", allow_null=True),
"workflow": fields.Nested(workflow_partial_model, allow_null=True),
"use_icon_as_answer_icon": fields.Boolean,
"created_by": fields.String,
"created_at": TimestampField,
"updated_by": fields.String,
"updated_at": TimestampField,
"tags": fields.List(fields.Nested(tag_model)),
"access_mode": fields.String,
"create_user_name": fields.String,
"author_name": fields.String,
"has_draft_trigger": fields.Boolean,
},
)
app_detail_model = console_ns.model(
"AppDetail",
{
"id": fields.String,
"name": fields.String,
"description": fields.String,
"mode": fields.String(attribute="mode_compatible_with_agent"),
"icon": fields.String,
"icon_background": fields.String,
"enable_site": fields.Boolean,
"enable_api": fields.Boolean,
"model_config": fields.Nested(model_config_model, attribute="app_model_config", allow_null=True),
"workflow": fields.Nested(workflow_partial_model, allow_null=True),
"tracing": fields.Raw,
"use_icon_as_answer_icon": fields.Boolean,
"created_by": fields.String,
"created_at": TimestampField,
"updated_by": fields.String,
"updated_at": TimestampField,
"access_mode": fields.String,
"tags": fields.List(fields.Nested(tag_model)),
},
)
app_detail_with_site_model = console_ns.model(
"AppDetailWithSite",
{
"id": fields.String,
"name": fields.String,
"description": fields.String,
"mode": fields.String(attribute="mode_compatible_with_agent"),
"icon_type": fields.String,
"icon": fields.String,
"icon_background": fields.String,
"icon_url": AppIconUrlField,
"enable_site": fields.Boolean,
"enable_api": fields.Boolean,
"model_config": fields.Nested(model_config_model, attribute="app_model_config", allow_null=True),
"workflow": fields.Nested(workflow_partial_model, allow_null=True),
"api_base_url": fields.String,
"use_icon_as_answer_icon": fields.Boolean,
"max_active_requests": fields.Integer,
"created_by": fields.String,
"created_at": TimestampField,
"updated_by": fields.String,
"updated_at": TimestampField,
"deleted_tools": fields.List(fields.Nested(deleted_tool_model)),
"access_mode": fields.String,
"tags": fields.List(fields.Nested(tag_model)),
"site": fields.Nested(site_model),
},
)
app_pagination_model = console_ns.model(
"AppPagination",
{
"page": fields.Integer,
"limit": fields.Integer(attribute="per_page"),
"total": fields.Integer,
"has_more": fields.Boolean(attribute="has_next"),
"data": fields.List(fields.Nested(app_partial_model), attribute="items"),
},
)
@console_ns.route("/apps")
class AppListApi(Resource):
@api.doc("list_apps")
@api.doc(description="Get list of applications with pagination and filtering")
@api.expect(
api.parser()
.add_argument("page", type=int, location="args", help="Page number (1-99999)", default=1)
.add_argument("limit", type=int, location="args", help="Page size (1-100)", default=20)
.add_argument(
"mode",
type=str,
location="args",
choices=["completion", "chat", "advanced-chat", "workflow", "agent-chat", "channel", "all"],
default="all",
help="App mode filter",
)
.add_argument("name", type=str, location="args", help="Filter by app name")
.add_argument("tag_ids", type=str, location="args", help="Comma-separated tag IDs")
.add_argument("is_created_by_me", type=bool, location="args", help="Filter by creator")
)
@api.response(200, "Success", app_pagination_fields)
@console_ns.doc("list_apps")
@console_ns.doc(description="Get list of applications with pagination and filtering")
@console_ns.expect(console_ns.models[AppListQuery.__name__])
@console_ns.response(200, "Success", app_pagination_model)
@setup_required
@login_required
@account_initialization_required
@enterprise_license_required
def get(self):
"""Get app list"""
current_user, current_tenant_id = current_account_with_tenant()
def uuid_list(value):
try:
return [str(uuid.UUID(v)) for v in value.split(",")]
except ValueError:
abort(400, message="Invalid UUID format in tag_ids.")
parser = reqparse.RequestParser()
parser.add_argument("page", type=inputs.int_range(1, 99999), required=False, default=1, location="args")
parser.add_argument("limit", type=inputs.int_range(1, 100), required=False, default=20, location="args")
parser.add_argument(
"mode",
type=str,
choices=[
"completion",
"chat",
"advanced-chat",
"workflow",
"agent-chat",
"channel",
"all",
],
default="all",
location="args",
required=False,
)
parser.add_argument("name", type=str, location="args", required=False)
parser.add_argument("tag_ids", type=uuid_list, location="args", required=False)
parser.add_argument("is_created_by_me", type=inputs.boolean, location="args", required=False)
args = parser.parse_args()
args = AppListQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
args_dict = args.model_dump()
# get app list
app_service = AppService()
app_pagination = app_service.get_paginate_apps(current_user.id, current_user.current_tenant_id, args)
app_pagination = app_service.get_paginate_apps(current_user.id, current_tenant_id, args_dict)
if not app_pagination:
return {"data": [], "total": 0, "page": 1, "limit": 20, "has_more": False}
@@ -104,71 +303,75 @@ class AppListApi(Resource):
if str(app.id) in res:
app.access_mode = res[str(app.id)].access_mode
return marshal(app_pagination, app_pagination_fields), 200
workflow_capable_app_ids = [
str(app.id) for app in app_pagination.items if app.mode in {"workflow", "advanced-chat"}
]
draft_trigger_app_ids: set[str] = set()
if workflow_capable_app_ids:
draft_workflows = (
db.session.execute(
select(Workflow).where(
Workflow.version == Workflow.VERSION_DRAFT,
Workflow.app_id.in_(workflow_capable_app_ids),
)
)
.scalars()
.all()
)
trigger_node_types = {
NodeType.TRIGGER_WEBHOOK,
NodeType.TRIGGER_SCHEDULE,
NodeType.TRIGGER_PLUGIN,
}
for workflow in draft_workflows:
try:
for _, node_data in workflow.walk_nodes():
if node_data.get("type") in trigger_node_types:
draft_trigger_app_ids.add(str(workflow.app_id))
break
except Exception:
continue
@api.doc("create_app")
@api.doc(description="Create a new application")
@api.expect(
api.model(
"CreateAppRequest",
{
"name": fields.String(required=True, description="App name"),
"description": fields.String(description="App description (max 400 chars)"),
"mode": fields.String(required=True, enum=ALLOW_CREATE_APP_MODES, description="App mode"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
},
)
)
@api.response(201, "App created successfully", app_detail_fields)
@api.response(403, "Insufficient permissions")
@api.response(400, "Invalid request parameters")
for app in app_pagination.items:
app.has_draft_trigger = str(app.id) in draft_trigger_app_ids
return marshal(app_pagination, app_pagination_model), 200
@console_ns.doc("create_app")
@console_ns.doc(description="Create a new application")
@console_ns.expect(console_ns.models[CreateAppPayload.__name__])
@console_ns.response(201, "App created successfully", app_detail_model)
@console_ns.response(403, "Insufficient permissions")
@console_ns.response(400, "Invalid request parameters")
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_detail_fields)
@marshal_with(app_detail_model)
@cloud_edition_billing_resource_check("apps")
@edit_permission_required
def post(self):
"""Create app"""
parser = reqparse.RequestParser()
parser.add_argument("name", type=str, required=True, location="json")
parser.add_argument("description", type=validate_description_length, location="json")
parser.add_argument("mode", type=str, choices=ALLOW_CREATE_APP_MODES, location="json")
parser.add_argument("icon_type", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
if "mode" not in args or args["mode"] is None:
raise BadRequest("mode is required")
current_user, current_tenant_id = current_account_with_tenant()
args = CreateAppPayload.model_validate(console_ns.payload)
app_service = AppService()
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
if current_user.current_tenant_id is None:
raise ValueError("current_user.current_tenant_id cannot be None")
app = app_service.create_app(current_user.current_tenant_id, args, current_user)
app = app_service.create_app(current_tenant_id, args.model_dump(), current_user)
return app, 201
@console_ns.route("/apps/<uuid:app_id>")
class AppApi(Resource):
@api.doc("get_app_detail")
@api.doc(description="Get application details")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Success", app_detail_fields_with_site)
@console_ns.doc("get_app_detail")
@console_ns.doc(description="Get application details")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(200, "Success", app_detail_with_site_model)
@setup_required
@login_required
@account_initialization_required
@enterprise_license_required
@get_app_model
@marshal_with(app_detail_fields_with_site)
@marshal_with(app_detail_with_site_model)
def get(self, app_model):
"""Get app detail"""
app_service = AppService()
@@ -181,79 +384,50 @@ class AppApi(Resource):
return app_model
@api.doc("update_app")
@api.doc(description="Update application details")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"UpdateAppRequest",
{
"name": fields.String(required=True, description="App name"),
"description": fields.String(description="App description (max 400 chars)"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
"use_icon_as_answer_icon": fields.Boolean(description="Use icon as answer icon"),
"max_active_requests": fields.Integer(description="Maximum active requests"),
},
)
)
@api.response(200, "App updated successfully", app_detail_fields_with_site)
@api.response(403, "Insufficient permissions")
@api.response(400, "Invalid request parameters")
@console_ns.doc("update_app")
@console_ns.doc(description="Update application details")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[UpdateAppPayload.__name__])
@console_ns.response(200, "App updated successfully", app_detail_with_site_model)
@console_ns.response(403, "Insufficient permissions")
@console_ns.response(400, "Invalid request parameters")
@setup_required
@login_required
@account_initialization_required
@get_app_model
@marshal_with(app_detail_fields_with_site)
@edit_permission_required
@marshal_with(app_detail_with_site_model)
def put(self, app_model):
"""Update app"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("name", type=str, required=True, nullable=False, location="json")
parser.add_argument("description", type=validate_description_length, location="json")
parser.add_argument("icon_type", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
parser.add_argument("use_icon_as_answer_icon", type=bool, location="json")
parser.add_argument("max_active_requests", type=int, location="json")
args = parser.parse_args()
args = UpdateAppPayload.model_validate(console_ns.payload)
app_service = AppService()
# Construct ArgsDict from parsed arguments
from services.app_service import AppService as AppServiceType
args_dict: AppServiceType.ArgsDict = {
"name": args["name"],
"description": args.get("description", ""),
"icon_type": args.get("icon_type", ""),
"icon": args.get("icon", ""),
"icon_background": args.get("icon_background", ""),
"use_icon_as_answer_icon": args.get("use_icon_as_answer_icon", False),
"max_active_requests": args.get("max_active_requests", 0),
args_dict: AppService.ArgsDict = {
"name": args.name,
"description": args.description or "",
"icon_type": args.icon_type or "",
"icon": args.icon or "",
"icon_background": args.icon_background or "",
"use_icon_as_answer_icon": args.use_icon_as_answer_icon or False,
"max_active_requests": args.max_active_requests or 0,
}
app_model = app_service.update_app(app_model, args_dict)
return app_model
@api.doc("delete_app")
@api.doc(description="Delete application")
@api.doc(params={"app_id": "Application ID"})
@api.response(204, "App deleted successfully")
@api.response(403, "Insufficient permissions")
@console_ns.doc("delete_app")
@console_ns.doc(description="Delete application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(204, "App deleted successfully")
@console_ns.response(403, "Insufficient permissions")
@get_app_model
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def delete(self, app_model):
"""Delete app"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
app_service = AppService()
app_service.delete_app(app_model)
@@ -262,55 +436,37 @@ class AppApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/copy")
class AppCopyApi(Resource):
@api.doc("copy_app")
@api.doc(description="Create a copy of an existing application")
@api.doc(params={"app_id": "Application ID to copy"})
@api.expect(
api.model(
"CopyAppRequest",
{
"name": fields.String(description="Name for the copied app"),
"description": fields.String(description="Description for the copied app"),
"icon_type": fields.String(description="Icon type"),
"icon": fields.String(description="Icon"),
"icon_background": fields.String(description="Icon background color"),
},
)
)
@api.response(201, "App copied successfully", app_detail_fields_with_site)
@api.response(403, "Insufficient permissions")
@console_ns.doc("copy_app")
@console_ns.doc(description="Create a copy of an existing application")
@console_ns.doc(params={"app_id": "Application ID to copy"})
@console_ns.expect(console_ns.models[CopyAppPayload.__name__])
@console_ns.response(201, "App copied successfully", app_detail_with_site_model)
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@get_app_model
@marshal_with(app_detail_fields_with_site)
@edit_permission_required
@marshal_with(app_detail_with_site_model)
def post(self, app_model):
"""Copy app"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
current_user, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("name", type=str, location="json")
parser.add_argument("description", type=validate_description_length, location="json")
parser.add_argument("icon_type", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
args = CopyAppPayload.model_validate(console_ns.payload or {})
with Session(db.engine) as session:
import_service = AppDslService(session)
yaml_content = import_service.export_dsl(app_model=app_model, include_secret=True)
account = cast(Account, current_user)
result = import_service.import_app(
account=account,
import_mode=ImportMode.YAML_CONTENT.value,
account=current_user,
import_mode=ImportMode.YAML_CONTENT,
yaml_content=yaml_content,
name=args.get("name"),
description=args.get("description"),
icon_type=args.get("icon_type"),
icon=args.get("icon"),
icon_background=args.get("icon_background"),
name=args.name,
description=args.description,
icon_type=args.icon_type,
icon=args.icon,
icon_background=args.icon_background,
)
session.commit()
@@ -322,178 +478,131 @@ class AppCopyApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/export")
class AppExportApi(Resource):
@api.doc("export_app")
@api.doc(description="Export application configuration as DSL")
@api.doc(params={"app_id": "Application ID to export"})
@api.expect(
api.parser()
.add_argument("include_secret", type=bool, location="args", default=False, help="Include secrets in export")
.add_argument("workflow_id", type=str, location="args", help="Specific workflow ID to export")
)
@api.response(
@console_ns.doc("export_app")
@console_ns.doc(description="Export application configuration as DSL")
@console_ns.doc(params={"app_id": "Application ID to export"})
@console_ns.expect(console_ns.models[AppExportQuery.__name__])
@console_ns.response(
200,
"App exported successfully",
api.model("AppExportResponse", {"data": fields.String(description="DSL export data")}),
console_ns.model("AppExportResponse", {"data": fields.String(description="DSL export data")}),
)
@api.response(403, "Insufficient permissions")
@console_ns.response(403, "Insufficient permissions")
@get_app_model
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def get(self, app_model):
"""Export app"""
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
# Add include_secret params
parser = reqparse.RequestParser()
parser.add_argument("include_secret", type=inputs.boolean, default=False, location="args")
parser.add_argument("workflow_id", type=str, location="args")
args = parser.parse_args()
args = AppExportQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
return {
"data": AppDslService.export_dsl(
app_model=app_model, include_secret=args["include_secret"], workflow_id=args.get("workflow_id")
app_model=app_model,
include_secret=args.include_secret,
workflow_id=args.workflow_id,
)
}
@console_ns.route("/apps/<uuid:app_id>/name")
class AppNameApi(Resource):
@api.doc("check_app_name")
@api.doc(description="Check if app name is available")
@api.doc(params={"app_id": "Application ID"})
@api.expect(api.parser().add_argument("name", type=str, required=True, location="args", help="Name to check"))
@api.response(200, "Name availability checked")
@console_ns.doc("check_app_name")
@console_ns.doc(description="Check if app name is available")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[AppNamePayload.__name__])
@console_ns.response(200, "Name availability checked")
@setup_required
@login_required
@account_initialization_required
@get_app_model
@marshal_with(app_detail_fields)
@marshal_with(app_detail_model)
@edit_permission_required
def post(self, app_model):
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("name", type=str, required=True, location="json")
args = parser.parse_args()
args = AppNamePayload.model_validate(console_ns.payload)
app_service = AppService()
app_model = app_service.update_app_name(app_model, args["name"])
app_model = app_service.update_app_name(app_model, args.name)
return app_model
@console_ns.route("/apps/<uuid:app_id>/icon")
class AppIconApi(Resource):
@api.doc("update_app_icon")
@api.doc(description="Update application icon")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppIconRequest",
{
"icon": fields.String(required=True, description="Icon data"),
"icon_type": fields.String(description="Icon type"),
"icon_background": fields.String(description="Icon background color"),
},
)
)
@api.response(200, "Icon updated successfully")
@api.response(403, "Insufficient permissions")
@console_ns.doc("update_app_icon")
@console_ns.doc(description="Update application icon")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[AppIconPayload.__name__])
@console_ns.response(200, "Icon updated successfully")
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@get_app_model
@marshal_with(app_detail_fields)
@marshal_with(app_detail_model)
@edit_permission_required
def post(self, app_model):
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
args = parser.parse_args()
args = AppIconPayload.model_validate(console_ns.payload or {})
app_service = AppService()
app_model = app_service.update_app_icon(app_model, args.get("icon") or "", args.get("icon_background") or "")
app_model = app_service.update_app_icon(app_model, args.icon or "", args.icon_background or "")
return app_model
@console_ns.route("/apps/<uuid:app_id>/site-enable")
class AppSiteStatus(Resource):
@api.doc("update_app_site_status")
@api.doc(description="Enable or disable app site")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppSiteStatusRequest", {"enable_site": fields.Boolean(required=True, description="Enable or disable site")}
)
)
@api.response(200, "Site status updated successfully", app_detail_fields)
@api.response(403, "Insufficient permissions")
@console_ns.doc("update_app_site_status")
@console_ns.doc(description="Enable or disable app site")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[AppSiteStatusPayload.__name__])
@console_ns.response(200, "Site status updated successfully", app_detail_model)
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@get_app_model
@marshal_with(app_detail_fields)
@marshal_with(app_detail_model)
@edit_permission_required
def post(self, app_model):
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("enable_site", type=bool, required=True, location="json")
args = parser.parse_args()
args = AppSiteStatusPayload.model_validate(console_ns.payload)
app_service = AppService()
app_model = app_service.update_app_site_status(app_model, args["enable_site"])
app_model = app_service.update_app_site_status(app_model, args.enable_site)
return app_model
@console_ns.route("/apps/<uuid:app_id>/api-enable")
class AppApiStatus(Resource):
@api.doc("update_app_api_status")
@api.doc(description="Enable or disable app API")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppApiStatusRequest", {"enable_api": fields.Boolean(required=True, description="Enable or disable API")}
)
)
@api.response(200, "API status updated successfully", app_detail_fields)
@api.response(403, "Insufficient permissions")
@console_ns.doc("update_app_api_status")
@console_ns.doc(description="Enable or disable app API")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[AppApiStatusPayload.__name__])
@console_ns.response(200, "API status updated successfully", app_detail_model)
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@is_admin_or_owner_required
@account_initialization_required
@get_app_model
@marshal_with(app_detail_fields)
@marshal_with(app_detail_model)
def post(self, app_model):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("enable_api", type=bool, required=True, location="json")
args = parser.parse_args()
args = AppApiStatusPayload.model_validate(console_ns.payload)
app_service = AppService()
app_model = app_service.update_app_api_status(app_model, args["enable_api"])
app_model = app_service.update_app_api_status(app_model, args.enable_api)
return app_model
@console_ns.route("/apps/<uuid:app_id>/trace")
class AppTraceApi(Resource):
@api.doc("get_app_trace")
@api.doc(description="Get app tracing configuration")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Trace configuration retrieved successfully")
@console_ns.doc("get_app_trace")
@console_ns.doc(description="Get app tracing configuration")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(200, "Trace configuration retrieved successfully")
@setup_required
@login_required
@account_initialization_required
@@ -503,36 +612,24 @@ class AppTraceApi(Resource):
return app_trace_config
@api.doc("update_app_trace")
@api.doc(description="Update app tracing configuration")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"AppTraceRequest",
{
"enabled": fields.Boolean(required=True, description="Enable or disable tracing"),
"tracing_provider": fields.String(required=True, description="Tracing provider"),
},
)
)
@api.response(200, "Trace configuration updated successfully")
@api.response(403, "Insufficient permissions")
@console_ns.doc("update_app_trace")
@console_ns.doc(description="Update app tracing configuration")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[AppTracePayload.__name__])
@console_ns.response(200, "Trace configuration updated successfully")
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def post(self, app_id):
# add app trace
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("enabled", type=bool, required=True, location="json")
parser.add_argument("tracing_provider", type=str, required=True, location="json")
args = parser.parse_args()
args = AppTracePayload.model_validate(console_ns.payload)
OpsTraceManager.update_app_tracing_config(
app_id=app_id,
enabled=args["enabled"],
tracing_provider=args["tracing_provider"],
enabled=args.enabled,
tracing_provider=args.tracing_provider,
)
return {"result": "success"}

View File

@@ -1,20 +1,20 @@
from typing import cast
from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse
from flask_restx import Resource, fields, marshal_with, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
edit_permission_required,
setup_required,
)
from extensions.ext_database import db
from fields.app_fields import app_import_check_dependencies_fields, app_import_fields
from libs.login import login_required
from models import Account
from fields.app_fields import (
app_import_check_dependencies_fields,
app_import_fields,
leaked_dependency_fields,
)
from libs.login import current_account_with_tenant, login_required
from models.model import App
from services.app_dsl_service import AppDslService, ImportStatus
from services.enterprise.enterprise_service import EnterpriseService
@@ -22,36 +22,52 @@ from services.feature_service import FeatureService
from .. import console_ns
# Register models for flask_restx to avoid dict type issues in Swagger
# Register base model first
leaked_dependency_model = console_ns.model("LeakedDependency", leaked_dependency_fields)
app_import_model = console_ns.model("AppImport", app_import_fields)
# For nested models, need to replace nested dict with registered model
app_import_check_dependencies_fields_copy = app_import_check_dependencies_fields.copy()
app_import_check_dependencies_fields_copy["leaked_dependencies"] = fields.List(fields.Nested(leaked_dependency_model))
app_import_check_dependencies_model = console_ns.model(
"AppImportCheckDependencies", app_import_check_dependencies_fields_copy
)
parser = (
reqparse.RequestParser()
.add_argument("mode", type=str, required=True, location="json")
.add_argument("yaml_content", type=str, location="json")
.add_argument("yaml_url", type=str, location="json")
.add_argument("name", type=str, location="json")
.add_argument("description", type=str, location="json")
.add_argument("icon_type", type=str, location="json")
.add_argument("icon", type=str, location="json")
.add_argument("icon_background", type=str, location="json")
.add_argument("app_id", type=str, location="json")
)
@console_ns.route("/apps/imports")
class AppImportApi(Resource):
@console_ns.expect(parser)
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_import_fields)
@marshal_with(app_import_model)
@cloud_edition_billing_resource_check("apps")
@edit_permission_required
def post(self):
# Check user role first
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("mode", type=str, required=True, location="json")
parser.add_argument("yaml_content", type=str, location="json")
parser.add_argument("yaml_url", type=str, location="json")
parser.add_argument("name", type=str, location="json")
parser.add_argument("description", type=str, location="json")
parser.add_argument("icon_type", type=str, location="json")
parser.add_argument("icon", type=str, location="json")
parser.add_argument("icon_background", type=str, location="json")
parser.add_argument("app_id", type=str, location="json")
current_user, _ = current_account_with_tenant()
args = parser.parse_args()
# Create service with session
with Session(db.engine) as session:
import_service = AppDslService(session)
# Import app
account = cast(Account, current_user)
account = current_user
result = import_service.import_app(
account=account,
import_mode=args["mode"],
@@ -70,9 +86,9 @@ class AppImportApi(Resource):
EnterpriseService.WebAppAuth.update_app_access_mode(result.app_id, "private")
# Return appropriate status code based on result
status = result.status
if status == ImportStatus.FAILED.value:
if status == ImportStatus.FAILED:
return result.model_dump(mode="json"), 400
elif status == ImportStatus.PENDING.value:
elif status == ImportStatus.PENDING:
return result.model_dump(mode="json"), 202
return result.model_dump(mode="json"), 200
@@ -82,22 +98,22 @@ class AppImportConfirmApi(Resource):
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_import_fields)
@marshal_with(app_import_model)
@edit_permission_required
def post(self, import_id):
# Check user role first
if not current_user.is_editor:
raise Forbidden()
current_user, _ = current_account_with_tenant()
# Create service with session
with Session(db.engine) as session:
import_service = AppDslService(session)
# Confirm import
account = cast(Account, current_user)
account = current_user
result = import_service.confirm_import(import_id=import_id, account=account)
session.commit()
# Return appropriate status code based on result
if result.status == ImportStatus.FAILED.value:
if result.status == ImportStatus.FAILED:
return result.model_dump(mode="json"), 400
return result.model_dump(mode="json"), 200
@@ -108,11 +124,9 @@ class AppImportCheckDependenciesApi(Resource):
@login_required
@get_app_model
@account_initialization_required
@marshal_with(app_import_check_dependencies_fields)
@marshal_with(app_import_check_dependencies_model)
@edit_permission_required
def get(self, app_model: App):
if not current_user.is_editor:
raise Forbidden()
with Session(db.engine) as session:
import_service = AppDslService(session)
result = import_service.check_dependencies(app_model=app_model)

View File

@@ -5,7 +5,7 @@ from flask_restx import Resource, fields, reqparse
from werkzeug.exceptions import InternalServerError
import services
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.error import (
AppUnavailableError,
AudioTooLargeError,
@@ -36,16 +36,16 @@ logger = logging.getLogger(__name__)
@console_ns.route("/apps/<uuid:app_id>/audio-to-text")
class ChatMessageAudioApi(Resource):
@api.doc("chat_message_audio_transcript")
@api.doc(description="Transcript audio to text for chat messages")
@api.doc(params={"app_id": "App ID"})
@api.response(
@console_ns.doc("chat_message_audio_transcript")
@console_ns.doc(description="Transcript audio to text for chat messages")
@console_ns.doc(params={"app_id": "App ID"})
@console_ns.response(
200,
"Audio transcription successful",
api.model("AudioTranscriptResponse", {"text": fields.String(description="Transcribed text from audio")}),
console_ns.model("AudioTranscriptResponse", {"text": fields.String(description="Transcribed text from audio")}),
)
@api.response(400, "Bad request - No audio uploaded or unsupported type")
@api.response(413, "Audio file too large")
@console_ns.response(400, "Bad request - No audio uploaded or unsupported type")
@console_ns.response(413, "Audio file too large")
@setup_required
@login_required
@account_initialization_required
@@ -89,11 +89,11 @@ class ChatMessageAudioApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/text-to-audio")
class ChatMessageTextApi(Resource):
@api.doc("chat_message_text_to_speech")
@api.doc(description="Convert text to speech for chat messages")
@api.doc(params={"app_id": "App ID"})
@api.expect(
api.model(
@console_ns.doc("chat_message_text_to_speech")
@console_ns.doc(description="Convert text to speech for chat messages")
@console_ns.doc(params={"app_id": "App ID"})
@console_ns.expect(
console_ns.model(
"TextToSpeechRequest",
{
"message_id": fields.String(description="Message ID"),
@@ -103,19 +103,21 @@ class ChatMessageTextApi(Resource):
},
)
)
@api.response(200, "Text to speech conversion successful")
@api.response(400, "Bad request - Invalid parameters")
@console_ns.response(200, "Text to speech conversion successful")
@console_ns.response(400, "Bad request - Invalid parameters")
@get_app_model
@setup_required
@login_required
@account_initialization_required
def post(self, app_model: App):
try:
parser = reqparse.RequestParser()
parser.add_argument("message_id", type=str, location="json")
parser.add_argument("text", type=str, location="json")
parser.add_argument("voice", type=str, location="json")
parser.add_argument("streaming", type=bool, location="json")
parser = (
reqparse.RequestParser()
.add_argument("message_id", type=str, location="json")
.add_argument("text", type=str, location="json")
.add_argument("voice", type=str, location="json")
.add_argument("streaming", type=bool, location="json")
)
args = parser.parse_args()
message_id = args.get("message_id", None)
@@ -154,20 +156,23 @@ class ChatMessageTextApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/text-to-audio/voices")
class TextModesApi(Resource):
@api.doc("get_text_to_speech_voices")
@api.doc(description="Get available TTS voices for a specific language")
@api.doc(params={"app_id": "App ID"})
@api.expect(api.parser().add_argument("language", type=str, required=True, location="args", help="Language code"))
@api.response(200, "TTS voices retrieved successfully", fields.List(fields.Raw(description="Available voices")))
@api.response(400, "Invalid language parameter")
@console_ns.doc("get_text_to_speech_voices")
@console_ns.doc(description="Get available TTS voices for a specific language")
@console_ns.doc(params={"app_id": "App ID"})
@console_ns.expect(
console_ns.parser().add_argument("language", type=str, required=True, location="args", help="Language code")
)
@console_ns.response(
200, "TTS voices retrieved successfully", fields.List(fields.Raw(description="Available voices"))
)
@console_ns.response(400, "Invalid language parameter")
@get_app_model
@setup_required
@login_required
@account_initialization_required
def get(self, app_model):
try:
parser = reqparse.RequestParser()
parser.add_argument("language", type=str, required=True, location="args")
parser = reqparse.RequestParser().add_argument("language", type=str, required=True, location="args")
args = parser.parse_args()
response = AudioService.transcript_tts_voices(

View File

@@ -1,11 +1,13 @@
import logging
from typing import Any, Literal
from flask import request
from flask_restx import Resource, fields, reqparse
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
from flask_restx import Resource
from pydantic import BaseModel, Field, field_validator
from werkzeug.exceptions import InternalServerError, NotFound
import services
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.error import (
AppUnavailableError,
CompletionRequestError,
@@ -15,9 +17,8 @@ from controllers.console.app.error import (
ProviderQuotaExceededError,
)
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.wraps import account_initialization_required, edit_permission_required, setup_required
from controllers.web.error import InvokeRateLimitError as InvokeRateLimitHttpError
from core.app.apps.base_app_queue_manager import AppQueueManager
from core.app.entities.app_invoke_entities import InvokeFrom
from core.errors.error import (
ModelCurrentlyNotSupportError,
@@ -32,48 +33,66 @@ from libs.login import current_user, login_required
from models import Account
from models.model import AppMode
from services.app_generate_service import AppGenerateService
from services.app_task_service import AppTaskService
from services.errors.llm import InvokeRateLimitError
logger = logging.getLogger(__name__)
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class BaseMessagePayload(BaseModel):
inputs: dict[str, Any]
model_config_data: dict[str, Any] = Field(..., alias="model_config")
files: list[Any] | None = Field(default=None, description="Uploaded files")
response_mode: Literal["blocking", "streaming"] = Field(default="blocking", description="Response mode")
retriever_from: str = Field(default="dev", description="Retriever source")
class CompletionMessagePayload(BaseMessagePayload):
query: str = Field(default="", description="Query text")
class ChatMessagePayload(BaseMessagePayload):
query: str = Field(..., description="User query")
conversation_id: str | None = Field(default=None, description="Conversation ID")
parent_message_id: str | None = Field(default=None, description="Parent message ID")
@field_validator("conversation_id", "parent_message_id")
@classmethod
def validate_uuid(cls, value: str | None) -> str | None:
if value is None:
return value
return uuid_value(value)
console_ns.schema_model(
CompletionMessagePayload.__name__,
CompletionMessagePayload.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0),
)
console_ns.schema_model(
ChatMessagePayload.__name__, ChatMessagePayload.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0)
)
# define completion message api for user
@console_ns.route("/apps/<uuid:app_id>/completion-messages")
class CompletionMessageApi(Resource):
@api.doc("create_completion_message")
@api.doc(description="Generate completion message for debugging")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"CompletionMessageRequest",
{
"inputs": fields.Raw(required=True, description="Input variables"),
"query": fields.String(description="Query text", default=""),
"files": fields.List(fields.Raw(), description="Uploaded files"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"response_mode": fields.String(enum=["blocking", "streaming"], description="Response mode"),
"retriever_from": fields.String(default="dev", description="Retriever source"),
},
)
)
@api.response(200, "Completion generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(404, "App not found")
@console_ns.doc("create_completion_message")
@console_ns.doc(description="Generate completion message for debugging")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[CompletionMessagePayload.__name__])
@console_ns.response(200, "Completion generated successfully")
@console_ns.response(400, "Invalid request parameters")
@console_ns.response(404, "App not found")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.COMPLETION)
def post(self, app_model):
parser = reqparse.RequestParser()
parser.add_argument("inputs", type=dict, required=True, location="json")
parser.add_argument("query", type=str, location="json", default="")
parser.add_argument("files", type=list, required=False, location="json")
parser.add_argument("model_config", type=dict, required=True, location="json")
parser.add_argument("response_mode", type=str, choices=["blocking", "streaming"], location="json")
parser.add_argument("retriever_from", type=str, required=False, default="dev", location="json")
args = parser.parse_args()
args_model = CompletionMessagePayload.model_validate(console_ns.payload)
args = args_model.model_dump(exclude_none=True, by_alias=True)
streaming = args["response_mode"] != "blocking"
streaming = args_model.response_mode != "blocking"
args["auto_generate_name"] = False
try:
@@ -108,10 +127,10 @@ class CompletionMessageApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/completion-messages/<string:task_id>/stop")
class CompletionMessageStopApi(Resource):
@api.doc("stop_completion_message")
@api.doc(description="Stop a running completion message generation")
@api.doc(params={"app_id": "Application ID", "task_id": "Task ID to stop"})
@api.response(200, "Task stopped successfully")
@console_ns.doc("stop_completion_message")
@console_ns.doc(description="Stop a running completion message generation")
@console_ns.doc(params={"app_id": "Application ID", "task_id": "Task ID to stop"})
@console_ns.response(200, "Task stopped successfully")
@setup_required
@login_required
@account_initialization_required
@@ -119,57 +138,36 @@ class CompletionMessageStopApi(Resource):
def post(self, app_model, task_id):
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, current_user.id)
AppTaskService.stop_task(
task_id=task_id,
invoke_from=InvokeFrom.DEBUGGER,
user_id=current_user.id,
app_mode=AppMode.value_of(app_model.mode),
)
return {"result": "success"}, 200
@console_ns.route("/apps/<uuid:app_id>/chat-messages")
class ChatMessageApi(Resource):
@api.doc("create_chat_message")
@api.doc(description="Generate chat message for debugging")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"ChatMessageRequest",
{
"inputs": fields.Raw(required=True, description="Input variables"),
"query": fields.String(required=True, description="User query"),
"files": fields.List(fields.Raw(), description="Uploaded files"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"conversation_id": fields.String(description="Conversation ID"),
"parent_message_id": fields.String(description="Parent message ID"),
"response_mode": fields.String(enum=["blocking", "streaming"], description="Response mode"),
"retriever_from": fields.String(default="dev", description="Retriever source"),
},
)
)
@api.response(200, "Chat message generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(404, "App or conversation not found")
@console_ns.doc("create_chat_message")
@console_ns.doc(description="Generate chat message for debugging")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[ChatMessagePayload.__name__])
@console_ns.response(200, "Chat message generated successfully")
@console_ns.response(400, "Invalid request parameters")
@console_ns.response(404, "App or conversation not found")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT])
@edit_permission_required
def post(self, app_model):
if not isinstance(current_user, Account):
raise Forbidden()
args_model = ChatMessagePayload.model_validate(console_ns.payload)
args = args_model.model_dump(exclude_none=True, by_alias=True)
if not current_user.has_edit_permission:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("inputs", type=dict, required=True, location="json")
parser.add_argument("query", type=str, required=True, location="json")
parser.add_argument("files", type=list, required=False, location="json")
parser.add_argument("model_config", type=dict, required=True, location="json")
parser.add_argument("conversation_id", type=uuid_value, location="json")
parser.add_argument("parent_message_id", type=uuid_value, required=False, location="json")
parser.add_argument("response_mode", type=str, choices=["blocking", "streaming"], location="json")
parser.add_argument("retriever_from", type=str, required=False, default="dev", location="json")
args = parser.parse_args()
streaming = args["response_mode"] != "blocking"
streaming = args_model.response_mode != "blocking"
args["auto_generate_name"] = False
external_trace_id = get_external_trace_id(request)
@@ -210,10 +208,10 @@ class ChatMessageApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/chat-messages/<string:task_id>/stop")
class ChatMessageStopApi(Resource):
@api.doc("stop_chat_message")
@api.doc(description="Stop a running chat message generation")
@api.doc(params={"app_id": "Application ID", "task_id": "Task ID to stop"})
@api.response(200, "Task stopped successfully")
@console_ns.doc("stop_chat_message")
@console_ns.doc(description="Stop a running chat message generation")
@console_ns.doc(params={"app_id": "Application ID", "task_id": "Task ID to stop"})
@console_ns.response(200, "Task stopped successfully")
@setup_required
@login_required
@account_initialization_required
@@ -221,6 +219,12 @@ class ChatMessageStopApi(Resource):
def post(self, app_model, task_id):
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
AppQueueManager.set_stop_flag(task_id, InvokeFrom.DEBUGGER, current_user.id)
AppTaskService.stop_task(
task_id=task_id,
invoke_from=InvokeFrom.DEBUGGER,
user_id=current_user.id,
app_mode=AppMode.value_of(app_model.mode),
)
return {"result": "success"}, 200

View File

@@ -1,116 +1,376 @@
from datetime import datetime
from typing import Literal
import pytz # pip install pytz
import sqlalchemy as sa
from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse
from flask_restx.inputs import int_range
from flask import abort, request
from flask_restx import Resource, fields, marshal_with
from pydantic import BaseModel, Field, field_validator
from sqlalchemy import func, or_
from sqlalchemy.orm import joinedload
from werkzeug.exceptions import Forbidden, NotFound
from werkzeug.exceptions import NotFound
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.wraps import account_initialization_required, edit_permission_required, setup_required
from core.app.entities.app_invoke_entities import InvokeFrom
from extensions.ext_database import db
from fields.conversation_fields import (
conversation_detail_fields,
conversation_message_detail_fields,
conversation_pagination_fields,
conversation_with_summary_pagination_fields,
)
from libs.datetime_utils import naive_utc_now
from libs.helper import DatetimeString
from libs.login import login_required
from models import Account, Conversation, EndUser, Message, MessageAnnotation
from fields.conversation_fields import MessageTextField
from fields.raws import FilesContainedField
from libs.datetime_utils import naive_utc_now, parse_time_range
from libs.helper import TimestampField
from libs.login import current_account_with_tenant, login_required
from models import Conversation, EndUser, Message, MessageAnnotation
from models.model import AppMode
from services.conversation_service import ConversationService
from services.errors.conversation import ConversationNotExistsError
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class BaseConversationQuery(BaseModel):
keyword: str | None = Field(default=None, description="Search keyword")
start: str | None = Field(default=None, description="Start date (YYYY-MM-DD HH:MM)")
end: str | None = Field(default=None, description="End date (YYYY-MM-DD HH:MM)")
annotation_status: Literal["annotated", "not_annotated", "all"] = Field(
default="all", description="Annotation status filter"
)
page: int = Field(default=1, ge=1, le=99999, description="Page number")
limit: int = Field(default=20, ge=1, le=100, description="Page size (1-100)")
@field_validator("start", "end", mode="before")
@classmethod
def blank_to_none(cls, value: str | None) -> str | None:
if value == "":
return None
return value
class CompletionConversationQuery(BaseConversationQuery):
pass
class ChatConversationQuery(BaseConversationQuery):
sort_by: Literal["created_at", "-created_at", "updated_at", "-updated_at"] = Field(
default="-updated_at", description="Sort field and direction"
)
console_ns.schema_model(
CompletionConversationQuery.__name__,
CompletionConversationQuery.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0),
)
console_ns.schema_model(
ChatConversationQuery.__name__,
ChatConversationQuery.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0),
)
# Register models for flask_restx to avoid dict type issues in Swagger
# Register in dependency order: base models first, then dependent models
# Base models
simple_account_model = console_ns.model(
"SimpleAccount",
{
"id": fields.String,
"name": fields.String,
"email": fields.String,
},
)
feedback_stat_model = console_ns.model(
"FeedbackStat",
{
"like": fields.Integer,
"dislike": fields.Integer,
},
)
status_count_model = console_ns.model(
"StatusCount",
{
"success": fields.Integer,
"failed": fields.Integer,
"partial_success": fields.Integer,
},
)
message_file_model = console_ns.model(
"MessageFile",
{
"id": fields.String,
"filename": fields.String,
"type": fields.String,
"url": fields.String,
"mime_type": fields.String,
"size": fields.Integer,
"transfer_method": fields.String,
"belongs_to": fields.String(default="user"),
"upload_file_id": fields.String(default=None),
},
)
agent_thought_model = console_ns.model(
"AgentThought",
{
"id": fields.String,
"chain_id": fields.String,
"message_id": fields.String,
"position": fields.Integer,
"thought": fields.String,
"tool": fields.String,
"tool_labels": fields.Raw,
"tool_input": fields.String,
"created_at": TimestampField,
"observation": fields.String,
"files": fields.List(fields.String),
},
)
simple_model_config_model = console_ns.model(
"SimpleModelConfig",
{
"model": fields.Raw(attribute="model_dict"),
"pre_prompt": fields.String,
},
)
model_config_model = console_ns.model(
"ModelConfig",
{
"opening_statement": fields.String,
"suggested_questions": fields.Raw,
"model": fields.Raw,
"user_input_form": fields.Raw,
"pre_prompt": fields.String,
"agent_mode": fields.Raw,
},
)
# Models that depend on simple_account_model
feedback_model = console_ns.model(
"Feedback",
{
"rating": fields.String,
"content": fields.String,
"from_source": fields.String,
"from_end_user_id": fields.String,
"from_account": fields.Nested(simple_account_model, allow_null=True),
},
)
annotation_model = console_ns.model(
"Annotation",
{
"id": fields.String,
"question": fields.String,
"content": fields.String,
"account": fields.Nested(simple_account_model, allow_null=True),
"created_at": TimestampField,
},
)
annotation_hit_history_model = console_ns.model(
"AnnotationHitHistory",
{
"annotation_id": fields.String(attribute="id"),
"annotation_create_account": fields.Nested(simple_account_model, allow_null=True),
"created_at": TimestampField,
},
)
# Simple message detail model
simple_message_detail_model = console_ns.model(
"SimpleMessageDetail",
{
"inputs": FilesContainedField,
"query": fields.String,
"message": MessageTextField,
"answer": fields.String,
},
)
# Message detail model that depends on multiple models
message_detail_model = console_ns.model(
"MessageDetail",
{
"id": fields.String,
"conversation_id": fields.String,
"inputs": FilesContainedField,
"query": fields.String,
"message": fields.Raw,
"message_tokens": fields.Integer,
"answer": fields.String(attribute="re_sign_file_url_answer"),
"answer_tokens": fields.Integer,
"provider_response_latency": fields.Float,
"from_source": fields.String,
"from_end_user_id": fields.String,
"from_account_id": fields.String,
"feedbacks": fields.List(fields.Nested(feedback_model)),
"workflow_run_id": fields.String,
"annotation": fields.Nested(annotation_model, allow_null=True),
"annotation_hit_history": fields.Nested(annotation_hit_history_model, allow_null=True),
"created_at": TimestampField,
"agent_thoughts": fields.List(fields.Nested(agent_thought_model)),
"message_files": fields.List(fields.Nested(message_file_model)),
"metadata": fields.Raw(attribute="message_metadata_dict"),
"status": fields.String,
"error": fields.String,
"parent_message_id": fields.String,
},
)
# Conversation models
conversation_fields_model = console_ns.model(
"Conversation",
{
"id": fields.String,
"status": fields.String,
"from_source": fields.String,
"from_end_user_id": fields.String,
"from_end_user_session_id": fields.String(),
"from_account_id": fields.String,
"from_account_name": fields.String,
"read_at": TimestampField,
"created_at": TimestampField,
"updated_at": TimestampField,
"annotation": fields.Nested(annotation_model, allow_null=True),
"model_config": fields.Nested(simple_model_config_model),
"user_feedback_stats": fields.Nested(feedback_stat_model),
"admin_feedback_stats": fields.Nested(feedback_stat_model),
"message": fields.Nested(simple_message_detail_model, attribute="first_message"),
},
)
conversation_pagination_model = console_ns.model(
"ConversationPagination",
{
"page": fields.Integer,
"limit": fields.Integer(attribute="per_page"),
"total": fields.Integer,
"has_more": fields.Boolean(attribute="has_next"),
"data": fields.List(fields.Nested(conversation_fields_model), attribute="items"),
},
)
conversation_message_detail_model = console_ns.model(
"ConversationMessageDetail",
{
"id": fields.String,
"status": fields.String,
"from_source": fields.String,
"from_end_user_id": fields.String,
"from_account_id": fields.String,
"created_at": TimestampField,
"model_config": fields.Nested(model_config_model),
"message": fields.Nested(message_detail_model, attribute="first_message"),
},
)
conversation_with_summary_model = console_ns.model(
"ConversationWithSummary",
{
"id": fields.String,
"status": fields.String,
"from_source": fields.String,
"from_end_user_id": fields.String,
"from_end_user_session_id": fields.String,
"from_account_id": fields.String,
"from_account_name": fields.String,
"name": fields.String,
"summary": fields.String(attribute="summary_or_query"),
"read_at": TimestampField,
"created_at": TimestampField,
"updated_at": TimestampField,
"annotated": fields.Boolean,
"model_config": fields.Nested(simple_model_config_model),
"message_count": fields.Integer,
"user_feedback_stats": fields.Nested(feedback_stat_model),
"admin_feedback_stats": fields.Nested(feedback_stat_model),
"status_count": fields.Nested(status_count_model),
},
)
conversation_with_summary_pagination_model = console_ns.model(
"ConversationWithSummaryPagination",
{
"page": fields.Integer,
"limit": fields.Integer(attribute="per_page"),
"total": fields.Integer,
"has_more": fields.Boolean(attribute="has_next"),
"data": fields.List(fields.Nested(conversation_with_summary_model), attribute="items"),
},
)
conversation_detail_model = console_ns.model(
"ConversationDetail",
{
"id": fields.String,
"status": fields.String,
"from_source": fields.String,
"from_end_user_id": fields.String,
"from_account_id": fields.String,
"created_at": TimestampField,
"updated_at": TimestampField,
"annotated": fields.Boolean,
"introduction": fields.String,
"model_config": fields.Nested(model_config_model),
"message_count": fields.Integer,
"user_feedback_stats": fields.Nested(feedback_stat_model),
"admin_feedback_stats": fields.Nested(feedback_stat_model),
},
)
@console_ns.route("/apps/<uuid:app_id>/completion-conversations")
class CompletionConversationApi(Resource):
@api.doc("list_completion_conversations")
@api.doc(description="Get completion conversations with pagination and filtering")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("keyword", type=str, location="args", help="Search keyword")
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
.add_argument(
"annotation_status",
type=str,
location="args",
choices=["annotated", "not_annotated", "all"],
default="all",
help="Annotation status filter",
)
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size (1-100)")
)
@api.response(200, "Success", conversation_pagination_fields)
@api.response(403, "Insufficient permissions")
@console_ns.doc("list_completion_conversations")
@console_ns.doc(description="Get completion conversations with pagination and filtering")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[CompletionConversationQuery.__name__])
@console_ns.response(200, "Success", conversation_pagination_model)
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.COMPLETION)
@marshal_with(conversation_pagination_fields)
@marshal_with(conversation_pagination_model)
@edit_permission_required
def get(self, app_model):
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("keyword", type=str, location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument(
"annotation_status", type=str, choices=["annotated", "not_annotated", "all"], default="all", location="args"
)
parser.add_argument("page", type=int_range(1, 99999), default=1, location="args")
parser.add_argument("limit", type=int_range(1, 100), default=20, location="args")
args = parser.parse_args()
current_user, _ = current_account_with_tenant()
args = CompletionConversationQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
query = sa.select(Conversation).where(
Conversation.app_id == app_model.id, Conversation.mode == "completion", Conversation.is_deleted.is_(False)
)
if args["keyword"]:
if args.keyword:
query = query.join(Message, Message.conversation_id == Conversation.id).where(
or_(
Message.query.ilike(f"%{args['keyword']}%"),
Message.answer.ilike(f"%{args['keyword']}%"),
Message.query.ilike(f"%{args.keyword}%"),
Message.answer.ilike(f"%{args.keyword}%"),
)
)
account = current_user
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
assert account.timezone is not None
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
try:
start_datetime_utc, end_datetime_utc = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if start_datetime_utc:
query = query.where(Conversation.created_at >= start_datetime_utc)
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=59)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
if end_datetime_utc:
end_datetime_utc = end_datetime_utc.replace(second=59)
query = query.where(Conversation.created_at < end_datetime_utc)
# FIXME, the type ignore in this file
if args["annotation_status"] == "annotated":
if args.annotation_status == "annotated":
query = query.options(joinedload(Conversation.message_annotations)).join( # type: ignore
MessageAnnotation, MessageAnnotation.conversation_id == Conversation.id
)
elif args["annotation_status"] == "not_annotated":
elif args.annotation_status == "not_annotated":
query = (
query.outerjoin(MessageAnnotation, MessageAnnotation.conversation_id == Conversation.id)
.group_by(Conversation.id)
@@ -119,49 +379,46 @@ class CompletionConversationApi(Resource):
query = query.order_by(Conversation.created_at.desc())
conversations = db.paginate(query, page=args["page"], per_page=args["limit"], error_out=False)
conversations = db.paginate(query, page=args.page, per_page=args.limit, error_out=False)
return conversations
@console_ns.route("/apps/<uuid:app_id>/completion-conversations/<uuid:conversation_id>")
class CompletionConversationDetailApi(Resource):
@api.doc("get_completion_conversation")
@api.doc(description="Get completion conversation details with messages")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(200, "Success", conversation_message_detail_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@console_ns.doc("get_completion_conversation")
@console_ns.doc(description="Get completion conversation details with messages")
@console_ns.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@console_ns.response(200, "Success", conversation_message_detail_model)
@console_ns.response(403, "Insufficient permissions")
@console_ns.response(404, "Conversation not found")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.COMPLETION)
@marshal_with(conversation_message_detail_fields)
@marshal_with(conversation_message_detail_model)
@edit_permission_required
def get(self, app_model, conversation_id):
if not current_user.is_editor:
raise Forbidden()
conversation_id = str(conversation_id)
return _get_conversation(app_model, conversation_id)
@api.doc("delete_completion_conversation")
@api.doc(description="Delete a completion conversation")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(204, "Conversation deleted successfully")
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@console_ns.doc("delete_completion_conversation")
@console_ns.doc(description="Delete a completion conversation")
@console_ns.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@console_ns.response(204, "Conversation deleted successfully")
@console_ns.response(403, "Insufficient permissions")
@console_ns.response(404, "Conversation not found")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.COMPLETION)
@edit_permission_required
def delete(self, app_model, conversation_id):
if not current_user.is_editor:
raise Forbidden()
current_user, _ = current_account_with_tenant()
conversation_id = str(conversation_id)
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
ConversationService.delete(app_model, conversation_id, current_user)
except ConversationNotExistsError:
raise NotFound("Conversation Not Exists.")
@@ -171,63 +428,21 @@ class CompletionConversationDetailApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/chat-conversations")
class ChatConversationApi(Resource):
@api.doc("list_chat_conversations")
@api.doc(description="Get chat conversations with pagination, filtering and summary")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("keyword", type=str, location="args", help="Search keyword")
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
.add_argument(
"annotation_status",
type=str,
location="args",
choices=["annotated", "not_annotated", "all"],
default="all",
help="Annotation status filter",
)
.add_argument("message_count_gte", type=int, location="args", help="Minimum message count")
.add_argument("page", type=int, location="args", default=1, help="Page number")
.add_argument("limit", type=int, location="args", default=20, help="Page size (1-100)")
.add_argument(
"sort_by",
type=str,
location="args",
choices=["created_at", "-created_at", "updated_at", "-updated_at"],
default="-updated_at",
help="Sort field and direction",
)
)
@api.response(200, "Success", conversation_with_summary_pagination_fields)
@api.response(403, "Insufficient permissions")
@console_ns.doc("list_chat_conversations")
@console_ns.doc(description="Get chat conversations with pagination, filtering and summary")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[ChatConversationQuery.__name__])
@console_ns.response(200, "Success", conversation_with_summary_pagination_model)
@console_ns.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@marshal_with(conversation_with_summary_pagination_fields)
@marshal_with(conversation_with_summary_pagination_model)
@edit_permission_required
def get(self, app_model):
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("keyword", type=str, location="args")
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument(
"annotation_status", type=str, choices=["annotated", "not_annotated", "all"], default="all", location="args"
)
parser.add_argument("message_count_gte", type=int_range(1, 99999), required=False, location="args")
parser.add_argument("page", type=int_range(1, 99999), required=False, default=1, location="args")
parser.add_argument("limit", type=int_range(1, 100), required=False, default=20, location="args")
parser.add_argument(
"sort_by",
type=str,
choices=["created_at", "-created_at", "updated_at", "-updated_at"],
required=False,
default="-updated_at",
location="args",
)
args = parser.parse_args()
current_user, _ = current_account_with_tenant()
args = ChatConversationQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
subquery = (
db.session.query(
@@ -239,8 +454,8 @@ class ChatConversationApi(Resource):
query = sa.select(Conversation).where(Conversation.app_id == app_model.id, Conversation.is_deleted.is_(False))
if args["keyword"]:
keyword_filter = f"%{args['keyword']}%"
if args.keyword:
keyword_filter = f"%{args.keyword}%"
query = (
query.join(
Message,
@@ -260,58 +475,43 @@ class ChatConversationApi(Resource):
)
account = current_user
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
assert account.timezone is not None
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
try:
start_datetime_utc, end_datetime_utc = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
match args["sort_by"]:
if start_datetime_utc:
match args.sort_by:
case "updated_at" | "-updated_at":
query = query.where(Conversation.updated_at >= start_datetime_utc)
case "created_at" | "-created_at" | _:
query = query.where(Conversation.created_at >= start_datetime_utc)
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=59)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
match args["sort_by"]:
if end_datetime_utc:
end_datetime_utc = end_datetime_utc.replace(second=59)
match args.sort_by:
case "updated_at" | "-updated_at":
query = query.where(Conversation.updated_at <= end_datetime_utc)
case "created_at" | "-created_at" | _:
query = query.where(Conversation.created_at <= end_datetime_utc)
if args["annotation_status"] == "annotated":
if args.annotation_status == "annotated":
query = query.options(joinedload(Conversation.message_annotations)).join( # type: ignore
MessageAnnotation, MessageAnnotation.conversation_id == Conversation.id
)
elif args["annotation_status"] == "not_annotated":
elif args.annotation_status == "not_annotated":
query = (
query.outerjoin(MessageAnnotation, MessageAnnotation.conversation_id == Conversation.id)
.group_by(Conversation.id)
.having(func.count(MessageAnnotation.id) == 0)
)
if args["message_count_gte"] and args["message_count_gte"] >= 1:
query = (
query.options(joinedload(Conversation.messages)) # type: ignore
.join(Message, Message.conversation_id == Conversation.id)
.group_by(Conversation.id)
.having(func.count(Message.id) >= args["message_count_gte"])
)
if app_model.mode == AppMode.ADVANCED_CHAT:
query = query.where(Conversation.invoke_from != InvokeFrom.DEBUGGER.value)
query = query.where(Conversation.invoke_from != InvokeFrom.DEBUGGER)
match args["sort_by"]:
match args.sort_by:
case "created_at":
query = query.order_by(Conversation.created_at.asc())
case "-created_at":
@@ -323,49 +523,46 @@ class ChatConversationApi(Resource):
case _:
query = query.order_by(Conversation.created_at.desc())
conversations = db.paginate(query, page=args["page"], per_page=args["limit"], error_out=False)
conversations = db.paginate(query, page=args.page, per_page=args.limit, error_out=False)
return conversations
@console_ns.route("/apps/<uuid:app_id>/chat-conversations/<uuid:conversation_id>")
class ChatConversationDetailApi(Resource):
@api.doc("get_chat_conversation")
@api.doc(description="Get chat conversation details")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(200, "Success", conversation_detail_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@console_ns.doc("get_chat_conversation")
@console_ns.doc(description="Get chat conversation details")
@console_ns.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@console_ns.response(200, "Success", conversation_detail_model)
@console_ns.response(403, "Insufficient permissions")
@console_ns.response(404, "Conversation not found")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@marshal_with(conversation_detail_fields)
@marshal_with(conversation_detail_model)
@edit_permission_required
def get(self, app_model, conversation_id):
if not current_user.is_editor:
raise Forbidden()
conversation_id = str(conversation_id)
return _get_conversation(app_model, conversation_id)
@api.doc("delete_chat_conversation")
@api.doc(description="Delete a chat conversation")
@api.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@api.response(204, "Conversation deleted successfully")
@api.response(403, "Insufficient permissions")
@api.response(404, "Conversation not found")
@console_ns.doc("delete_chat_conversation")
@console_ns.doc(description="Delete a chat conversation")
@console_ns.doc(params={"app_id": "Application ID", "conversation_id": "Conversation ID"})
@console_ns.response(204, "Conversation deleted successfully")
@console_ns.response(403, "Insufficient permissions")
@console_ns.response(404, "Conversation not found")
@setup_required
@login_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@account_initialization_required
@edit_permission_required
def delete(self, app_model, conversation_id):
if not current_user.is_editor:
raise Forbidden()
current_user, _ = current_account_with_tenant()
conversation_id = str(conversation_id)
try:
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
ConversationService.delete(app_model, conversation_id, current_user)
except ConversationNotExistsError:
raise NotFound("Conversation Not Exists.")
@@ -374,6 +571,7 @@ class ChatConversationDetailApi(Resource):
def _get_conversation(app_model, conversation_id):
current_user, _ = current_account_with_tenant()
conversation = (
db.session.query(Conversation)
.where(Conversation.id == conversation_id, Conversation.app_id == app_model.id)

View File

@@ -1,47 +1,68 @@
from flask_restx import Resource, marshal_with, reqparse
from flask import request
from flask_restx import Resource, fields, marshal_with
from pydantic import BaseModel, Field
from sqlalchemy import select
from sqlalchemy.orm import Session
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db
from fields.conversation_variable_fields import paginated_conversation_variable_fields
from fields.conversation_variable_fields import (
conversation_variable_fields,
paginated_conversation_variable_fields,
)
from libs.login import login_required
from models import ConversationVariable
from models.model import AppMode
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class ConversationVariablesQuery(BaseModel):
conversation_id: str = Field(..., description="Conversation ID to filter variables")
console_ns.schema_model(
ConversationVariablesQuery.__name__,
ConversationVariablesQuery.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0),
)
# Register models for flask_restx to avoid dict type issues in Swagger
# Register base model first
conversation_variable_model = console_ns.model("ConversationVariable", conversation_variable_fields)
# For nested models, need to replace nested dict with registered model
paginated_conversation_variable_fields_copy = paginated_conversation_variable_fields.copy()
paginated_conversation_variable_fields_copy["data"] = fields.List(
fields.Nested(conversation_variable_model), attribute="data"
)
paginated_conversation_variable_model = console_ns.model(
"PaginatedConversationVariable", paginated_conversation_variable_fields_copy
)
@console_ns.route("/apps/<uuid:app_id>/conversation-variables")
class ConversationVariablesApi(Resource):
@api.doc("get_conversation_variables")
@api.doc(description="Get conversation variables for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser().add_argument(
"conversation_id", type=str, location="args", help="Conversation ID to filter variables"
)
)
@api.response(200, "Conversation variables retrieved successfully", paginated_conversation_variable_fields)
@console_ns.doc("get_conversation_variables")
@console_ns.doc(description="Get conversation variables for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[ConversationVariablesQuery.__name__])
@console_ns.response(200, "Conversation variables retrieved successfully", paginated_conversation_variable_model)
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.ADVANCED_CHAT)
@marshal_with(paginated_conversation_variable_fields)
@marshal_with(paginated_conversation_variable_model)
def get(self, app_model):
parser = reqparse.RequestParser()
parser.add_argument("conversation_id", type=str, location="args")
args = parser.parse_args()
args = ConversationVariablesQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
stmt = (
select(ConversationVariable)
.where(ConversationVariable.app_id == app_model.id)
.order_by(ConversationVariable.created_at)
)
if args["conversation_id"]:
stmt = stmt.where(ConversationVariable.conversation_id == args["conversation_id"])
else:
raise ValueError("conversation_id is required")
stmt = stmt.where(ConversationVariable.conversation_id == args.conversation_id)
# NOTE: This is a temporary solution to avoid performance issues.
page = 1

View File

@@ -1,9 +1,10 @@
from collections.abc import Sequence
from typing import Any
from flask_login import current_user
from flask_restx import Resource, fields, reqparse
from flask_restx import Resource
from pydantic import BaseModel, Field
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.error import (
CompletionRequestError,
ProviderModelCurrentlyNotSupportError,
@@ -12,50 +13,80 @@ from controllers.console.app.error import (
)
from controllers.console.wraps import account_initialization_required, setup_required
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.helper.code_executor.code_node_provider import CodeNodeProvider
from core.helper.code_executor.javascript.javascript_code_provider import JavascriptCodeProvider
from core.helper.code_executor.python3.python3_code_provider import Python3CodeProvider
from core.llm_generator.llm_generator import LLMGenerator
from core.model_runtime.errors.invoke import InvokeError
from extensions.ext_database import db
from libs.login import login_required
from libs.login import current_account_with_tenant, login_required
from models import App
from services.workflow_service import WorkflowService
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class RuleGeneratePayload(BaseModel):
instruction: str = Field(..., description="Rule generation instruction")
model_config_data: dict[str, Any] = Field(..., alias="model_config", description="Model configuration")
no_variable: bool = Field(default=False, description="Whether to exclude variables")
class RuleCodeGeneratePayload(RuleGeneratePayload):
code_language: str = Field(default="javascript", description="Programming language for code generation")
class RuleStructuredOutputPayload(BaseModel):
instruction: str = Field(..., description="Structured output generation instruction")
model_config_data: dict[str, Any] = Field(..., alias="model_config", description="Model configuration")
class InstructionGeneratePayload(BaseModel):
flow_id: str = Field(..., description="Workflow/Flow ID")
node_id: str = Field(default="", description="Node ID for workflow context")
current: str = Field(default="", description="Current instruction text")
language: str = Field(default="javascript", description="Programming language (javascript/python)")
instruction: str = Field(..., description="Instruction for generation")
model_config_data: dict[str, Any] = Field(..., alias="model_config", description="Model configuration")
ideal_output: str = Field(default="", description="Expected ideal output")
class InstructionTemplatePayload(BaseModel):
type: str = Field(..., description="Instruction template type")
def reg(cls: type[BaseModel]):
console_ns.schema_model(cls.__name__, cls.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0))
reg(RuleGeneratePayload)
reg(RuleCodeGeneratePayload)
reg(RuleStructuredOutputPayload)
reg(InstructionGeneratePayload)
reg(InstructionTemplatePayload)
@console_ns.route("/rule-generate")
class RuleGenerateApi(Resource):
@api.doc("generate_rule_config")
@api.doc(description="Generate rule configuration using LLM")
@api.expect(
api.model(
"RuleGenerateRequest",
{
"instruction": fields.String(required=True, description="Rule generation instruction"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"no_variable": fields.Boolean(required=True, default=False, description="Whether to exclude variables"),
},
)
)
@api.response(200, "Rule configuration generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(402, "Provider quota exceeded")
@console_ns.doc("generate_rule_config")
@console_ns.doc(description="Generate rule configuration using LLM")
@console_ns.expect(console_ns.models[RuleGeneratePayload.__name__])
@console_ns.response(200, "Rule configuration generated successfully")
@console_ns.response(400, "Invalid request parameters")
@console_ns.response(402, "Provider quota exceeded")
@setup_required
@login_required
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("instruction", type=str, required=True, nullable=False, location="json")
parser.add_argument("model_config", type=dict, required=True, nullable=False, location="json")
parser.add_argument("no_variable", type=bool, required=True, default=False, location="json")
args = parser.parse_args()
args = RuleGeneratePayload.model_validate(console_ns.payload)
_, current_tenant_id = current_account_with_tenant()
account = current_user
try:
rules = LLMGenerator.generate_rule_config(
tenant_id=account.current_tenant_id,
instruction=args["instruction"],
model_config=args["model_config"],
no_variable=args["no_variable"],
tenant_id=current_tenant_id,
instruction=args.instruction,
model_config=args.model_config_data,
no_variable=args.no_variable,
)
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
@@ -71,42 +102,25 @@ class RuleGenerateApi(Resource):
@console_ns.route("/rule-code-generate")
class RuleCodeGenerateApi(Resource):
@api.doc("generate_rule_code")
@api.doc(description="Generate code rules using LLM")
@api.expect(
api.model(
"RuleCodeGenerateRequest",
{
"instruction": fields.String(required=True, description="Code generation instruction"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"no_variable": fields.Boolean(required=True, default=False, description="Whether to exclude variables"),
"code_language": fields.String(
default="javascript", description="Programming language for code generation"
),
},
)
)
@api.response(200, "Code rules generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(402, "Provider quota exceeded")
@console_ns.doc("generate_rule_code")
@console_ns.doc(description="Generate code rules using LLM")
@console_ns.expect(console_ns.models[RuleCodeGeneratePayload.__name__])
@console_ns.response(200, "Code rules generated successfully")
@console_ns.response(400, "Invalid request parameters")
@console_ns.response(402, "Provider quota exceeded")
@setup_required
@login_required
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("instruction", type=str, required=True, nullable=False, location="json")
parser.add_argument("model_config", type=dict, required=True, nullable=False, location="json")
parser.add_argument("no_variable", type=bool, required=True, default=False, location="json")
parser.add_argument("code_language", type=str, required=False, default="javascript", location="json")
args = parser.parse_args()
args = RuleCodeGeneratePayload.model_validate(console_ns.payload)
_, current_tenant_id = current_account_with_tenant()
account = current_user
try:
code_result = LLMGenerator.generate_code(
tenant_id=account.current_tenant_id,
instruction=args["instruction"],
model_config=args["model_config"],
code_language=args["code_language"],
tenant_id=current_tenant_id,
instruction=args.instruction,
model_config=args.model_config_data,
code_language=args.code_language,
)
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
@@ -122,35 +136,24 @@ class RuleCodeGenerateApi(Resource):
@console_ns.route("/rule-structured-output-generate")
class RuleStructuredOutputGenerateApi(Resource):
@api.doc("generate_structured_output")
@api.doc(description="Generate structured output rules using LLM")
@api.expect(
api.model(
"StructuredOutputGenerateRequest",
{
"instruction": fields.String(required=True, description="Structured output generation instruction"),
"model_config": fields.Raw(required=True, description="Model configuration"),
},
)
)
@api.response(200, "Structured output generated successfully")
@api.response(400, "Invalid request parameters")
@api.response(402, "Provider quota exceeded")
@console_ns.doc("generate_structured_output")
@console_ns.doc(description="Generate structured output rules using LLM")
@console_ns.expect(console_ns.models[RuleStructuredOutputPayload.__name__])
@console_ns.response(200, "Structured output generated successfully")
@console_ns.response(400, "Invalid request parameters")
@console_ns.response(402, "Provider quota exceeded")
@setup_required
@login_required
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("instruction", type=str, required=True, nullable=False, location="json")
parser.add_argument("model_config", type=dict, required=True, nullable=False, location="json")
args = parser.parse_args()
args = RuleStructuredOutputPayload.model_validate(console_ns.payload)
_, current_tenant_id = current_account_with_tenant()
account = current_user
try:
structured_output = LLMGenerator.generate_structured_output(
tenant_id=account.current_tenant_id,
instruction=args["instruction"],
model_config=args["model_config"],
tenant_id=current_tenant_id,
instruction=args.instruction,
model_config=args.model_config_data,
)
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
@@ -166,101 +169,79 @@ class RuleStructuredOutputGenerateApi(Resource):
@console_ns.route("/instruction-generate")
class InstructionGenerateApi(Resource):
@api.doc("generate_instruction")
@api.doc(description="Generate instruction for workflow nodes or general use")
@api.expect(
api.model(
"InstructionGenerateRequest",
{
"flow_id": fields.String(required=True, description="Workflow/Flow ID"),
"node_id": fields.String(description="Node ID for workflow context"),
"current": fields.String(description="Current instruction text"),
"language": fields.String(default="javascript", description="Programming language (javascript/python)"),
"instruction": fields.String(required=True, description="Instruction for generation"),
"model_config": fields.Raw(required=True, description="Model configuration"),
"ideal_output": fields.String(description="Expected ideal output"),
},
)
)
@api.response(200, "Instruction generated successfully")
@api.response(400, "Invalid request parameters or flow/workflow not found")
@api.response(402, "Provider quota exceeded")
@console_ns.doc("generate_instruction")
@console_ns.doc(description="Generate instruction for workflow nodes or general use")
@console_ns.expect(console_ns.models[InstructionGeneratePayload.__name__])
@console_ns.response(200, "Instruction generated successfully")
@console_ns.response(400, "Invalid request parameters or flow/workflow not found")
@console_ns.response(402, "Provider quota exceeded")
@setup_required
@login_required
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("flow_id", type=str, required=True, default="", location="json")
parser.add_argument("node_id", type=str, required=False, default="", location="json")
parser.add_argument("current", type=str, required=False, default="", location="json")
parser.add_argument("language", type=str, required=False, default="javascript", location="json")
parser.add_argument("instruction", type=str, required=True, nullable=False, location="json")
parser.add_argument("model_config", type=dict, required=True, nullable=False, location="json")
parser.add_argument("ideal_output", type=str, required=False, default="", location="json")
args = parser.parse_args()
code_template = (
Python3CodeProvider.get_default_code()
if args["language"] == "python"
else (JavascriptCodeProvider.get_default_code())
if args["language"] == "javascript"
else ""
args = InstructionGeneratePayload.model_validate(console_ns.payload)
_, current_tenant_id = current_account_with_tenant()
providers: list[type[CodeNodeProvider]] = [Python3CodeProvider, JavascriptCodeProvider]
code_provider: type[CodeNodeProvider] | None = next(
(p for p in providers if p.is_accept_language(args.language)), None
)
code_template = code_provider.get_default_code() if code_provider else ""
try:
# Generate from nothing for a workflow node
if (args["current"] == code_template or args["current"] == "") and args["node_id"] != "":
app = db.session.query(App).where(App.id == args["flow_id"]).first()
if (args.current in (code_template, "")) and args.node_id != "":
app = db.session.query(App).where(App.id == args.flow_id).first()
if not app:
return {"error": f"app {args['flow_id']} not found"}, 400
return {"error": f"app {args.flow_id} not found"}, 400
workflow = WorkflowService().get_draft_workflow(app_model=app)
if not workflow:
return {"error": f"workflow {args['flow_id']} not found"}, 400
return {"error": f"workflow {args.flow_id} not found"}, 400
nodes: Sequence = workflow.graph_dict["nodes"]
node = [node for node in nodes if node["id"] == args["node_id"]]
node = [node for node in nodes if node["id"] == args.node_id]
if len(node) == 0:
return {"error": f"node {args['node_id']} not found"}, 400
return {"error": f"node {args.node_id} not found"}, 400
node_type = node[0]["data"]["type"]
match node_type:
case "llm":
return LLMGenerator.generate_rule_config(
current_user.current_tenant_id,
instruction=args["instruction"],
model_config=args["model_config"],
current_tenant_id,
instruction=args.instruction,
model_config=args.model_config_data,
no_variable=True,
)
case "agent":
return LLMGenerator.generate_rule_config(
current_user.current_tenant_id,
instruction=args["instruction"],
model_config=args["model_config"],
current_tenant_id,
instruction=args.instruction,
model_config=args.model_config_data,
no_variable=True,
)
case "code":
return LLMGenerator.generate_code(
tenant_id=current_user.current_tenant_id,
instruction=args["instruction"],
model_config=args["model_config"],
code_language=args["language"],
tenant_id=current_tenant_id,
instruction=args.instruction,
model_config=args.model_config_data,
code_language=args.language,
)
case _:
return {"error": f"invalid node type: {node_type}"}
if args["node_id"] == "" and args["current"] != "": # For legacy app without a workflow
if args.node_id == "" and args.current != "": # For legacy app without a workflow
return LLMGenerator.instruction_modify_legacy(
tenant_id=current_user.current_tenant_id,
flow_id=args["flow_id"],
current=args["current"],
instruction=args["instruction"],
model_config=args["model_config"],
ideal_output=args["ideal_output"],
tenant_id=current_tenant_id,
flow_id=args.flow_id,
current=args.current,
instruction=args.instruction,
model_config=args.model_config_data,
ideal_output=args.ideal_output,
)
if args["node_id"] != "" and args["current"] != "": # For workflow node
if args.node_id != "" and args.current != "": # For workflow node
return LLMGenerator.instruction_modify_workflow(
tenant_id=current_user.current_tenant_id,
flow_id=args["flow_id"],
node_id=args["node_id"],
current=args["current"],
instruction=args["instruction"],
model_config=args["model_config"],
ideal_output=args["ideal_output"],
tenant_id=current_tenant_id,
flow_id=args.flow_id,
node_id=args.node_id,
current=args.current,
instruction=args.instruction,
model_config=args.model_config_data,
ideal_output=args.ideal_output,
workflow_service=WorkflowService(),
)
return {"error": "incompatible parameters"}, 400
@@ -276,27 +257,17 @@ class InstructionGenerateApi(Resource):
@console_ns.route("/instruction-generate/template")
class InstructionGenerationTemplateApi(Resource):
@api.doc("get_instruction_template")
@api.doc(description="Get instruction generation template")
@api.expect(
api.model(
"InstructionTemplateRequest",
{
"instruction": fields.String(required=True, description="Template instruction"),
"ideal_output": fields.String(description="Expected ideal output"),
},
)
)
@api.response(200, "Template retrieved successfully")
@api.response(400, "Invalid request parameters")
@console_ns.doc("get_instruction_template")
@console_ns.doc(description="Get instruction generation template")
@console_ns.expect(console_ns.models[InstructionTemplatePayload.__name__])
@console_ns.response(200, "Template retrieved successfully")
@console_ns.response(400, "Invalid request parameters")
@setup_required
@login_required
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("type", type=str, required=True, default=False, location="json")
args = parser.parse_args()
match args["type"]:
args = InstructionTemplatePayload.model_validate(console_ns.payload)
match args.type:
case "prompt":
from core.llm_generator.prompts import INSTRUCTION_GENERATE_TEMPLATE_PROMPT
@@ -306,4 +277,4 @@ class InstructionGenerationTemplateApi(Resource):
return {"data": INSTRUCTION_GENERATE_TEMPLATE_CODE}
case _:
raise ValueError(f"Invalid type: {args['type']}")
raise ValueError(f"Invalid type: {args.type}")

View File

@@ -1,18 +1,20 @@
import json
from enum import StrEnum
from flask_login import current_user
from flask_restx import Resource, fields, marshal_with, reqparse
from werkzeug.exceptions import NotFound
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.wraps import account_initialization_required, edit_permission_required, setup_required
from extensions.ext_database import db
from fields.app_fields import app_server_fields
from libs.login import login_required
from libs.login import current_account_with_tenant, login_required
from models.model import AppMCPServer
# Register model for flask_restx to avoid dict type issues in Swagger
app_server_model = console_ns.model("AppServer", app_server_fields)
class AppMCPServerStatus(StrEnum):
ACTIVE = "active"
@@ -21,24 +23,24 @@ class AppMCPServerStatus(StrEnum):
@console_ns.route("/apps/<uuid:app_id>/server")
class AppMCPServerController(Resource):
@api.doc("get_app_mcp_server")
@api.doc(description="Get MCP server configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "MCP server configuration retrieved successfully", app_server_fields)
@setup_required
@console_ns.doc("get_app_mcp_server")
@console_ns.doc(description="Get MCP server configuration for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(200, "MCP server configuration retrieved successfully", app_server_model)
@login_required
@account_initialization_required
@setup_required
@get_app_model
@marshal_with(app_server_fields)
@marshal_with(app_server_model)
def get(self, app_model):
server = db.session.query(AppMCPServer).where(AppMCPServer.app_id == app_model.id).first()
return server
@api.doc("create_app_mcp_server")
@api.doc(description="Create MCP server configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
@console_ns.doc("create_app_mcp_server")
@console_ns.doc(description="Create MCP server configuration for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(
console_ns.model(
"MCPServerCreateRequest",
{
"description": fields.String(description="Server description"),
@@ -46,19 +48,21 @@ class AppMCPServerController(Resource):
},
)
)
@api.response(201, "MCP server configuration created successfully", app_server_fields)
@api.response(403, "Insufficient permissions")
@setup_required
@login_required
@console_ns.response(201, "MCP server configuration created successfully", app_server_model)
@console_ns.response(403, "Insufficient permissions")
@account_initialization_required
@get_app_model
@marshal_with(app_server_fields)
@login_required
@setup_required
@marshal_with(app_server_model)
@edit_permission_required
def post(self, app_model):
if not current_user.is_editor:
raise NotFound()
parser = reqparse.RequestParser()
parser.add_argument("description", type=str, required=False, location="json")
parser.add_argument("parameters", type=dict, required=True, location="json")
_, current_tenant_id = current_account_with_tenant()
parser = (
reqparse.RequestParser()
.add_argument("description", type=str, required=False, location="json")
.add_argument("parameters", type=dict, required=True, location="json")
)
args = parser.parse_args()
description = args.get("description")
@@ -71,18 +75,18 @@ class AppMCPServerController(Resource):
parameters=json.dumps(args["parameters"], ensure_ascii=False),
status=AppMCPServerStatus.ACTIVE,
app_id=app_model.id,
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
server_code=AppMCPServer.generate_server_code(16),
)
db.session.add(server)
db.session.commit()
return server
@api.doc("update_app_mcp_server")
@api.doc(description="Update MCP server configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
@console_ns.doc("update_app_mcp_server")
@console_ns.doc(description="Update MCP server configuration for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(
console_ns.model(
"MCPServerUpdateRequest",
{
"id": fields.String(required=True, description="Server ID"),
@@ -92,22 +96,23 @@ class AppMCPServerController(Resource):
},
)
)
@api.response(200, "MCP server configuration updated successfully", app_server_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Server not found")
@setup_required
@login_required
@account_initialization_required
@console_ns.response(200, "MCP server configuration updated successfully", app_server_model)
@console_ns.response(403, "Insufficient permissions")
@console_ns.response(404, "Server not found")
@get_app_model
@marshal_with(app_server_fields)
@login_required
@setup_required
@account_initialization_required
@marshal_with(app_server_model)
@edit_permission_required
def put(self, app_model):
if not current_user.is_editor:
raise NotFound()
parser = reqparse.RequestParser()
parser.add_argument("id", type=str, required=True, location="json")
parser.add_argument("description", type=str, required=False, location="json")
parser.add_argument("parameters", type=dict, required=True, location="json")
parser.add_argument("status", type=str, required=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("id", type=str, required=True, location="json")
.add_argument("description", type=str, required=False, location="json")
.add_argument("parameters", type=dict, required=True, location="json")
.add_argument("status", type=str, required=False, location="json")
)
args = parser.parse_args()
server = db.session.query(AppMCPServer).where(AppMCPServer.id == args["id"]).first()
if not server:
@@ -132,23 +137,23 @@ class AppMCPServerController(Resource):
@console_ns.route("/apps/<uuid:server_id>/server/refresh")
class AppMCPServerRefreshController(Resource):
@api.doc("refresh_app_mcp_server")
@api.doc(description="Refresh MCP server configuration and regenerate server code")
@api.doc(params={"server_id": "Server ID"})
@api.response(200, "MCP server refreshed successfully", app_server_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "Server not found")
@console_ns.doc("refresh_app_mcp_server")
@console_ns.doc(description="Refresh MCP server configuration and regenerate server code")
@console_ns.doc(params={"server_id": "Server ID"})
@console_ns.response(200, "MCP server refreshed successfully", app_server_model)
@console_ns.response(403, "Insufficient permissions")
@console_ns.response(404, "Server not found")
@setup_required
@login_required
@account_initialization_required
@marshal_with(app_server_fields)
@marshal_with(app_server_model)
@edit_permission_required
def get(self, server_id):
if not current_user.is_editor:
raise NotFound()
_, current_tenant_id = current_account_with_tenant()
server = (
db.session.query(AppMCPServer)
.where(AppMCPServer.id == server_id)
.where(AppMCPServer.tenant_id == current_user.current_tenant_id)
.where(AppMCPServer.tenant_id == current_tenant_id)
.first()
)
if not server:

View File

@@ -1,11 +1,13 @@
import logging
from typing import Literal
from flask_restx import Resource, fields, marshal_with, reqparse
from flask_restx.inputs import int_range
from flask import request
from flask_restx import Resource, fields, marshal_with
from pydantic import BaseModel, Field, field_validator
from sqlalchemy import exists, select
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
from werkzeug.exceptions import InternalServerError, NotFound
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.error import (
CompletionRequestError,
ProviderModelCurrentlyNotSupportError,
@@ -16,74 +18,233 @@ from controllers.console.app.wraps import get_app_model
from controllers.console.explore.error import AppSuggestedQuestionsAfterAnswerDisabledError
from controllers.console.wraps import (
account_initialization_required,
cloud_edition_billing_resource_check,
edit_permission_required,
setup_required,
)
from core.app.entities.app_invoke_entities import InvokeFrom
from core.errors.error import ModelCurrentlyNotSupportError, ProviderTokenNotInitError, QuotaExceededError
from core.model_runtime.errors.invoke import InvokeError
from extensions.ext_database import db
from fields.conversation_fields import annotation_fields, message_detail_fields
from libs.helper import uuid_value
from fields.raws import FilesContainedField
from libs.helper import TimestampField, uuid_value
from libs.infinite_scroll_pagination import InfiniteScrollPagination
from libs.login import current_user, login_required
from models.account import Account
from libs.login import current_account_with_tenant, login_required
from models.model import AppMode, Conversation, Message, MessageAnnotation, MessageFeedback
from services.annotation_service import AppAnnotationService
from services.errors.conversation import ConversationNotExistsError
from services.errors.message import MessageNotExistsError, SuggestedQuestionsAfterAnswerDisabledError
from services.message_service import MessageService
logger = logging.getLogger(__name__)
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class ChatMessagesQuery(BaseModel):
conversation_id: str = Field(..., description="Conversation ID")
first_id: str | None = Field(default=None, description="First message ID for pagination")
limit: int = Field(default=20, ge=1, le=100, description="Number of messages to return (1-100)")
@field_validator("first_id", mode="before")
@classmethod
def empty_to_none(cls, value: str | None) -> str | None:
if value == "":
return None
return value
@field_validator("conversation_id", "first_id")
@classmethod
def validate_uuid(cls, value: str | None) -> str | None:
if value is None:
return value
return uuid_value(value)
class MessageFeedbackPayload(BaseModel):
message_id: str = Field(..., description="Message ID")
rating: Literal["like", "dislike"] | None = Field(default=None, description="Feedback rating")
@field_validator("message_id")
@classmethod
def validate_message_id(cls, value: str) -> str:
return uuid_value(value)
class FeedbackExportQuery(BaseModel):
from_source: Literal["user", "admin"] | None = Field(default=None, description="Filter by feedback source")
rating: Literal["like", "dislike"] | None = Field(default=None, description="Filter by rating")
has_comment: bool | None = Field(default=None, description="Only include feedback with comments")
start_date: str | None = Field(default=None, description="Start date (YYYY-MM-DD)")
end_date: str | None = Field(default=None, description="End date (YYYY-MM-DD)")
format: Literal["csv", "json"] = Field(default="csv", description="Export format")
@field_validator("has_comment", mode="before")
@classmethod
def parse_bool(cls, value: bool | str | None) -> bool | None:
if isinstance(value, bool) or value is None:
return value
lowered = value.lower()
if lowered in {"true", "1", "yes", "on"}:
return True
if lowered in {"false", "0", "no", "off"}:
return False
raise ValueError("has_comment must be a boolean value")
def reg(cls: type[BaseModel]):
console_ns.schema_model(cls.__name__, cls.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0))
reg(ChatMessagesQuery)
reg(MessageFeedbackPayload)
reg(FeedbackExportQuery)
# Register models for flask_restx to avoid dict type issues in Swagger
# Register in dependency order: base models first, then dependent models
# Base models
simple_account_model = console_ns.model(
"SimpleAccount",
{
"id": fields.String,
"name": fields.String,
"email": fields.String,
},
)
message_file_model = console_ns.model(
"MessageFile",
{
"id": fields.String,
"filename": fields.String,
"type": fields.String,
"url": fields.String,
"mime_type": fields.String,
"size": fields.Integer,
"transfer_method": fields.String,
"belongs_to": fields.String(default="user"),
"upload_file_id": fields.String(default=None),
},
)
agent_thought_model = console_ns.model(
"AgentThought",
{
"id": fields.String,
"chain_id": fields.String,
"message_id": fields.String,
"position": fields.Integer,
"thought": fields.String,
"tool": fields.String,
"tool_labels": fields.Raw,
"tool_input": fields.String,
"created_at": TimestampField,
"observation": fields.String,
"files": fields.List(fields.String),
},
)
# Models that depend on simple_account_model
feedback_model = console_ns.model(
"Feedback",
{
"rating": fields.String,
"content": fields.String,
"from_source": fields.String,
"from_end_user_id": fields.String,
"from_account": fields.Nested(simple_account_model, allow_null=True),
},
)
annotation_model = console_ns.model(
"Annotation",
{
"id": fields.String,
"question": fields.String,
"content": fields.String,
"account": fields.Nested(simple_account_model, allow_null=True),
"created_at": TimestampField,
},
)
annotation_hit_history_model = console_ns.model(
"AnnotationHitHistory",
{
"annotation_id": fields.String(attribute="id"),
"annotation_create_account": fields.Nested(simple_account_model, allow_null=True),
"created_at": TimestampField,
},
)
# Message detail model that depends on multiple models
message_detail_model = console_ns.model(
"MessageDetail",
{
"id": fields.String,
"conversation_id": fields.String,
"inputs": FilesContainedField,
"query": fields.String,
"message": fields.Raw,
"message_tokens": fields.Integer,
"answer": fields.String(attribute="re_sign_file_url_answer"),
"answer_tokens": fields.Integer,
"provider_response_latency": fields.Float,
"from_source": fields.String,
"from_end_user_id": fields.String,
"from_account_id": fields.String,
"feedbacks": fields.List(fields.Nested(feedback_model)),
"workflow_run_id": fields.String,
"annotation": fields.Nested(annotation_model, allow_null=True),
"annotation_hit_history": fields.Nested(annotation_hit_history_model, allow_null=True),
"created_at": TimestampField,
"agent_thoughts": fields.List(fields.Nested(agent_thought_model)),
"message_files": fields.List(fields.Nested(message_file_model)),
"metadata": fields.Raw(attribute="message_metadata_dict"),
"status": fields.String,
"error": fields.String,
"parent_message_id": fields.String,
},
)
# Message infinite scroll pagination model
message_infinite_scroll_pagination_model = console_ns.model(
"MessageInfiniteScrollPagination",
{
"limit": fields.Integer,
"has_more": fields.Boolean,
"data": fields.List(fields.Nested(message_detail_model)),
},
)
@console_ns.route("/apps/<uuid:app_id>/chat-messages")
class ChatMessageListApi(Resource):
message_infinite_scroll_pagination_fields = {
"limit": fields.Integer,
"has_more": fields.Boolean,
"data": fields.List(fields.Nested(message_detail_fields)),
}
@api.doc("list_chat_messages")
@api.doc(description="Get chat messages for a conversation with pagination")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("conversation_id", type=str, required=True, location="args", help="Conversation ID")
.add_argument("first_id", type=str, location="args", help="First message ID for pagination")
.add_argument("limit", type=int, location="args", default=20, help="Number of messages to return (1-100)")
)
@api.response(200, "Success", message_infinite_scroll_pagination_fields)
@api.response(404, "Conversation not found")
@setup_required
@console_ns.doc("list_chat_messages")
@console_ns.doc(description="Get chat messages for a conversation with pagination")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[ChatMessagesQuery.__name__])
@console_ns.response(200, "Success", message_infinite_scroll_pagination_model)
@console_ns.response(404, "Conversation not found")
@login_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@account_initialization_required
@marshal_with(message_infinite_scroll_pagination_fields)
@setup_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
@marshal_with(message_infinite_scroll_pagination_model)
@edit_permission_required
def get(self, app_model):
if not isinstance(current_user, Account) or not current_user.has_edit_permission:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("conversation_id", required=True, type=uuid_value, location="args")
parser.add_argument("first_id", type=uuid_value, location="args")
parser.add_argument("limit", type=int_range(1, 100), required=False, default=20, location="args")
args = parser.parse_args()
args = ChatMessagesQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
conversation = (
db.session.query(Conversation)
.where(Conversation.id == args["conversation_id"], Conversation.app_id == app_model.id)
.where(Conversation.id == args.conversation_id, Conversation.app_id == app_model.id)
.first()
)
if not conversation:
raise NotFound("Conversation Not Exists.")
if args["first_id"]:
if args.first_id:
first_message = (
db.session.query(Message)
.where(Message.conversation_id == conversation.id, Message.id == args["first_id"])
.where(Message.conversation_id == conversation.id, Message.id == args.first_id)
.first()
)
@@ -98,7 +259,7 @@ class ChatMessageListApi(Resource):
Message.id != first_message.id,
)
.order_by(Message.created_at.desc())
.limit(args["limit"])
.limit(args.limit)
.all()
)
else:
@@ -106,12 +267,12 @@ class ChatMessageListApi(Resource):
db.session.query(Message)
.where(Message.conversation_id == conversation.id)
.order_by(Message.created_at.desc())
.limit(args["limit"])
.limit(args.limit)
.all()
)
# Initialize has_more based on whether we have a full page
if len(history_messages) == args["limit"]:
if len(history_messages) == args.limit:
current_page_first_message = history_messages[-1]
# Check if there are more messages before the current page
has_more = db.session.scalar(
@@ -129,40 +290,28 @@ class ChatMessageListApi(Resource):
history_messages = list(reversed(history_messages))
return InfiniteScrollPagination(data=history_messages, limit=args["limit"], has_more=has_more)
return InfiniteScrollPagination(data=history_messages, limit=args.limit, has_more=has_more)
@console_ns.route("/apps/<uuid:app_id>/feedbacks")
class MessageFeedbackApi(Resource):
@api.doc("create_message_feedback")
@api.doc(description="Create or update message feedback (like/dislike)")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"MessageFeedbackRequest",
{
"message_id": fields.String(required=True, description="Message ID"),
"rating": fields.String(enum=["like", "dislike"], description="Feedback rating"),
},
)
)
@api.response(200, "Feedback updated successfully")
@api.response(404, "Message not found")
@api.response(403, "Insufficient permissions")
@console_ns.doc("create_message_feedback")
@console_ns.doc(description="Create or update message feedback (like/dislike)")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[MessageFeedbackPayload.__name__])
@console_ns.response(200, "Feedback updated successfully")
@console_ns.response(404, "Message not found")
@console_ns.response(403, "Insufficient permissions")
@get_app_model
@setup_required
@login_required
@account_initialization_required
def post(self, app_model):
if current_user is None:
raise Forbidden()
current_user, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("message_id", required=True, type=uuid_value, location="json")
parser.add_argument("rating", type=str, choices=["like", "dislike", None], location="json")
args = parser.parse_args()
args = MessageFeedbackPayload.model_validate(console_ns.payload)
message_id = str(args["message_id"])
message_id = str(args.message_id)
message = db.session.query(Message).where(Message.id == message_id, Message.app_id == app_model.id).first()
@@ -171,18 +320,21 @@ class MessageFeedbackApi(Resource):
feedback = message.admin_feedback
if not args["rating"] and feedback:
if not args.rating and feedback:
db.session.delete(feedback)
elif args["rating"] and feedback:
feedback.rating = args["rating"]
elif not args["rating"] and not feedback:
elif args.rating and feedback:
feedback.rating = args.rating
elif not args.rating and not feedback:
raise ValueError("rating cannot be None when feedback not exists")
else:
rating_value = args.rating
if rating_value is None:
raise ValueError("rating is required to create feedback")
feedback = MessageFeedback(
app_id=app_model.id,
conversation_id=message.conversation_id,
message_id=message.id,
rating=args["rating"],
rating=rating_value,
from_source="admin",
from_account_id=current_user.id,
)
@@ -193,56 +345,15 @@ class MessageFeedbackApi(Resource):
return {"result": "success"}
@console_ns.route("/apps/<uuid:app_id>/annotations")
class MessageAnnotationApi(Resource):
@api.doc("create_message_annotation")
@api.doc(description="Create message annotation")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
"MessageAnnotationRequest",
{
"message_id": fields.String(description="Message ID"),
"question": fields.String(required=True, description="Question text"),
"answer": fields.String(required=True, description="Answer text"),
"annotation_reply": fields.Raw(description="Annotation reply"),
},
)
)
@api.response(200, "Annotation created successfully", annotation_fields)
@api.response(403, "Insufficient permissions")
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_resource_check("annotation")
@get_app_model
@marshal_with(annotation_fields)
def post(self, app_model):
if not isinstance(current_user, Account):
raise Forbidden()
if not current_user.has_edit_permission:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("message_id", required=False, type=uuid_value, location="json")
parser.add_argument("question", required=True, type=str, location="json")
parser.add_argument("answer", required=True, type=str, location="json")
parser.add_argument("annotation_reply", required=False, type=dict, location="json")
args = parser.parse_args()
annotation = AppAnnotationService.up_insert_app_annotation_from_message(args, app_model.id)
return annotation
@console_ns.route("/apps/<uuid:app_id>/annotations/count")
class MessageAnnotationCountApi(Resource):
@api.doc("get_annotation_count")
@api.doc(description="Get count of message annotations for the app")
@api.doc(params={"app_id": "Application ID"})
@api.response(
@console_ns.doc("get_annotation_count")
@console_ns.doc(description="Get count of message annotations for the app")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(
200,
"Annotation count retrieved successfully",
api.model("AnnotationCountResponse", {"count": fields.Integer(description="Number of annotations")}),
console_ns.model("AnnotationCountResponse", {"count": fields.Integer(description="Number of annotations")}),
)
@get_app_model
@setup_required
@@ -256,20 +367,23 @@ class MessageAnnotationCountApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/chat-messages/<uuid:message_id>/suggested-questions")
class MessageSuggestedQuestionApi(Resource):
@api.doc("get_message_suggested_questions")
@api.doc(description="Get suggested questions for a message")
@api.doc(params={"app_id": "Application ID", "message_id": "Message ID"})
@api.response(
@console_ns.doc("get_message_suggested_questions")
@console_ns.doc(description="Get suggested questions for a message")
@console_ns.doc(params={"app_id": "Application ID", "message_id": "Message ID"})
@console_ns.response(
200,
"Suggested questions retrieved successfully",
api.model("SuggestedQuestionsResponse", {"data": fields.List(fields.String(description="Suggested question"))}),
console_ns.model(
"SuggestedQuestionsResponse", {"data": fields.List(fields.String(description="Suggested question"))}
),
)
@api.response(404, "Message or conversation not found")
@console_ns.response(404, "Message or conversation not found")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
def get(self, app_model, message_id):
current_user, _ = current_account_with_tenant()
message_id = str(message_id)
try:
@@ -297,19 +411,59 @@ class MessageSuggestedQuestionApi(Resource):
return {"data": questions}
@console_ns.route("/apps/<uuid:app_id>/messages/<uuid:message_id>")
class MessageApi(Resource):
@api.doc("get_message")
@api.doc(description="Get message details by ID")
@api.doc(params={"app_id": "Application ID", "message_id": "Message ID"})
@api.response(200, "Message retrieved successfully", message_detail_fields)
@api.response(404, "Message not found")
@console_ns.route("/apps/<uuid:app_id>/feedbacks/export")
class MessageFeedbackExportApi(Resource):
@console_ns.doc("export_feedbacks")
@console_ns.doc(description="Export user feedback data for Google Sheets")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[FeedbackExportQuery.__name__])
@console_ns.response(200, "Feedback data exported successfully")
@console_ns.response(400, "Invalid parameters")
@console_ns.response(500, "Internal server error")
@get_app_model
@setup_required
@login_required
@account_initialization_required
def get(self, app_model):
args = FeedbackExportQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
# Import the service function
from services.feedback_service import FeedbackService
try:
export_data = FeedbackService.export_feedbacks(
app_id=app_model.id,
from_source=args.from_source,
rating=args.rating,
has_comment=args.has_comment,
start_date=args.start_date,
end_date=args.end_date,
format_type=args.format,
)
return export_data
except ValueError as e:
logger.exception("Parameter validation error in feedback export")
return {"error": f"Parameter validation error: {str(e)}"}, 400
except Exception as e:
logger.exception("Error exporting feedback data")
raise InternalServerError(str(e))
@console_ns.route("/apps/<uuid:app_id>/messages/<uuid:message_id>")
class MessageApi(Resource):
@console_ns.doc("get_message")
@console_ns.doc(description="Get message details by ID")
@console_ns.doc(params={"app_id": "Application ID", "message_id": "Message ID"})
@console_ns.response(200, "Message retrieved successfully", message_detail_model)
@console_ns.response(404, "Message not found")
@get_app_model
@marshal_with(message_detail_fields)
def get(self, app_model, message_id):
@setup_required
@login_required
@account_initialization_required
@marshal_with(message_detail_model)
def get(self, app_model, message_id: str):
message_id = str(message_id)
message = db.session.query(Message).where(Message.id == message_id, Message.app_id == app_model.id).first()

View File

@@ -2,31 +2,29 @@ import json
from typing import cast
from flask import request
from flask_login import current_user
from flask_restx import Resource, fields
from werkzeug.exceptions import Forbidden
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.wraps import account_initialization_required, edit_permission_required, setup_required
from core.agent.entities import AgentToolEntity
from core.tools.tool_manager import ToolManager
from core.tools.utils.configuration import ToolParameterConfigurationManager
from events.app_event import app_model_config_was_updated
from extensions.ext_database import db
from libs.login import login_required
from models.account import Account
from libs.datetime_utils import naive_utc_now
from libs.login import current_account_with_tenant, login_required
from models.model import AppMode, AppModelConfig
from services.app_model_config_service import AppModelConfigService
@console_ns.route("/apps/<uuid:app_id>/model-config")
class ModelConfigResource(Resource):
@api.doc("update_app_model_config")
@api.doc(description="Update application model configuration")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
@console_ns.doc("update_app_model_config")
@console_ns.doc(description="Update application model configuration")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(
console_ns.model(
"ModelConfigRequest",
{
"provider": fields.String(description="Model provider"),
@@ -44,25 +42,20 @@ class ModelConfigResource(Resource):
},
)
)
@api.response(200, "Model configuration updated successfully")
@api.response(400, "Invalid configuration")
@api.response(404, "App not found")
@console_ns.response(200, "Model configuration updated successfully")
@console_ns.response(400, "Invalid configuration")
@console_ns.response(404, "App not found")
@setup_required
@login_required
@edit_permission_required
@account_initialization_required
@get_app_model(mode=[AppMode.AGENT_CHAT, AppMode.CHAT, AppMode.COMPLETION])
def post(self, app_model):
"""Modify app model config"""
if not isinstance(current_user, Account):
raise Forbidden()
if not current_user.has_edit_permission:
raise Forbidden()
assert current_user.current_tenant_id is not None, "The tenant information should be loaded."
current_user, current_tenant_id = current_account_with_tenant()
# validate config
model_configuration = AppModelConfigService.validate_configuration(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
config=cast(dict, request.json),
app_mode=AppMode.value_of(app_model.mode),
)
@@ -90,16 +83,16 @@ class ModelConfigResource(Resource):
if not isinstance(tool, dict) or len(tool.keys()) <= 3:
continue
agent_tool_entity = AgentToolEntity(**tool)
agent_tool_entity = AgentToolEntity.model_validate(tool)
# get tool
try:
tool_runtime = ToolManager.get_agent_tool_runtime(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
app_id=app_model.id,
agent_tool=agent_tool_entity,
)
manager = ToolParameterConfigurationManager(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
tool_runtime=tool_runtime,
provider_name=agent_tool_entity.provider_id,
provider_type=agent_tool_entity.provider_type,
@@ -124,7 +117,7 @@ class ModelConfigResource(Resource):
# encrypt agent tool parameters if it's secret-input
agent_mode = new_app_model_config.agent_mode_dict
for tool in agent_mode.get("tools") or []:
agent_tool_entity = AgentToolEntity(**tool)
agent_tool_entity = AgentToolEntity.model_validate(tool)
# get tool
key = f"{agent_tool_entity.provider_id}.{agent_tool_entity.provider_type}.{agent_tool_entity.tool_name}"
@@ -133,7 +126,7 @@ class ModelConfigResource(Resource):
else:
try:
tool_runtime = ToolManager.get_agent_tool_runtime(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
app_id=app_model.id,
agent_tool=agent_tool_entity,
)
@@ -141,7 +134,7 @@ class ModelConfigResource(Resource):
continue
manager = ToolParameterConfigurationManager(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
tool_runtime=tool_runtime,
provider_name=agent_tool_entity.provider_id,
provider_type=agent_tool_entity.provider_type,
@@ -172,6 +165,8 @@ class ModelConfigResource(Resource):
db.session.flush()
app_model.app_model_config_id = new_app_model_config.id
app_model.updated_by = current_user.id
app_model.updated_at = naive_utc_now()
db.session.commit()
app_model_config_was_updated.send(app_model, app_model_config=new_app_model_config)

View File

@@ -1,7 +1,7 @@
from flask_restx import Resource, fields, reqparse
from werkzeug.exceptions import BadRequest
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.error import TracingConfigCheckError, TracingConfigIsExist, TracingConfigNotExist
from controllers.console.wraps import account_initialization_required, setup_required
from libs.login import login_required
@@ -14,24 +14,23 @@ class TraceAppConfigApi(Resource):
Manage trace app configurations
"""
@api.doc("get_trace_app_config")
@api.doc(description="Get tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser().add_argument(
@console_ns.doc("get_trace_app_config")
@console_ns.doc(description="Get tracing configuration for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(
console_ns.parser().add_argument(
"tracing_provider", type=str, required=True, location="args", help="Tracing provider name"
)
)
@api.response(
@console_ns.response(
200, "Tracing configuration retrieved successfully", fields.Raw(description="Tracing configuration data")
)
@api.response(400, "Invalid request parameters")
@console_ns.response(400, "Invalid request parameters")
@setup_required
@login_required
@account_initialization_required
def get(self, app_id):
parser = reqparse.RequestParser()
parser.add_argument("tracing_provider", type=str, required=True, location="args")
parser = reqparse.RequestParser().add_argument("tracing_provider", type=str, required=True, location="args")
args = parser.parse_args()
try:
@@ -42,11 +41,11 @@ class TraceAppConfigApi(Resource):
except Exception as e:
raise BadRequest(str(e))
@api.doc("create_trace_app_config")
@api.doc(description="Create a new tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
@console_ns.doc("create_trace_app_config")
@console_ns.doc(description="Create a new tracing configuration for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(
console_ns.model(
"TraceConfigCreateRequest",
{
"tracing_provider": fields.String(required=True, description="Tracing provider name"),
@@ -54,18 +53,20 @@ class TraceAppConfigApi(Resource):
},
)
)
@api.response(
@console_ns.response(
201, "Tracing configuration created successfully", fields.Raw(description="Created configuration data")
)
@api.response(400, "Invalid request parameters or configuration already exists")
@console_ns.response(400, "Invalid request parameters or configuration already exists")
@setup_required
@login_required
@account_initialization_required
def post(self, app_id):
"""Create a new trace app configuration"""
parser = reqparse.RequestParser()
parser.add_argument("tracing_provider", type=str, required=True, location="json")
parser.add_argument("tracing_config", type=dict, required=True, location="json")
parser = (
reqparse.RequestParser()
.add_argument("tracing_provider", type=str, required=True, location="json")
.add_argument("tracing_config", type=dict, required=True, location="json")
)
args = parser.parse_args()
try:
@@ -80,11 +81,11 @@ class TraceAppConfigApi(Resource):
except Exception as e:
raise BadRequest(str(e))
@api.doc("update_trace_app_config")
@api.doc(description="Update an existing tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
@console_ns.doc("update_trace_app_config")
@console_ns.doc(description="Update an existing tracing configuration for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(
console_ns.model(
"TraceConfigUpdateRequest",
{
"tracing_provider": fields.String(required=True, description="Tracing provider name"),
@@ -92,16 +93,18 @@ class TraceAppConfigApi(Resource):
},
)
)
@api.response(200, "Tracing configuration updated successfully", fields.Raw(description="Success response"))
@api.response(400, "Invalid request parameters or configuration not found")
@console_ns.response(200, "Tracing configuration updated successfully", fields.Raw(description="Success response"))
@console_ns.response(400, "Invalid request parameters or configuration not found")
@setup_required
@login_required
@account_initialization_required
def patch(self, app_id):
"""Update an existing trace app configuration"""
parser = reqparse.RequestParser()
parser.add_argument("tracing_provider", type=str, required=True, location="json")
parser.add_argument("tracing_config", type=dict, required=True, location="json")
parser = (
reqparse.RequestParser()
.add_argument("tracing_provider", type=str, required=True, location="json")
.add_argument("tracing_config", type=dict, required=True, location="json")
)
args = parser.parse_args()
try:
@@ -114,23 +117,22 @@ class TraceAppConfigApi(Resource):
except Exception as e:
raise BadRequest(str(e))
@api.doc("delete_trace_app_config")
@api.doc(description="Delete an existing tracing configuration for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser().add_argument(
@console_ns.doc("delete_trace_app_config")
@console_ns.doc(description="Delete an existing tracing configuration for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(
console_ns.parser().add_argument(
"tracing_provider", type=str, required=True, location="args", help="Tracing provider name"
)
)
@api.response(204, "Tracing configuration deleted successfully")
@api.response(400, "Invalid request parameters or configuration not found")
@console_ns.response(204, "Tracing configuration deleted successfully")
@console_ns.response(400, "Invalid request parameters or configuration not found")
@setup_required
@login_required
@account_initialization_required
def delete(self, app_id):
"""Delete an existing trace app configuration"""
parser = reqparse.RequestParser()
parser.add_argument("tracing_provider", type=str, required=True, location="args")
parser = reqparse.RequestParser().add_argument("tracing_provider", type=str, required=True, location="args")
args = parser.parse_args()
try:

View File

@@ -1,48 +1,61 @@
from flask_login import current_user
from flask_restx import Resource, fields, marshal_with, reqparse
from werkzeug.exceptions import Forbidden, NotFound
from werkzeug.exceptions import NotFound
from constants.languages import supported_language
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.wraps import (
account_initialization_required,
edit_permission_required,
is_admin_or_owner_required,
setup_required,
)
from extensions.ext_database import db
from fields.app_fields import app_site_fields
from libs.datetime_utils import naive_utc_now
from libs.login import login_required
from models import Account, Site
from libs.login import current_account_with_tenant, login_required
from models import Site
# Register model for flask_restx to avoid dict type issues in Swagger
app_site_model = console_ns.model("AppSite", app_site_fields)
def parse_app_site_args():
parser = reqparse.RequestParser()
parser.add_argument("title", type=str, required=False, location="json")
parser.add_argument("icon_type", type=str, required=False, location="json")
parser.add_argument("icon", type=str, required=False, location="json")
parser.add_argument("icon_background", type=str, required=False, location="json")
parser.add_argument("description", type=str, required=False, location="json")
parser.add_argument("default_language", type=supported_language, required=False, location="json")
parser.add_argument("chat_color_theme", type=str, required=False, location="json")
parser.add_argument("chat_color_theme_inverted", type=bool, required=False, location="json")
parser.add_argument("customize_domain", type=str, required=False, location="json")
parser.add_argument("copyright", type=str, required=False, location="json")
parser.add_argument("privacy_policy", type=str, required=False, location="json")
parser.add_argument("custom_disclaimer", type=str, required=False, location="json")
parser.add_argument(
"customize_token_strategy", type=str, choices=["must", "allow", "not_allow"], required=False, location="json"
parser = (
reqparse.RequestParser()
.add_argument("title", type=str, required=False, location="json")
.add_argument("icon_type", type=str, required=False, location="json")
.add_argument("icon", type=str, required=False, location="json")
.add_argument("icon_background", type=str, required=False, location="json")
.add_argument("description", type=str, required=False, location="json")
.add_argument("default_language", type=supported_language, required=False, location="json")
.add_argument("chat_color_theme", type=str, required=False, location="json")
.add_argument("chat_color_theme_inverted", type=bool, required=False, location="json")
.add_argument("customize_domain", type=str, required=False, location="json")
.add_argument("copyright", type=str, required=False, location="json")
.add_argument("privacy_policy", type=str, required=False, location="json")
.add_argument("custom_disclaimer", type=str, required=False, location="json")
.add_argument(
"customize_token_strategy",
type=str,
choices=["must", "allow", "not_allow"],
required=False,
location="json",
)
.add_argument("prompt_public", type=bool, required=False, location="json")
.add_argument("show_workflow_steps", type=bool, required=False, location="json")
.add_argument("use_icon_as_answer_icon", type=bool, required=False, location="json")
)
parser.add_argument("prompt_public", type=bool, required=False, location="json")
parser.add_argument("show_workflow_steps", type=bool, required=False, location="json")
parser.add_argument("use_icon_as_answer_icon", type=bool, required=False, location="json")
return parser.parse_args()
@console_ns.route("/apps/<uuid:app_id>/site")
class AppSite(Resource):
@api.doc("update_app_site")
@api.doc(description="Update application site configuration")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.model(
@console_ns.doc("update_app_site")
@console_ns.doc(description="Update application site configuration")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(
console_ns.model(
"AppSiteRequest",
{
"title": fields.String(description="Site title"),
@@ -66,21 +79,18 @@ class AppSite(Resource):
},
)
)
@api.response(200, "Site configuration updated successfully", app_site_fields)
@api.response(403, "Insufficient permissions")
@api.response(404, "App not found")
@console_ns.response(200, "Site configuration updated successfully", app_site_model)
@console_ns.response(403, "Insufficient permissions")
@console_ns.response(404, "App not found")
@setup_required
@login_required
@edit_permission_required
@account_initialization_required
@get_app_model
@marshal_with(app_site_fields)
@marshal_with(app_site_model)
def post(self, app_model):
args = parse_app_site_args()
# The role of the current user in the ta table must be editor, admin, or owner
if not current_user.is_editor:
raise Forbidden()
current_user, _ = current_account_with_tenant()
site = db.session.query(Site).where(Site.app_id == app_model.id).first()
if not site:
raise NotFound
@@ -107,8 +117,6 @@ class AppSite(Resource):
if value is not None:
setattr(site, attr_name, value)
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
site.updated_by = current_user.id
site.updated_at = naive_utc_now()
db.session.commit()
@@ -118,30 +126,26 @@ class AppSite(Resource):
@console_ns.route("/apps/<uuid:app_id>/site/access-token-reset")
class AppSiteAccessTokenReset(Resource):
@api.doc("reset_app_site_access_token")
@api.doc(description="Reset access token for application site")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Access token reset successfully", app_site_fields)
@api.response(403, "Insufficient permissions (admin/owner required)")
@api.response(404, "App or site not found")
@console_ns.doc("reset_app_site_access_token")
@console_ns.doc(description="Reset access token for application site")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(200, "Access token reset successfully", app_site_model)
@console_ns.response(403, "Insufficient permissions (admin/owner required)")
@console_ns.response(404, "App or site not found")
@setup_required
@login_required
@is_admin_or_owner_required
@account_initialization_required
@get_app_model
@marshal_with(app_site_fields)
@marshal_with(app_site_model)
def post(self, app_model):
# The role of the current user in the ta table must be admin or owner
if not current_user.is_admin_or_owner:
raise Forbidden()
current_user, _ = current_account_with_tenant()
site = db.session.query(Site).where(Site.app_id == app_model.id).first()
if not site:
raise NotFound
site.code = Site.generate_code(16)
if not isinstance(current_user, Account):
raise ValueError("current_user must be an Account instance")
site.updated_by = current_user.id
site.updated_at = naive_utc_now()
db.session.commit()

View File

@@ -1,33 +1,48 @@
from datetime import datetime
from decimal import Decimal
import pytz
import sqlalchemy as sa
from flask import jsonify
from flask_login import current_user
from flask_restx import Resource, fields, reqparse
from flask import abort, jsonify, request
from flask_restx import Resource, fields
from pydantic import BaseModel, Field, field_validator
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from core.app.entities.app_invoke_entities import InvokeFrom
from extensions.ext_database import db
from libs.helper import DatetimeString
from libs.login import login_required
from models import AppMode, Message
from libs.datetime_utils import parse_time_range
from libs.helper import convert_datetime_to_date
from libs.login import current_account_with_tenant, login_required
from models import AppMode
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class StatisticTimeRangeQuery(BaseModel):
start: str | None = Field(default=None, description="Start date (YYYY-MM-DD HH:MM)")
end: str | None = Field(default=None, description="End date (YYYY-MM-DD HH:MM)")
@field_validator("start", "end", mode="before")
@classmethod
def empty_string_to_none(cls, value: str | None) -> str | None:
if value == "":
return None
return value
console_ns.schema_model(
StatisticTimeRangeQuery.__name__,
StatisticTimeRangeQuery.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0),
)
@console_ns.route("/apps/<uuid:app_id>/statistics/daily-messages")
class DailyMessageStatistic(Resource):
@api.doc("get_daily_message_statistics")
@api.doc(description="Get daily message statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
@console_ns.doc("get_daily_message_statistics")
@console_ns.doc(description="Get daily message statistics for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[StatisticTimeRangeQuery.__name__])
@console_ns.response(
200,
"Daily message statistics retrieved successfully",
fields.List(fields.Raw(description="Daily message count data")),
@@ -37,43 +52,32 @@ class DailyMessageStatistic(Resource):
@login_required
@account_initialization_required
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = StatisticTimeRangeQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
converted_created_at = convert_datetime_to_date("created_at")
sql_query = f"""SELECT
{converted_created_at} AS date,
COUNT(*) AS message_count
FROM
messages
WHERE
app_id = :app_id
AND invoke_from != :invoke_from"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER.value}
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
try:
start_datetime_utc, end_datetime_utc = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if start_datetime_utc:
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
if end_datetime_utc:
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
@@ -91,15 +95,11 @@ WHERE
@console_ns.route("/apps/<uuid:app_id>/statistics/daily-conversations")
class DailyConversationStatistic(Resource):
@api.doc("get_daily_conversation_statistics")
@api.doc(description="Get daily conversation statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
@console_ns.doc("get_daily_conversation_statistics")
@console_ns.doc(description="Get daily conversation statistics for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[StatisticTimeRangeQuery.__name__])
@console_ns.response(
200,
"Daily conversation statistics retrieved successfully",
fields.List(fields.Raw(description="Daily conversation count data")),
@@ -109,63 +109,53 @@ class DailyConversationStatistic(Resource):
@login_required
@account_initialization_required
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = StatisticTimeRangeQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
converted_created_at = convert_datetime_to_date("created_at")
sql_query = f"""SELECT
{converted_created_at} AS date,
COUNT(DISTINCT conversation_id) AS conversation_count
FROM
messages
WHERE
app_id = :app_id
AND invoke_from != :invoke_from"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER}
assert account.timezone is not None
stmt = (
sa.select(
sa.func.date(
sa.func.date_trunc("day", sa.text("created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz"))
).label("date"),
sa.func.count(sa.distinct(Message.conversation_id)).label("conversation_count"),
)
.select_from(Message)
.where(Message.app_id == app_model.id, Message.invoke_from != InvokeFrom.DEBUGGER.value)
)
try:
start_datetime_utc, end_datetime_utc = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
stmt = stmt.where(Message.created_at >= start_datetime_utc)
if start_datetime_utc:
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
stmt = stmt.where(Message.created_at < end_datetime_utc)
if end_datetime_utc:
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
stmt = stmt.group_by("date").order_by("date")
sql_query += " GROUP BY date ORDER BY date"
response_data = []
with db.engine.begin() as conn:
rs = conn.execute(stmt, {"tz": account.timezone})
for row in rs:
response_data.append({"date": str(row.date), "conversation_count": row.conversation_count})
rs = conn.execute(sa.text(sql_query), arg_dict)
for i in rs:
response_data.append({"date": str(i.date), "conversation_count": i.conversation_count})
return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/statistics/daily-end-users")
class DailyTerminalsStatistic(Resource):
@api.doc("get_daily_terminals_statistics")
@api.doc(description="Get daily terminal/end-user statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
@console_ns.doc("get_daily_terminals_statistics")
@console_ns.doc(description="Get daily terminal/end-user statistics for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[StatisticTimeRangeQuery.__name__])
@console_ns.response(
200,
"Daily terminal statistics retrieved successfully",
fields.List(fields.Raw(description="Daily terminal count data")),
@@ -175,43 +165,32 @@ class DailyTerminalsStatistic(Resource):
@login_required
@account_initialization_required
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = StatisticTimeRangeQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
converted_created_at = convert_datetime_to_date("created_at")
sql_query = f"""SELECT
{converted_created_at} AS date,
COUNT(DISTINCT messages.from_end_user_id) AS terminal_count
FROM
messages
WHERE
app_id = :app_id
AND invoke_from != :invoke_from"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER.value}
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
try:
start_datetime_utc, end_datetime_utc = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if start_datetime_utc:
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
if end_datetime_utc:
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
@@ -229,15 +208,11 @@ WHERE
@console_ns.route("/apps/<uuid:app_id>/statistics/token-costs")
class DailyTokenCostStatistic(Resource):
@api.doc("get_daily_token_cost_statistics")
@api.doc(description="Get daily token cost statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
@console_ns.doc("get_daily_token_cost_statistics")
@console_ns.doc(description="Get daily token cost statistics for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[StatisticTimeRangeQuery.__name__])
@console_ns.response(
200,
"Daily token cost statistics retrieved successfully",
fields.List(fields.Raw(description="Daily token cost data")),
@@ -247,15 +222,13 @@ class DailyTokenCostStatistic(Resource):
@login_required
@account_initialization_required
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = StatisticTimeRangeQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
converted_created_at = convert_datetime_to_date("created_at")
sql_query = f"""SELECT
{converted_created_at} AS date,
(SUM(messages.message_tokens) + SUM(messages.answer_tokens)) AS token_count,
SUM(total_price) AS total_price
FROM
@@ -263,28 +236,19 @@ FROM
WHERE
app_id = :app_id
AND invoke_from != :invoke_from"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER.value}
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
try:
start_datetime_utc, end_datetime_utc = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if start_datetime_utc:
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
if end_datetime_utc:
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
@@ -304,15 +268,11 @@ WHERE
@console_ns.route("/apps/<uuid:app_id>/statistics/average-session-interactions")
class AverageSessionInteractionStatistic(Resource):
@api.doc("get_average_session_interaction_statistics")
@api.doc(description="Get average session interaction statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
@console_ns.doc("get_average_session_interaction_statistics")
@console_ns.doc(description="Get average session interaction statistics for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[StatisticTimeRangeQuery.__name__])
@console_ns.response(
200,
"Average session interaction statistics retrieved successfully",
fields.List(fields.Raw(description="Average session interaction data")),
@@ -322,15 +282,13 @@ class AverageSessionInteractionStatistic(Resource):
@account_initialization_required
@get_app_model(mode=[AppMode.CHAT, AppMode.AGENT_CHAT, AppMode.ADVANCED_CHAT])
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = StatisticTimeRangeQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
sql_query = """SELECT
DATE(DATE_TRUNC('day', c.created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
converted_created_at = convert_datetime_to_date("c.created_at")
sql_query = f"""SELECT
{converted_created_at} AS date,
AVG(subquery.message_count) AS interactions
FROM
(
@@ -345,28 +303,19 @@ FROM
WHERE
c.app_id = :app_id
AND m.invoke_from != :invoke_from"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER.value}
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
try:
start_datetime_utc, end_datetime_utc = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if start_datetime_utc:
sql_query += " AND c.created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
if end_datetime_utc:
sql_query += " AND c.created_at < :end"
arg_dict["end"] = end_datetime_utc
@@ -395,15 +344,11 @@ ORDER BY
@console_ns.route("/apps/<uuid:app_id>/statistics/user-satisfaction-rate")
class UserSatisfactionRateStatistic(Resource):
@api.doc("get_user_satisfaction_rate_statistics")
@api.doc(description="Get user satisfaction rate statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
@console_ns.doc("get_user_satisfaction_rate_statistics")
@console_ns.doc(description="Get user satisfaction rate statistics for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[StatisticTimeRangeQuery.__name__])
@console_ns.response(
200,
"User satisfaction rate statistics retrieved successfully",
fields.List(fields.Raw(description="User satisfaction rate data")),
@@ -413,15 +358,13 @@ class UserSatisfactionRateStatistic(Resource):
@login_required
@account_initialization_required
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = StatisticTimeRangeQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
sql_query = """SELECT
DATE(DATE_TRUNC('day', m.created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
converted_created_at = convert_datetime_to_date("m.created_at")
sql_query = f"""SELECT
{converted_created_at} AS date,
COUNT(m.id) AS message_count,
COUNT(mf.id) AS feedback_count
FROM
@@ -432,28 +375,19 @@ LEFT JOIN
WHERE
m.app_id = :app_id
AND m.invoke_from != :invoke_from"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER.value}
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
try:
start_datetime_utc, end_datetime_utc = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if start_datetime_utc:
sql_query += " AND m.created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
if end_datetime_utc:
sql_query += " AND m.created_at < :end"
arg_dict["end"] = end_datetime_utc
@@ -476,15 +410,11 @@ WHERE
@console_ns.route("/apps/<uuid:app_id>/statistics/average-response-time")
class AverageResponseTimeStatistic(Resource):
@api.doc("get_average_response_time_statistics")
@api.doc(description="Get average response time statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
@console_ns.doc("get_average_response_time_statistics")
@console_ns.doc(description="Get average response time statistics for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[StatisticTimeRangeQuery.__name__])
@console_ns.response(
200,
"Average response time statistics retrieved successfully",
fields.List(fields.Raw(description="Average response time data")),
@@ -494,43 +424,32 @@ class AverageResponseTimeStatistic(Resource):
@account_initialization_required
@get_app_model(mode=AppMode.COMPLETION)
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = StatisticTimeRangeQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
converted_created_at = convert_datetime_to_date("created_at")
sql_query = f"""SELECT
{converted_created_at} AS date,
AVG(provider_response_latency) AS latency
FROM
messages
WHERE
app_id = :app_id
AND invoke_from != :invoke_from"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER.value}
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
try:
start_datetime_utc, end_datetime_utc = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if start_datetime_utc:
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
if end_datetime_utc:
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
@@ -548,15 +467,11 @@ WHERE
@console_ns.route("/apps/<uuid:app_id>/statistics/tokens-per-second")
class TokensPerSecondStatistic(Resource):
@api.doc("get_tokens_per_second_statistics")
@api.doc(description="Get tokens per second statistics for an application")
@api.doc(params={"app_id": "Application ID"})
@api.expect(
api.parser()
.add_argument("start", type=str, location="args", help="Start date (YYYY-MM-DD HH:MM)")
.add_argument("end", type=str, location="args", help="End date (YYYY-MM-DD HH:MM)")
)
@api.response(
@console_ns.doc("get_tokens_per_second_statistics")
@console_ns.doc(description="Get tokens per second statistics for an application")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[StatisticTimeRangeQuery.__name__])
@console_ns.response(
200,
"Tokens per second statistics retrieved successfully",
fields.List(fields.Raw(description="Tokens per second data")),
@@ -566,15 +481,12 @@ class TokensPerSecondStatistic(Resource):
@login_required
@account_initialization_required
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
args = StatisticTimeRangeQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
converted_created_at = convert_datetime_to_date("created_at")
sql_query = f"""SELECT
{converted_created_at} AS date,
CASE
WHEN SUM(provider_response_latency) = 0 THEN 0
ELSE (SUM(answer_tokens) / SUM(provider_response_latency))
@@ -584,28 +496,19 @@ FROM
WHERE
app_id = :app_id
AND invoke_from != :invoke_from"""
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER.value}
arg_dict = {"tz": account.timezone, "app_id": app_model.id, "invoke_from": InvokeFrom.DEBUGGER}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
try:
start_datetime_utc, end_datetime_utc = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if start_datetime_utc:
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
if end_datetime_utc:
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc

File diff suppressed because it is too large Load Diff

View File

@@ -1,82 +1,85 @@
from datetime import datetime
from dateutil.parser import isoparse
from flask_restx import Resource, marshal_with, reqparse
from flask_restx.inputs import int_range
from flask import request
from flask_restx import Resource, marshal_with
from pydantic import BaseModel, Field, field_validator
from sqlalchemy.orm import Session
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from core.workflow.enums import WorkflowExecutionStatus
from extensions.ext_database import db
from fields.workflow_app_log_fields import workflow_app_log_pagination_fields
from fields.workflow_app_log_fields import build_workflow_app_log_pagination_model
from libs.login import login_required
from models import App
from models.model import AppMode
from services.workflow_app_service import WorkflowAppService
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class WorkflowAppLogQuery(BaseModel):
keyword: str | None = Field(default=None, description="Search keyword for filtering logs")
status: WorkflowExecutionStatus | None = Field(
default=None, description="Execution status filter (succeeded, failed, stopped, partial-succeeded)"
)
created_at__before: datetime | None = Field(default=None, description="Filter logs created before this timestamp")
created_at__after: datetime | None = Field(default=None, description="Filter logs created after this timestamp")
created_by_end_user_session_id: str | None = Field(default=None, description="Filter by end user session ID")
created_by_account: str | None = Field(default=None, description="Filter by account")
detail: bool = Field(default=False, description="Whether to return detailed logs")
page: int = Field(default=1, ge=1, le=99999, description="Page number (1-99999)")
limit: int = Field(default=20, ge=1, le=100, description="Number of items per page (1-100)")
@field_validator("created_at__before", "created_at__after", mode="before")
@classmethod
def parse_datetime(cls, value: str | None) -> datetime | None:
if value in (None, ""):
return None
return isoparse(value) # type: ignore
@field_validator("detail", mode="before")
@classmethod
def parse_bool(cls, value: bool | str | None) -> bool:
if isinstance(value, bool):
return value
if value is None:
return False
lowered = value.lower()
if lowered in {"1", "true", "yes", "on"}:
return True
if lowered in {"0", "false", "no", "off"}:
return False
raise ValueError("Invalid boolean value for detail")
console_ns.schema_model(
WorkflowAppLogQuery.__name__, WorkflowAppLogQuery.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0)
)
# Register model for flask_restx to avoid dict type issues in Swagger
workflow_app_log_pagination_model = build_workflow_app_log_pagination_model(console_ns)
@console_ns.route("/apps/<uuid:app_id>/workflow-app-logs")
class WorkflowAppLogApi(Resource):
@api.doc("get_workflow_app_logs")
@api.doc(description="Get workflow application execution logs")
@api.doc(params={"app_id": "Application ID"})
@api.doc(
params={
"keyword": "Search keyword for filtering logs",
"status": "Filter by execution status (succeeded, failed, stopped, partial-succeeded)",
"created_at__before": "Filter logs created before this timestamp",
"created_at__after": "Filter logs created after this timestamp",
"created_by_end_user_session_id": "Filter by end user session ID",
"created_by_account": "Filter by account",
"page": "Page number (1-99999)",
"limit": "Number of items per page (1-100)",
}
)
@api.response(200, "Workflow app logs retrieved successfully", workflow_app_log_pagination_fields)
@console_ns.doc("get_workflow_app_logs")
@console_ns.doc(description="Get workflow application execution logs")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[WorkflowAppLogQuery.__name__])
@console_ns.response(200, "Workflow app logs retrieved successfully", workflow_app_log_pagination_model)
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.WORKFLOW])
@marshal_with(workflow_app_log_pagination_fields)
@marshal_with(workflow_app_log_pagination_model)
def get(self, app_model: App):
"""
Get workflow app logs
"""
parser = reqparse.RequestParser()
parser.add_argument("keyword", type=str, location="args")
parser.add_argument(
"status", type=str, choices=["succeeded", "failed", "stopped", "partial-succeeded"], location="args"
)
parser.add_argument(
"created_at__before", type=str, location="args", help="Filter logs created before this timestamp"
)
parser.add_argument(
"created_at__after", type=str, location="args", help="Filter logs created after this timestamp"
)
parser.add_argument(
"created_by_end_user_session_id",
type=str,
location="args",
required=False,
default=None,
)
parser.add_argument(
"created_by_account",
type=str,
location="args",
required=False,
default=None,
)
parser.add_argument("page", type=int_range(1, 99999), default=1, location="args")
parser.add_argument("limit", type=int_range(1, 100), default=20, location="args")
args = parser.parse_args()
args.status = WorkflowExecutionStatus(args.status) if args.status else None
if args.created_at__before:
args.created_at__before = isoparse(args.created_at__before)
if args.created_at__after:
args.created_at__after = isoparse(args.created_at__after)
args = WorkflowAppLogQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
# get paginate workflow app logs
workflow_app_service = WorkflowAppService()
@@ -90,6 +93,7 @@ class WorkflowAppLogApi(Resource):
created_at_after=args.created_at__after,
page=args.page,
limit=args.limit,
detail=args.detail,
created_by_end_user_session_id=args.created_by_end_user_session_id,
created_by_account=args.created_by_account,
)

View File

@@ -1,17 +1,18 @@
import logging
from typing import NoReturn
from collections.abc import Callable
from functools import wraps
from typing import NoReturn, ParamSpec, TypeVar
from flask import Response
from flask_restx import Resource, fields, inputs, marshal, marshal_with, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.error import (
DraftWorkflowNotExist,
)
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from controllers.console.wraps import account_initialization_required, edit_permission_required, setup_required
from controllers.web.error import InvalidArgumentError, NotFoundError
from core.file import helpers as file_helpers
from core.variables.segment_group import SegmentGroup
@@ -21,9 +22,8 @@ from core.workflow.constants import CONVERSATION_VARIABLE_NODE_ID, SYSTEM_VARIAB
from extensions.ext_database import db
from factories.file_factory import build_from_mapping, build_from_mappings
from factories.variable_factory import build_segment_with_type
from libs.login import current_user, login_required
from libs.login import login_required
from models import App, AppMode
from models.account import Account
from models.workflow import WorkflowDraftVariable
from services.workflow_draft_variable_service import WorkflowDraftVariableList, WorkflowDraftVariableService
from services.workflow_service import WorkflowService
@@ -58,16 +58,18 @@ def _serialize_var_value(variable: WorkflowDraftVariable):
def _create_pagination_parser():
parser = reqparse.RequestParser()
parser.add_argument(
"page",
type=inputs.int_range(1, 100_000),
required=False,
default=1,
location="args",
help="the page of data requested",
parser = (
reqparse.RequestParser()
.add_argument(
"page",
type=inputs.int_range(1, 100_000),
required=False,
default=1,
location="args",
help="the page of data requested",
)
.add_argument("limit", type=inputs.int_range(1, 100), required=False, default=20, location="args")
)
parser.add_argument("limit", type=inputs.int_range(1, 100), required=False, default=20, location="args")
return parser
@@ -139,8 +141,42 @@ _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS = {
"items": fields.List(fields.Nested(_WORKFLOW_DRAFT_VARIABLE_FIELDS), attribute=_get_items),
}
# Register models for flask_restx to avoid dict type issues in Swagger
workflow_draft_variable_without_value_model = console_ns.model(
"WorkflowDraftVariableWithoutValue", _WORKFLOW_DRAFT_VARIABLE_WITHOUT_VALUE_FIELDS
)
def _api_prerequisite(f):
workflow_draft_variable_model = console_ns.model("WorkflowDraftVariable", _WORKFLOW_DRAFT_VARIABLE_FIELDS)
workflow_draft_env_variable_model = console_ns.model("WorkflowDraftEnvVariable", _WORKFLOW_DRAFT_ENV_VARIABLE_FIELDS)
workflow_draft_env_variable_list_fields_copy = _WORKFLOW_DRAFT_ENV_VARIABLE_LIST_FIELDS.copy()
workflow_draft_env_variable_list_fields_copy["items"] = fields.List(fields.Nested(workflow_draft_env_variable_model))
workflow_draft_env_variable_list_model = console_ns.model(
"WorkflowDraftEnvVariableList", workflow_draft_env_variable_list_fields_copy
)
workflow_draft_variable_list_without_value_fields_copy = _WORKFLOW_DRAFT_VARIABLE_LIST_WITHOUT_VALUE_FIELDS.copy()
workflow_draft_variable_list_without_value_fields_copy["items"] = fields.List(
fields.Nested(workflow_draft_variable_without_value_model), attribute=_get_items
)
workflow_draft_variable_list_without_value_model = console_ns.model(
"WorkflowDraftVariableListWithoutValue", workflow_draft_variable_list_without_value_fields_copy
)
workflow_draft_variable_list_fields_copy = _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS.copy()
workflow_draft_variable_list_fields_copy["items"] = fields.List(
fields.Nested(workflow_draft_variable_model), attribute=_get_items
)
workflow_draft_variable_list_model = console_ns.model(
"WorkflowDraftVariableList", workflow_draft_variable_list_fields_copy
)
P = ParamSpec("P")
R = TypeVar("R")
def _api_prerequisite(f: Callable[P, R]):
"""Common prerequisites for all draft workflow variable APIs.
It ensures the following conditions are satisfied:
@@ -154,11 +190,10 @@ def _api_prerequisite(f):
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
def wrapper(*args, **kwargs):
assert isinstance(current_user, Account)
if not current_user.has_edit_permission:
raise Forbidden()
@wraps(f)
def wrapper(*args: P.args, **kwargs: P.kwargs):
return f(*args, **kwargs)
return wrapper
@@ -166,13 +201,16 @@ def _api_prerequisite(f):
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/variables")
class WorkflowVariableCollectionApi(Resource):
@api.doc("get_workflow_variables")
@api.doc(description="Get draft workflow variables")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"page": "Page number (1-100000)", "limit": "Number of items per page (1-100)"})
@api.response(200, "Workflow variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_WITHOUT_VALUE_FIELDS)
@console_ns.expect(_create_pagination_parser())
@console_ns.doc("get_workflow_variables")
@console_ns.doc(description="Get draft workflow variables")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.doc(params={"page": "Page number (1-100000)", "limit": "Number of items per page (1-100)"})
@console_ns.response(
200, "Workflow variables retrieved successfully", workflow_draft_variable_list_without_value_model
)
@_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_WITHOUT_VALUE_FIELDS)
@marshal_with(workflow_draft_variable_list_without_value_model)
def get(self, app_model: App):
"""
Get draft workflow
@@ -199,9 +237,9 @@ class WorkflowVariableCollectionApi(Resource):
return workflow_vars
@api.doc("delete_workflow_variables")
@api.doc(description="Delete all draft workflow variables")
@api.response(204, "Workflow variables deleted successfully")
@console_ns.doc("delete_workflow_variables")
@console_ns.doc(description="Delete all draft workflow variables")
@console_ns.response(204, "Workflow variables deleted successfully")
@_api_prerequisite
def delete(self, app_model: App):
draft_var_srv = WorkflowDraftVariableService(
@@ -232,12 +270,12 @@ def validate_node_id(node_id: str) -> NoReturn | None:
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/nodes/<string:node_id>/variables")
class NodeVariableCollectionApi(Resource):
@api.doc("get_node_variables")
@api.doc(description="Get variables for a specific node")
@api.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@api.response(200, "Node variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@console_ns.doc("get_node_variables")
@console_ns.doc(description="Get variables for a specific node")
@console_ns.doc(params={"app_id": "Application ID", "node_id": "Node ID"})
@console_ns.response(200, "Node variables retrieved successfully", workflow_draft_variable_list_model)
@_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@marshal_with(workflow_draft_variable_list_model)
def get(self, app_model: App, node_id: str):
validate_node_id(node_id)
with Session(bind=db.engine, expire_on_commit=False) as session:
@@ -248,9 +286,9 @@ class NodeVariableCollectionApi(Resource):
return node_vars
@api.doc("delete_node_variables")
@api.doc(description="Delete all variables for a specific node")
@api.response(204, "Node variables deleted successfully")
@console_ns.doc("delete_node_variables")
@console_ns.doc(description="Delete all variables for a specific node")
@console_ns.response(204, "Node variables deleted successfully")
@_api_prerequisite
def delete(self, app_model: App, node_id: str):
validate_node_id(node_id)
@@ -265,13 +303,13 @@ class VariableApi(Resource):
_PATCH_NAME_FIELD = "name"
_PATCH_VALUE_FIELD = "value"
@api.doc("get_variable")
@api.doc(description="Get a specific workflow variable")
@api.doc(params={"app_id": "Application ID", "variable_id": "Variable ID"})
@api.response(200, "Variable retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_FIELDS)
@api.response(404, "Variable not found")
@console_ns.doc("get_variable")
@console_ns.doc(description="Get a specific workflow variable")
@console_ns.doc(params={"app_id": "Application ID", "variable_id": "Variable ID"})
@console_ns.response(200, "Variable retrieved successfully", workflow_draft_variable_model)
@console_ns.response(404, "Variable not found")
@_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_FIELDS)
@marshal_with(workflow_draft_variable_model)
def get(self, app_model: App, variable_id: str):
draft_var_srv = WorkflowDraftVariableService(
session=db.session(),
@@ -283,10 +321,10 @@ class VariableApi(Resource):
raise NotFoundError(description=f"variable not found, id={variable_id}")
return variable
@api.doc("update_variable")
@api.doc(description="Update a workflow variable")
@api.expect(
api.model(
@console_ns.doc("update_variable")
@console_ns.doc(description="Update a workflow variable")
@console_ns.expect(
console_ns.model(
"UpdateVariableRequest",
{
"name": fields.String(description="Variable name"),
@@ -294,10 +332,10 @@ class VariableApi(Resource):
},
)
)
@api.response(200, "Variable updated successfully", _WORKFLOW_DRAFT_VARIABLE_FIELDS)
@api.response(404, "Variable not found")
@console_ns.response(200, "Variable updated successfully", workflow_draft_variable_model)
@console_ns.response(404, "Variable not found")
@_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_FIELDS)
@marshal_with(workflow_draft_variable_model)
def patch(self, app_model: App, variable_id: str):
# Request payload for file types:
#
@@ -320,10 +358,11 @@ class VariableApi(Resource):
# "upload_file_id": "1602650a-4fe4-423c-85a2-af76c083e3c4"
# }
parser = reqparse.RequestParser()
parser.add_argument(self._PATCH_NAME_FIELD, type=str, required=False, nullable=True, location="json")
# Parse 'value' field as-is to maintain its original data structure
parser.add_argument(self._PATCH_VALUE_FIELD, type=lambda x: x, required=False, nullable=True, location="json")
parser = (
reqparse.RequestParser()
.add_argument(self._PATCH_NAME_FIELD, type=str, required=False, nullable=True, location="json")
.add_argument(self._PATCH_VALUE_FIELD, type=lambda x: x, required=False, nullable=True, location="json")
)
draft_var_srv = WorkflowDraftVariableService(
session=db.session(),
@@ -358,10 +397,10 @@ class VariableApi(Resource):
db.session.commit()
return variable
@api.doc("delete_variable")
@api.doc(description="Delete a workflow variable")
@api.response(204, "Variable deleted successfully")
@api.response(404, "Variable not found")
@console_ns.doc("delete_variable")
@console_ns.doc(description="Delete a workflow variable")
@console_ns.response(204, "Variable deleted successfully")
@console_ns.response(404, "Variable not found")
@_api_prerequisite
def delete(self, app_model: App, variable_id: str):
draft_var_srv = WorkflowDraftVariableService(
@@ -379,12 +418,12 @@ class VariableApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/variables/<uuid:variable_id>/reset")
class VariableResetApi(Resource):
@api.doc("reset_variable")
@api.doc(description="Reset a workflow variable to its default value")
@api.doc(params={"app_id": "Application ID", "variable_id": "Variable ID"})
@api.response(200, "Variable reset successfully", _WORKFLOW_DRAFT_VARIABLE_FIELDS)
@api.response(204, "Variable reset (no content)")
@api.response(404, "Variable not found")
@console_ns.doc("reset_variable")
@console_ns.doc(description="Reset a workflow variable to its default value")
@console_ns.doc(params={"app_id": "Application ID", "variable_id": "Variable ID"})
@console_ns.response(200, "Variable reset successfully", workflow_draft_variable_model)
@console_ns.response(204, "Variable reset (no content)")
@console_ns.response(404, "Variable not found")
@_api_prerequisite
def put(self, app_model: App, variable_id: str):
draft_var_srv = WorkflowDraftVariableService(
@@ -408,7 +447,7 @@ class VariableResetApi(Resource):
if resetted is None:
return Response("", 204)
else:
return marshal(resetted, _WORKFLOW_DRAFT_VARIABLE_FIELDS)
return marshal(resetted, workflow_draft_variable_model)
def _get_variable_list(app_model: App, node_id) -> WorkflowDraftVariableList:
@@ -427,13 +466,13 @@ def _get_variable_list(app_model: App, node_id) -> WorkflowDraftVariableList:
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/conversation-variables")
class ConversationVariableCollectionApi(Resource):
@api.doc("get_conversation_variables")
@api.doc(description="Get conversation variables for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Conversation variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@api.response(404, "Draft workflow not found")
@console_ns.doc("get_conversation_variables")
@console_ns.doc(description="Get conversation variables for workflow")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(200, "Conversation variables retrieved successfully", workflow_draft_variable_list_model)
@console_ns.response(404, "Draft workflow not found")
@_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@marshal_with(workflow_draft_variable_list_model)
def get(self, app_model: App):
# NOTE(QuantumGhost): Prefill conversation variables into the draft variables table
# so their IDs can be returned to the caller.
@@ -449,23 +488,23 @@ class ConversationVariableCollectionApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/system-variables")
class SystemVariableCollectionApi(Resource):
@api.doc("get_system_variables")
@api.doc(description="Get system variables for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "System variables retrieved successfully", _WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@console_ns.doc("get_system_variables")
@console_ns.doc(description="Get system variables for workflow")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(200, "System variables retrieved successfully", workflow_draft_variable_list_model)
@_api_prerequisite
@marshal_with(_WORKFLOW_DRAFT_VARIABLE_LIST_FIELDS)
@marshal_with(workflow_draft_variable_list_model)
def get(self, app_model: App):
return _get_variable_list(app_model, SYSTEM_VARIABLE_NODE_ID)
@console_ns.route("/apps/<uuid:app_id>/workflows/draft/environment-variables")
class EnvironmentVariableCollectionApi(Resource):
@api.doc("get_environment_variables")
@api.doc(description="Get environment variables for workflow")
@api.doc(params={"app_id": "Application ID"})
@api.response(200, "Environment variables retrieved successfully")
@api.response(404, "Draft workflow not found")
@console_ns.doc("get_environment_variables")
@console_ns.doc(description="Get environment variables for workflow")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.response(200, "Environment variables retrieved successfully")
@console_ns.response(404, "Draft workflow not found")
@_api_prerequisite
def get(self, app_model: App):
"""

View File

@@ -1,90 +1,341 @@
from typing import cast
from typing import Literal, cast
from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse
from flask_restx.inputs import int_range
from flask import request
from flask_restx import Resource, fields, marshal_with
from pydantic import BaseModel, Field, field_validator
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from fields.end_user_fields import simple_end_user_fields
from fields.member_fields import simple_account_fields
from fields.workflow_run_fields import (
advanced_chat_workflow_run_for_list_fields,
advanced_chat_workflow_run_pagination_fields,
workflow_run_count_fields,
workflow_run_detail_fields,
workflow_run_for_list_fields,
workflow_run_node_execution_fields,
workflow_run_node_execution_list_fields,
workflow_run_pagination_fields,
)
from libs.custom_inputs import time_duration
from libs.helper import uuid_value
from libs.login import login_required
from models import Account, App, AppMode, EndUser
from libs.login import current_user, login_required
from models import Account, App, AppMode, EndUser, WorkflowRunTriggeredFrom
from services.workflow_run_service import WorkflowRunService
# Workflow run status choices for filtering
WORKFLOW_RUN_STATUS_CHOICES = ["running", "succeeded", "failed", "stopped", "partial-succeeded"]
# Register models for flask_restx to avoid dict type issues in Swagger
# Register in dependency order: base models first, then dependent models
# Base models
simple_account_model = console_ns.model("SimpleAccount", simple_account_fields)
simple_end_user_model = console_ns.model("SimpleEndUser", simple_end_user_fields)
# Models that depend on simple_account_fields
workflow_run_for_list_fields_copy = workflow_run_for_list_fields.copy()
workflow_run_for_list_fields_copy["created_by_account"] = fields.Nested(
simple_account_model, attribute="created_by_account", allow_null=True
)
workflow_run_for_list_model = console_ns.model("WorkflowRunForList", workflow_run_for_list_fields_copy)
advanced_chat_workflow_run_for_list_fields_copy = advanced_chat_workflow_run_for_list_fields.copy()
advanced_chat_workflow_run_for_list_fields_copy["created_by_account"] = fields.Nested(
simple_account_model, attribute="created_by_account", allow_null=True
)
advanced_chat_workflow_run_for_list_model = console_ns.model(
"AdvancedChatWorkflowRunForList", advanced_chat_workflow_run_for_list_fields_copy
)
workflow_run_detail_fields_copy = workflow_run_detail_fields.copy()
workflow_run_detail_fields_copy["created_by_account"] = fields.Nested(
simple_account_model, attribute="created_by_account", allow_null=True
)
workflow_run_detail_fields_copy["created_by_end_user"] = fields.Nested(
simple_end_user_model, attribute="created_by_end_user", allow_null=True
)
workflow_run_detail_model = console_ns.model("WorkflowRunDetail", workflow_run_detail_fields_copy)
workflow_run_node_execution_fields_copy = workflow_run_node_execution_fields.copy()
workflow_run_node_execution_fields_copy["created_by_account"] = fields.Nested(
simple_account_model, attribute="created_by_account", allow_null=True
)
workflow_run_node_execution_fields_copy["created_by_end_user"] = fields.Nested(
simple_end_user_model, attribute="created_by_end_user", allow_null=True
)
workflow_run_node_execution_model = console_ns.model(
"WorkflowRunNodeExecution", workflow_run_node_execution_fields_copy
)
# Simple models without nested dependencies
workflow_run_count_model = console_ns.model("WorkflowRunCount", workflow_run_count_fields)
# Pagination models that depend on list models
advanced_chat_workflow_run_pagination_fields_copy = advanced_chat_workflow_run_pagination_fields.copy()
advanced_chat_workflow_run_pagination_fields_copy["data"] = fields.List(
fields.Nested(advanced_chat_workflow_run_for_list_model), attribute="data"
)
advanced_chat_workflow_run_pagination_model = console_ns.model(
"AdvancedChatWorkflowRunPagination", advanced_chat_workflow_run_pagination_fields_copy
)
workflow_run_pagination_fields_copy = workflow_run_pagination_fields.copy()
workflow_run_pagination_fields_copy["data"] = fields.List(fields.Nested(workflow_run_for_list_model), attribute="data")
workflow_run_pagination_model = console_ns.model("WorkflowRunPagination", workflow_run_pagination_fields_copy)
workflow_run_node_execution_list_fields_copy = workflow_run_node_execution_list_fields.copy()
workflow_run_node_execution_list_fields_copy["data"] = fields.List(fields.Nested(workflow_run_node_execution_model))
workflow_run_node_execution_list_model = console_ns.model(
"WorkflowRunNodeExecutionList", workflow_run_node_execution_list_fields_copy
)
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class WorkflowRunListQuery(BaseModel):
last_id: str | None = Field(default=None, description="Last run ID for pagination")
limit: int = Field(default=20, ge=1, le=100, description="Number of items per page (1-100)")
status: Literal["running", "succeeded", "failed", "stopped", "partial-succeeded"] | None = Field(
default=None, description="Workflow run status filter"
)
triggered_from: Literal["debugging", "app-run"] | None = Field(
default=None, description="Filter by trigger source: debugging or app-run"
)
@field_validator("last_id")
@classmethod
def validate_last_id(cls, value: str | None) -> str | None:
if value is None:
return value
return uuid_value(value)
class WorkflowRunCountQuery(BaseModel):
status: Literal["running", "succeeded", "failed", "stopped", "partial-succeeded"] | None = Field(
default=None, description="Workflow run status filter"
)
time_range: str | None = Field(default=None, description="Time range filter (e.g., 7d, 4h, 30m, 30s)")
triggered_from: Literal["debugging", "app-run"] | None = Field(
default=None, description="Filter by trigger source: debugging or app-run"
)
@field_validator("time_range")
@classmethod
def validate_time_range(cls, value: str | None) -> str | None:
if value is None:
return value
return time_duration(value)
console_ns.schema_model(
WorkflowRunListQuery.__name__, WorkflowRunListQuery.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0)
)
console_ns.schema_model(
WorkflowRunCountQuery.__name__,
WorkflowRunCountQuery.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0),
)
@console_ns.route("/apps/<uuid:app_id>/advanced-chat/workflow-runs")
class AdvancedChatAppWorkflowRunListApi(Resource):
@api.doc("get_advanced_chat_workflow_runs")
@api.doc(description="Get advanced chat workflow run list")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"last_id": "Last run ID for pagination", "limit": "Number of items per page (1-100)"})
@api.response(200, "Workflow runs retrieved successfully", advanced_chat_workflow_run_pagination_fields)
@console_ns.doc("get_advanced_chat_workflow_runs")
@console_ns.doc(description="Get advanced chat workflow run list")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.doc(params={"last_id": "Last run ID for pagination", "limit": "Number of items per page (1-100)"})
@console_ns.doc(
params={"status": "Filter by status (optional): running, succeeded, failed, stopped, partial-succeeded"}
)
@console_ns.doc(
params={"triggered_from": "Filter by trigger source (optional): debugging or app-run. Default: debugging"}
)
@console_ns.expect(console_ns.models[WorkflowRunListQuery.__name__])
@console_ns.response(200, "Workflow runs retrieved successfully", advanced_chat_workflow_run_pagination_model)
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT])
@marshal_with(advanced_chat_workflow_run_pagination_fields)
@marshal_with(advanced_chat_workflow_run_pagination_model)
def get(self, app_model: App):
"""
Get advanced chat app workflow run list
"""
parser = reqparse.RequestParser()
parser.add_argument("last_id", type=uuid_value, location="args")
parser.add_argument("limit", type=int_range(1, 100), required=False, default=20, location="args")
args = parser.parse_args()
args_model = WorkflowRunListQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
args = args_model.model_dump(exclude_none=True)
# Default to DEBUGGING if not specified
triggered_from = (
WorkflowRunTriggeredFrom(args_model.triggered_from)
if args_model.triggered_from
else WorkflowRunTriggeredFrom.DEBUGGING
)
workflow_run_service = WorkflowRunService()
result = workflow_run_service.get_paginate_advanced_chat_workflow_runs(app_model=app_model, args=args)
result = workflow_run_service.get_paginate_advanced_chat_workflow_runs(
app_model=app_model, args=args, triggered_from=triggered_from
)
return result
@console_ns.route("/apps/<uuid:app_id>/advanced-chat/workflow-runs/count")
class AdvancedChatAppWorkflowRunCountApi(Resource):
@console_ns.doc("get_advanced_chat_workflow_runs_count")
@console_ns.doc(description="Get advanced chat workflow runs count statistics")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.doc(
params={"status": "Filter by status (optional): running, succeeded, failed, stopped, partial-succeeded"}
)
@console_ns.doc(
params={
"time_range": (
"Filter by time range (optional): e.g., 7d (7 days), 4h (4 hours), "
"30m (30 minutes), 30s (30 seconds). Filters by created_at field."
)
}
)
@console_ns.doc(
params={"triggered_from": "Filter by trigger source (optional): debugging or app-run. Default: debugging"}
)
@console_ns.response(200, "Workflow runs count retrieved successfully", workflow_run_count_model)
@console_ns.expect(console_ns.models[WorkflowRunCountQuery.__name__])
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT])
@marshal_with(workflow_run_count_model)
def get(self, app_model: App):
"""
Get advanced chat workflow runs count statistics
"""
args_model = WorkflowRunCountQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
args = args_model.model_dump(exclude_none=True)
# Default to DEBUGGING if not specified
triggered_from = (
WorkflowRunTriggeredFrom(args_model.triggered_from)
if args_model.triggered_from
else WorkflowRunTriggeredFrom.DEBUGGING
)
workflow_run_service = WorkflowRunService()
result = workflow_run_service.get_workflow_runs_count(
app_model=app_model,
status=args.get("status"),
time_range=args.get("time_range"),
triggered_from=triggered_from,
)
return result
@console_ns.route("/apps/<uuid:app_id>/workflow-runs")
class WorkflowRunListApi(Resource):
@api.doc("get_workflow_runs")
@api.doc(description="Get workflow run list")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"last_id": "Last run ID for pagination", "limit": "Number of items per page (1-100)"})
@api.response(200, "Workflow runs retrieved successfully", workflow_run_pagination_fields)
@console_ns.doc("get_workflow_runs")
@console_ns.doc(description="Get workflow run list")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.doc(params={"last_id": "Last run ID for pagination", "limit": "Number of items per page (1-100)"})
@console_ns.doc(
params={"status": "Filter by status (optional): running, succeeded, failed, stopped, partial-succeeded"}
)
@console_ns.doc(
params={"triggered_from": "Filter by trigger source (optional): debugging or app-run. Default: debugging"}
)
@console_ns.response(200, "Workflow runs retrieved successfully", workflow_run_pagination_model)
@console_ns.expect(console_ns.models[WorkflowRunListQuery.__name__])
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
@marshal_with(workflow_run_pagination_fields)
@marshal_with(workflow_run_pagination_model)
def get(self, app_model: App):
"""
Get workflow run list
"""
parser = reqparse.RequestParser()
parser.add_argument("last_id", type=uuid_value, location="args")
parser.add_argument("limit", type=int_range(1, 100), required=False, default=20, location="args")
args = parser.parse_args()
args_model = WorkflowRunListQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
args = args_model.model_dump(exclude_none=True)
# Default to DEBUGGING for workflow if not specified (backward compatibility)
triggered_from = (
WorkflowRunTriggeredFrom(args_model.triggered_from)
if args_model.triggered_from
else WorkflowRunTriggeredFrom.DEBUGGING
)
workflow_run_service = WorkflowRunService()
result = workflow_run_service.get_paginate_workflow_runs(app_model=app_model, args=args)
result = workflow_run_service.get_paginate_workflow_runs(
app_model=app_model, args=args, triggered_from=triggered_from
)
return result
@console_ns.route("/apps/<uuid:app_id>/workflow-runs/count")
class WorkflowRunCountApi(Resource):
@console_ns.doc("get_workflow_runs_count")
@console_ns.doc(description="Get workflow runs count statistics")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.doc(
params={"status": "Filter by status (optional): running, succeeded, failed, stopped, partial-succeeded"}
)
@console_ns.doc(
params={
"time_range": (
"Filter by time range (optional): e.g., 7d (7 days), 4h (4 hours), "
"30m (30 minutes), 30s (30 seconds). Filters by created_at field."
)
}
)
@console_ns.doc(
params={"triggered_from": "Filter by trigger source (optional): debugging or app-run. Default: debugging"}
)
@console_ns.response(200, "Workflow runs count retrieved successfully", workflow_run_count_model)
@console_ns.expect(console_ns.models[WorkflowRunCountQuery.__name__])
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
@marshal_with(workflow_run_count_model)
def get(self, app_model: App):
"""
Get workflow runs count statistics
"""
args_model = WorkflowRunCountQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
args = args_model.model_dump(exclude_none=True)
# Default to DEBUGGING for workflow if not specified (backward compatibility)
triggered_from = (
WorkflowRunTriggeredFrom(args_model.triggered_from)
if args_model.triggered_from
else WorkflowRunTriggeredFrom.DEBUGGING
)
workflow_run_service = WorkflowRunService()
result = workflow_run_service.get_workflow_runs_count(
app_model=app_model,
status=args.get("status"),
time_range=args.get("time_range"),
triggered_from=triggered_from,
)
return result
@console_ns.route("/apps/<uuid:app_id>/workflow-runs/<uuid:run_id>")
class WorkflowRunDetailApi(Resource):
@api.doc("get_workflow_run_detail")
@api.doc(description="Get workflow run detail")
@api.doc(params={"app_id": "Application ID", "run_id": "Workflow run ID"})
@api.response(200, "Workflow run detail retrieved successfully", workflow_run_detail_fields)
@api.response(404, "Workflow run not found")
@console_ns.doc("get_workflow_run_detail")
@console_ns.doc(description="Get workflow run detail")
@console_ns.doc(params={"app_id": "Application ID", "run_id": "Workflow run ID"})
@console_ns.response(200, "Workflow run detail retrieved successfully", workflow_run_detail_model)
@console_ns.response(404, "Workflow run not found")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
@marshal_with(workflow_run_detail_fields)
@marshal_with(workflow_run_detail_model)
def get(self, app_model: App, run_id):
"""
Get workflow run detail
@@ -99,16 +350,16 @@ class WorkflowRunDetailApi(Resource):
@console_ns.route("/apps/<uuid:app_id>/workflow-runs/<uuid:run_id>/node-executions")
class WorkflowRunNodeExecutionListApi(Resource):
@api.doc("get_workflow_run_node_executions")
@api.doc(description="Get workflow run node execution list")
@api.doc(params={"app_id": "Application ID", "run_id": "Workflow run ID"})
@api.response(200, "Node executions retrieved successfully", workflow_run_node_execution_list_fields)
@api.response(404, "Workflow run not found")
@console_ns.doc("get_workflow_run_node_executions")
@console_ns.doc(description="Get workflow run node execution list")
@console_ns.doc(params={"app_id": "Application ID", "run_id": "Workflow run ID"})
@console_ns.response(200, "Node executions retrieved successfully", workflow_run_node_execution_list_model)
@console_ns.response(404, "Workflow run not found")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.ADVANCED_CHAT, AppMode.WORKFLOW])
@marshal_with(workflow_run_node_execution_list_fields)
@marshal_with(workflow_run_node_execution_list_model)
def get(self, app_model: App, run_id):
"""
Get workflow run node execution list

View File

@@ -1,311 +1,194 @@
from datetime import datetime
from decimal import Decimal
from flask import abort, jsonify, request
from flask_restx import Resource
from pydantic import BaseModel, Field, field_validator
from sqlalchemy.orm import sessionmaker
import pytz
import sqlalchemy as sa
from flask import jsonify
from flask_login import current_user
from flask_restx import Resource, reqparse
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.wraps import get_app_model
from controllers.console.wraps import account_initialization_required, setup_required
from extensions.ext_database import db
from libs.helper import DatetimeString
from libs.login import login_required
from libs.datetime_utils import parse_time_range
from libs.login import current_account_with_tenant, login_required
from models.enums import WorkflowRunTriggeredFrom
from models.model import AppMode
from repositories.factory import DifyAPIRepositoryFactory
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class WorkflowStatisticQuery(BaseModel):
start: str | None = Field(default=None, description="Start date and time (YYYY-MM-DD HH:MM)")
end: str | None = Field(default=None, description="End date and time (YYYY-MM-DD HH:MM)")
@field_validator("start", "end", mode="before")
@classmethod
def blank_to_none(cls, value: str | None) -> str | None:
if value == "":
return None
return value
console_ns.schema_model(
WorkflowStatisticQuery.__name__,
WorkflowStatisticQuery.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0),
)
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/daily-conversations")
class WorkflowDailyRunsStatistic(Resource):
@api.doc("get_workflow_daily_runs_statistic")
@api.doc(description="Get workflow daily runs statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Daily runs statistics retrieved successfully")
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
session_maker = sessionmaker(bind=db.engine, expire_on_commit=False)
self._workflow_run_repo = DifyAPIRepositoryFactory.create_api_workflow_run_repository(session_maker)
@console_ns.doc("get_workflow_daily_runs_statistic")
@console_ns.doc(description="Get workflow daily runs statistics")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[WorkflowStatisticQuery.__name__])
@console_ns.response(200, "Daily runs statistics retrieved successfully")
@get_app_model
@setup_required
@login_required
@account_initialization_required
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = WorkflowStatisticQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
COUNT(id) AS runs
FROM
workflow_runs
WHERE
app_id = :app_id
AND triggered_from = :triggered_from"""
arg_dict = {
"tz": account.timezone,
"app_id": app_model.id,
"triggered_from": WorkflowRunTriggeredFrom.APP_RUN.value,
}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
try:
start_date, end_date = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date ORDER BY date"
response_data = []
with db.engine.begin() as conn:
rs = conn.execute(sa.text(sql_query), arg_dict)
for i in rs:
response_data.append({"date": str(i.date), "runs": i.runs})
response_data = self._workflow_run_repo.get_daily_runs_statistics(
tenant_id=app_model.tenant_id,
app_id=app_model.id,
triggered_from=WorkflowRunTriggeredFrom.APP_RUN,
start_date=start_date,
end_date=end_date,
timezone=account.timezone,
)
return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/daily-terminals")
class WorkflowDailyTerminalsStatistic(Resource):
@api.doc("get_workflow_daily_terminals_statistic")
@api.doc(description="Get workflow daily terminals statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Daily terminals statistics retrieved successfully")
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
session_maker = sessionmaker(bind=db.engine, expire_on_commit=False)
self._workflow_run_repo = DifyAPIRepositoryFactory.create_api_workflow_run_repository(session_maker)
@console_ns.doc("get_workflow_daily_terminals_statistic")
@console_ns.doc(description="Get workflow daily terminals statistics")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[WorkflowStatisticQuery.__name__])
@console_ns.response(200, "Daily terminals statistics retrieved successfully")
@get_app_model
@setup_required
@login_required
@account_initialization_required
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = WorkflowStatisticQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
COUNT(DISTINCT workflow_runs.created_by) AS terminal_count
FROM
workflow_runs
WHERE
app_id = :app_id
AND triggered_from = :triggered_from"""
arg_dict = {
"tz": account.timezone,
"app_id": app_model.id,
"triggered_from": WorkflowRunTriggeredFrom.APP_RUN.value,
}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
try:
start_date, end_date = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date ORDER BY date"
response_data = []
with db.engine.begin() as conn:
rs = conn.execute(sa.text(sql_query), arg_dict)
for i in rs:
response_data.append({"date": str(i.date), "terminal_count": i.terminal_count})
response_data = self._workflow_run_repo.get_daily_terminals_statistics(
tenant_id=app_model.tenant_id,
app_id=app_model.id,
triggered_from=WorkflowRunTriggeredFrom.APP_RUN,
start_date=start_date,
end_date=end_date,
timezone=account.timezone,
)
return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/token-costs")
class WorkflowDailyTokenCostStatistic(Resource):
@api.doc("get_workflow_daily_token_cost_statistic")
@api.doc(description="Get workflow daily token cost statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Daily token cost statistics retrieved successfully")
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
session_maker = sessionmaker(bind=db.engine, expire_on_commit=False)
self._workflow_run_repo = DifyAPIRepositoryFactory.create_api_workflow_run_repository(session_maker)
@console_ns.doc("get_workflow_daily_token_cost_statistic")
@console_ns.doc(description="Get workflow daily token cost statistics")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[WorkflowStatisticQuery.__name__])
@console_ns.response(200, "Daily token cost statistics retrieved successfully")
@get_app_model
@setup_required
@login_required
@account_initialization_required
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = WorkflowStatisticQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
sql_query = """SELECT
DATE(DATE_TRUNC('day', created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
SUM(workflow_runs.total_tokens) AS token_count
FROM
workflow_runs
WHERE
app_id = :app_id
AND triggered_from = :triggered_from"""
arg_dict = {
"tz": account.timezone,
"app_id": app_model.id,
"triggered_from": WorkflowRunTriggeredFrom.APP_RUN.value,
}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
try:
start_date, end_date = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query += " AND created_at >= :start"
arg_dict["start"] = start_datetime_utc
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query += " AND created_at < :end"
arg_dict["end"] = end_datetime_utc
sql_query += " GROUP BY date ORDER BY date"
response_data = []
with db.engine.begin() as conn:
rs = conn.execute(sa.text(sql_query), arg_dict)
for i in rs:
response_data.append(
{
"date": str(i.date),
"token_count": i.token_count,
}
)
response_data = self._workflow_run_repo.get_daily_token_cost_statistics(
tenant_id=app_model.tenant_id,
app_id=app_model.id,
triggered_from=WorkflowRunTriggeredFrom.APP_RUN,
start_date=start_date,
end_date=end_date,
timezone=account.timezone,
)
return jsonify({"data": response_data})
@console_ns.route("/apps/<uuid:app_id>/workflow/statistics/average-app-interactions")
class WorkflowAverageAppInteractionStatistic(Resource):
@api.doc("get_workflow_average_app_interaction_statistic")
@api.doc(description="Get workflow average app interaction statistics")
@api.doc(params={"app_id": "Application ID"})
@api.doc(params={"start": "Start date and time (YYYY-MM-DD HH:MM)", "end": "End date and time (YYYY-MM-DD HH:MM)"})
@api.response(200, "Average app interaction statistics retrieved successfully")
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
session_maker = sessionmaker(bind=db.engine, expire_on_commit=False)
self._workflow_run_repo = DifyAPIRepositoryFactory.create_api_workflow_run_repository(session_maker)
@console_ns.doc("get_workflow_average_app_interaction_statistic")
@console_ns.doc(description="Get workflow average app interaction statistics")
@console_ns.doc(params={"app_id": "Application ID"})
@console_ns.expect(console_ns.models[WorkflowStatisticQuery.__name__])
@console_ns.response(200, "Average app interaction statistics retrieved successfully")
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=[AppMode.WORKFLOW])
def get(self, app_model):
account = current_user
account, _ = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument("start", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
parser.add_argument("end", type=DatetimeString("%Y-%m-%d %H:%M"), location="args")
args = parser.parse_args()
args = WorkflowStatisticQuery.model_validate(request.args.to_dict(flat=True)) # type: ignore
sql_query = """SELECT
AVG(sub.interactions) AS interactions,
sub.date
FROM
(
SELECT
DATE(DATE_TRUNC('day', c.created_at AT TIME ZONE 'UTC' AT TIME ZONE :tz )) AS date,
c.created_by,
COUNT(c.id) AS interactions
FROM
workflow_runs c
WHERE
c.app_id = :app_id
AND c.triggered_from = :triggered_from
{{start}}
{{end}}
GROUP BY
date, c.created_by
) sub
GROUP BY
sub.date"""
arg_dict = {
"tz": account.timezone,
"app_id": app_model.id,
"triggered_from": WorkflowRunTriggeredFrom.APP_RUN.value,
}
assert account.timezone is not None
timezone = pytz.timezone(account.timezone)
utc_timezone = pytz.utc
try:
start_date, end_date = parse_time_range(args.start, args.end, account.timezone)
except ValueError as e:
abort(400, description=str(e))
if args["start"]:
start_datetime = datetime.strptime(args["start"], "%Y-%m-%d %H:%M")
start_datetime = start_datetime.replace(second=0)
start_datetime_timezone = timezone.localize(start_datetime)
start_datetime_utc = start_datetime_timezone.astimezone(utc_timezone)
sql_query = sql_query.replace("{{start}}", " AND c.created_at >= :start")
arg_dict["start"] = start_datetime_utc
else:
sql_query = sql_query.replace("{{start}}", "")
if args["end"]:
end_datetime = datetime.strptime(args["end"], "%Y-%m-%d %H:%M")
end_datetime = end_datetime.replace(second=0)
end_datetime_timezone = timezone.localize(end_datetime)
end_datetime_utc = end_datetime_timezone.astimezone(utc_timezone)
sql_query = sql_query.replace("{{end}}", " AND c.created_at < :end")
arg_dict["end"] = end_datetime_utc
else:
sql_query = sql_query.replace("{{end}}", "")
response_data = []
with db.engine.begin() as conn:
rs = conn.execute(sa.text(sql_query), arg_dict)
for i in rs:
response_data.append(
{"date": str(i.date), "interactions": float(i.interactions.quantize(Decimal("0.01")))}
)
response_data = self._workflow_run_repo.get_average_app_interaction_statistics(
tenant_id=app_model.tenant_id,
app_id=app_model.id,
triggered_from=WorkflowRunTriggeredFrom.APP_RUN,
start_date=start_date,
end_date=end_date,
timezone=account.timezone,
)
return jsonify({"data": response_data})

View File

@@ -0,0 +1,157 @@
import logging
from flask import request
from flask_restx import Resource, marshal_with
from pydantic import BaseModel
from sqlalchemy import select
from sqlalchemy.orm import Session
from werkzeug.exceptions import NotFound
from configs import dify_config
from extensions.ext_database import db
from fields.workflow_trigger_fields import trigger_fields, triggers_list_fields, webhook_trigger_fields
from libs.login import current_user, login_required
from models.enums import AppTriggerStatus
from models.model import Account, App, AppMode
from models.trigger import AppTrigger, WorkflowWebhookTrigger
from .. import console_ns
from ..app.wraps import get_app_model
from ..wraps import account_initialization_required, edit_permission_required, setup_required
logger = logging.getLogger(__name__)
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class Parser(BaseModel):
node_id: str
class ParserEnable(BaseModel):
trigger_id: str
enable_trigger: bool
console_ns.schema_model(Parser.__name__, Parser.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0))
console_ns.schema_model(
ParserEnable.__name__, ParserEnable.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0)
)
@console_ns.route("/apps/<uuid:app_id>/workflows/triggers/webhook")
class WebhookTriggerApi(Resource):
"""Webhook Trigger API"""
@console_ns.expect(console_ns.models[Parser.__name__])
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
@marshal_with(webhook_trigger_fields)
def get(self, app_model: App):
"""Get webhook trigger for a node"""
args = Parser.model_validate(request.args.to_dict(flat=True)) # type: ignore
node_id = args.node_id
with Session(db.engine) as session:
# Get webhook trigger for this app and node
webhook_trigger = (
session.query(WorkflowWebhookTrigger)
.where(
WorkflowWebhookTrigger.app_id == app_model.id,
WorkflowWebhookTrigger.node_id == node_id,
)
.first()
)
if not webhook_trigger:
raise NotFound("Webhook trigger not found for this node")
return webhook_trigger
@console_ns.route("/apps/<uuid:app_id>/triggers")
class AppTriggersApi(Resource):
"""App Triggers list API"""
@setup_required
@login_required
@account_initialization_required
@get_app_model(mode=AppMode.WORKFLOW)
@marshal_with(triggers_list_fields)
def get(self, app_model: App):
"""Get app triggers list"""
assert isinstance(current_user, Account)
assert current_user.current_tenant_id is not None
with Session(db.engine) as session:
# Get all triggers for this app using select API
triggers = (
session.execute(
select(AppTrigger)
.where(
AppTrigger.tenant_id == current_user.current_tenant_id,
AppTrigger.app_id == app_model.id,
)
.order_by(AppTrigger.created_at.desc(), AppTrigger.id.desc())
)
.scalars()
.all()
)
# Add computed icon field for each trigger
url_prefix = dify_config.CONSOLE_API_URL + "/console/api/workspaces/current/tool-provider/builtin/"
for trigger in triggers:
if trigger.trigger_type == "trigger-plugin":
trigger.icon = url_prefix + trigger.provider_name + "/icon" # type: ignore
else:
trigger.icon = "" # type: ignore
return {"data": triggers}
@console_ns.route("/apps/<uuid:app_id>/trigger-enable")
class AppTriggerEnableApi(Resource):
@console_ns.expect(console_ns.models[ParserEnable.__name__], validate=True)
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
@get_app_model(mode=AppMode.WORKFLOW)
@marshal_with(trigger_fields)
def post(self, app_model: App):
"""Update app trigger (enable/disable)"""
args = ParserEnable.model_validate(console_ns.payload)
assert current_user.current_tenant_id is not None
trigger_id = args.trigger_id
with Session(db.engine) as session:
# Find the trigger using select
trigger = session.execute(
select(AppTrigger).where(
AppTrigger.id == trigger_id,
AppTrigger.tenant_id == current_user.current_tenant_id,
AppTrigger.app_id == app_model.id,
)
).scalar_one_or_none()
if not trigger:
raise NotFound("Trigger not found")
# Update status based on enable_trigger boolean
trigger.status = AppTriggerStatus.ENABLED if args.enable_trigger else AppTriggerStatus.DISABLED
session.commit()
session.refresh(trigger)
# Add computed icon field
url_prefix = dify_config.CONSOLE_API_URL + "/console/api/workspaces/current/tool-provider/builtin/"
if trigger.trigger_type == "trigger-plugin":
trigger.icon = url_prefix + trigger.provider_name + "/icon" # type: ignore
else:
trigger.icon = "" # type: ignore
return trigger

View File

@@ -4,28 +4,29 @@ from typing import ParamSpec, TypeVar, Union
from controllers.console.app.error import AppNotFoundError
from extensions.ext_database import db
from libs.login import current_user
from libs.login import current_account_with_tenant
from models import App, AppMode
from models.account import Account
P = ParamSpec("P")
R = TypeVar("R")
P1 = ParamSpec("P1")
R1 = TypeVar("R1")
def _load_app_model(app_id: str) -> App | None:
assert isinstance(current_user, Account)
_, current_tenant_id = current_account_with_tenant()
app_model = (
db.session.query(App)
.where(App.id == app_id, App.tenant_id == current_user.current_tenant_id, App.status == "normal")
.where(App.id == app_id, App.tenant_id == current_tenant_id, App.status == "normal")
.first()
)
return app_model
def get_app_model(view: Callable[P, R] | None = None, *, mode: Union[AppMode, list[AppMode], None] = None):
def decorator(view_func: Callable[P, R]):
def decorator(view_func: Callable[P1, R1]):
@wraps(view_func)
def decorated_view(*args: P.args, **kwargs: P.kwargs):
def decorated_view(*args: P1.args, **kwargs: P1.kwargs):
if not kwargs.get("app_id"):
raise ValueError("missing app_id in path parameters")

View File

@@ -2,35 +2,31 @@ from flask import request
from flask_restx import Resource, fields, reqparse
from constants.languages import supported_language
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.error import AlreadyActivateError
from extensions.ext_database import db
from libs.datetime_utils import naive_utc_now
from libs.helper import StrLen, email, extract_remote_ip, timezone
from models.account import AccountStatus
from models import AccountStatus
from services.account_service import AccountService, RegisterService
active_check_parser = reqparse.RequestParser()
active_check_parser.add_argument(
"workspace_id", type=str, required=False, nullable=True, location="args", help="Workspace ID"
)
active_check_parser.add_argument(
"email", type=email, required=False, nullable=True, location="args", help="Email address"
)
active_check_parser.add_argument(
"token", type=str, required=True, nullable=False, location="args", help="Activation token"
active_check_parser = (
reqparse.RequestParser()
.add_argument("workspace_id", type=str, required=False, nullable=True, location="args", help="Workspace ID")
.add_argument("email", type=email, required=False, nullable=True, location="args", help="Email address")
.add_argument("token", type=str, required=True, nullable=False, location="args", help="Activation token")
)
@console_ns.route("/activate/check")
class ActivateCheckApi(Resource):
@api.doc("check_activation_token")
@api.doc(description="Check if activation token is valid")
@api.expect(active_check_parser)
@api.response(
@console_ns.doc("check_activation_token")
@console_ns.doc(description="Check if activation token is valid")
@console_ns.expect(active_check_parser)
@console_ns.response(
200,
"Success",
api.model(
console_ns.model(
"ActivationCheckResponse",
{
"is_valid": fields.Boolean(description="Whether token is valid"),
@@ -60,26 +56,26 @@ class ActivateCheckApi(Resource):
return {"is_valid": False}
active_parser = reqparse.RequestParser()
active_parser.add_argument("workspace_id", type=str, required=False, nullable=True, location="json")
active_parser.add_argument("email", type=email, required=False, nullable=True, location="json")
active_parser.add_argument("token", type=str, required=True, nullable=False, location="json")
active_parser.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
active_parser.add_argument(
"interface_language", type=supported_language, required=True, nullable=False, location="json"
active_parser = (
reqparse.RequestParser()
.add_argument("workspace_id", type=str, required=False, nullable=True, location="json")
.add_argument("email", type=email, required=False, nullable=True, location="json")
.add_argument("token", type=str, required=True, nullable=False, location="json")
.add_argument("name", type=StrLen(30), required=True, nullable=False, location="json")
.add_argument("interface_language", type=supported_language, required=True, nullable=False, location="json")
.add_argument("timezone", type=timezone, required=True, nullable=False, location="json")
)
active_parser.add_argument("timezone", type=timezone, required=True, nullable=False, location="json")
@console_ns.route("/activate")
class ActivateApi(Resource):
@api.doc("activate_account")
@api.doc(description="Activate account with invitation token")
@api.expect(active_parser)
@api.response(
@console_ns.doc("activate_account")
@console_ns.doc(description="Activate account with invitation token")
@console_ns.expect(active_parser)
@console_ns.response(
200,
"Account activated successfully",
api.model(
console_ns.model(
"ActivationResponse",
{
"result": fields.String(description="Operation result"),
@@ -87,7 +83,7 @@ class ActivateApi(Resource):
},
),
)
@api.response(400, "Already activated or invalid token")
@console_ns.response(400, "Already activated or invalid token")
def post(self):
args = active_parser.parse_args()
@@ -103,7 +99,7 @@ class ActivateApi(Resource):
account.interface_language = args["interface_language"]
account.timezone = args["timezone"]
account.interface_theme = "light"
account.status = AccountStatus.ACTIVE.value
account.status = AccountStatus.ACTIVE
account.initialized_at = naive_utc_now()
db.session.commit()

View File

@@ -1,10 +1,9 @@
from flask_login import current_user
from flask_restx import Resource, reqparse
from werkzeug.exceptions import Forbidden
from controllers.console import console_ns
from controllers.console.auth.error import ApiKeyAuthFailedError
from libs.login import login_required
from controllers.console.wraps import is_admin_or_owner_required
from libs.login import current_account_with_tenant, login_required
from services.auth.api_key_auth_service import ApiKeyAuthService
from ..wraps import account_initialization_required, setup_required
@@ -16,7 +15,8 @@ class ApiKeyAuthDataSource(Resource):
@login_required
@account_initialization_required
def get(self):
data_source_api_key_bindings = ApiKeyAuthService.get_provider_auth_list(current_user.current_tenant_id)
_, current_tenant_id = current_account_with_tenant()
data_source_api_key_bindings = ApiKeyAuthService.get_provider_auth_list(current_tenant_id)
if data_source_api_key_bindings:
return {
"sources": [
@@ -39,18 +39,20 @@ class ApiKeyAuthDataSourceBinding(Resource):
@setup_required
@login_required
@account_initialization_required
@is_admin_or_owner_required
def post(self):
# The role of the current user in the table must be admin or owner
if not current_user.is_admin_or_owner:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("category", type=str, required=True, nullable=False, location="json")
parser.add_argument("provider", type=str, required=True, nullable=False, location="json")
parser.add_argument("credentials", type=dict, required=True, nullable=False, location="json")
_, current_tenant_id = current_account_with_tenant()
parser = (
reqparse.RequestParser()
.add_argument("category", type=str, required=True, nullable=False, location="json")
.add_argument("provider", type=str, required=True, nullable=False, location="json")
.add_argument("credentials", type=dict, required=True, nullable=False, location="json")
)
args = parser.parse_args()
ApiKeyAuthService.validate_api_key_auth_args(args)
try:
ApiKeyAuthService.create_provider_auth(current_user.current_tenant_id, args)
ApiKeyAuthService.create_provider_auth(current_tenant_id, args)
except Exception as e:
raise ApiKeyAuthFailedError(str(e))
return {"result": "success"}, 200
@@ -61,11 +63,11 @@ class ApiKeyAuthDataSourceBindingDelete(Resource):
@setup_required
@login_required
@account_initialization_required
@is_admin_or_owner_required
def delete(self, binding_id):
# The role of the current user in the table must be admin or owner
if not current_user.is_admin_or_owner:
raise Forbidden()
_, current_tenant_id = current_account_with_tenant()
ApiKeyAuthService.delete_provider_auth(current_user.current_tenant_id, binding_id)
ApiKeyAuthService.delete_provider_auth(current_tenant_id, binding_id)
return {"result": "success"}, 204

View File

@@ -2,12 +2,11 @@ import logging
import httpx
from flask import current_app, redirect, request
from flask_login import current_user
from flask_restx import Resource, fields
from werkzeug.exceptions import Forbidden
from configs import dify_config
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.wraps import is_admin_or_owner_required
from libs.login import login_required
from libs.oauth_data_source import NotionOAuth
@@ -30,23 +29,22 @@ def get_oauth_providers():
@console_ns.route("/oauth/data-source/<string:provider>")
class OAuthDataSource(Resource):
@api.doc("oauth_data_source")
@api.doc(description="Get OAuth authorization URL for data source provider")
@api.doc(params={"provider": "Data source provider name (notion)"})
@api.response(
@console_ns.doc("oauth_data_source")
@console_ns.doc(description="Get OAuth authorization URL for data source provider")
@console_ns.doc(params={"provider": "Data source provider name (notion)"})
@console_ns.response(
200,
"Authorization URL or internal setup success",
api.model(
console_ns.model(
"OAuthDataSourceResponse",
{"data": fields.Raw(description="Authorization URL or 'internal' for internal setup")},
),
)
@api.response(400, "Invalid provider")
@api.response(403, "Admin privileges required")
@console_ns.response(400, "Invalid provider")
@console_ns.response(403, "Admin privileges required")
@is_admin_or_owner_required
def get(self, provider: str):
# The role of the current user in the table must be admin or owner
if not current_user.is_admin_or_owner:
raise Forbidden()
OAUTH_DATASOURCE_PROVIDERS = get_oauth_providers()
with current_app.app_context():
oauth_provider = OAUTH_DATASOURCE_PROVIDERS.get(provider)
@@ -65,17 +63,17 @@ class OAuthDataSource(Resource):
@console_ns.route("/oauth/data-source/callback/<string:provider>")
class OAuthDataSourceCallback(Resource):
@api.doc("oauth_data_source_callback")
@api.doc(description="Handle OAuth callback from data source provider")
@api.doc(
@console_ns.doc("oauth_data_source_callback")
@console_ns.doc(description="Handle OAuth callback from data source provider")
@console_ns.doc(
params={
"provider": "Data source provider name (notion)",
"code": "Authorization code from OAuth provider",
"error": "Error message from OAuth provider",
}
)
@api.response(302, "Redirect to console with result")
@api.response(400, "Invalid provider")
@console_ns.response(302, "Redirect to console with result")
@console_ns.response(400, "Invalid provider")
def get(self, provider: str):
OAUTH_DATASOURCE_PROVIDERS = get_oauth_providers()
with current_app.app_context():
@@ -96,17 +94,17 @@ class OAuthDataSourceCallback(Resource):
@console_ns.route("/oauth/data-source/binding/<string:provider>")
class OAuthDataSourceBinding(Resource):
@api.doc("oauth_data_source_binding")
@api.doc(description="Bind OAuth data source with authorization code")
@api.doc(
@console_ns.doc("oauth_data_source_binding")
@console_ns.doc(description="Bind OAuth data source with authorization code")
@console_ns.doc(
params={"provider": "Data source provider name (notion)", "code": "Authorization code from OAuth provider"}
)
@api.response(
@console_ns.response(
200,
"Data source binding success",
api.model("OAuthDataSourceBindingResponse", {"result": fields.String(description="Operation result")}),
console_ns.model("OAuthDataSourceBindingResponse", {"result": fields.String(description="Operation result")}),
)
@api.response(400, "Invalid provider or code")
@console_ns.response(400, "Invalid provider or code")
def get(self, provider: str):
OAUTH_DATASOURCE_PROVIDERS = get_oauth_providers()
with current_app.app_context():
@@ -130,15 +128,15 @@ class OAuthDataSourceBinding(Resource):
@console_ns.route("/oauth/data-source/<string:provider>/<uuid:binding_id>/sync")
class OAuthDataSourceSync(Resource):
@api.doc("oauth_data_source_sync")
@api.doc(description="Sync data from OAuth data source")
@api.doc(params={"provider": "Data source provider name (notion)", "binding_id": "Data source binding ID"})
@api.response(
@console_ns.doc("oauth_data_source_sync")
@console_ns.doc(description="Sync data from OAuth data source")
@console_ns.doc(params={"provider": "Data source provider name (notion)", "binding_id": "Data source binding ID"})
@console_ns.response(
200,
"Data source sync success",
api.model("OAuthDataSourceSyncResponse", {"result": fields.String(description="Operation result")}),
console_ns.model("OAuthDataSourceSyncResponse", {"result": fields.String(description="Operation result")}),
)
@api.response(400, "Invalid provider or sync failed")
@console_ns.response(400, "Invalid provider or sync failed")
@setup_required
@login_required
@account_initialization_required

View File

@@ -19,7 +19,7 @@ from controllers.console.wraps import email_password_login_enabled, email_regist
from extensions.ext_database import db
from libs.helper import email, extract_remote_ip
from libs.password import valid_password
from models.account import Account
from models import Account
from services.account_service import AccountService
from services.billing_service import BillingService
from services.errors.account import AccountNotFoundError, AccountRegisterError
@@ -31,9 +31,11 @@ class EmailRegisterSendEmailApi(Resource):
@email_password_login_enabled
@email_register_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("language", type=str, required=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("email", type=email, required=True, location="json")
.add_argument("language", type=str, required=False, location="json")
)
args = parser.parse_args()
ip_address = extract_remote_ip(request)
@@ -59,10 +61,12 @@ class EmailRegisterCheckApi(Resource):
@email_password_login_enabled
@email_register_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=str, required=True, location="json")
parser.add_argument("code", type=str, required=True, location="json")
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("email", type=str, required=True, location="json")
.add_argument("code", type=str, required=True, location="json")
.add_argument("token", type=str, required=True, nullable=False, location="json")
)
args = parser.parse_args()
user_email = args["email"]
@@ -100,10 +104,12 @@ class EmailRegisterResetApi(Resource):
@email_password_login_enabled
@email_register_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
parser.add_argument("new_password", type=valid_password, required=True, nullable=False, location="json")
parser.add_argument("password_confirm", type=valid_password, required=True, nullable=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("token", type=str, required=True, nullable=False, location="json")
.add_argument("new_password", type=valid_password, required=True, nullable=False, location="json")
.add_argument("password_confirm", type=valid_password, required=True, nullable=False, location="json")
)
args = parser.parse_args()
# Validate passwords match

View File

@@ -6,7 +6,7 @@ from flask_restx import Resource, fields, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.auth.error import (
EmailCodeError,
EmailPasswordResetLimitError,
@@ -20,17 +20,17 @@ from events.tenant_event import tenant_was_created
from extensions.ext_database import db
from libs.helper import email, extract_remote_ip
from libs.password import hash_password, valid_password
from models.account import Account
from models import Account
from services.account_service import AccountService, TenantService
from services.feature_service import FeatureService
@console_ns.route("/forgot-password")
class ForgotPasswordSendEmailApi(Resource):
@api.doc("send_forgot_password_email")
@api.doc(description="Send password reset email")
@api.expect(
api.model(
@console_ns.doc("send_forgot_password_email")
@console_ns.doc(description="Send password reset email")
@console_ns.expect(
console_ns.model(
"ForgotPasswordEmailRequest",
{
"email": fields.String(required=True, description="Email address"),
@@ -38,10 +38,10 @@ class ForgotPasswordSendEmailApi(Resource):
},
)
)
@api.response(
@console_ns.response(
200,
"Email sent successfully",
api.model(
console_ns.model(
"ForgotPasswordEmailResponse",
{
"result": fields.String(description="Operation result"),
@@ -50,13 +50,15 @@ class ForgotPasswordSendEmailApi(Resource):
},
),
)
@api.response(400, "Invalid email or rate limit exceeded")
@console_ns.response(400, "Invalid email or rate limit exceeded")
@setup_required
@email_password_login_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("language", type=str, required=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("email", type=email, required=True, location="json")
.add_argument("language", type=str, required=False, location="json")
)
args = parser.parse_args()
ip_address = extract_remote_ip(request)
@@ -83,10 +85,10 @@ class ForgotPasswordSendEmailApi(Resource):
@console_ns.route("/forgot-password/validity")
class ForgotPasswordCheckApi(Resource):
@api.doc("check_forgot_password_code")
@api.doc(description="Verify password reset code")
@api.expect(
api.model(
@console_ns.doc("check_forgot_password_code")
@console_ns.doc(description="Verify password reset code")
@console_ns.expect(
console_ns.model(
"ForgotPasswordCheckRequest",
{
"email": fields.String(required=True, description="Email address"),
@@ -95,10 +97,10 @@ class ForgotPasswordCheckApi(Resource):
},
)
)
@api.response(
@console_ns.response(
200,
"Code verified successfully",
api.model(
console_ns.model(
"ForgotPasswordCheckResponse",
{
"is_valid": fields.Boolean(description="Whether code is valid"),
@@ -107,14 +109,16 @@ class ForgotPasswordCheckApi(Resource):
},
),
)
@api.response(400, "Invalid code or token")
@console_ns.response(400, "Invalid code or token")
@setup_required
@email_password_login_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=str, required=True, location="json")
parser.add_argument("code", type=str, required=True, location="json")
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("email", type=str, required=True, location="json")
.add_argument("code", type=str, required=True, location="json")
.add_argument("token", type=str, required=True, nullable=False, location="json")
)
args = parser.parse_args()
user_email = args["email"]
@@ -148,10 +152,10 @@ class ForgotPasswordCheckApi(Resource):
@console_ns.route("/forgot-password/resets")
class ForgotPasswordResetApi(Resource):
@api.doc("reset_password")
@api.doc(description="Reset password with verification token")
@api.expect(
api.model(
@console_ns.doc("reset_password")
@console_ns.doc(description="Reset password with verification token")
@console_ns.expect(
console_ns.model(
"ForgotPasswordResetRequest",
{
"token": fields.String(required=True, description="Verification token"),
@@ -160,19 +164,21 @@ class ForgotPasswordResetApi(Resource):
},
)
)
@api.response(
@console_ns.response(
200,
"Password reset successfully",
api.model("ForgotPasswordResetResponse", {"result": fields.String(description="Operation result")}),
console_ns.model("ForgotPasswordResetResponse", {"result": fields.String(description="Operation result")}),
)
@api.response(400, "Invalid token or password mismatch")
@console_ns.response(400, "Invalid token or password mismatch")
@setup_required
@email_password_login_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("token", type=str, required=True, nullable=False, location="json")
parser.add_argument("new_password", type=valid_password, required=True, nullable=False, location="json")
parser.add_argument("password_confirm", type=valid_password, required=True, nullable=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("token", type=str, required=True, nullable=False, location="json")
.add_argument("new_password", type=valid_password, required=True, nullable=False, location="json")
.add_argument("password_confirm", type=valid_password, required=True, nullable=False, location="json")
)
args = parser.parse_args()
# Validate passwords match

View File

@@ -1,12 +1,10 @@
from typing import cast
import flask_login
from flask import request
from flask import make_response, request
from flask_restx import Resource, reqparse
import services
from configs import dify_config
from constants.languages import languages
from constants.languages import get_valid_language
from controllers.console import console_ns
from controllers.console.auth.error import (
AuthenticationFailedError,
@@ -26,7 +24,16 @@ from controllers.console.error import (
from controllers.console.wraps import email_password_login_enabled, setup_required
from events.tenant_event import tenant_was_created
from libs.helper import email, extract_remote_ip
from models.account import Account
from libs.login import current_account_with_tenant
from libs.token import (
clear_access_token_from_cookie,
clear_csrf_token_from_cookie,
clear_refresh_token_from_cookie,
extract_refresh_token,
set_access_token_to_cookie,
set_csrf_token_to_cookie,
set_refresh_token_to_cookie,
)
from services.account_service import AccountService, RegisterService, TenantService
from services.billing_service import BillingService
from services.errors.account import AccountRegisterError
@@ -42,11 +49,13 @@ class LoginApi(Resource):
@email_password_login_enabled
def post(self):
"""Authenticate user and login."""
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("password", type=str, required=True, location="json")
parser.add_argument("remember_me", type=bool, required=False, default=False, location="json")
parser.add_argument("invite_token", type=str, required=False, default=None, location="json")
parser = (
reqparse.RequestParser()
.add_argument("email", type=email, required=True, location="json")
.add_argument("password", type=str, required=True, location="json")
.add_argument("remember_me", type=bool, required=False, default=False, location="json")
.add_argument("invite_token", type=str, required=False, default=None, location="json")
)
args = parser.parse_args()
if dify_config.BILLING_ENABLED and BillingService.is_email_in_freeze(args["email"]):
@@ -89,19 +98,36 @@ class LoginApi(Resource):
token_pair = AccountService.login(account=account, ip_address=extract_remote_ip(request))
AccountService.reset_login_error_rate_limit(args["email"])
return {"result": "success", "data": token_pair.model_dump()}
# Create response with cookies instead of returning tokens in body
response = make_response({"result": "success"})
set_access_token_to_cookie(request, response, token_pair.access_token)
set_refresh_token_to_cookie(request, response, token_pair.refresh_token)
set_csrf_token_to_cookie(request, response, token_pair.csrf_token)
return response
@console_ns.route("/logout")
class LogoutApi(Resource):
@setup_required
def get(self):
account = cast(Account, flask_login.current_user)
def post(self):
current_user, _ = current_account_with_tenant()
account = current_user
if isinstance(account, flask_login.AnonymousUserMixin):
return {"result": "success"}
AccountService.logout(account=account)
flask_login.logout_user()
return {"result": "success"}
response = make_response({"result": "success"})
else:
AccountService.logout(account=account)
flask_login.logout_user()
response = make_response({"result": "success"})
# Clear cookies on logout
clear_access_token_from_cookie(response)
clear_refresh_token_from_cookie(response)
clear_csrf_token_from_cookie(response)
return response
@console_ns.route("/reset-password")
@@ -109,9 +135,11 @@ class ResetPasswordSendEmailApi(Resource):
@setup_required
@email_password_login_enabled
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("language", type=str, required=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("email", type=email, required=True, location="json")
.add_argument("language", type=str, required=False, location="json")
)
args = parser.parse_args()
if args["language"] is not None and args["language"] == "zh-Hans":
@@ -137,9 +165,11 @@ class ResetPasswordSendEmailApi(Resource):
class EmailCodeLoginSendEmailApi(Resource):
@setup_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=email, required=True, location="json")
parser.add_argument("language", type=str, required=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("email", type=email, required=True, location="json")
.add_argument("language", type=str, required=False, location="json")
)
args = parser.parse_args()
ip_address = extract_remote_ip(request)
@@ -170,13 +200,17 @@ class EmailCodeLoginSendEmailApi(Resource):
class EmailCodeLoginApi(Resource):
@setup_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("email", type=str, required=True, location="json")
parser.add_argument("code", type=str, required=True, location="json")
parser.add_argument("token", type=str, required=True, location="json")
parser = (
reqparse.RequestParser()
.add_argument("email", type=str, required=True, location="json")
.add_argument("code", type=str, required=True, location="json")
.add_argument("token", type=str, required=True, location="json")
.add_argument("language", type=str, required=False, location="json")
)
args = parser.parse_args()
user_email = args["email"]
language = args["language"]
token_data = AccountService.get_email_code_login_data(args["token"])
if token_data is None:
@@ -210,7 +244,9 @@ class EmailCodeLoginApi(Resource):
if account is None:
try:
account = AccountService.create_account_and_tenant(
email=user_email, name=user_email, interface_language=languages[0]
email=user_email,
name=user_email,
interface_language=get_valid_language(language),
)
except WorkSpaceNotAllowedCreateError:
raise NotAllowedCreateWorkspace()
@@ -220,18 +256,36 @@ class EmailCodeLoginApi(Resource):
raise WorkspacesLimitExceeded()
token_pair = AccountService.login(account, ip_address=extract_remote_ip(request))
AccountService.reset_login_error_rate_limit(args["email"])
return {"result": "success", "data": token_pair.model_dump()}
# Create response with cookies instead of returning tokens in body
response = make_response({"result": "success"})
set_csrf_token_to_cookie(request, response, token_pair.csrf_token)
# Set HTTP-only secure cookies for tokens
set_access_token_to_cookie(request, response, token_pair.access_token)
set_refresh_token_to_cookie(request, response, token_pair.refresh_token)
return response
@console_ns.route("/refresh-token")
class RefreshTokenApi(Resource):
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("refresh_token", type=str, required=True, location="json")
args = parser.parse_args()
# Get refresh token from cookie instead of request body
refresh_token = extract_refresh_token(request)
if not refresh_token:
return {"result": "fail", "message": "No refresh token provided"}, 401
try:
new_token_pair = AccountService.refresh_token(args["refresh_token"])
return {"result": "success", "data": new_token_pair.model_dump()}
new_token_pair = AccountService.refresh_token(refresh_token)
# Create response with new cookies
response = make_response({"result": "success"})
# Update cookies with new tokens
set_csrf_token_to_cookie(request, response, new_token_pair.csrf_token)
set_access_token_to_cookie(request, response, new_token_pair.access_token)
set_refresh_token_to_cookie(request, response, new_token_pair.refresh_token)
return response
except Exception as e:
return {"result": "fail", "data": str(e)}, 401
return {"result": "fail", "message": str(e)}, 401

View File

@@ -14,15 +14,19 @@ from extensions.ext_database import db
from libs.datetime_utils import naive_utc_now
from libs.helper import extract_remote_ip
from libs.oauth import GitHubOAuth, GoogleOAuth, OAuthUserInfo
from models import Account
from models.account import AccountStatus
from libs.token import (
set_access_token_to_cookie,
set_csrf_token_to_cookie,
set_refresh_token_to_cookie,
)
from models import Account, AccountStatus
from services.account_service import AccountService, RegisterService, TenantService
from services.billing_service import BillingService
from services.errors.account import AccountNotFoundError, AccountRegisterError
from services.errors.workspace import WorkSpaceNotAllowedCreateError, WorkSpaceNotFoundError
from services.feature_service import FeatureService
from .. import api, console_ns
from .. import console_ns
logger = logging.getLogger(__name__)
@@ -52,11 +56,13 @@ def get_oauth_providers():
@console_ns.route("/oauth/login/<provider>")
class OAuthLogin(Resource):
@api.doc("oauth_login")
@api.doc(description="Initiate OAuth login process")
@api.doc(params={"provider": "OAuth provider name (github/google)", "invite_token": "Optional invitation token"})
@api.response(302, "Redirect to OAuth authorization URL")
@api.response(400, "Invalid provider")
@console_ns.doc("oauth_login")
@console_ns.doc(description="Initiate OAuth login process")
@console_ns.doc(
params={"provider": "OAuth provider name (github/google)", "invite_token": "Optional invitation token"}
)
@console_ns.response(302, "Redirect to OAuth authorization URL")
@console_ns.response(400, "Invalid provider")
def get(self, provider: str):
invite_token = request.args.get("invite_token") or None
OAUTH_PROVIDERS = get_oauth_providers()
@@ -71,17 +77,17 @@ class OAuthLogin(Resource):
@console_ns.route("/oauth/authorize/<provider>")
class OAuthCallback(Resource):
@api.doc("oauth_callback")
@api.doc(description="Handle OAuth callback and complete login process")
@api.doc(
@console_ns.doc("oauth_callback")
@console_ns.doc(description="Handle OAuth callback and complete login process")
@console_ns.doc(
params={
"provider": "OAuth provider name (github/google)",
"code": "Authorization code from OAuth provider",
"state": "Optional state parameter (used for invite token)",
}
)
@api.response(302, "Redirect to console with access token")
@api.response(400, "OAuth process failed")
@console_ns.response(302, "Redirect to console with access token")
@console_ns.response(400, "OAuth process failed")
def get(self, provider: str):
OAUTH_PROVIDERS = get_oauth_providers()
with current_app.app_context():
@@ -130,11 +136,11 @@ class OAuthCallback(Resource):
return redirect(f"{dify_config.CONSOLE_WEB_URL}/signin?message={e.description}")
# Check account status
if account.status == AccountStatus.BANNED.value:
if account.status == AccountStatus.BANNED:
return redirect(f"{dify_config.CONSOLE_WEB_URL}/signin?message=Account is banned.")
if account.status == AccountStatus.PENDING.value:
account.status = AccountStatus.ACTIVE.value
if account.status == AccountStatus.PENDING:
account.status = AccountStatus.ACTIVE
account.initialized_at = naive_utc_now()
db.session.commit()
@@ -153,9 +159,12 @@ class OAuthCallback(Resource):
ip_address=extract_remote_ip(request),
)
return redirect(
f"{dify_config.CONSOLE_WEB_URL}?access_token={token_pair.access_token}&refresh_token={token_pair.refresh_token}"
)
response = redirect(f"{dify_config.CONSOLE_WEB_URL}")
set_access_token_to_cookie(request, response, token_pair.access_token)
set_refresh_token_to_cookie(request, response, token_pair.refresh_token)
set_csrf_token_to_cookie(request, response, token_pair.csrf_token)
return response
def _get_account_by_openid_or_email(provider: str, user_info: OAuthUserInfo) -> Account | None:

View File

@@ -1,16 +1,15 @@
from collections.abc import Callable
from functools import wraps
from typing import Concatenate, ParamSpec, TypeVar, cast
from typing import Concatenate, ParamSpec, TypeVar
import flask_login
from flask import jsonify, request
from flask_restx import Resource, reqparse
from werkzeug.exceptions import BadRequest, NotFound
from controllers.console.wraps import account_initialization_required, setup_required
from core.model_runtime.utils.encoders import jsonable_encoder
from libs.login import login_required
from models.account import Account
from libs.login import current_account_with_tenant, login_required
from models import Account
from models.model import OAuthProviderApp
from services.oauth_server import OAUTH_ACCESS_TOKEN_EXPIRES_IN, OAuthGrantType, OAuthServerService
@@ -24,8 +23,7 @@ T = TypeVar("T")
def oauth_server_client_id_required(view: Callable[Concatenate[T, OAuthProviderApp, P], R]):
@wraps(view)
def decorated(self: T, *args: P.args, **kwargs: P.kwargs):
parser = reqparse.RequestParser()
parser.add_argument("client_id", type=str, required=True, location="json")
parser = reqparse.RequestParser().add_argument("client_id", type=str, required=True, location="json")
parsed_args = parser.parse_args()
client_id = parsed_args.get("client_id")
if not client_id:
@@ -91,8 +89,7 @@ class OAuthServerAppApi(Resource):
@setup_required
@oauth_server_client_id_required
def post(self, oauth_provider_app: OAuthProviderApp):
parser = reqparse.RequestParser()
parser.add_argument("redirect_uri", type=str, required=True, location="json")
parser = reqparse.RequestParser().add_argument("redirect_uri", type=str, required=True, location="json")
parsed_args = parser.parse_args()
redirect_uri = parsed_args.get("redirect_uri")
@@ -116,7 +113,8 @@ class OAuthServerUserAuthorizeApi(Resource):
@account_initialization_required
@oauth_server_client_id_required
def post(self, oauth_provider_app: OAuthProviderApp):
account = cast(Account, flask_login.current_user)
current_user, _ = current_account_with_tenant()
account = current_user
user_account_id = account.id
code = OAuthServerService.sign_oauth_authorization_code(oauth_provider_app.client_id, user_account_id)
@@ -132,12 +130,14 @@ class OAuthServerUserTokenApi(Resource):
@setup_required
@oauth_server_client_id_required
def post(self, oauth_provider_app: OAuthProviderApp):
parser = reqparse.RequestParser()
parser.add_argument("grant_type", type=str, required=True, location="json")
parser.add_argument("code", type=str, required=False, location="json")
parser.add_argument("client_secret", type=str, required=False, location="json")
parser.add_argument("redirect_uri", type=str, required=False, location="json")
parser.add_argument("refresh_token", type=str, required=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("grant_type", type=str, required=True, location="json")
.add_argument("code", type=str, required=False, location="json")
.add_argument("client_secret", type=str, required=False, location="json")
.add_argument("redirect_uri", type=str, required=False, location="json")
.add_argument("refresh_token", type=str, required=False, location="json")
)
parsed_args = parser.parse_args()
try:

View File

@@ -1,9 +1,12 @@
from flask_restx import Resource, reqparse
import base64
from flask_restx import Resource, fields, reqparse
from werkzeug.exceptions import BadRequest
from controllers.console import console_ns
from controllers.console.wraps import account_initialization_required, only_edition_cloud, setup_required
from libs.login import current_user, login_required
from models.model import Account
from enums.cloud_plan import CloudPlan
from libs.login import current_account_with_tenant, login_required
from services.billing_service import BillingService
@@ -14,17 +17,21 @@ class Subscription(Resource):
@account_initialization_required
@only_edition_cloud
def get(self):
parser = reqparse.RequestParser()
parser.add_argument("plan", type=str, required=True, location="args", choices=["professional", "team"])
parser.add_argument("interval", type=str, required=True, location="args", choices=["month", "year"])
args = parser.parse_args()
assert isinstance(current_user, Account)
BillingService.is_tenant_owner_or_admin(current_user)
assert current_user.current_tenant_id is not None
return BillingService.get_subscription(
args["plan"], args["interval"], current_user.email, current_user.current_tenant_id
current_user, current_tenant_id = current_account_with_tenant()
parser = (
reqparse.RequestParser()
.add_argument(
"plan",
type=str,
required=True,
location="args",
choices=[CloudPlan.PROFESSIONAL, CloudPlan.TEAM],
)
.add_argument("interval", type=str, required=True, location="args", choices=["month", "year"])
)
args = parser.parse_args()
BillingService.is_tenant_owner_or_admin(current_user)
return BillingService.get_subscription(args["plan"], args["interval"], current_user.email, current_tenant_id)
@console_ns.route("/billing/invoices")
@@ -34,7 +41,40 @@ class Invoices(Resource):
@account_initialization_required
@only_edition_cloud
def get(self):
assert isinstance(current_user, Account)
current_user, current_tenant_id = current_account_with_tenant()
BillingService.is_tenant_owner_or_admin(current_user)
assert current_user.current_tenant_id is not None
return BillingService.get_invoices(current_user.email, current_user.current_tenant_id)
return BillingService.get_invoices(current_user.email, current_tenant_id)
@console_ns.route("/billing/partners/<string:partner_key>/tenants")
class PartnerTenants(Resource):
@console_ns.doc("sync_partner_tenants_bindings")
@console_ns.doc(description="Sync partner tenants bindings")
@console_ns.doc(params={"partner_key": "Partner key"})
@console_ns.expect(
console_ns.model(
"SyncPartnerTenantsBindingsRequest",
{"click_id": fields.String(required=True, description="Click Id from partner referral link")},
)
)
@console_ns.response(200, "Tenants synced to partner successfully")
@console_ns.response(400, "Invalid partner information")
@setup_required
@login_required
@account_initialization_required
@only_edition_cloud
def put(self, partner_key: str):
current_user, _ = current_account_with_tenant()
parser = reqparse.RequestParser().add_argument("click_id", required=True, type=str, location="json")
args = parser.parse_args()
try:
click_id = args["click_id"]
decoded_partner_key = base64.b64decode(partner_key).decode("utf-8")
except Exception:
raise BadRequest("Invalid partner_key")
if not click_id or not decoded_partner_key or not current_user.id:
raise BadRequest("Invalid partner information")
return BillingService.sync_partner_tenants_bindings(current_user.id, decoded_partner_key, click_id)

View File

@@ -1,9 +1,8 @@
from flask import request
from flask_login import current_user
from flask_restx import Resource, reqparse
from libs.helper import extract_remote_ip
from libs.login import login_required
from libs.login import current_account_with_tenant, login_required
from services.billing_service import BillingService
from .. import console_ns
@@ -17,17 +16,16 @@ class ComplianceApi(Resource):
@account_initialization_required
@only_edition_cloud
def get(self):
parser = reqparse.RequestParser()
parser.add_argument("doc_name", type=str, required=True, location="args")
current_user, current_tenant_id = current_account_with_tenant()
parser = reqparse.RequestParser().add_argument("doc_name", type=str, required=True, location="args")
args = parser.parse_args()
ip_address = extract_remote_ip(request)
device_info = request.headers.get("User-Agent", "Unknown device")
return BillingService.get_compliance_download_link(
doc_name=args.doc_name,
account_id=current_user.id,
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
ip=ip_address,
device_info=device_info,
)

View File

@@ -3,7 +3,6 @@ from collections.abc import Generator
from typing import cast
from flask import request
from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse
from sqlalchemy import select
from sqlalchemy.orm import Session
@@ -15,12 +14,12 @@ from core.datasource.entities.datasource_entities import DatasourceProviderType,
from core.datasource.online_document.online_document_plugin import OnlineDocumentDatasourcePlugin
from core.indexing_runner import IndexingRunner
from core.rag.extractor.entity.datasource_type import DatasourceType
from core.rag.extractor.entity.extract_setting import ExtractSetting
from core.rag.extractor.entity.extract_setting import ExtractSetting, NotionInfo
from core.rag.extractor.notion_extractor import NotionExtractor
from extensions.ext_database import db
from fields.data_source_fields import integrate_list_fields, integrate_notion_info_list_fields
from libs.datetime_utils import naive_utc_now
from libs.login import login_required
from libs.login import current_account_with_tenant, login_required
from models import DataSourceOauthBinding, Document
from services.dataset_service import DatasetService, DocumentService
from services.datasource_provider_service import DatasourceProviderService
@@ -37,10 +36,12 @@ class DataSourceApi(Resource):
@account_initialization_required
@marshal_with(integrate_list_fields)
def get(self):
_, current_tenant_id = current_account_with_tenant()
# get workspace data source integrates
data_source_integrates = db.session.scalars(
select(DataSourceOauthBinding).where(
DataSourceOauthBinding.tenant_id == current_user.current_tenant_id,
DataSourceOauthBinding.tenant_id == current_tenant_id,
DataSourceOauthBinding.disabled == False,
)
).all()
@@ -120,13 +121,15 @@ class DataSourceNotionListApi(Resource):
@account_initialization_required
@marshal_with(integrate_notion_info_list_fields)
def get(self):
current_user, current_tenant_id = current_account_with_tenant()
dataset_id = request.args.get("dataset_id", default=None, type=str)
credential_id = request.args.get("credential_id", default=None, type=str)
if not credential_id:
raise ValueError("Credential id is required.")
datasource_provider_service = DatasourceProviderService()
credential = datasource_provider_service.get_datasource_credentials(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
credential_id=credential_id,
provider="notion_datasource",
plugin_id="langgenius/notion_datasource",
@@ -146,7 +149,7 @@ class DataSourceNotionListApi(Resource):
documents = session.scalars(
select(Document).filter_by(
dataset_id=dataset_id,
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
data_source_type="notion_import",
enabled=True,
)
@@ -161,7 +164,7 @@ class DataSourceNotionListApi(Resource):
datasource_runtime = DatasourceManager.get_datasource_runtime(
provider_id="langgenius/notion_datasource/notion_datasource",
datasource_name="notion_datasource",
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
datasource_type=DatasourceProviderType.ONLINE_DOCUMENT,
)
datasource_provider_service = DatasourceProviderService()
@@ -210,12 +213,14 @@ class DataSourceNotionApi(Resource):
@login_required
@account_initialization_required
def get(self, workspace_id, page_id, page_type):
_, current_tenant_id = current_account_with_tenant()
credential_id = request.args.get("credential_id", default=None, type=str)
if not credential_id:
raise ValueError("Credential id is required.")
datasource_provider_service = DatasourceProviderService()
credential = datasource_provider_service.get_datasource_credentials(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
credential_id=credential_id,
provider="notion_datasource",
plugin_id="langgenius/notion_datasource",
@@ -229,7 +234,7 @@ class DataSourceNotionApi(Resource):
notion_obj_id=page_id,
notion_page_type=page_type,
notion_access_token=credential.get("integration_secret"),
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
)
text_docs = extractor.extract()
@@ -239,12 +244,14 @@ class DataSourceNotionApi(Resource):
@login_required
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("notion_info_list", type=list, required=True, nullable=True, location="json")
parser.add_argument("process_rule", type=dict, required=True, nullable=True, location="json")
parser.add_argument("doc_form", type=str, default="text_model", required=False, nullable=False, location="json")
parser.add_argument(
"doc_language", type=str, default="English", required=False, nullable=False, location="json"
_, current_tenant_id = current_account_with_tenant()
parser = (
reqparse.RequestParser()
.add_argument("notion_info_list", type=list, required=True, nullable=True, location="json")
.add_argument("process_rule", type=dict, required=True, nullable=True, location="json")
.add_argument("doc_form", type=str, default="text_model", required=False, nullable=False, location="json")
.add_argument("doc_language", type=str, default="English", required=False, nullable=False, location="json")
)
args = parser.parse_args()
# validate args
@@ -256,20 +263,22 @@ class DataSourceNotionApi(Resource):
credential_id = notion_info.get("credential_id")
for page in notion_info["pages"]:
extract_setting = ExtractSetting(
datasource_type=DatasourceType.NOTION.value,
notion_info={
"credential_id": credential_id,
"notion_workspace_id": workspace_id,
"notion_obj_id": page["page_id"],
"notion_page_type": page["type"],
"tenant_id": current_user.current_tenant_id,
},
datasource_type=DatasourceType.NOTION,
notion_info=NotionInfo.model_validate(
{
"credential_id": credential_id,
"notion_workspace_id": workspace_id,
"notion_obj_id": page["page_id"],
"notion_page_type": page["type"],
"tenant_id": current_tenant_id,
}
),
document_model=args["doc_form"],
)
extract_settings.append(extract_setting)
indexing_runner = IndexingRunner()
response = indexing_runner.indexing_estimate(
current_user.current_tenant_id,
current_tenant_id,
extract_settings,
args["process_rule"],
args["doc_form"],

File diff suppressed because it is too large Load Diff

View File

@@ -6,13 +6,12 @@ from typing import Literal, cast
import sqlalchemy as sa
from flask import request
from flask_login import current_user
from flask_restx import Resource, fields, marshal, marshal_with, reqparse
from sqlalchemy import asc, desc, select
from werkzeug.exceptions import Forbidden, NotFound
import services
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.app.error import (
ProviderModelCurrentlyNotSupportError,
ProviderNotInitializeError,
@@ -44,18 +43,19 @@ from core.model_runtime.entities.model_entities import ModelType
from core.model_runtime.errors.invoke import InvokeAuthorizationError
from core.plugin.impl.exc import PluginDaemonClientSideError
from core.rag.extractor.entity.datasource_type import DatasourceType
from core.rag.extractor.entity.extract_setting import ExtractSetting
from core.rag.extractor.entity.extract_setting import ExtractSetting, NotionInfo, WebsiteInfo
from extensions.ext_database import db
from fields.dataset_fields import dataset_fields
from fields.document_fields import (
dataset_and_document_fields,
document_fields,
document_metadata_fields,
document_status_fields,
document_with_segments_fields,
)
from libs.datetime_utils import naive_utc_now
from libs.login import login_required
from libs.login import current_account_with_tenant, login_required
from models import Dataset, DatasetProcessRule, Document, DocumentSegment, UploadFile
from models.account import Account
from models.dataset import DocumentPipelineExecutionLog
from services.dataset_service import DatasetService, DocumentService
from services.entities.knowledge_entities.knowledge_entities import KnowledgeConfig
@@ -63,8 +63,39 @@ from services.entities.knowledge_entities.knowledge_entities import KnowledgeCon
logger = logging.getLogger(__name__)
def _get_or_create_model(model_name: str, field_def):
existing = console_ns.models.get(model_name)
if existing is None:
existing = console_ns.model(model_name, field_def)
return existing
# Register models for flask_restx to avoid dict type issues in Swagger
dataset_model = _get_or_create_model("Dataset", dataset_fields)
document_metadata_model = _get_or_create_model("DocumentMetadata", document_metadata_fields)
document_fields_copy = document_fields.copy()
document_fields_copy["doc_metadata"] = fields.List(
fields.Nested(document_metadata_model), attribute="doc_metadata_details"
)
document_model = _get_or_create_model("Document", document_fields_copy)
document_with_segments_fields_copy = document_with_segments_fields.copy()
document_with_segments_fields_copy["doc_metadata"] = fields.List(
fields.Nested(document_metadata_model), attribute="doc_metadata_details"
)
document_with_segments_model = _get_or_create_model("DocumentWithSegments", document_with_segments_fields_copy)
dataset_and_document_fields_copy = dataset_and_document_fields.copy()
dataset_and_document_fields_copy["dataset"] = fields.Nested(dataset_model)
dataset_and_document_fields_copy["documents"] = fields.List(fields.Nested(document_model))
dataset_and_document_model = _get_or_create_model("DatasetAndDocument", dataset_and_document_fields_copy)
class DocumentResource(Resource):
def get_document(self, dataset_id: str, document_id: str) -> Document:
current_user, current_tenant_id = current_account_with_tenant()
dataset = DatasetService.get_dataset(dataset_id)
if not dataset:
raise NotFound("Dataset not found.")
@@ -79,12 +110,13 @@ class DocumentResource(Resource):
if not document:
raise NotFound("Document not found.")
if document.tenant_id != current_user.current_tenant_id:
if document.tenant_id != current_tenant_id:
raise Forbidden("No permission.")
return document
def get_batch_documents(self, dataset_id: str, batch: str) -> Sequence[Document]:
current_user, _ = current_account_with_tenant()
dataset = DatasetService.get_dataset(dataset_id)
if not dataset:
raise NotFound("Dataset not found.")
@@ -104,14 +136,15 @@ class DocumentResource(Resource):
@console_ns.route("/datasets/process-rule")
class GetProcessRuleApi(Resource):
@api.doc("get_process_rule")
@api.doc(description="Get dataset document processing rules")
@api.doc(params={"document_id": "Document ID (optional)"})
@api.response(200, "Process rules retrieved successfully")
@console_ns.doc("get_process_rule")
@console_ns.doc(description="Get dataset document processing rules")
@console_ns.doc(params={"document_id": "Document ID (optional)"})
@console_ns.response(200, "Process rules retrieved successfully")
@setup_required
@login_required
@account_initialization_required
def get(self):
current_user, _ = current_account_with_tenant()
req_data = request.args
document_id = req_data.get("document_id")
@@ -151,9 +184,9 @@ class GetProcessRuleApi(Resource):
@console_ns.route("/datasets/<uuid:dataset_id>/documents")
class DatasetDocumentListApi(Resource):
@api.doc("get_dataset_documents")
@api.doc(description="Get documents in a dataset")
@api.doc(
@console_ns.doc("get_dataset_documents")
@console_ns.doc(description="Get documents in a dataset")
@console_ns.doc(
params={
"dataset_id": "Dataset ID",
"page": "Page number (default: 1)",
@@ -161,18 +194,20 @@ class DatasetDocumentListApi(Resource):
"keyword": "Search keyword",
"sort": "Sort order (default: -created_at)",
"fetch": "Fetch full details (default: false)",
"status": "Filter documents by display status",
}
)
@api.response(200, "Documents retrieved successfully")
@console_ns.response(200, "Documents retrieved successfully")
@setup_required
@login_required
@account_initialization_required
def get(self, dataset_id):
dataset_id = str(dataset_id)
def get(self, dataset_id: str):
current_user, current_tenant_id = current_account_with_tenant()
page = request.args.get("page", default=1, type=int)
limit = request.args.get("limit", default=20, type=int)
search = request.args.get("keyword", default=None, type=str)
sort = request.args.get("sort", default="-created_at", type=str)
status = request.args.get("status", default=None, type=str)
# "yes", "true", "t", "y", "1" convert to True, while others convert to False.
try:
fetch_val = request.args.get("fetch", default="false")
@@ -199,7 +234,10 @@ class DatasetDocumentListApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
query = select(Document).filter_by(dataset_id=str(dataset_id), tenant_id=current_user.current_tenant_id)
query = select(Document).filter_by(dataset_id=str(dataset_id), tenant_id=current_tenant_id)
if status:
query = DocumentService.apply_display_status_filter(query, status)
if search:
search = f"%{search}%"
@@ -269,10 +307,11 @@ class DatasetDocumentListApi(Resource):
@setup_required
@login_required
@account_initialization_required
@marshal_with(dataset_and_document_fields)
@marshal_with(dataset_and_document_model)
@cloud_edition_billing_resource_check("vector_space")
@cloud_edition_billing_rate_limit_check("knowledge")
def post(self, dataset_id):
current_user, _ = current_account_with_tenant()
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -289,23 +328,23 @@ class DatasetDocumentListApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
parser = reqparse.RequestParser()
parser.add_argument(
"indexing_technique", type=str, choices=Dataset.INDEXING_TECHNIQUE_LIST, nullable=False, location="json"
)
parser.add_argument("data_source", type=dict, required=False, location="json")
parser.add_argument("process_rule", type=dict, required=False, location="json")
parser.add_argument("duplicate", type=bool, default=True, nullable=False, location="json")
parser.add_argument("original_document_id", type=str, required=False, location="json")
parser.add_argument("doc_form", type=str, default="text_model", required=False, nullable=False, location="json")
parser.add_argument("retrieval_model", type=dict, required=False, nullable=False, location="json")
parser.add_argument("embedding_model", type=str, required=False, nullable=True, location="json")
parser.add_argument("embedding_model_provider", type=str, required=False, nullable=True, location="json")
parser.add_argument(
"doc_language", type=str, default="English", required=False, nullable=False, location="json"
parser = (
reqparse.RequestParser()
.add_argument(
"indexing_technique", type=str, choices=Dataset.INDEXING_TECHNIQUE_LIST, nullable=False, location="json"
)
.add_argument("data_source", type=dict, required=False, location="json")
.add_argument("process_rule", type=dict, required=False, location="json")
.add_argument("duplicate", type=bool, default=True, nullable=False, location="json")
.add_argument("original_document_id", type=str, required=False, location="json")
.add_argument("doc_form", type=str, default="text_model", required=False, nullable=False, location="json")
.add_argument("retrieval_model", type=dict, required=False, nullable=False, location="json")
.add_argument("embedding_model", type=str, required=False, nullable=True, location="json")
.add_argument("embedding_model_provider", type=str, required=False, nullable=True, location="json")
.add_argument("doc_language", type=str, default="English", required=False, nullable=False, location="json")
)
args = parser.parse_args()
knowledge_config = KnowledgeConfig(**args)
knowledge_config = KnowledgeConfig.model_validate(args)
if not dataset.indexing_technique and not knowledge_config.indexing_technique:
raise ValueError("indexing_technique is required.")
@@ -349,10 +388,10 @@ class DatasetDocumentListApi(Resource):
@console_ns.route("/datasets/init")
class DatasetInitApi(Resource):
@api.doc("init_dataset")
@api.doc(description="Initialize dataset with documents")
@api.expect(
api.model(
@console_ns.doc("init_dataset")
@console_ns.doc(description="Initialize dataset with documents")
@console_ns.expect(
console_ns.model(
"DatasetInitRequest",
{
"upload_file_id": fields.String(required=True, description="Upload file ID"),
@@ -362,47 +401,48 @@ class DatasetInitApi(Resource):
},
)
)
@api.response(201, "Dataset initialized successfully", dataset_and_document_fields)
@api.response(400, "Invalid request parameters")
@console_ns.response(201, "Dataset initialized successfully", dataset_and_document_model)
@console_ns.response(400, "Invalid request parameters")
@setup_required
@login_required
@account_initialization_required
@marshal_with(dataset_and_document_fields)
@marshal_with(dataset_and_document_model)
@cloud_edition_billing_resource_check("vector_space")
@cloud_edition_billing_rate_limit_check("knowledge")
def post(self):
# The role of the current user in the ta table must be admin, owner, dataset_operator, or editor
current_user, current_tenant_id = current_account_with_tenant()
if not current_user.is_dataset_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument(
"indexing_technique",
type=str,
choices=Dataset.INDEXING_TECHNIQUE_LIST,
required=True,
nullable=False,
location="json",
parser = (
reqparse.RequestParser()
.add_argument(
"indexing_technique",
type=str,
choices=Dataset.INDEXING_TECHNIQUE_LIST,
required=True,
nullable=False,
location="json",
)
.add_argument("data_source", type=dict, required=True, nullable=True, location="json")
.add_argument("process_rule", type=dict, required=True, nullable=True, location="json")
.add_argument("doc_form", type=str, default="text_model", required=False, nullable=False, location="json")
.add_argument("doc_language", type=str, default="English", required=False, nullable=False, location="json")
.add_argument("retrieval_model", type=dict, required=False, nullable=False, location="json")
.add_argument("embedding_model", type=str, required=False, nullable=True, location="json")
.add_argument("embedding_model_provider", type=str, required=False, nullable=True, location="json")
)
parser.add_argument("data_source", type=dict, required=True, nullable=True, location="json")
parser.add_argument("process_rule", type=dict, required=True, nullable=True, location="json")
parser.add_argument("doc_form", type=str, default="text_model", required=False, nullable=False, location="json")
parser.add_argument(
"doc_language", type=str, default="English", required=False, nullable=False, location="json"
)
parser.add_argument("retrieval_model", type=dict, required=False, nullable=False, location="json")
parser.add_argument("embedding_model", type=str, required=False, nullable=True, location="json")
parser.add_argument("embedding_model_provider", type=str, required=False, nullable=True, location="json")
args = parser.parse_args()
knowledge_config = KnowledgeConfig(**args)
knowledge_config = KnowledgeConfig.model_validate(args)
if knowledge_config.indexing_technique == "high_quality":
if knowledge_config.embedding_model is None or knowledge_config.embedding_model_provider is None:
raise ValueError("embedding model and embedding model provider are required for high quality indexing.")
try:
model_manager = ModelManager()
model_manager.get_model_instance(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
provider=args["embedding_model_provider"],
model_type=ModelType.TEXT_EMBEDDING,
model=args["embedding_model"],
@@ -419,9 +459,9 @@ class DatasetInitApi(Resource):
try:
dataset, documents, batch = DocumentService.save_document_without_dataset_id(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
knowledge_config=knowledge_config,
account=cast(Account, current_user),
account=current_user,
)
except ProviderTokenNotInitError as ex:
raise ProviderNotInitializeError(ex.description)
@@ -437,16 +477,17 @@ class DatasetInitApi(Resource):
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/indexing-estimate")
class DocumentIndexingEstimateApi(DocumentResource):
@api.doc("estimate_document_indexing")
@api.doc(description="Estimate document indexing cost")
@api.doc(params={"dataset_id": "Dataset ID", "document_id": "Document ID"})
@api.response(200, "Indexing estimate calculated successfully")
@api.response(404, "Document not found")
@api.response(400, "Document already finished")
@console_ns.doc("estimate_document_indexing")
@console_ns.doc(description="Estimate document indexing cost")
@console_ns.doc(params={"dataset_id": "Dataset ID", "document_id": "Document ID"})
@console_ns.response(200, "Indexing estimate calculated successfully")
@console_ns.response(404, "Document not found")
@console_ns.response(400, "Document already finished")
@setup_required
@login_required
@account_initialization_required
def get(self, dataset_id, document_id):
_, current_tenant_id = current_account_with_tenant()
dataset_id = str(dataset_id)
document_id = str(document_id)
document = self.get_document(dataset_id, document_id)
@@ -475,14 +516,14 @@ class DocumentIndexingEstimateApi(DocumentResource):
raise NotFound("File not found.")
extract_setting = ExtractSetting(
datasource_type=DatasourceType.FILE.value, upload_file=file, document_model=document.doc_form
datasource_type=DatasourceType.FILE, upload_file=file, document_model=document.doc_form
)
indexing_runner = IndexingRunner()
try:
estimate_response = indexing_runner.indexing_estimate(
current_user.current_tenant_id,
current_tenant_id,
[extract_setting],
data_process_rule_dict,
document.doc_form,
@@ -511,6 +552,7 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
@login_required
@account_initialization_required
def get(self, dataset_id, batch):
_, current_tenant_id = current_account_with_tenant()
dataset_id = str(dataset_id)
batch = str(batch)
documents = self.get_batch_documents(dataset_id, batch)
@@ -530,7 +572,7 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
file_id = data_source_info["upload_file_id"]
file_detail = (
db.session.query(UploadFile)
.where(UploadFile.tenant_id == current_user.current_tenant_id, UploadFile.id == file_id)
.where(UploadFile.tenant_id == current_tenant_id, UploadFile.id == file_id)
.first()
)
@@ -538,7 +580,7 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
raise NotFound("File not found.")
extract_setting = ExtractSetting(
datasource_type=DatasourceType.FILE.value, upload_file=file_detail, document_model=document.doc_form
datasource_type=DatasourceType.FILE, upload_file=file_detail, document_model=document.doc_form
)
extract_settings.append(extract_setting)
@@ -546,14 +588,16 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
if not data_source_info:
continue
extract_setting = ExtractSetting(
datasource_type=DatasourceType.NOTION.value,
notion_info={
"credential_id": data_source_info["credential_id"],
"notion_workspace_id": data_source_info["notion_workspace_id"],
"notion_obj_id": data_source_info["notion_page_id"],
"notion_page_type": data_source_info["type"],
"tenant_id": current_user.current_tenant_id,
},
datasource_type=DatasourceType.NOTION,
notion_info=NotionInfo.model_validate(
{
"credential_id": data_source_info["credential_id"],
"notion_workspace_id": data_source_info["notion_workspace_id"],
"notion_obj_id": data_source_info["notion_page_id"],
"notion_page_type": data_source_info["type"],
"tenant_id": current_tenant_id,
}
),
document_model=document.doc_form,
)
extract_settings.append(extract_setting)
@@ -561,15 +605,17 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
if not data_source_info:
continue
extract_setting = ExtractSetting(
datasource_type=DatasourceType.WEBSITE.value,
website_info={
"provider": data_source_info["provider"],
"job_id": data_source_info["job_id"],
"url": data_source_info["url"],
"tenant_id": current_user.current_tenant_id,
"mode": data_source_info["mode"],
"only_main_content": data_source_info["only_main_content"],
},
datasource_type=DatasourceType.WEBSITE,
website_info=WebsiteInfo.model_validate(
{
"provider": data_source_info["provider"],
"job_id": data_source_info["job_id"],
"url": data_source_info["url"],
"tenant_id": current_tenant_id,
"mode": data_source_info["mode"],
"only_main_content": data_source_info["only_main_content"],
}
),
document_model=document.doc_form,
)
extract_settings.append(extract_setting)
@@ -579,7 +625,7 @@ class DocumentBatchIndexingEstimateApi(DocumentResource):
indexing_runner = IndexingRunner()
try:
response = indexing_runner.indexing_estimate(
current_user.current_tenant_id,
current_tenant_id,
extract_settings,
data_process_rule_dict,
document.doc_form,
@@ -646,11 +692,11 @@ class DocumentBatchIndexingStatusApi(DocumentResource):
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/indexing-status")
class DocumentIndexingStatusApi(DocumentResource):
@api.doc("get_document_indexing_status")
@api.doc(description="Get document indexing status")
@api.doc(params={"dataset_id": "Dataset ID", "document_id": "Document ID"})
@api.response(200, "Indexing status retrieved successfully")
@api.response(404, "Document not found")
@console_ns.doc("get_document_indexing_status")
@console_ns.doc(description="Get document indexing status")
@console_ns.doc(params={"dataset_id": "Dataset ID", "document_id": "Document ID"})
@console_ns.response(200, "Indexing status retrieved successfully")
@console_ns.response(404, "Document not found")
@setup_required
@login_required
@account_initialization_required
@@ -696,17 +742,17 @@ class DocumentIndexingStatusApi(DocumentResource):
class DocumentApi(DocumentResource):
METADATA_CHOICES = {"all", "only", "without"}
@api.doc("get_document")
@api.doc(description="Get document details")
@api.doc(
@console_ns.doc("get_document")
@console_ns.doc(description="Get document details")
@console_ns.doc(
params={
"dataset_id": "Dataset ID",
"document_id": "Document ID",
"metadata": "Metadata inclusion (all/only/without)",
}
)
@api.response(200, "Document retrieved successfully")
@api.response(404, "Document not found")
@console_ns.response(200, "Document retrieved successfully")
@console_ns.response(404, "Document not found")
@setup_required
@login_required
@account_initialization_required
@@ -736,7 +782,7 @@ class DocumentApi(DocumentResource):
"name": document.name,
"created_from": document.created_from,
"created_by": document.created_by,
"created_at": document.created_at.timestamp(),
"created_at": int(document.created_at.timestamp()),
"tokens": document.tokens,
"indexing_status": document.indexing_status,
"completed_at": int(document.completed_at.timestamp()) if document.completed_at else None,
@@ -769,7 +815,7 @@ class DocumentApi(DocumentResource):
"name": document.name,
"created_from": document.created_from,
"created_by": document.created_by,
"created_at": document.created_at.timestamp(),
"created_at": int(document.created_at.timestamp()),
"tokens": document.tokens,
"indexing_status": document.indexing_status,
"completed_at": int(document.completed_at.timestamp()) if document.completed_at else None,
@@ -817,19 +863,20 @@ class DocumentApi(DocumentResource):
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/processing/<string:action>")
class DocumentProcessingApi(DocumentResource):
@api.doc("update_document_processing")
@api.doc(description="Update document processing status (pause/resume)")
@api.doc(
@console_ns.doc("update_document_processing")
@console_ns.doc(description="Update document processing status (pause/resume)")
@console_ns.doc(
params={"dataset_id": "Dataset ID", "document_id": "Document ID", "action": "Action to perform (pause/resume)"}
)
@api.response(200, "Processing status updated successfully")
@api.response(404, "Document not found")
@api.response(400, "Invalid action")
@console_ns.response(200, "Processing status updated successfully")
@console_ns.response(404, "Document not found")
@console_ns.response(400, "Invalid action")
@setup_required
@login_required
@account_initialization_required
@cloud_edition_billing_rate_limit_check("knowledge")
def patch(self, dataset_id, document_id, action: Literal["pause", "resume"]):
current_user, _ = current_account_with_tenant()
dataset_id = str(dataset_id)
document_id = str(document_id)
document = self.get_document(dataset_id, document_id)
@@ -861,11 +908,11 @@ class DocumentProcessingApi(DocumentResource):
@console_ns.route("/datasets/<uuid:dataset_id>/documents/<uuid:document_id>/metadata")
class DocumentMetadataApi(DocumentResource):
@api.doc("update_document_metadata")
@api.doc(description="Update document metadata")
@api.doc(params={"dataset_id": "Dataset ID", "document_id": "Document ID"})
@api.expect(
api.model(
@console_ns.doc("update_document_metadata")
@console_ns.doc(description="Update document metadata")
@console_ns.doc(params={"dataset_id": "Dataset ID", "document_id": "Document ID"})
@console_ns.expect(
console_ns.model(
"UpdateDocumentMetadataRequest",
{
"doc_type": fields.String(description="Document type"),
@@ -873,13 +920,14 @@ class DocumentMetadataApi(DocumentResource):
},
)
)
@api.response(200, "Document metadata updated successfully")
@api.response(404, "Document not found")
@api.response(403, "Permission denied")
@console_ns.response(200, "Document metadata updated successfully")
@console_ns.response(404, "Document not found")
@console_ns.response(403, "Permission denied")
@setup_required
@login_required
@account_initialization_required
def put(self, dataset_id, document_id):
current_user, _ = current_account_with_tenant()
dataset_id = str(dataset_id)
document_id = str(document_id)
document = self.get_document(dataset_id, document_id)
@@ -927,6 +975,7 @@ class DocumentStatusApi(DocumentResource):
@cloud_edition_billing_resource_check("vector_space")
@cloud_edition_billing_rate_limit_check("knowledge")
def patch(self, dataset_id, action: Literal["enable", "disable", "archive", "un_archive"]):
current_user, _ = current_account_with_tenant()
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
if dataset is None:
@@ -1030,8 +1079,9 @@ class DocumentRetryApi(DocumentResource):
def post(self, dataset_id):
"""retry document."""
parser = reqparse.RequestParser()
parser.add_argument("document_ids", type=list, required=True, nullable=False, location="json")
parser = reqparse.RequestParser().add_argument(
"document_ids", type=list, required=True, nullable=False, location="json"
)
args = parser.parse_args()
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -1073,14 +1123,14 @@ class DocumentRenameApi(DocumentResource):
@marshal_with(document_fields)
def post(self, dataset_id, document_id):
# The role of the current user in the ta table must be admin, owner, editor, or dataset_operator
current_user, _ = current_account_with_tenant()
if not current_user.is_dataset_editor:
raise Forbidden()
dataset = DatasetService.get_dataset(dataset_id)
if not dataset:
raise NotFound("Dataset not found.")
DatasetService.check_dataset_operator_permission(cast(Account, current_user), dataset)
parser = reqparse.RequestParser()
parser.add_argument("name", type=str, required=True, nullable=False, location="json")
DatasetService.check_dataset_operator_permission(current_user, dataset)
parser = reqparse.RequestParser().add_argument("name", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
try:
@@ -1098,6 +1148,7 @@ class WebsiteDocumentSyncApi(DocumentResource):
@account_initialization_required
def get(self, dataset_id, document_id):
"""sync website document."""
_, current_tenant_id = current_account_with_tenant()
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
if not dataset:
@@ -1106,7 +1157,7 @@ class WebsiteDocumentSyncApi(DocumentResource):
document = DocumentService.get_document(dataset.id, document_id)
if not document:
raise NotFound("Document not found.")
if document.tenant_id != current_user.current_tenant_id:
if document.tenant_id != current_tenant_id:
raise Forbidden("No permission.")
if document.data_source_type != "website_crawl":
raise ValueError("Document is not a website document.")

View File

@@ -1,7 +1,6 @@
import uuid
from flask import request
from flask_login import current_user
from flask_restx import Resource, marshal, reqparse
from sqlalchemy import select
from werkzeug.exceptions import Forbidden, NotFound
@@ -27,7 +26,7 @@ from core.model_runtime.entities.model_entities import ModelType
from extensions.ext_database import db
from extensions.ext_redis import redis_client
from fields.segment_fields import child_chunk_fields, segment_fields
from libs.login import login_required
from libs.login import current_account_with_tenant, login_required
from models.dataset import ChildChunk, DocumentSegment
from models.model import UploadFile
from services.dataset_service import DatasetService, DocumentService, SegmentService
@@ -43,6 +42,8 @@ class DatasetDocumentSegmentListApi(Resource):
@login_required
@account_initialization_required
def get(self, dataset_id, document_id):
current_user, current_tenant_id = current_account_with_tenant()
dataset_id = str(dataset_id)
document_id = str(document_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -59,13 +60,15 @@ class DatasetDocumentSegmentListApi(Resource):
if not document:
raise NotFound("Document not found.")
parser = reqparse.RequestParser()
parser.add_argument("limit", type=int, default=20, location="args")
parser.add_argument("status", type=str, action="append", default=[], location="args")
parser.add_argument("hit_count_gte", type=int, default=None, location="args")
parser.add_argument("enabled", type=str, default="all", location="args")
parser.add_argument("keyword", type=str, default=None, location="args")
parser.add_argument("page", type=int, default=1, location="args")
parser = (
reqparse.RequestParser()
.add_argument("limit", type=int, default=20, location="args")
.add_argument("status", type=str, action="append", default=[], location="args")
.add_argument("hit_count_gte", type=int, default=None, location="args")
.add_argument("enabled", type=str, default="all", location="args")
.add_argument("keyword", type=str, default=None, location="args")
.add_argument("page", type=int, default=1, location="args")
)
args = parser.parse_args()
@@ -79,7 +82,7 @@ class DatasetDocumentSegmentListApi(Resource):
select(DocumentSegment)
.where(
DocumentSegment.document_id == str(document_id),
DocumentSegment.tenant_id == current_user.current_tenant_id,
DocumentSegment.tenant_id == current_tenant_id,
)
.order_by(DocumentSegment.position.asc())
)
@@ -115,6 +118,8 @@ class DatasetDocumentSegmentListApi(Resource):
@account_initialization_required
@cloud_edition_billing_rate_limit_check("knowledge")
def delete(self, dataset_id, document_id):
current_user, _ = current_account_with_tenant()
# check dataset
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -148,6 +153,8 @@ class DatasetDocumentSegmentApi(Resource):
@cloud_edition_billing_resource_check("vector_space")
@cloud_edition_billing_rate_limit_check("knowledge")
def patch(self, dataset_id, document_id, action):
current_user, current_tenant_id = current_account_with_tenant()
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
if not dataset:
@@ -171,7 +178,7 @@ class DatasetDocumentSegmentApi(Resource):
try:
model_manager = ModelManager()
model_manager.get_model_instance(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
provider=dataset.embedding_model_provider,
model_type=ModelType.TEXT_EMBEDDING,
model=dataset.embedding_model,
@@ -204,6 +211,8 @@ class DatasetDocumentSegmentAddApi(Resource):
@cloud_edition_billing_knowledge_limit_check("add_segment")
@cloud_edition_billing_rate_limit_check("knowledge")
def post(self, dataset_id, document_id):
current_user, current_tenant_id = current_account_with_tenant()
# check dataset
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -221,7 +230,7 @@ class DatasetDocumentSegmentAddApi(Resource):
try:
model_manager = ModelManager()
model_manager.get_model_instance(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
provider=dataset.embedding_model_provider,
model_type=ModelType.TEXT_EMBEDDING,
model=dataset.embedding_model,
@@ -237,10 +246,12 @@ class DatasetDocumentSegmentAddApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
# validate args
parser = reqparse.RequestParser()
parser.add_argument("content", type=str, required=True, nullable=False, location="json")
parser.add_argument("answer", type=str, required=False, nullable=True, location="json")
parser.add_argument("keywords", type=list, required=False, nullable=True, location="json")
parser = (
reqparse.RequestParser()
.add_argument("content", type=str, required=True, nullable=False, location="json")
.add_argument("answer", type=str, required=False, nullable=True, location="json")
.add_argument("keywords", type=list, required=False, nullable=True, location="json")
)
args = parser.parse_args()
SegmentService.segment_create_args_validate(args, document)
segment = SegmentService.create_segment(args, document, dataset)
@@ -255,6 +266,8 @@ class DatasetDocumentSegmentUpdateApi(Resource):
@cloud_edition_billing_resource_check("vector_space")
@cloud_edition_billing_rate_limit_check("knowledge")
def patch(self, dataset_id, document_id, segment_id):
current_user, current_tenant_id = current_account_with_tenant()
# check dataset
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -272,7 +285,7 @@ class DatasetDocumentSegmentUpdateApi(Resource):
try:
model_manager = ModelManager()
model_manager.get_model_instance(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
provider=dataset.embedding_model_provider,
model_type=ModelType.TEXT_EMBEDDING,
model=dataset.embedding_model,
@@ -287,7 +300,7 @@ class DatasetDocumentSegmentUpdateApi(Resource):
segment_id = str(segment_id)
segment = (
db.session.query(DocumentSegment)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_tenant_id)
.first()
)
if not segment:
@@ -300,16 +313,18 @@ class DatasetDocumentSegmentUpdateApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
# validate args
parser = reqparse.RequestParser()
parser.add_argument("content", type=str, required=True, nullable=False, location="json")
parser.add_argument("answer", type=str, required=False, nullable=True, location="json")
parser.add_argument("keywords", type=list, required=False, nullable=True, location="json")
parser.add_argument(
"regenerate_child_chunks", type=bool, required=False, nullable=True, default=False, location="json"
parser = (
reqparse.RequestParser()
.add_argument("content", type=str, required=True, nullable=False, location="json")
.add_argument("answer", type=str, required=False, nullable=True, location="json")
.add_argument("keywords", type=list, required=False, nullable=True, location="json")
.add_argument(
"regenerate_child_chunks", type=bool, required=False, nullable=True, default=False, location="json"
)
)
args = parser.parse_args()
SegmentService.segment_create_args_validate(args, document)
segment = SegmentService.update_segment(SegmentUpdateArgs(**args), segment, document, dataset)
segment = SegmentService.update_segment(SegmentUpdateArgs.model_validate(args), segment, document, dataset)
return {"data": marshal(segment, segment_fields), "doc_form": document.doc_form}, 200
@setup_required
@@ -317,6 +332,8 @@ class DatasetDocumentSegmentUpdateApi(Resource):
@account_initialization_required
@cloud_edition_billing_rate_limit_check("knowledge")
def delete(self, dataset_id, document_id, segment_id):
current_user, current_tenant_id = current_account_with_tenant()
# check dataset
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -333,7 +350,7 @@ class DatasetDocumentSegmentUpdateApi(Resource):
segment_id = str(segment_id)
segment = (
db.session.query(DocumentSegment)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_tenant_id)
.first()
)
if not segment:
@@ -361,6 +378,8 @@ class DatasetDocumentSegmentBatchImportApi(Resource):
@cloud_edition_billing_knowledge_limit_check("add_segment")
@cloud_edition_billing_rate_limit_check("knowledge")
def post(self, dataset_id, document_id):
current_user, current_tenant_id = current_account_with_tenant()
# check dataset
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -372,8 +391,9 @@ class DatasetDocumentSegmentBatchImportApi(Resource):
if not document:
raise NotFound("Document not found.")
parser = reqparse.RequestParser()
parser.add_argument("upload_file_id", type=str, required=True, nullable=False, location="json")
parser = reqparse.RequestParser().add_argument(
"upload_file_id", type=str, required=True, nullable=False, location="json"
)
args = parser.parse_args()
upload_file_id = args["upload_file_id"]
@@ -396,7 +416,7 @@ class DatasetDocumentSegmentBatchImportApi(Resource):
upload_file_id,
dataset_id,
document_id,
current_user.current_tenant_id,
current_tenant_id,
current_user.id,
)
except Exception as e:
@@ -427,6 +447,8 @@ class ChildChunkAddApi(Resource):
@cloud_edition_billing_knowledge_limit_check("add_segment")
@cloud_edition_billing_rate_limit_check("knowledge")
def post(self, dataset_id, document_id, segment_id):
current_user, current_tenant_id = current_account_with_tenant()
# check dataset
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -441,7 +463,7 @@ class ChildChunkAddApi(Resource):
segment_id = str(segment_id)
segment = (
db.session.query(DocumentSegment)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_tenant_id)
.first()
)
if not segment:
@@ -453,7 +475,7 @@ class ChildChunkAddApi(Resource):
try:
model_manager = ModelManager()
model_manager.get_model_instance(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
provider=dataset.embedding_model_provider,
model_type=ModelType.TEXT_EMBEDDING,
model=dataset.embedding_model,
@@ -469,8 +491,9 @@ class ChildChunkAddApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
# validate args
parser = reqparse.RequestParser()
parser.add_argument("content", type=str, required=True, nullable=False, location="json")
parser = reqparse.RequestParser().add_argument(
"content", type=str, required=True, nullable=False, location="json"
)
args = parser.parse_args()
try:
content = args["content"]
@@ -483,6 +506,8 @@ class ChildChunkAddApi(Resource):
@login_required
@account_initialization_required
def get(self, dataset_id, document_id, segment_id):
_, current_tenant_id = current_account_with_tenant()
# check dataset
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -499,15 +524,17 @@ class ChildChunkAddApi(Resource):
segment_id = str(segment_id)
segment = (
db.session.query(DocumentSegment)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_tenant_id)
.first()
)
if not segment:
raise NotFound("Segment not found.")
parser = reqparse.RequestParser()
parser.add_argument("limit", type=int, default=20, location="args")
parser.add_argument("keyword", type=str, default=None, location="args")
parser.add_argument("page", type=int, default=1, location="args")
parser = (
reqparse.RequestParser()
.add_argument("limit", type=int, default=20, location="args")
.add_argument("keyword", type=str, default=None, location="args")
.add_argument("page", type=int, default=1, location="args")
)
args = parser.parse_args()
@@ -530,6 +557,8 @@ class ChildChunkAddApi(Resource):
@cloud_edition_billing_resource_check("vector_space")
@cloud_edition_billing_rate_limit_check("knowledge")
def patch(self, dataset_id, document_id, segment_id):
current_user, current_tenant_id = current_account_with_tenant()
# check dataset
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -546,7 +575,7 @@ class ChildChunkAddApi(Resource):
segment_id = str(segment_id)
segment = (
db.session.query(DocumentSegment)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_tenant_id)
.first()
)
if not segment:
@@ -559,12 +588,13 @@ class ChildChunkAddApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
# validate args
parser = reqparse.RequestParser()
parser.add_argument("chunks", type=list, required=True, nullable=False, location="json")
parser = reqparse.RequestParser().add_argument(
"chunks", type=list, required=True, nullable=False, location="json"
)
args = parser.parse_args()
try:
chunks_data = args["chunks"]
chunks = [ChildChunkUpdateArgs(**chunk) for chunk in chunks_data]
chunks = [ChildChunkUpdateArgs.model_validate(chunk) for chunk in chunks_data]
child_chunks = SegmentService.update_child_chunks(chunks, segment, document, dataset)
except ChildChunkIndexingServiceError as e:
raise ChildChunkIndexingError(str(e))
@@ -580,6 +610,8 @@ class ChildChunkUpdateApi(Resource):
@account_initialization_required
@cloud_edition_billing_rate_limit_check("knowledge")
def delete(self, dataset_id, document_id, segment_id, child_chunk_id):
current_user, current_tenant_id = current_account_with_tenant()
# check dataset
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -596,7 +628,7 @@ class ChildChunkUpdateApi(Resource):
segment_id = str(segment_id)
segment = (
db.session.query(DocumentSegment)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_tenant_id)
.first()
)
if not segment:
@@ -607,7 +639,7 @@ class ChildChunkUpdateApi(Resource):
db.session.query(ChildChunk)
.where(
ChildChunk.id == str(child_chunk_id),
ChildChunk.tenant_id == current_user.current_tenant_id,
ChildChunk.tenant_id == current_tenant_id,
ChildChunk.segment_id == segment.id,
ChildChunk.document_id == document_id,
)
@@ -634,6 +666,8 @@ class ChildChunkUpdateApi(Resource):
@cloud_edition_billing_resource_check("vector_space")
@cloud_edition_billing_rate_limit_check("knowledge")
def patch(self, dataset_id, document_id, segment_id, child_chunk_id):
current_user, current_tenant_id = current_account_with_tenant()
# check dataset
dataset_id = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id)
@@ -650,7 +684,7 @@ class ChildChunkUpdateApi(Resource):
segment_id = str(segment_id)
segment = (
db.session.query(DocumentSegment)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_user.current_tenant_id)
.where(DocumentSegment.id == str(segment_id), DocumentSegment.tenant_id == current_tenant_id)
.first()
)
if not segment:
@@ -661,7 +695,7 @@ class ChildChunkUpdateApi(Resource):
db.session.query(ChildChunk)
.where(
ChildChunk.id == str(child_chunk_id),
ChildChunk.tenant_id == current_user.current_tenant_id,
ChildChunk.tenant_id == current_tenant_id,
ChildChunk.segment_id == segment.id,
ChildChunk.document_id == document_id,
)
@@ -677,8 +711,9 @@ class ChildChunkUpdateApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
# validate args
parser = reqparse.RequestParser()
parser.add_argument("content", type=str, required=True, nullable=False, location="json")
parser = reqparse.RequestParser().add_argument(
"content", type=str, required=True, nullable=False, location="json"
)
args = parser.parse_args()
try:
content = args["content"]

View File

@@ -1,23 +1,76 @@
from typing import cast
from flask import request
from flask_login import current_user
from flask_restx import Resource, fields, marshal, reqparse
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
import services
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.datasets.error import DatasetNameDuplicateError
from controllers.console.wraps import account_initialization_required, setup_required
from fields.dataset_fields import dataset_detail_fields
from libs.login import login_required
from models.account import Account
from controllers.console.wraps import account_initialization_required, edit_permission_required, setup_required
from fields.dataset_fields import (
dataset_detail_fields,
dataset_retrieval_model_fields,
doc_metadata_fields,
external_knowledge_info_fields,
external_retrieval_model_fields,
icon_info_fields,
keyword_setting_fields,
reranking_model_fields,
tag_fields,
vector_setting_fields,
weighted_score_fields,
)
from libs.login import current_account_with_tenant, login_required
from services.dataset_service import DatasetService
from services.external_knowledge_service import ExternalDatasetService
from services.hit_testing_service import HitTestingService
from services.knowledge_service import ExternalDatasetTestService
def _get_or_create_model(model_name: str, field_def):
existing = console_ns.models.get(model_name)
if existing is None:
existing = console_ns.model(model_name, field_def)
return existing
def _build_dataset_detail_model():
keyword_setting_model = _get_or_create_model("DatasetKeywordSetting", keyword_setting_fields)
vector_setting_model = _get_or_create_model("DatasetVectorSetting", vector_setting_fields)
weighted_score_fields_copy = weighted_score_fields.copy()
weighted_score_fields_copy["keyword_setting"] = fields.Nested(keyword_setting_model)
weighted_score_fields_copy["vector_setting"] = fields.Nested(vector_setting_model)
weighted_score_model = _get_or_create_model("DatasetWeightedScore", weighted_score_fields_copy)
reranking_model = _get_or_create_model("DatasetRerankingModel", reranking_model_fields)
dataset_retrieval_model_fields_copy = dataset_retrieval_model_fields.copy()
dataset_retrieval_model_fields_copy["reranking_model"] = fields.Nested(reranking_model)
dataset_retrieval_model_fields_copy["weights"] = fields.Nested(weighted_score_model, allow_null=True)
dataset_retrieval_model = _get_or_create_model("DatasetRetrievalModel", dataset_retrieval_model_fields_copy)
tag_model = _get_or_create_model("Tag", tag_fields)
doc_metadata_model = _get_or_create_model("DatasetDocMetadata", doc_metadata_fields)
external_knowledge_info_model = _get_or_create_model("ExternalKnowledgeInfo", external_knowledge_info_fields)
external_retrieval_model = _get_or_create_model("ExternalRetrievalModel", external_retrieval_model_fields)
icon_info_model = _get_or_create_model("DatasetIconInfo", icon_info_fields)
dataset_detail_fields_copy = dataset_detail_fields.copy()
dataset_detail_fields_copy["retrieval_model_dict"] = fields.Nested(dataset_retrieval_model)
dataset_detail_fields_copy["tags"] = fields.List(fields.Nested(tag_model))
dataset_detail_fields_copy["external_knowledge_info"] = fields.Nested(external_knowledge_info_model)
dataset_detail_fields_copy["external_retrieval_model"] = fields.Nested(external_retrieval_model, allow_null=True)
dataset_detail_fields_copy["doc_metadata"] = fields.List(fields.Nested(doc_metadata_model))
dataset_detail_fields_copy["icon_info"] = fields.Nested(icon_info_model)
return _get_or_create_model("DatasetDetail", dataset_detail_fields_copy)
try:
dataset_detail_model = console_ns.models["DatasetDetail"]
except KeyError:
dataset_detail_model = _build_dataset_detail_model()
def _validate_name(name: str) -> str:
if not name or len(name) < 1 or len(name) > 100:
raise ValueError("Name must be between 1 to 100 characters.")
@@ -26,26 +79,27 @@ def _validate_name(name: str) -> str:
@console_ns.route("/datasets/external-knowledge-api")
class ExternalApiTemplateListApi(Resource):
@api.doc("get_external_api_templates")
@api.doc(description="Get external knowledge API templates")
@api.doc(
@console_ns.doc("get_external_api_templates")
@console_ns.doc(description="Get external knowledge API templates")
@console_ns.doc(
params={
"page": "Page number (default: 1)",
"limit": "Number of items per page (default: 20)",
"keyword": "Search keyword",
}
)
@api.response(200, "External API templates retrieved successfully")
@console_ns.response(200, "External API templates retrieved successfully")
@setup_required
@login_required
@account_initialization_required
def get(self):
_, current_tenant_id = current_account_with_tenant()
page = request.args.get("page", default=1, type=int)
limit = request.args.get("limit", default=20, type=int)
search = request.args.get("keyword", default=None, type=str)
external_knowledge_apis, total = ExternalDatasetService.get_external_knowledge_apis(
page, limit, current_user.current_tenant_id, search
page, limit, current_tenant_id, search
)
response = {
"data": [item.to_dict() for item in external_knowledge_apis],
@@ -60,20 +114,23 @@ class ExternalApiTemplateListApi(Resource):
@login_required
@account_initialization_required
def post(self):
parser = reqparse.RequestParser()
parser.add_argument(
"name",
nullable=False,
required=True,
help="Name is required. Name must be between 1 to 100 characters.",
type=_validate_name,
)
parser.add_argument(
"settings",
type=dict,
location="json",
nullable=False,
required=True,
current_user, current_tenant_id = current_account_with_tenant()
parser = (
reqparse.RequestParser()
.add_argument(
"name",
nullable=False,
required=True,
help="Name is required. Name must be between 1 to 100 characters.",
type=_validate_name,
)
.add_argument(
"settings",
type=dict,
location="json",
nullable=False,
required=True,
)
)
args = parser.parse_args()
@@ -85,7 +142,7 @@ class ExternalApiTemplateListApi(Resource):
try:
external_knowledge_api = ExternalDatasetService.create_external_knowledge_api(
tenant_id=current_user.current_tenant_id, user_id=current_user.id, args=args
tenant_id=current_tenant_id, user_id=current_user.id, args=args
)
except services.errors.dataset.DatasetNameDuplicateError:
raise DatasetNameDuplicateError()
@@ -95,11 +152,11 @@ class ExternalApiTemplateListApi(Resource):
@console_ns.route("/datasets/external-knowledge-api/<uuid:external_knowledge_api_id>")
class ExternalApiTemplateApi(Resource):
@api.doc("get_external_api_template")
@api.doc(description="Get external knowledge API template details")
@api.doc(params={"external_knowledge_api_id": "External knowledge API ID"})
@api.response(200, "External API template retrieved successfully")
@api.response(404, "Template not found")
@console_ns.doc("get_external_api_template")
@console_ns.doc(description="Get external knowledge API template details")
@console_ns.doc(params={"external_knowledge_api_id": "External knowledge API ID"})
@console_ns.response(200, "External API template retrieved successfully")
@console_ns.response(404, "Template not found")
@setup_required
@login_required
@account_initialization_required
@@ -115,28 +172,31 @@ class ExternalApiTemplateApi(Resource):
@login_required
@account_initialization_required
def patch(self, external_knowledge_api_id):
current_user, current_tenant_id = current_account_with_tenant()
external_knowledge_api_id = str(external_knowledge_api_id)
parser = reqparse.RequestParser()
parser.add_argument(
"name",
nullable=False,
required=True,
help="type is required. Name must be between 1 to 100 characters.",
type=_validate_name,
)
parser.add_argument(
"settings",
type=dict,
location="json",
nullable=False,
required=True,
parser = (
reqparse.RequestParser()
.add_argument(
"name",
nullable=False,
required=True,
help="type is required. Name must be between 1 to 100 characters.",
type=_validate_name,
)
.add_argument(
"settings",
type=dict,
location="json",
nullable=False,
required=True,
)
)
args = parser.parse_args()
ExternalDatasetService.validate_api_list(args["settings"])
external_knowledge_api = ExternalDatasetService.update_external_knowledge_api(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
user_id=current_user.id,
external_knowledge_api_id=external_knowledge_api_id,
args=args,
@@ -148,22 +208,22 @@ class ExternalApiTemplateApi(Resource):
@login_required
@account_initialization_required
def delete(self, external_knowledge_api_id):
current_user, current_tenant_id = current_account_with_tenant()
external_knowledge_api_id = str(external_knowledge_api_id)
# The role of the current user in the ta table must be admin, owner, or editor
if not (current_user.is_editor or current_user.is_dataset_operator):
if not (current_user.has_edit_permission or current_user.is_dataset_operator):
raise Forbidden()
ExternalDatasetService.delete_external_knowledge_api(current_user.current_tenant_id, external_knowledge_api_id)
ExternalDatasetService.delete_external_knowledge_api(current_tenant_id, external_knowledge_api_id)
return {"result": "success"}, 204
@console_ns.route("/datasets/external-knowledge-api/<uuid:external_knowledge_api_id>/use-check")
class ExternalApiUseCheckApi(Resource):
@api.doc("check_external_api_usage")
@api.doc(description="Check if external knowledge API is being used")
@api.doc(params={"external_knowledge_api_id": "External knowledge API ID"})
@api.response(200, "Usage check completed successfully")
@console_ns.doc("check_external_api_usage")
@console_ns.doc(description="Check if external knowledge API is being used")
@console_ns.doc(params={"external_knowledge_api_id": "External knowledge API ID"})
@console_ns.response(200, "Usage check completed successfully")
@setup_required
@login_required
@account_initialization_required
@@ -178,10 +238,10 @@ class ExternalApiUseCheckApi(Resource):
@console_ns.route("/datasets/external")
class ExternalDatasetCreateApi(Resource):
@api.doc("create_external_dataset")
@api.doc(description="Create external knowledge dataset")
@api.expect(
api.model(
@console_ns.doc("create_external_dataset")
@console_ns.doc(description="Create external knowledge dataset")
@console_ns.expect(
console_ns.model(
"CreateExternalDatasetRequest",
{
"external_knowledge_api_id": fields.String(required=True, description="External knowledge API ID"),
@@ -191,29 +251,30 @@ class ExternalDatasetCreateApi(Resource):
},
)
)
@api.response(201, "External dataset created successfully", dataset_detail_fields)
@api.response(400, "Invalid parameters")
@api.response(403, "Permission denied")
@console_ns.response(201, "External dataset created successfully", dataset_detail_model)
@console_ns.response(400, "Invalid parameters")
@console_ns.response(403, "Permission denied")
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def post(self):
# The role of the current user in the ta table must be admin, owner, or editor
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("external_knowledge_api_id", type=str, required=True, nullable=False, location="json")
parser.add_argument("external_knowledge_id", type=str, required=True, nullable=False, location="json")
parser.add_argument(
"name",
nullable=False,
required=True,
help="name is required. Name must be between 1 to 100 characters.",
type=_validate_name,
current_user, current_tenant_id = current_account_with_tenant()
parser = (
reqparse.RequestParser()
.add_argument("external_knowledge_api_id", type=str, required=True, nullable=False, location="json")
.add_argument("external_knowledge_id", type=str, required=True, nullable=False, location="json")
.add_argument(
"name",
nullable=False,
required=True,
help="name is required. Name must be between 1 to 100 characters.",
type=_validate_name,
)
.add_argument("description", type=str, required=False, nullable=True, location="json")
.add_argument("external_retrieval_model", type=dict, required=False, location="json")
)
parser.add_argument("description", type=str, required=False, nullable=True, location="json")
parser.add_argument("external_retrieval_model", type=dict, required=False, location="json")
args = parser.parse_args()
@@ -223,7 +284,7 @@ class ExternalDatasetCreateApi(Resource):
try:
dataset = ExternalDatasetService.create_external_dataset(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
user_id=current_user.id,
args=args,
)
@@ -235,11 +296,11 @@ class ExternalDatasetCreateApi(Resource):
@console_ns.route("/datasets/<uuid:dataset_id>/external-hit-testing")
class ExternalKnowledgeHitTestingApi(Resource):
@api.doc("test_external_knowledge_retrieval")
@api.doc(description="Test external knowledge retrieval for dataset")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.expect(
api.model(
@console_ns.doc("test_external_knowledge_retrieval")
@console_ns.doc(description="Test external knowledge retrieval for dataset")
@console_ns.doc(params={"dataset_id": "Dataset ID"})
@console_ns.expect(
console_ns.model(
"ExternalHitTestingRequest",
{
"query": fields.String(required=True, description="Query text for testing"),
@@ -248,13 +309,14 @@ class ExternalKnowledgeHitTestingApi(Resource):
},
)
)
@api.response(200, "External hit testing completed successfully")
@api.response(404, "Dataset not found")
@api.response(400, "Invalid parameters")
@console_ns.response(200, "External hit testing completed successfully")
@console_ns.response(404, "Dataset not found")
@console_ns.response(400, "Invalid parameters")
@setup_required
@login_required
@account_initialization_required
def post(self, dataset_id):
current_user, _ = current_account_with_tenant()
dataset_id_str = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id_str)
if dataset is None:
@@ -265,10 +327,12 @@ class ExternalKnowledgeHitTestingApi(Resource):
except services.errors.account.NoPermissionError as e:
raise Forbidden(str(e))
parser = reqparse.RequestParser()
parser.add_argument("query", type=str, location="json")
parser.add_argument("external_retrieval_model", type=dict, required=False, location="json")
parser.add_argument("metadata_filtering_conditions", type=dict, required=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("query", type=str, location="json")
.add_argument("external_retrieval_model", type=dict, required=False, location="json")
.add_argument("metadata_filtering_conditions", type=dict, required=False, location="json")
)
args = parser.parse_args()
HitTestingService.hit_testing_args_check(args)
@@ -277,7 +341,7 @@ class ExternalKnowledgeHitTestingApi(Resource):
response = HitTestingService.external_retrieve(
dataset=dataset,
query=args["query"],
account=cast(Account, current_user),
account=current_user,
external_retrieval_model=args["external_retrieval_model"],
metadata_filtering_conditions=args["metadata_filtering_conditions"],
)
@@ -290,10 +354,10 @@ class ExternalKnowledgeHitTestingApi(Resource):
@console_ns.route("/test/retrieval")
class BedrockRetrievalApi(Resource):
# this api is only for internal testing
@api.doc("bedrock_retrieval_test")
@api.doc(description="Bedrock retrieval test (internal use only)")
@api.expect(
api.model(
@console_ns.doc("bedrock_retrieval_test")
@console_ns.doc(description="Bedrock retrieval test (internal use only)")
@console_ns.expect(
console_ns.model(
"BedrockRetrievalTestRequest",
{
"retrieval_setting": fields.Raw(required=True, description="Retrieval settings"),
@@ -302,17 +366,19 @@ class BedrockRetrievalApi(Resource):
},
)
)
@api.response(200, "Bedrock retrieval test completed")
@console_ns.response(200, "Bedrock retrieval test completed")
def post(self):
parser = reqparse.RequestParser()
parser.add_argument("retrieval_setting", nullable=False, required=True, type=dict, location="json")
parser.add_argument(
"query",
nullable=False,
required=True,
type=str,
parser = (
reqparse.RequestParser()
.add_argument("retrieval_setting", nullable=False, required=True, type=dict, location="json")
.add_argument(
"query",
nullable=False,
required=True,
type=str,
)
.add_argument("knowledge_id", nullable=False, required=True, type=str)
)
parser.add_argument("knowledge_id", nullable=False, required=True, type=str)
args = parser.parse_args()
# Call the knowledge retrieval service

View File

@@ -1,6 +1,6 @@
from flask_restx import Resource, fields
from controllers.console import api, console_ns
from controllers.console import console_ns
from controllers.console.datasets.hit_testing_base import DatasetsHitTestingBase
from controllers.console.wraps import (
account_initialization_required,
@@ -12,11 +12,11 @@ from libs.login import login_required
@console_ns.route("/datasets/<uuid:dataset_id>/hit-testing")
class HitTestingApi(Resource, DatasetsHitTestingBase):
@api.doc("test_dataset_retrieval")
@api.doc(description="Test dataset knowledge retrieval")
@api.doc(params={"dataset_id": "Dataset ID"})
@api.expect(
api.model(
@console_ns.doc("test_dataset_retrieval")
@console_ns.doc(description="Test dataset knowledge retrieval")
@console_ns.doc(params={"dataset_id": "Dataset ID"})
@console_ns.expect(
console_ns.model(
"HitTestingRequest",
{
"query": fields.String(required=True, description="Query text for testing"),
@@ -26,9 +26,9 @@ class HitTestingApi(Resource, DatasetsHitTestingBase):
},
)
)
@api.response(200, "Hit testing completed successfully")
@api.response(404, "Dataset not found")
@api.response(400, "Invalid parameters")
@console_ns.response(200, "Hit testing completed successfully")
@console_ns.response(404, "Dataset not found")
@console_ns.response(400, "Invalid parameters")
@setup_required
@login_required
@account_initialization_required

View File

@@ -1,7 +1,5 @@
import logging
from typing import cast
from flask_login import current_user
from flask_restx import marshal, reqparse
from werkzeug.exceptions import Forbidden, InternalServerError, NotFound
@@ -21,6 +19,7 @@ from core.errors.error import (
)
from core.model_runtime.errors.invoke import InvokeError
from fields.hit_testing_fields import hit_testing_record_fields
from libs.login import current_user
from models.account import Account
from services.dataset_service import DatasetService
from services.hit_testing_service import HitTestingService
@@ -31,6 +30,7 @@ logger = logging.getLogger(__name__)
class DatasetsHitTestingBase:
@staticmethod
def get_and_validate_dataset(dataset_id: str):
assert isinstance(current_user, Account)
dataset = DatasetService.get_dataset(dataset_id)
if dataset is None:
raise NotFound("Dataset not found.")
@@ -48,20 +48,22 @@ class DatasetsHitTestingBase:
@staticmethod
def parse_args():
parser = reqparse.RequestParser()
parser.add_argument("query", type=str, location="json")
parser.add_argument("retrieval_model", type=dict, required=False, location="json")
parser.add_argument("external_retrieval_model", type=dict, required=False, location="json")
parser = (
reqparse.RequestParser()
.add_argument("query", type=str, location="json")
.add_argument("retrieval_model", type=dict, required=False, location="json")
.add_argument("external_retrieval_model", type=dict, required=False, location="json")
)
return parser.parse_args()
@staticmethod
def perform_hit_testing(dataset, args):
assert isinstance(current_user, Account)
try:
response = HitTestingService.retrieve(
dataset=dataset,
query=args["query"],
account=cast(Account, current_user),
account=current_user,
retrieval_model=args["retrieval_model"],
external_retrieval_model=args["external_retrieval_model"],
limit=10,

View File

@@ -1,13 +1,12 @@
from typing import Literal
from flask_login import current_user
from flask_restx import Resource, marshal_with, reqparse
from werkzeug.exceptions import NotFound
from controllers.console import console_ns
from controllers.console.wraps import account_initialization_required, enterprise_license_required, setup_required
from fields.dataset_fields import dataset_metadata_fields
from libs.login import login_required
from libs.login import current_account_with_tenant, login_required
from services.dataset_service import DatasetService
from services.entities.knowledge_entities.knowledge_entities import (
MetadataArgs,
@@ -24,11 +23,14 @@ class DatasetMetadataCreateApi(Resource):
@enterprise_license_required
@marshal_with(dataset_metadata_fields)
def post(self, dataset_id):
parser = reqparse.RequestParser()
parser.add_argument("type", type=str, required=True, nullable=False, location="json")
parser.add_argument("name", type=str, required=True, nullable=False, location="json")
current_user, _ = current_account_with_tenant()
parser = (
reqparse.RequestParser()
.add_argument("type", type=str, required=True, nullable=False, location="json")
.add_argument("name", type=str, required=True, nullable=False, location="json")
)
args = parser.parse_args()
metadata_args = MetadataArgs(**args)
metadata_args = MetadataArgs.model_validate(args)
dataset_id_str = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id_str)
@@ -59,8 +61,8 @@ class DatasetMetadataApi(Resource):
@enterprise_license_required
@marshal_with(dataset_metadata_fields)
def patch(self, dataset_id, metadata_id):
parser = reqparse.RequestParser()
parser.add_argument("name", type=str, required=True, nullable=False, location="json")
current_user, _ = current_account_with_tenant()
parser = reqparse.RequestParser().add_argument("name", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
name = args["name"]
@@ -79,6 +81,7 @@ class DatasetMetadataApi(Resource):
@account_initialization_required
@enterprise_license_required
def delete(self, dataset_id, metadata_id):
current_user, _ = current_account_with_tenant()
dataset_id_str = str(dataset_id)
metadata_id_str = str(metadata_id)
dataset = DatasetService.get_dataset(dataset_id_str)
@@ -108,6 +111,7 @@ class DatasetMetadataBuiltInFieldActionApi(Resource):
@account_initialization_required
@enterprise_license_required
def post(self, dataset_id, action: Literal["enable", "disable"]):
current_user, _ = current_account_with_tenant()
dataset_id_str = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id_str)
if dataset is None:
@@ -128,16 +132,18 @@ class DocumentMetadataEditApi(Resource):
@account_initialization_required
@enterprise_license_required
def post(self, dataset_id):
current_user, _ = current_account_with_tenant()
dataset_id_str = str(dataset_id)
dataset = DatasetService.get_dataset(dataset_id_str)
if dataset is None:
raise NotFound("Dataset not found.")
DatasetService.check_dataset_permission(dataset, current_user)
parser = reqparse.RequestParser()
parser.add_argument("operation_data", type=list, required=True, nullable=False, location="json")
parser = reqparse.RequestParser().add_argument(
"operation_data", type=list, required=True, nullable=False, location="json"
)
args = parser.parse_args()
metadata_args = MetadataOperationData(**args)
metadata_args = MetadataOperationData.model_validate(args)
MetadataService.update_documents_metadata(dataset, metadata_args)

View File

@@ -1,19 +1,15 @@
from fastapi.encoders import jsonable_encoder
from flask import make_response, redirect, request
from flask_login import current_user
from flask_restx import Resource, reqparse
from werkzeug.exceptions import Forbidden, NotFound
from configs import dify_config
from controllers.console import console_ns
from controllers.console.wraps import (
account_initialization_required,
setup_required,
)
from controllers.console.wraps import account_initialization_required, edit_permission_required, setup_required
from core.model_runtime.errors.validate import CredentialsValidateFailedError
from core.model_runtime.utils.encoders import jsonable_encoder
from core.plugin.impl.oauth import OAuthHandler
from libs.helper import StrLen
from libs.login import login_required
from libs.login import current_account_with_tenant, login_required
from models.provider_ids import DatasourceProviderID
from services.datasource_provider_service import DatasourceProviderService
from services.plugin.oauth_service import OAuthProxyService
@@ -24,11 +20,11 @@ class DatasourcePluginOAuthAuthorizationUrl(Resource):
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def get(self, provider_id: str):
user = current_user
tenant_id = user.current_tenant_id
if not current_user.is_editor:
raise Forbidden()
current_user, current_tenant_id = current_account_with_tenant()
tenant_id = current_tenant_id
credential_id = request.args.get("credential_id")
datasource_provider_id = DatasourceProviderID(provider_id)
@@ -52,7 +48,7 @@ class DatasourcePluginOAuthAuthorizationUrl(Resource):
redirect_uri = f"{dify_config.CONSOLE_API_URL}/console/api/oauth/plugin/{provider_id}/datasource/callback"
authorization_url_response = oauth_handler.get_authorization_url(
tenant_id=tenant_id,
user_id=user.id,
user_id=current_user.id,
plugin_id=plugin_id,
provider=provider_name,
redirect_uri=redirect_uri,
@@ -125,27 +121,30 @@ class DatasourceOAuthCallback(Resource):
return redirect(f"{dify_config.CONSOLE_WEB_URL}/oauth-callback")
parser_datasource = (
reqparse.RequestParser()
.add_argument("name", type=StrLen(max_length=100), required=False, nullable=True, location="json", default=None)
.add_argument("credentials", type=dict, required=True, nullable=False, location="json")
)
@console_ns.route("/auth/plugin/datasource/<path:provider_id>")
class DatasourceAuth(Resource):
@console_ns.expect(parser_datasource)
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def post(self, provider_id: str):
if not current_user.is_editor:
raise Forbidden()
_, current_tenant_id = current_account_with_tenant()
parser = reqparse.RequestParser()
parser.add_argument(
"name", type=StrLen(max_length=100), required=False, nullable=True, location="json", default=None
)
parser.add_argument("credentials", type=dict, required=True, nullable=False, location="json")
args = parser.parse_args()
args = parser_datasource.parse_args()
datasource_provider_id = DatasourceProviderID(provider_id)
datasource_provider_service = DatasourceProviderService()
try:
datasource_provider_service.add_datasource_api_key_provider(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
provider_id=datasource_provider_id,
credentials=args["credentials"],
name=args["name"],
@@ -160,31 +159,39 @@ class DatasourceAuth(Resource):
def get(self, provider_id: str):
datasource_provider_id = DatasourceProviderID(provider_id)
datasource_provider_service = DatasourceProviderService()
_, current_tenant_id = current_account_with_tenant()
datasources = datasource_provider_service.list_datasource_credentials(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
provider=datasource_provider_id.provider_name,
plugin_id=datasource_provider_id.plugin_id,
)
return {"result": datasources}, 200
parser_datasource_delete = reqparse.RequestParser().add_argument(
"credential_id", type=str, required=True, nullable=False, location="json"
)
@console_ns.route("/auth/plugin/datasource/<path:provider_id>/delete")
class DatasourceAuthDeleteApi(Resource):
@console_ns.expect(parser_datasource_delete)
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def post(self, provider_id: str):
_, current_tenant_id = current_account_with_tenant()
datasource_provider_id = DatasourceProviderID(provider_id)
plugin_id = datasource_provider_id.plugin_id
provider_name = datasource_provider_id.provider_name
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("credential_id", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
args = parser_datasource_delete.parse_args()
datasource_provider_service = DatasourceProviderService()
datasource_provider_service.remove_datasource_credentials(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
auth_id=args["credential_id"],
provider=provider_name,
plugin_id=plugin_id,
@@ -192,23 +199,30 @@ class DatasourceAuthDeleteApi(Resource):
return {"result": "success"}, 200
parser_datasource_update = (
reqparse.RequestParser()
.add_argument("credentials", type=dict, required=False, nullable=True, location="json")
.add_argument("name", type=StrLen(max_length=100), required=False, nullable=True, location="json")
.add_argument("credential_id", type=str, required=True, nullable=False, location="json")
)
@console_ns.route("/auth/plugin/datasource/<path:provider_id>/update")
class DatasourceAuthUpdateApi(Resource):
@console_ns.expect(parser_datasource_update)
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def post(self, provider_id: str):
_, current_tenant_id = current_account_with_tenant()
datasource_provider_id = DatasourceProviderID(provider_id)
parser = reqparse.RequestParser()
parser.add_argument("credentials", type=dict, required=False, nullable=True, location="json")
parser.add_argument("name", type=StrLen(max_length=100), required=False, nullable=True, location="json")
parser.add_argument("credential_id", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
if not current_user.is_editor:
raise Forbidden()
args = parser_datasource_update.parse_args()
datasource_provider_service = DatasourceProviderService()
datasource_provider_service.update_datasource_credentials(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
auth_id=args["credential_id"],
provider=datasource_provider_id.provider_name,
plugin_id=datasource_provider_id.plugin_id,
@@ -224,10 +238,10 @@ class DatasourceAuthListApi(Resource):
@login_required
@account_initialization_required
def get(self):
_, current_tenant_id = current_account_with_tenant()
datasource_provider_service = DatasourceProviderService()
datasources = datasource_provider_service.get_all_datasource_credentials(
tenant_id=current_user.current_tenant_id
)
datasources = datasource_provider_service.get_all_datasource_credentials(tenant_id=current_tenant_id)
return {"result": jsonable_encoder(datasources)}, 200
@@ -237,29 +251,35 @@ class DatasourceHardCodeAuthListApi(Resource):
@login_required
@account_initialization_required
def get(self):
_, current_tenant_id = current_account_with_tenant()
datasource_provider_service = DatasourceProviderService()
datasources = datasource_provider_service.get_hard_code_datasource_credentials(
tenant_id=current_user.current_tenant_id
)
datasources = datasource_provider_service.get_hard_code_datasource_credentials(tenant_id=current_tenant_id)
return {"result": jsonable_encoder(datasources)}, 200
parser_datasource_custom = (
reqparse.RequestParser()
.add_argument("client_params", type=dict, required=False, nullable=True, location="json")
.add_argument("enable_oauth_custom_client", type=bool, required=False, nullable=True, location="json")
)
@console_ns.route("/auth/plugin/datasource/<path:provider_id>/custom-client")
class DatasourceAuthOauthCustomClient(Resource):
@console_ns.expect(parser_datasource_custom)
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def post(self, provider_id: str):
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("client_params", type=dict, required=False, nullable=True, location="json")
parser.add_argument("enable_oauth_custom_client", type=bool, required=False, nullable=True, location="json")
args = parser.parse_args()
_, current_tenant_id = current_account_with_tenant()
args = parser_datasource_custom.parse_args()
datasource_provider_id = DatasourceProviderID(provider_id)
datasource_provider_service = DatasourceProviderService()
datasource_provider_service.setup_oauth_custom_client_params(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
datasource_provider_id=datasource_provider_id,
client_params=args.get("client_params", {}),
enabled=args.get("enable_oauth_custom_client", False),
@@ -270,52 +290,63 @@ class DatasourceAuthOauthCustomClient(Resource):
@login_required
@account_initialization_required
def delete(self, provider_id: str):
_, current_tenant_id = current_account_with_tenant()
datasource_provider_id = DatasourceProviderID(provider_id)
datasource_provider_service = DatasourceProviderService()
datasource_provider_service.remove_oauth_custom_client_params(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
datasource_provider_id=datasource_provider_id,
)
return {"result": "success"}, 200
parser_default = reqparse.RequestParser().add_argument("id", type=str, required=True, nullable=False, location="json")
@console_ns.route("/auth/plugin/datasource/<path:provider_id>/default")
class DatasourceAuthDefaultApi(Resource):
@console_ns.expect(parser_default)
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def post(self, provider_id: str):
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("id", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
_, current_tenant_id = current_account_with_tenant()
args = parser_default.parse_args()
datasource_provider_id = DatasourceProviderID(provider_id)
datasource_provider_service = DatasourceProviderService()
datasource_provider_service.set_default_datasource_provider(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
datasource_provider_id=datasource_provider_id,
credential_id=args["id"],
)
return {"result": "success"}, 200
parser_update_name = (
reqparse.RequestParser()
.add_argument("name", type=StrLen(max_length=100), required=True, nullable=False, location="json")
.add_argument("credential_id", type=str, required=True, nullable=False, location="json")
)
@console_ns.route("/auth/plugin/datasource/<path:provider_id>/update-name")
class DatasourceUpdateProviderNameApi(Resource):
@console_ns.expect(parser_update_name)
@setup_required
@login_required
@account_initialization_required
@edit_permission_required
def post(self, provider_id: str):
if not current_user.is_editor:
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("name", type=StrLen(max_length=100), required=True, nullable=False, location="json")
parser.add_argument("credential_id", type=str, required=True, nullable=False, location="json")
args = parser.parse_args()
_, current_tenant_id = current_account_with_tenant()
args = parser_update_name.parse_args()
datasource_provider_id = DatasourceProviderID(provider_id)
datasource_provider_service = DatasourceProviderService()
datasource_provider_service.update_datasource_provider_name(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
datasource_provider_id=datasource_provider_id,
name=args["name"],
credential_id=args["credential_id"],

View File

@@ -1,7 +1,7 @@
from flask_restx import ( # type: ignore
Resource, # type: ignore
reqparse,
)
from pydantic import BaseModel
from werkzeug.exceptions import Forbidden
from controllers.console import console_ns
@@ -12,9 +12,21 @@ from models import Account
from models.dataset import Pipeline
from services.rag_pipeline.rag_pipeline import RagPipelineService
DEFAULT_REF_TEMPLATE_SWAGGER_2_0 = "#/definitions/{model}"
class Parser(BaseModel):
inputs: dict
datasource_type: str
credential_id: str | None = None
console_ns.schema_model(Parser.__name__, Parser.model_json_schema(ref_template=DEFAULT_REF_TEMPLATE_SWAGGER_2_0))
@console_ns.route("/rag/pipelines/<uuid:pipeline_id>/workflows/published/datasource/nodes/<string:node_id>/preview")
class DataSourceContentPreviewApi(Resource):
@console_ns.expect(console_ns.models[Parser.__name__], validate=True)
@setup_required
@login_required
@account_initialization_required
@@ -26,19 +38,10 @@ class DataSourceContentPreviewApi(Resource):
if not isinstance(current_user, Account):
raise Forbidden()
parser = reqparse.RequestParser()
parser.add_argument("inputs", type=dict, required=True, nullable=False, location="json")
parser.add_argument("datasource_type", type=str, required=True, location="json")
parser.add_argument("credential_id", type=str, required=False, location="json")
args = parser.parse_args()
inputs = args.get("inputs")
if inputs is None:
raise ValueError("missing inputs")
datasource_type = args.get("datasource_type")
if datasource_type is None:
raise ValueError("missing datasource_type")
args = Parser.model_validate(console_ns.payload)
inputs = args.inputs
datasource_type = args.datasource_type
rag_pipeline_service = RagPipelineService()
preview_content = rag_pipeline_service.run_datasource_node_preview(
pipeline=pipeline,
@@ -47,6 +50,6 @@ class DataSourceContentPreviewApi(Resource):
account=current_user,
datasource_type=datasource_type,
is_published=True,
credential_id=args.get("credential_id"),
credential_id=args.credential_id,
)
return preview_content, 200

View File

@@ -66,29 +66,31 @@ class CustomizedPipelineTemplateApi(Resource):
@account_initialization_required
@enterprise_license_required
def patch(self, template_id: str):
parser = reqparse.RequestParser()
parser.add_argument(
"name",
nullable=False,
required=True,
help="Name must be between 1 to 40 characters.",
type=_validate_name,
)
parser.add_argument(
"description",
type=_validate_description_length,
nullable=True,
required=False,
default="",
)
parser.add_argument(
"icon_info",
type=dict,
location="json",
nullable=True,
parser = (
reqparse.RequestParser()
.add_argument(
"name",
nullable=False,
required=True,
help="Name must be between 1 to 40 characters.",
type=_validate_name,
)
.add_argument(
"description",
type=_validate_description_length,
nullable=True,
required=False,
default="",
)
.add_argument(
"icon_info",
type=dict,
location="json",
nullable=True,
)
)
args = parser.parse_args()
pipeline_template_info = PipelineTemplateInfoEntity(**args)
pipeline_template_info = PipelineTemplateInfoEntity.model_validate(args)
RagPipelineService.update_customized_pipeline_template(template_id, pipeline_template_info)
return 200
@@ -123,26 +125,28 @@ class PublishCustomizedPipelineTemplateApi(Resource):
@enterprise_license_required
@knowledge_pipeline_publish_enabled
def post(self, pipeline_id: str):
parser = reqparse.RequestParser()
parser.add_argument(
"name",
nullable=False,
required=True,
help="Name must be between 1 to 40 characters.",
type=_validate_name,
)
parser.add_argument(
"description",
type=_validate_description_length,
nullable=True,
required=False,
default="",
)
parser.add_argument(
"icon_info",
type=dict,
location="json",
nullable=True,
parser = (
reqparse.RequestParser()
.add_argument(
"name",
nullable=False,
required=True,
help="Name must be between 1 to 40 characters.",
type=_validate_name,
)
.add_argument(
"description",
type=_validate_description_length,
nullable=True,
required=False,
default="",
)
.add_argument(
"icon_info",
type=dict,
location="json",
nullable=True,
)
)
args = parser.parse_args()
rag_pipeline_service = RagPipelineService()

View File

@@ -1,4 +1,3 @@
from flask_login import current_user
from flask_restx import Resource, marshal, reqparse
from sqlalchemy.orm import Session
from werkzeug.exceptions import Forbidden
@@ -13,7 +12,7 @@ from controllers.console.wraps import (
)
from extensions.ext_database import db
from fields.dataset_fields import dataset_detail_fields
from libs.login import login_required
from libs.login import current_account_with_tenant, login_required
from models.dataset import DatasetPermissionEnum
from services.dataset_service import DatasetPermissionService, DatasetService
from services.entities.knowledge_entities.rag_pipeline_entities import IconInfo, RagPipelineDatasetCreateEntity
@@ -27,9 +26,7 @@ class CreateRagPipelineDatasetApi(Resource):
@account_initialization_required
@cloud_edition_billing_rate_limit_check("knowledge")
def post(self):
parser = reqparse.RequestParser()
parser.add_argument(
parser = reqparse.RequestParser().add_argument(
"yaml_content",
type=str,
nullable=False,
@@ -38,7 +35,7 @@ class CreateRagPipelineDatasetApi(Resource):
)
args = parser.parse_args()
current_user, current_tenant_id = current_account_with_tenant()
# The role of the current user in the ta table must be admin, owner, or editor, or dataset_operator
if not current_user.is_dataset_editor:
raise Forbidden()
@@ -58,12 +55,12 @@ class CreateRagPipelineDatasetApi(Resource):
with Session(db.engine) as session:
rag_pipeline_dsl_service = RagPipelineDslService(session)
import_info = rag_pipeline_dsl_service.create_rag_pipeline_dataset(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
rag_pipeline_dataset_create_entity=rag_pipeline_dataset_create_entity,
)
if rag_pipeline_dataset_create_entity.permission == "partial_members":
DatasetPermissionService.update_partial_member_list(
current_user.current_tenant_id,
current_tenant_id,
import_info["dataset_id"],
rag_pipeline_dataset_create_entity.partial_member_list,
)
@@ -81,10 +78,12 @@ class CreateEmptyRagPipelineDatasetApi(Resource):
@cloud_edition_billing_rate_limit_check("knowledge")
def post(self):
# The role of the current user in the ta table must be admin, owner, or editor, or dataset_operator
current_user, current_tenant_id = current_account_with_tenant()
if not current_user.is_dataset_editor:
raise Forbidden()
dataset = DatasetService.create_empty_rag_pipeline_dataset(
tenant_id=current_user.current_tenant_id,
tenant_id=current_tenant_id,
rag_pipeline_dataset_create_entity=RagPipelineDatasetCreateEntity(
name="",
description="",

View File

@@ -23,7 +23,7 @@ from extensions.ext_database import db
from factories.file_factory import build_from_mapping, build_from_mappings
from factories.variable_factory import build_segment_with_type
from libs.login import current_user, login_required
from models.account import Account
from models import Account
from models.dataset import Pipeline
from models.workflow import WorkflowDraftVariable
from services.rag_pipeline.rag_pipeline import RagPipelineService
@@ -33,16 +33,18 @@ logger = logging.getLogger(__name__)
def _create_pagination_parser():
parser = reqparse.RequestParser()
parser.add_argument(
"page",
type=inputs.int_range(1, 100_000),
required=False,
default=1,
location="args",
help="the page of data requested",
parser = (
reqparse.RequestParser()
.add_argument(
"page",
type=inputs.int_range(1, 100_000),
required=False,
default=1,
location="args",
help="the page of data requested",
)
.add_argument("limit", type=inputs.int_range(1, 100), required=False, default=20, location="args")
)
parser.add_argument("limit", type=inputs.int_range(1, 100), required=False, default=20, location="args")
return parser
@@ -206,10 +208,11 @@ class RagPipelineVariableApi(Resource):
# "upload_file_id": "1602650a-4fe4-423c-85a2-af76c083e3c4"
# }
parser = reqparse.RequestParser()
parser.add_argument(self._PATCH_NAME_FIELD, type=str, required=False, nullable=True, location="json")
# Parse 'value' field as-is to maintain its original data structure
parser.add_argument(self._PATCH_VALUE_FIELD, type=lambda x: x, required=False, nullable=True, location="json")
parser = (
reqparse.RequestParser()
.add_argument(self._PATCH_NAME_FIELD, type=str, required=False, nullable=True, location="json")
.add_argument(self._PATCH_VALUE_FIELD, type=lambda x: x, required=False, nullable=True, location="json")
)
draft_var_srv = WorkflowDraftVariableService(
session=db.session(),

Some files were not shown because too many files have changed in this diff Show More