feat: 添加MiniMax作为LLM提供商,合并PR#280并补充监听

合并PR#280的MiniMax provider实现,解决与main分支的冲突,
并在MinimaxServiceImpl中补充MyChatModelListener监听,
与其他provider保持一致。

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
This commit is contained in:
wangle
2026-04-17 18:31:53 +08:00
parent 74eb5b2530
commit 081da6d18d
13 changed files with 347 additions and 214 deletions

View File

@@ -34,7 +34,7 @@
| Module | Current Capabilities |
|:---:|---|
| **Model Management** | Multi-model integration (OpenAI/DeepSeek/Tongyi/Zhipu), multi-modal understanding, Coze/DIFY/FastGPT platform integration |
| **Model Management** | Multi-model integration (OpenAI/DeepSeek/Tongyi/Zhipu/MiniMax), multi-modal understanding, Coze/DIFY/FastGPT platform integration |
| **Knowledge Base** | Local RAG + Vector DB (Milvus/Weaviate/Qdrant) + Document parsing |
| **Tool Management** | MCP protocol integration, Skills capability + Extensible tool ecosystem |
| **Workflow Orchestration** | Visual workflow designer, drag-and-drop node orchestration, SSE streaming execution, currently supports model calls, email sending, manual review nodes |