feat: 重构项目结构并添加核心功能
refactor: 将代码按功能模块重新组织到 core/runtime/control 等目录 feat(core): 添加 Context、FlowNode 等核心 trait 和类型 feat(runtime): 实现 FlowEngine 和状态管理 feat(control): 添加顺序/并行/条件控制流节点 feat(nodes): 实现 HTTP/DB/MQ 等业务节点 docs: 更新 README 添加架构说明和快速开始示例 test: 添加性能测试脚本和示例代码
This commit is contained in:
39
dsl-flow/Cargo.toml
Normal file
39
dsl-flow/Cargo.toml
Normal file
@ -0,0 +1,39 @@
|
||||
[package]
|
||||
name = "dsl-flow"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
license = "MIT"
|
||||
description = "A Rust DSL-based workflow engine supporting stateful/stateless flows, async nodes, fork-join, and extensible expression engines (Rhai/JS)."
|
||||
readme = "README.md"
|
||||
|
||||
[lib]
|
||||
name = "dsl_flow"
|
||||
path = "src/lib.rs"
|
||||
|
||||
[features]
|
||||
default = ["rhai", "http"]
|
||||
rhai = ["dep:rhai"]
|
||||
js = ["dep:boa_engine"]
|
||||
http = ["dep:reqwest"]
|
||||
|
||||
[dependencies]
|
||||
tokio = { version = "1", features = ["rt-multi-thread", "macros", "sync", "time"] }
|
||||
futures = "0.3"
|
||||
serde = { version = "1.0", features = ["derive"] }
|
||||
serde_json = { version = "1.0" }
|
||||
thiserror = "1.0"
|
||||
async-trait = "0.1"
|
||||
uuid = { version = "1", features = ["v4"] }
|
||||
tracing = "0.1"
|
||||
tracing-subscriber = { version = "0.3", features = ["fmt", "env-filter"] }
|
||||
|
||||
# Optional engines and nodes
|
||||
rhai = { version = "1", optional = true, features = ["serde", "sync"] }
|
||||
boa_engine = { version = "0.20", optional = true }
|
||||
reqwest = { version = "0.12", optional = true, features = ["json", "rustls-tls"] }
|
||||
|
||||
[dev-dependencies]
|
||||
tokio = { version = "1", features = ["rt-multi-thread", "macros", "sync", "time"] }
|
||||
httpmock = "0.7"
|
||||
anyhow = "1.0"
|
||||
|
||||
164
dsl-flow/README.md
Normal file
164
dsl-flow/README.md
Normal file
@ -0,0 +1,164 @@
|
||||
# DSL Flow
|
||||
|
||||
[](https://crates.io/crates/dsl-flow)
|
||||
[](https://docs.rs/dsl-flow)
|
||||
[](LICENSE)
|
||||
|
||||
**DSL Flow** 是一个基于 Rust 的高性能、可扩展的工作流引擎。它旨在通过声明式的 DSL(领域特定语言)来编排复杂的异步任务,支持状态管理、多语言脚本执行(Rhai/JavaScript)以及灵活的控制流。
|
||||
|
||||
该项目专为构建中间件、API 编排层和数据处理管道而设计,采用清晰的分层架构,易于扩展和维护。
|
||||
|
||||
## ✨ 核心特性
|
||||
|
||||
* **声明式 DSL**: 使用 Rust 宏 (`sequence!`, `fork_join!`, `group!`) 轻松定义复杂的流程结构。
|
||||
* **多语言支持**:
|
||||
* **表达式求值**: 支持 Rhai 和 JS 表达式,用于条件判断和简单数据处理。
|
||||
* **代码执行**: 支持运行完整的 Rhai 或 JS 代码块,处理复杂业务逻辑。
|
||||
* **强大的控制流**:
|
||||
* **Sequence**: 顺序执行任务。
|
||||
* **Fork-Join**: 并行执行多个分支,支持灵活的结果合并策略(Array 或 Object)。
|
||||
* **Conditional**: 基于表达式结果的条件分支 (`if-else`)。
|
||||
* **状态管理**: 支持无状态(Stateless)和有状态(Stateful)执行模式。内置内存状态存储,可扩展至 Redis/Database。
|
||||
* **内置节点**: 提供 HTTP 请求、数据库模拟、消息队列模拟、血缘追踪等常用节点。
|
||||
* **异步高性能**: 基于 `Tokio` 和 `Async Trait`,全链路异步非阻塞。
|
||||
* **模块化架构**: 内核(Core)、运行时(Runtime)、控制流(Control)与业务节点(Nodes)分离,职责单一。
|
||||
|
||||
## 📦 安装
|
||||
|
||||
在你的 `Cargo.toml` 中添加依赖:
|
||||
|
||||
```toml
|
||||
[dependencies]
|
||||
dsl-flow = "0.1"
|
||||
```
|
||||
|
||||
启用特定特性(可选):
|
||||
|
||||
```toml
|
||||
[dependencies]
|
||||
dsl-flow = { version = "0.1", features = ["js", "http"] }
|
||||
```
|
||||
|
||||
* `rhai`: 启用 Rhai 脚本引擎(默认开启)。
|
||||
* `js`: 启用 JavaScript (Boa) 脚本引擎。
|
||||
* `http`: 启用 HTTP 请求节点支持。
|
||||
|
||||
## 🚀 快速开始
|
||||
|
||||
以下示例展示了如何定义一个简单的流程:先计算 `1+2`,然后并行计算其 2 倍和 3 倍。
|
||||
|
||||
```rust
|
||||
use dsl_flow::*;
|
||||
use dsl_flow::dsl::*; // 导入 DSL 宏和辅助函数
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
|
||||
// 1. 初始化引擎
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions {
|
||||
stateful: false,
|
||||
expr_engine: ExprEngineKind::Rhai
|
||||
});
|
||||
|
||||
// 2. 定义流程
|
||||
let flow = Flow::new(sequence! {
|
||||
// 步骤 1: 设置初始值
|
||||
expr_set(ExprEngineKind::Rhai, "1 + 2", "calc.sum"),
|
||||
|
||||
// 步骤 2: 并行执行
|
||||
fork_join! {
|
||||
// 使用 Rhai 代码块
|
||||
rhai("let sum = ctx.calc.sum; sum * 2"),
|
||||
// 使用 JS 代码块
|
||||
js("const sum = ctx.calc.sum; sum * 3")
|
||||
}
|
||||
});
|
||||
|
||||
// 3. 运行流程
|
||||
let ctx = Context::new();
|
||||
let output = engine.run_stateless(&flow, ctx).await?;
|
||||
|
||||
println!("Result: {}", output.data);
|
||||
Ok(())
|
||||
}
|
||||
```
|
||||
|
||||
## 🏗️ 架构概览
|
||||
|
||||
DSL Flow 采用清晰的分层架构设计,各层职责分明:
|
||||
|
||||
```text
|
||||
dsl-flow/src/
|
||||
├── core/ <-- [内核层] 纯粹的抽象和基础类型
|
||||
│ ├── mod.rs
|
||||
│ ├── traits.rs (FlowNode, ExprEngine 接口定义)
|
||||
│ ├── context.rs (Context 上下文管理)
|
||||
│ ├── error.rs (统一的错误处理)
|
||||
│ ├── executor.rs (TaskExecutor 执行器特征)
|
||||
│ └── types.rs (NodeId, NodeOutput 等通用类型)
|
||||
│
|
||||
├── runtime/ <-- [运行时层] 负责跑起来
|
||||
│ ├── mod.rs
|
||||
│ ├── engine.rs (FlowEngine 调度器)
|
||||
│ ├── state.rs (FlowState 状态存储)
|
||||
│ └── expr.rs (Rhai/JS 脚本引擎的具体实现)
|
||||
│
|
||||
├── control/ <-- [控制层] 流程控制逻辑
|
||||
│ ├── mod.rs
|
||||
│ ├── sequence.rs (顺序执行)
|
||||
│ ├── fork_join.rs (并行分支)
|
||||
│ └── conditional.rs (条件判断)
|
||||
│
|
||||
├── expression/ <-- [表达式层] 表达式处理
|
||||
│ ├── mod.rs
|
||||
│ ├── expr_set.rs (设置变量)
|
||||
│ └── expr_get.rs (获取变量)
|
||||
│
|
||||
├── nodes/ <-- [业务层] 具体的业务节点实现
|
||||
│ ├── mod.rs
|
||||
│ ├── code.rs (通用代码执行: Rhai/JS)
|
||||
│ ├── io.rs (IO 操作: Http, Db, Mq)
|
||||
│ └── lineage.rs (血缘追踪)
|
||||
│
|
||||
├── dsl.rs <-- [接口层] 对外暴露的 DSL 构建宏和辅助函数
|
||||
└── lib.rs <-- [入口] 统一导出公共 API
|
||||
```
|
||||
|
||||
### 分层说明
|
||||
|
||||
* **Core**: 定义了 `FlowNode` 和 `TaskExecutor` 接口,是整个框架的基石。
|
||||
* **Runtime**: 负责调度流程执行、管理状态和实例化脚本引擎。
|
||||
* **Control**: 实现了流程控制逻辑,如串行、并行、分支等。
|
||||
* **Expression**: 专注于简单的表达式求值和上下文变量操作。
|
||||
* **Nodes**: 包含所有具体的业务逻辑节点,如代码执行、网络请求、数据库操作等。
|
||||
|
||||
## 🔧 节点类型
|
||||
|
||||
### 控制流 (Control Flow)
|
||||
* **Sequence**: `sequence! { node1, node2 }` - 按顺序执行。
|
||||
* **ForkJoin**: `fork_join! { node1, node2 }` - 并行执行,结果收集为数组。
|
||||
* **Conditional**: `ConditionalExecutor` - 基于表达式结果的条件分支 (`if-else`)。
|
||||
|
||||
### 代码与表达式 (Code & Expression)
|
||||
* **ExprSet**: `expr_set(...)` - 执行表达式并将结果写入 Context。
|
||||
* **Rhai Code**: `rhai("code")` - 执行一段 Rhai 代码。
|
||||
* **JS Code**: `js("code")` - 执行一段 JavaScript 代码。
|
||||
|
||||
### IO 与副作用 (Side Effects)
|
||||
* **HttpNode**: `http_get/post(...)` - 发送 HTTP 请求。
|
||||
* **DbNode**: `db_node(...)` - 模拟数据库操作。
|
||||
* **MqNode**: `mq_node(...)` - 模拟消息队列发送。
|
||||
|
||||
## 🤝 贡献
|
||||
|
||||
欢迎提交 Issue 和 Pull Request!
|
||||
|
||||
1. Fork 本仓库。
|
||||
2. 创建你的特性分支 (`git checkout -b feature/AmazingFeature`)。
|
||||
3. 提交你的更改 (`git commit -m 'Add some AmazingFeature'`)。
|
||||
4. 推送到分支 (`git push origin feature/AmazingFeature`)。
|
||||
5. 打开一个 Pull Request。
|
||||
|
||||
## 📄 许可证
|
||||
|
||||
本项目采用 MIT 许可证 - 详情请见 [LICENSE](LICENSE) 文件。
|
||||
19
dsl-flow/examples/basic.rs
Normal file
19
dsl-flow/examples/basic.rs
Normal file
@ -0,0 +1,19 @@
|
||||
use dsl_flow::*;
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> Result<(), Box<dyn std::error::Error + Send + Sync>> {
|
||||
tracing_subscriber::fmt().with_env_filter("info").init();
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions { stateful: false, expr_engine: ExprEngineKind::Rhai });
|
||||
let flow = Flow::new(sequence! {
|
||||
expr_set(ExprEngineKind::Rhai, "1 + 2", "calc.sum"),
|
||||
fork_join! {
|
||||
expr_set(ExprEngineKind::Rhai, "ctx.calc.sum * 2", "calc.double"),
|
||||
expr_set(ExprEngineKind::Rhai, "ctx.calc.sum * 3", "calc.triple")
|
||||
}
|
||||
});
|
||||
let ctx = Context::new();
|
||||
let out = engine.run_stateless(&flow, ctx).await?;
|
||||
println!("{}", serde_json::to_string_pretty(&out.data)?);
|
||||
Ok(())
|
||||
}
|
||||
67
dsl-flow/examples/report.rs
Normal file
67
dsl-flow/examples/report.rs
Normal file
@ -0,0 +1,67 @@
|
||||
use std::fs;
|
||||
use std::path::PathBuf;
|
||||
use serde_json::Value;
|
||||
|
||||
fn read_json_lines(path: &str) -> Vec<Value> {
|
||||
let p = PathBuf::from(path);
|
||||
if !p.exists() {
|
||||
return vec![];
|
||||
}
|
||||
let content = fs::read_to_string(p).unwrap_or_default();
|
||||
content
|
||||
.lines()
|
||||
.filter_map(|l| serde_json::from_str::<Value>(l).ok())
|
||||
.collect()
|
||||
}
|
||||
|
||||
fn summarize(items: Vec<Value>) -> (usize, usize, f64, Vec<(String, f64)>) {
|
||||
let mut total = 0usize;
|
||||
let mut failed = 0usize;
|
||||
let mut duration = 0f64;
|
||||
let mut tests = Vec::new();
|
||||
for v in items {
|
||||
let t = v.get("type").and_then(|x| x.as_str()).unwrap_or("");
|
||||
if t == "test" {
|
||||
let name = v.get("name").and_then(|x| x.as_str()).unwrap_or("").to_string();
|
||||
let event = v.get("event").and_then(|x| x.as_str()).unwrap_or("");
|
||||
let time = v.get("exec_time").and_then(|x| x.as_f64()).unwrap_or(0.0);
|
||||
total += 1;
|
||||
if event == "failed" {
|
||||
failed += 1;
|
||||
}
|
||||
duration += time;
|
||||
tests.push((name, time));
|
||||
}
|
||||
}
|
||||
tests.sort_by(|a, b| b.1.partial_cmp(&a.1).unwrap());
|
||||
(total, failed, duration, tests)
|
||||
}
|
||||
|
||||
fn main() {
|
||||
let out_dir = PathBuf::from("target/test-reports");
|
||||
let _ = fs::create_dir_all(&out_dir);
|
||||
let default = read_json_lines("target/test-report-default.json");
|
||||
let js = read_json_lines("target/test-report-js.json");
|
||||
let (t1, f1, d1, s1) = summarize(default);
|
||||
let (t2, f2, d2, s2) = summarize(js);
|
||||
let mut md = String::new();
|
||||
md.push_str("# dsl-flow Test Report\n");
|
||||
md.push_str("\n");
|
||||
md.push_str("## Default features\n");
|
||||
md.push_str(&format!("- total: {}\n- failed: {}\n- duration: {:.3}s\n", t1, f1, d1));
|
||||
md.push_str("- top slow tests:\n");
|
||||
for (name, time) in s1.iter().take(5) {
|
||||
md.push_str(&format!(" - {}: {:.3}s\n", name, time));
|
||||
}
|
||||
md.push_str("\n");
|
||||
md.push_str("## JS feature\n");
|
||||
md.push_str(&format!("- total: {}\n- failed: {}\n- duration: {:.3}s\n", t2, f2, d2));
|
||||
md.push_str("- top slow tests:\n");
|
||||
for (name, time) in s2.iter().take(5) {
|
||||
md.push_str(&format!(" - {}: {:.3}s\n", name, time));
|
||||
}
|
||||
let out_path = out_dir.join("summary.md");
|
||||
let _ = fs::write(out_path, md);
|
||||
println!("report generated in target/test-reports/summary.md");
|
||||
}
|
||||
|
||||
31
dsl-flow/src/control/conditional.rs
Normal file
31
dsl-flow/src/control/conditional.rs
Normal file
@ -0,0 +1,31 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use crate::core::types::NodeRef;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 条件分支执行器
|
||||
#[derive(Clone)]
|
||||
pub struct ConditionalExecutor {
|
||||
#[allow(dead_code)]
|
||||
pub engine: crate::runtime::ExprEngineKind,
|
||||
pub condition: String,
|
||||
pub then_node: NodeRef,
|
||||
pub else_node: Option<NodeRef>,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for ConditionalExecutor {
|
||||
async fn execute(&self, ctx: &mut Context, expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
let engine = expr.ok_or_else(|| NodeError::Exec("Expr engine not provided".into()))?;
|
||||
let val = engine.eval(&self.condition, ctx).await.map_err(|e| NodeError::Exec(e.to_string()))?;
|
||||
let cond = match val {
|
||||
Value::Bool(b) => b,
|
||||
_ => false,
|
||||
};
|
||||
let selected = if cond { &self.then_node } else { self.else_node.as_ref().unwrap_or(&self.then_node) };
|
||||
let out = selected.execute(ctx, Some(engine)).await?;
|
||||
Ok(out.data)
|
||||
}
|
||||
}
|
||||
49
dsl-flow/src/control/fork_join.rs
Normal file
49
dsl-flow/src/control/fork_join.rs
Normal file
@ -0,0 +1,49 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use crate::core::types::NodeRef;
|
||||
use crate::control::merge_mode::MergeMode;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 并行分支执行器 (Fork-Join)
|
||||
#[derive(Clone)]
|
||||
pub struct ForkJoinExecutor {
|
||||
pub branches: Vec<NodeRef>,
|
||||
pub merge_to_ctx: Option<String>,
|
||||
pub merge_mode: MergeMode,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for ForkJoinExecutor {
|
||||
async fn execute(&self, ctx: &mut Context, expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
let mut tasks = Vec::with_capacity(self.branches.len());
|
||||
for b in &self.branches {
|
||||
let mut subctx = Context::with_value(ctx.as_value().clone());
|
||||
let b = b.clone();
|
||||
tasks.push(Box::pin(async move { b.execute(&mut subctx, expr).await }));
|
||||
}
|
||||
let joined = futures::future::join_all(tasks).await;
|
||||
let mut results = Vec::new();
|
||||
for res in joined {
|
||||
let out = res?;
|
||||
results.push(serde_json::json!({ "id": out.id, "data": out.data }));
|
||||
}
|
||||
let data = match self.merge_mode {
|
||||
MergeMode::Array => Value::Array(results.clone()),
|
||||
MergeMode::ObjectById => {
|
||||
let mut map = serde_json::Map::new();
|
||||
for item in &results {
|
||||
let id = item.get("id").and_then(|v| v.as_str()).unwrap_or_default().to_string();
|
||||
let data = item.get("data").cloned().unwrap_or(Value::Null);
|
||||
map.insert(id, data);
|
||||
}
|
||||
Value::Object(map)
|
||||
}
|
||||
};
|
||||
if let Some(path) = &self.merge_to_ctx {
|
||||
ctx.set(path, data.clone());
|
||||
}
|
||||
Ok(data)
|
||||
}
|
||||
}
|
||||
50
dsl-flow/src/control/group.rs
Normal file
50
dsl-flow/src/control/group.rs
Normal file
@ -0,0 +1,50 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use crate::core::types::NodeRef;
|
||||
use crate::control::merge_mode::MergeMode;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 并行执行组执行器
|
||||
#[derive(Clone)]
|
||||
pub struct GroupExecutor {
|
||||
pub parallel: Vec<NodeRef>,
|
||||
pub merge_to_ctx: Option<String>,
|
||||
pub merge_mode: MergeMode,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for GroupExecutor {
|
||||
async fn execute(&self, ctx: &mut Context, expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
// Logic is same as ForkJoin currently
|
||||
let mut joins = Vec::with_capacity(self.parallel.len());
|
||||
for n in &self.parallel {
|
||||
let mut subctx = Context::with_value(ctx.as_value().clone());
|
||||
let n = n.clone();
|
||||
joins.push(Box::pin(async move { n.execute(&mut subctx, expr).await }));
|
||||
}
|
||||
let joined = futures::future::join_all(joins).await;
|
||||
let mut results = Vec::new();
|
||||
for res in joined {
|
||||
let out = res?;
|
||||
results.push(serde_json::json!({ "id": out.id, "data": out.data }));
|
||||
}
|
||||
let data = match self.merge_mode {
|
||||
MergeMode::Array => Value::Array(results.clone()),
|
||||
MergeMode::ObjectById => {
|
||||
let mut map = serde_json::Map::new();
|
||||
for item in &results {
|
||||
let id = item.get("id").and_then(|v| v.as_str()).unwrap_or_default().to_string();
|
||||
let data = item.get("data").cloned().unwrap_or(Value::Null);
|
||||
map.insert(id, data);
|
||||
}
|
||||
Value::Object(map)
|
||||
}
|
||||
};
|
||||
if let Some(path) = &self.merge_to_ctx {
|
||||
ctx.set(path, data.clone());
|
||||
}
|
||||
Ok(data)
|
||||
}
|
||||
}
|
||||
8
dsl-flow/src/control/merge_mode.rs
Normal file
8
dsl-flow/src/control/merge_mode.rs
Normal file
@ -0,0 +1,8 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// 结果合并模式
|
||||
#[derive(Debug, Clone, Copy, Serialize, Deserialize)]
|
||||
pub enum MergeMode {
|
||||
Array,
|
||||
ObjectById,
|
||||
}
|
||||
11
dsl-flow/src/control/mod.rs
Normal file
11
dsl-flow/src/control/mod.rs
Normal file
@ -0,0 +1,11 @@
|
||||
pub mod sequence;
|
||||
pub mod fork_join;
|
||||
pub mod group;
|
||||
pub mod conditional;
|
||||
pub mod merge_mode;
|
||||
|
||||
pub use sequence::SequenceExecutor;
|
||||
pub use fork_join::ForkJoinExecutor;
|
||||
pub use group::GroupExecutor;
|
||||
pub use conditional::ConditionalExecutor;
|
||||
pub use merge_mode::MergeMode;
|
||||
24
dsl-flow/src/control/sequence.rs
Normal file
24
dsl-flow/src/control/sequence.rs
Normal file
@ -0,0 +1,24 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use crate::core::types::NodeRef;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 顺序执行器
|
||||
#[derive(Clone)]
|
||||
pub struct SequenceExecutor {
|
||||
pub children: Vec<NodeRef>,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for SequenceExecutor {
|
||||
async fn execute(&self, ctx: &mut Context, expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
let mut last = Value::Null;
|
||||
for child in &self.children {
|
||||
let out = child.execute(ctx, expr).await?;
|
||||
last = out.data;
|
||||
}
|
||||
Ok(last)
|
||||
}
|
||||
}
|
||||
135
dsl-flow/src/core/context.rs
Normal file
135
dsl-flow/src/core/context.rs
Normal file
@ -0,0 +1,135 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
use serde_json::{json, Value};
|
||||
use std::time::{SystemTime, UNIX_EPOCH};
|
||||
|
||||
/// 执行上下文,存储流程数据和元数据
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
pub struct Context {
|
||||
/// 实际数据 (JSON 对象)
|
||||
data: Value,
|
||||
/// 元数据 (如血缘信息)
|
||||
meta: ContextMeta,
|
||||
}
|
||||
|
||||
impl Context {
|
||||
/// 创建一个新的空上下文
|
||||
pub fn new() -> Self {
|
||||
Self { data: json!({}), meta: ContextMeta::default() }
|
||||
}
|
||||
|
||||
/// 使用给定值创建上下文
|
||||
pub fn with_value(value: Value) -> Self {
|
||||
Self { data: value, meta: ContextMeta::default() }
|
||||
}
|
||||
|
||||
/// 获取指定路径的值
|
||||
///
|
||||
/// # 参数
|
||||
/// * `path` - 路径 (例如 "foo.bar")
|
||||
pub fn get(&self, path: impl AsRef<str>) -> Option<Value> {
|
||||
get_path(&self.data, path.as_ref())
|
||||
}
|
||||
|
||||
/// 设置指定路径的值
|
||||
///
|
||||
/// # 参数
|
||||
/// * `path` - 路径 (例如 "foo.bar")
|
||||
/// * `value` - 值
|
||||
pub fn set(&mut self, path: impl AsRef<str>, value: Value) {
|
||||
set_path(&mut self.data, path.as_ref(), value);
|
||||
}
|
||||
|
||||
/// 获取底层 JSON 值引用
|
||||
pub fn as_value(&self) -> &Value {
|
||||
&self.data
|
||||
}
|
||||
|
||||
/// 记录写入操作 (用于血缘追踪)
|
||||
pub fn record_write(&mut self, node_id: impl Into<String>, path: impl Into<String>) {
|
||||
let ts = SystemTime::now().duration_since(UNIX_EPOCH).unwrap().as_millis() as u64;
|
||||
self.meta.lineage.push(LineageEntry { node_id: node_id.into(), path: path.into(), ts });
|
||||
}
|
||||
|
||||
/// 获取血缘记录
|
||||
pub fn lineage(&self) -> Vec<LineageEntry> {
|
||||
self.meta.lineage.clone()
|
||||
}
|
||||
}
|
||||
|
||||
/// 路径辅助结构
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ValuePath;
|
||||
|
||||
/// 上下文元数据
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||
pub struct ContextMeta {
|
||||
/// 血缘记录列表
|
||||
pub lineage: Vec<LineageEntry>,
|
||||
}
|
||||
|
||||
/// 血缘记录条目
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct LineageEntry {
|
||||
/// 写入节点 ID
|
||||
pub node_id: String,
|
||||
/// 写入路径
|
||||
pub path: String,
|
||||
/// 时间戳 (毫秒)
|
||||
pub ts: u64,
|
||||
}
|
||||
|
||||
fn split_path(path: &str) -> Vec<&str> {
|
||||
path.split('.').filter(|s| !s.is_empty()).collect()
|
||||
}
|
||||
|
||||
fn get_path(root: &Value, path: &str) -> Option<Value> {
|
||||
let mut cur = root;
|
||||
for seg in split_path(path) {
|
||||
match cur {
|
||||
Value::Object(map) => {
|
||||
cur = map.get(seg)?;
|
||||
}
|
||||
Value::Array(arr) => {
|
||||
if let Ok(idx) = seg.parse::<usize>() {
|
||||
cur = arr.get(idx)?;
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
}
|
||||
_ => return None,
|
||||
}
|
||||
}
|
||||
Some(cur.clone())
|
||||
}
|
||||
|
||||
fn set_path(root: &mut Value, path: &str, value: Value) {
|
||||
let segs = split_path(path);
|
||||
if segs.is_empty() { return; }
|
||||
|
||||
let mut cur = root;
|
||||
for (i, seg) in segs.iter().enumerate() {
|
||||
let is_last = i == segs.len() - 1;
|
||||
if is_last {
|
||||
match cur {
|
||||
Value::Object(map) => {
|
||||
map.insert((*seg).to_string(), value);
|
||||
}
|
||||
_ => {
|
||||
*cur = json!({ (*seg): value });
|
||||
}
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
if !cur.is_object() {
|
||||
*cur = json!({});
|
||||
}
|
||||
|
||||
if let Value::Object(map) = cur {
|
||||
if !map.contains_key(*seg) {
|
||||
map.insert((*seg).to_string(), json!({}));
|
||||
}
|
||||
cur = map.get_mut(*seg).unwrap();
|
||||
}
|
||||
}
|
||||
}
|
||||
19
dsl-flow/src/core/error.rs
Normal file
19
dsl-flow/src/core/error.rs
Normal file
@ -0,0 +1,19 @@
|
||||
use thiserror::Error;
|
||||
|
||||
/// 节点执行错误
|
||||
#[derive(Debug, Error)]
|
||||
pub enum NodeError {
|
||||
#[error("Execution error: {0}")]
|
||||
Exec(String),
|
||||
}
|
||||
|
||||
/// 表达式执行错误
|
||||
#[derive(Debug, Error)]
|
||||
pub enum ExprError {
|
||||
#[error("Rhai error: {0}")]
|
||||
Rhai(String),
|
||||
#[error("JS error: {0}")]
|
||||
Js(String),
|
||||
#[error("Unsupported engine")]
|
||||
Unsupported,
|
||||
}
|
||||
11
dsl-flow/src/core/executor.rs
Normal file
11
dsl-flow/src/core/executor.rs
Normal file
@ -0,0 +1,11 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 任务执行器特征
|
||||
#[async_trait::async_trait]
|
||||
pub trait TaskExecutor: Send + Sync {
|
||||
/// 执行任务逻辑
|
||||
async fn execute(&self, ctx: &mut Context, expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError>;
|
||||
}
|
||||
14
dsl-flow/src/core/mod.rs
Normal file
14
dsl-flow/src/core/mod.rs
Normal file
@ -0,0 +1,14 @@
|
||||
pub mod context;
|
||||
pub mod types;
|
||||
pub mod error;
|
||||
pub mod traits;
|
||||
pub mod executor;
|
||||
|
||||
pub mod node;
|
||||
|
||||
pub use context::{Context, LineageEntry};
|
||||
pub use types::{NodeId, NodeOutput};
|
||||
pub use error::{NodeError, ExprError};
|
||||
pub use traits::{FlowNode, ExprEngine};
|
||||
pub use executor::TaskExecutor;
|
||||
pub use node::Node;
|
||||
35
dsl-flow/src/core/node.rs
Normal file
35
dsl-flow/src/core/node.rs
Normal file
@ -0,0 +1,35 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::{ExprEngine, FlowNode};
|
||||
use crate::core::types::{NodeId, NodeOutput};
|
||||
|
||||
/// 通用节点
|
||||
pub struct Node {
|
||||
pub id: NodeId,
|
||||
pub name: Option<String>,
|
||||
pub executor: Box<dyn TaskExecutor>,
|
||||
}
|
||||
|
||||
impl Node {
|
||||
pub fn new(id: NodeId, executor: Box<dyn TaskExecutor>) -> Self {
|
||||
Self { id, name: None, executor }
|
||||
}
|
||||
|
||||
pub fn with_name(mut self, name: impl Into<String>) -> Self {
|
||||
self.name = Some(name.into());
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl FlowNode for Node {
|
||||
fn id(&self) -> &str {
|
||||
&self.id
|
||||
}
|
||||
|
||||
async fn execute(&self, ctx: &mut Context, expr: Option<&dyn ExprEngine>) -> Result<NodeOutput, NodeError> {
|
||||
let data = self.executor.execute(ctx, expr).await?;
|
||||
Ok(NodeOutput { id: self.id.clone(), data })
|
||||
}
|
||||
}
|
||||
28
dsl-flow/src/core/traits.rs
Normal file
28
dsl-flow/src/core/traits.rs
Normal file
@ -0,0 +1,28 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::{ExprError, NodeError};
|
||||
use crate::core::types::NodeOutput;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 流程节点特征
|
||||
#[async_trait::async_trait]
|
||||
pub trait FlowNode: Send + Sync {
|
||||
/// 获取节点 ID
|
||||
fn id(&self) -> &str;
|
||||
/// 执行节点逻辑
|
||||
///
|
||||
/// # 参数
|
||||
/// * `ctx` - 执行上下文
|
||||
/// * `expr` - 表达式引擎 (可选)
|
||||
async fn execute(&self, ctx: &mut Context, expr: Option<&dyn ExprEngine>) -> Result<NodeOutput, NodeError>;
|
||||
}
|
||||
|
||||
/// 表达式引擎特征
|
||||
#[async_trait::async_trait]
|
||||
pub trait ExprEngine: Send + Sync {
|
||||
/// 执行脚本
|
||||
///
|
||||
/// # 参数
|
||||
/// * `script` - 脚本内容
|
||||
/// * `ctx` - 执行上下文
|
||||
async fn eval(&self, script: &str, ctx: &Context) -> Result<Value, ExprError>;
|
||||
}
|
||||
28
dsl-flow/src/core/types.rs
Normal file
28
dsl-flow/src/core/types.rs
Normal file
@ -0,0 +1,28 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
use serde_json::Value;
|
||||
|
||||
use crate::core::traits::FlowNode;
|
||||
use std::sync::Arc;
|
||||
|
||||
/// 节点引用类型 (Arc<dyn FlowNode>)
|
||||
pub type NodeRef = Arc<dyn FlowNode>;
|
||||
|
||||
/// 节点 ID 类型
|
||||
pub type NodeId = String;
|
||||
|
||||
/// 节点输出
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct NodeOutput {
|
||||
/// 节点 ID
|
||||
pub id: NodeId,
|
||||
/// 节点输出数据
|
||||
pub data: Value,
|
||||
}
|
||||
|
||||
/// 生成唯一的节点 ID
|
||||
///
|
||||
/// # 参数
|
||||
/// * `prefix` - ID 前缀
|
||||
pub fn node_id(prefix: &str) -> NodeId {
|
||||
format!("{}-{}", prefix, uuid::Uuid::new_v4())
|
||||
}
|
||||
237
dsl-flow/src/dsl.rs
Normal file
237
dsl-flow/src/dsl.rs
Normal file
@ -0,0 +1,237 @@
|
||||
use crate::core::node::Node;
|
||||
use crate::core::types::node_id;
|
||||
use crate::control::MergeMode;
|
||||
use crate::expression::{
|
||||
ExprSetExecutor, ExprGetExecutor
|
||||
};
|
||||
use crate::nodes::{
|
||||
DbExecutor, MqExecutor, LineageExecutor
|
||||
};
|
||||
#[cfg(feature = "http")]
|
||||
use crate::nodes::HttpExecutor;
|
||||
use crate::runtime::ExprEngineKind;
|
||||
|
||||
/// 创建一个顺序执行的节点
|
||||
#[macro_export]
|
||||
macro_rules! sequence {
|
||||
( $($node:expr),* $(,)? ) => {{
|
||||
let mut nodes: Vec<std::sync::Arc<dyn $crate::core::traits::FlowNode>> = Vec::new();
|
||||
$(
|
||||
nodes.push(std::sync::Arc::new($node));
|
||||
)*
|
||||
$crate::core::node::Node::new(
|
||||
$crate::core::types::node_id("seq"),
|
||||
Box::new($crate::control::SequenceExecutor { children: nodes })
|
||||
)
|
||||
}};
|
||||
}
|
||||
|
||||
/// 创建一个并行分支执行的节点 (Fork-Join)
|
||||
#[macro_export]
|
||||
macro_rules! fork_join {
|
||||
( $($node:expr),* $(,)? ) => {{
|
||||
let mut nodes: Vec<std::sync::Arc<dyn $crate::core::traits::FlowNode>> = Vec::new();
|
||||
$(
|
||||
nodes.push(std::sync::Arc::new($node));
|
||||
)*
|
||||
$crate::core::node::Node::new(
|
||||
$crate::core::types::node_id("fork"),
|
||||
Box::new($crate::control::ForkJoinExecutor {
|
||||
branches: nodes,
|
||||
merge_to_ctx: None,
|
||||
merge_mode: $crate::control::MergeMode::Array
|
||||
})
|
||||
)
|
||||
}};
|
||||
}
|
||||
|
||||
/// 创建一个并行分支执行的节点,并合并结果
|
||||
#[macro_export]
|
||||
macro_rules! fork_join_merge {
|
||||
( $merge_path:expr, $mode:expr, $( $node:expr ),* $(,)? ) => {{
|
||||
let mut nodes: Vec<std::sync::Arc<dyn $crate::core::traits::FlowNode>> = Vec::new();
|
||||
$(
|
||||
nodes.push(std::sync::Arc::new($node));
|
||||
)*
|
||||
$crate::core::node::Node::new(
|
||||
$crate::core::types::node_id("fork"),
|
||||
Box::new($crate::control::ForkJoinExecutor {
|
||||
branches: nodes,
|
||||
merge_to_ctx: Some($merge_path.into()),
|
||||
merge_mode: $mode
|
||||
})
|
||||
)
|
||||
}};
|
||||
}
|
||||
|
||||
/// 创建一个并行执行组节点
|
||||
#[macro_export]
|
||||
macro_rules! group {
|
||||
( $($node:expr),* $(,)? ) => {{
|
||||
let mut nodes: Vec<std::sync::Arc<dyn $crate::core::traits::FlowNode>> = Vec::new();
|
||||
$(
|
||||
nodes.push(std::sync::Arc::new($node));
|
||||
)*
|
||||
$crate::core::node::Node::new(
|
||||
$crate::core::types::node_id("group"),
|
||||
Box::new($crate::control::GroupExecutor {
|
||||
parallel: nodes,
|
||||
merge_to_ctx: None,
|
||||
merge_mode: $crate::control::MergeMode::Array
|
||||
})
|
||||
)
|
||||
}};
|
||||
}
|
||||
|
||||
/// 创建一个并行执行组节点,并合并结果
|
||||
#[macro_export]
|
||||
macro_rules! group_merge {
|
||||
( $merge_path:expr, $mode:expr, $( $node:expr ),* $(,)? ) => {{
|
||||
let mut nodes: Vec<std::sync::Arc<dyn $crate::core::traits::FlowNode>> = Vec::new();
|
||||
$(
|
||||
nodes.push(std::sync::Arc::new($node));
|
||||
)*
|
||||
$crate::core::node::Node::new(
|
||||
$crate::core::types::node_id("group"),
|
||||
Box::new($crate::control::GroupExecutor {
|
||||
parallel: nodes,
|
||||
merge_to_ctx: Some($merge_path.into()),
|
||||
merge_mode: $mode
|
||||
})
|
||||
)
|
||||
}};
|
||||
}
|
||||
|
||||
/// 创建一个表达式设置节点
|
||||
pub fn expr_set(engine: ExprEngineKind, script: &str, target_path: &str) -> Node {
|
||||
Node::new(
|
||||
node_id("expr_set"),
|
||||
Box::new(ExprSetExecutor {
|
||||
engine,
|
||||
script: script.into(),
|
||||
target_path: target_path.into(),
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 创建一个表达式获取节点
|
||||
pub fn expr_get(engine: ExprEngineKind, script: &str) -> Node {
|
||||
Node::new(
|
||||
node_id("expr_get"),
|
||||
Box::new(ExprGetExecutor {
|
||||
engine,
|
||||
script: script.into(),
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 创建一个 Rhai 代码执行节点
|
||||
pub fn rhai(script: &str) -> Node {
|
||||
Node::new(
|
||||
node_id("rhai"),
|
||||
Box::new(crate::nodes::CodeExecutor {
|
||||
engine: ExprEngineKind::Rhai,
|
||||
script: script.into(),
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 创建一个 JavaScript 代码执行节点
|
||||
pub fn js(script: &str) -> Node {
|
||||
Node::new(
|
||||
node_id("js"),
|
||||
Box::new(crate::nodes::CodeExecutor {
|
||||
engine: ExprEngineKind::Js,
|
||||
script: script.into(),
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 创建一个通用代码执行节点
|
||||
pub fn run_code(engine: ExprEngineKind, script: &str) -> Node {
|
||||
Node::new(
|
||||
node_id("code"),
|
||||
Box::new(crate::nodes::CodeExecutor {
|
||||
engine,
|
||||
script: script.into(),
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 创建一个 HTTP GET 请求节点
|
||||
#[cfg(feature = "http")]
|
||||
pub fn http_get(url: &str) -> Node {
|
||||
Node::new(
|
||||
node_id("http"),
|
||||
Box::new(HttpExecutor {
|
||||
method: "GET".into(),
|
||||
url: url.into(),
|
||||
body: None,
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 创建一个 HTTP POST 请求节点
|
||||
#[cfg(feature = "http")]
|
||||
pub fn http_post(url: &str, body: serde_json::Value) -> Node {
|
||||
Node::new(
|
||||
node_id("http"),
|
||||
Box::new(HttpExecutor {
|
||||
method: "POST".into(),
|
||||
url: url.into(),
|
||||
body: Some(body),
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 创建一个数据库操作节点
|
||||
pub fn db_node(operation: &str, params: serde_json::Value) -> Node {
|
||||
Node::new(
|
||||
node_id("db"),
|
||||
Box::new(DbExecutor {
|
||||
operation: operation.into(),
|
||||
params,
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 创建一个消息队列节点
|
||||
pub fn mq_node(topic: &str, message: serde_json::Value) -> Node {
|
||||
Node::new(
|
||||
node_id("mq"),
|
||||
Box::new(MqExecutor {
|
||||
topic: topic.into(),
|
||||
message,
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 创建一个血缘追踪节点
|
||||
pub fn lineage_node() -> Node {
|
||||
Node::new(
|
||||
node_id("lineage"),
|
||||
Box::new(LineageExecutor {
|
||||
target_path: None,
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 创建一个带路径的血缘追踪节点
|
||||
pub fn lineage_node_to_path(path: &str) -> Node {
|
||||
Node::new(
|
||||
node_id("lineage"),
|
||||
Box::new(LineageExecutor {
|
||||
target_path: Some(path.into()),
|
||||
}),
|
||||
)
|
||||
}
|
||||
|
||||
/// 合并模式:按 ID 聚合为对象
|
||||
pub fn merge_mode_object_by_id() -> MergeMode {
|
||||
MergeMode::ObjectById
|
||||
}
|
||||
|
||||
/// 合并模式:聚合为数组
|
||||
pub fn merge_mode_array() -> MergeMode {
|
||||
MergeMode::Array
|
||||
}
|
||||
22
dsl-flow/src/expression/expr_get.rs
Normal file
22
dsl-flow/src/expression/expr_get.rs
Normal file
@ -0,0 +1,22 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 表达式获取执行器
|
||||
#[derive(Clone)]
|
||||
pub struct ExprGetExecutor {
|
||||
#[allow(dead_code)]
|
||||
pub engine: crate::runtime::ExprEngineKind,
|
||||
pub script: String,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for ExprGetExecutor {
|
||||
async fn execute(&self, ctx: &mut Context, expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
let engine = expr.ok_or_else(|| NodeError::Exec("Expr engine not provided".into()))?;
|
||||
let val = engine.eval(&self.script, ctx).await.map_err(|e| NodeError::Exec(e.to_string()))?;
|
||||
Ok(val)
|
||||
}
|
||||
}
|
||||
24
dsl-flow/src/expression/expr_set.rs
Normal file
24
dsl-flow/src/expression/expr_set.rs
Normal file
@ -0,0 +1,24 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 表达式设置执行器
|
||||
#[derive(Clone)]
|
||||
pub struct ExprSetExecutor {
|
||||
#[allow(dead_code)]
|
||||
pub engine: crate::runtime::ExprEngineKind,
|
||||
pub script: String,
|
||||
pub target_path: String,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for ExprSetExecutor {
|
||||
async fn execute(&self, ctx: &mut Context, expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
let engine = expr.ok_or_else(|| NodeError::Exec("Expr engine not provided".into()))?;
|
||||
let val = engine.eval(&self.script, ctx).await.map_err(|e| NodeError::Exec(e.to_string()))?;
|
||||
ctx.set(&self.target_path, val.clone());
|
||||
Ok(val)
|
||||
}
|
||||
}
|
||||
5
dsl-flow/src/expression/mod.rs
Normal file
5
dsl-flow/src/expression/mod.rs
Normal file
@ -0,0 +1,5 @@
|
||||
pub mod expr_set;
|
||||
pub mod expr_get;
|
||||
|
||||
pub use expr_set::ExprSetExecutor;
|
||||
pub use expr_get::ExprGetExecutor;
|
||||
58
dsl-flow/src/lib.rs
Normal file
58
dsl-flow/src/lib.rs
Normal file
@ -0,0 +1,58 @@
|
||||
//! dsl-flow 库入口
|
||||
//!
|
||||
//! 导出了核心模块和常用的结构体、特征。
|
||||
|
||||
pub mod core;
|
||||
pub mod runtime;
|
||||
pub mod nodes;
|
||||
pub mod control;
|
||||
pub mod expression;
|
||||
pub mod dsl;
|
||||
|
||||
pub use dsl::*;
|
||||
|
||||
// Re-export core types
|
||||
pub use core::{Context, FlowNode, NodeId, NodeOutput, NodeError, ExprError, ExprEngine, LineageEntry};
|
||||
|
||||
// Re-export runtime types
|
||||
pub use runtime::{Flow, FlowEngine, FlowOptions, FlowResult, ExprEngineKind, RhaiEngine, StateStore, InMemoryStateStore, FlowState};
|
||||
#[cfg(feature = "js")]
|
||||
pub use runtime::JsEngine;
|
||||
|
||||
// Re-export control types
|
||||
pub use control::{SequenceExecutor, ForkJoinExecutor, GroupExecutor, ConditionalExecutor, MergeMode};
|
||||
|
||||
// Re-export expression types
|
||||
pub use expression::{ExprSetExecutor, ExprGetExecutor};
|
||||
|
||||
// Re-export node types
|
||||
pub use nodes::{
|
||||
DbExecutor, MqExecutor, LineageExecutor, CodeExecutor
|
||||
};
|
||||
#[cfg(feature = "http")]
|
||||
pub use nodes::HttpExecutor;
|
||||
|
||||
pub use core::node::Node;
|
||||
pub use core::executor::TaskExecutor;
|
||||
|
||||
// Backward compatibility (deprecated, will be removed in future versions)
|
||||
pub mod engine {
|
||||
pub use crate::runtime::engine::*;
|
||||
}
|
||||
pub mod context {
|
||||
pub use crate::core::context::*;
|
||||
}
|
||||
pub mod expr {
|
||||
pub use crate::runtime::expr::*;
|
||||
pub use crate::core::error::ExprError;
|
||||
pub use crate::core::traits::ExprEngine;
|
||||
}
|
||||
pub mod node {
|
||||
pub use crate::nodes::*;
|
||||
pub use crate::core::types::{NodeId, NodeOutput, node_id};
|
||||
pub use crate::core::error::NodeError;
|
||||
pub use crate::core::traits::FlowNode;
|
||||
}
|
||||
pub mod state {
|
||||
pub use crate::runtime::state::*;
|
||||
}
|
||||
47
dsl-flow/src/nodes/code.rs
Normal file
47
dsl-flow/src/nodes/code.rs
Normal file
@ -0,0 +1,47 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use crate::runtime::ExprEngineKind;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 通用代码执行器
|
||||
///
|
||||
/// 支持多种语言(Rhai, JS 等)的脚本执行。
|
||||
/// 不同于表达式求值,代码节点通常用于执行一段完整的业务逻辑,
|
||||
/// 可能包含多行代码、控制流以及副作用。
|
||||
#[derive(Clone)]
|
||||
pub struct CodeExecutor {
|
||||
pub engine: ExprEngineKind,
|
||||
pub script: String,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for CodeExecutor {
|
||||
async fn execute(&self, ctx: &mut Context, _expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
match self.engine {
|
||||
ExprEngineKind::Rhai => {
|
||||
#[cfg(feature = "rhai")]
|
||||
{
|
||||
let engine = crate::runtime::RhaiEngine::new();
|
||||
engine.eval(&self.script, ctx).await.map_err(|e| NodeError::Exec(e.to_string()))
|
||||
}
|
||||
#[cfg(not(feature = "rhai"))]
|
||||
{
|
||||
Err(NodeError::Exec("Rhai feature is not enabled".to_string()))
|
||||
}
|
||||
}
|
||||
ExprEngineKind::Js => {
|
||||
#[cfg(feature = "js")]
|
||||
{
|
||||
let engine = crate::runtime::JsEngine::new();
|
||||
engine.eval(&self.script, ctx).await.map_err(|e| NodeError::Exec(e.to_string()))
|
||||
}
|
||||
#[cfg(not(feature = "js"))]
|
||||
{
|
||||
Err(NodeError::Exec("JS feature is not enabled".to_string()))
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
25
dsl-flow/src/nodes/db.rs
Normal file
25
dsl-flow/src/nodes/db.rs
Normal file
@ -0,0 +1,25 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 数据库操作执行器
|
||||
#[derive(Clone)]
|
||||
pub struct DbExecutor {
|
||||
pub operation: String,
|
||||
pub params: Value,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for DbExecutor {
|
||||
async fn execute(&self, ctx: &mut Context, _expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
let result = serde_json::json!({
|
||||
"op": self.operation,
|
||||
"params": self.params,
|
||||
"status": "ok"
|
||||
});
|
||||
ctx.set("db.last", result.clone());
|
||||
Ok(result)
|
||||
}
|
||||
}
|
||||
31
dsl-flow/src/nodes/http.rs
Normal file
31
dsl-flow/src/nodes/http.rs
Normal file
@ -0,0 +1,31 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use serde_json::Value;
|
||||
|
||||
/// HTTP 请求执行器
|
||||
#[cfg(feature = "http")]
|
||||
#[derive(Clone)]
|
||||
pub struct HttpExecutor {
|
||||
pub method: String,
|
||||
pub url: String,
|
||||
pub body: Option<Value>,
|
||||
}
|
||||
|
||||
#[cfg(feature = "http")]
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for HttpExecutor {
|
||||
async fn execute(&self, _ctx: &mut Context, _expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
let client = reqwest::Client::new();
|
||||
let resp = match self.method.as_str() {
|
||||
"GET" => client.get(&self.url).send().await,
|
||||
"POST" => client.post(&self.url).json(&self.body).send().await,
|
||||
_ => return Err(NodeError::Exec("Unsupported HTTP method".into())),
|
||||
}
|
||||
.map_err(|e| NodeError::Exec(format!("{e}")))?;
|
||||
let status = resp.status().as_u16();
|
||||
let json = resp.json::<Value>().await.map_err(|e| NodeError::Exec(format!("{e}")))?;
|
||||
Ok(serde_json::json!({ "status": status, "body": json }))
|
||||
}
|
||||
}
|
||||
33
dsl-flow/src/nodes/lineage.rs
Normal file
33
dsl-flow/src/nodes/lineage.rs
Normal file
@ -0,0 +1,33 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 血缘追踪执行器
|
||||
#[derive(Clone)]
|
||||
pub struct LineageExecutor {
|
||||
pub target_path: Option<String>,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for LineageExecutor {
|
||||
async fn execute(&self, ctx: &mut Context, _expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
let items = ctx.lineage();
|
||||
let data = serde_json::to_value(items).map_err(|e| NodeError::Exec(format!("{e}")))?;
|
||||
if let Some(p) = &self.target_path {
|
||||
ctx.set(p, data.clone());
|
||||
// Warning: context.record_write needs NodeId, but we don't have it here.
|
||||
// This is a limitation of the current refactor.
|
||||
// Ideally Context.record_write should rely on something else or Executor should receive NodeId.
|
||||
// But changing TaskExecutor signature requires updating all implementations.
|
||||
// For now, let's omit the record_write call here or assume LineageNode's purpose is to READ lineage, not WRITE it (except to context variable).
|
||||
// Actually, writing to context IS a write operation.
|
||||
// If we want to track that "LineageNode wrote to target_path", we need the ID.
|
||||
|
||||
// Temporary workaround: Use a fixed ID or skip recording the write of the lineage itself.
|
||||
// ctx.record_write("lineage-node".into(), p.clone());
|
||||
}
|
||||
Ok(data)
|
||||
}
|
||||
}
|
||||
11
dsl-flow/src/nodes/mod.rs
Normal file
11
dsl-flow/src/nodes/mod.rs
Normal file
@ -0,0 +1,11 @@
|
||||
pub mod http;
|
||||
pub mod db;
|
||||
pub mod mq;
|
||||
pub mod lineage;
|
||||
pub mod code;
|
||||
|
||||
pub use http::HttpExecutor;
|
||||
pub use db::DbExecutor;
|
||||
pub use mq::MqExecutor;
|
||||
pub use lineage::LineageExecutor;
|
||||
pub use code::CodeExecutor;
|
||||
25
dsl-flow/src/nodes/mq.rs
Normal file
25
dsl-flow/src/nodes/mq.rs
Normal file
@ -0,0 +1,25 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::NodeError;
|
||||
use crate::core::executor::TaskExecutor;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 消息队列执行器
|
||||
#[derive(Clone)]
|
||||
pub struct MqExecutor {
|
||||
pub topic: String,
|
||||
pub message: Value,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl TaskExecutor for MqExecutor {
|
||||
async fn execute(&self, ctx: &mut Context, _expr: Option<&dyn ExprEngine>) -> Result<Value, NodeError> {
|
||||
let result = serde_json::json!({
|
||||
"topic": self.topic,
|
||||
"message": self.message,
|
||||
"status": "sent"
|
||||
});
|
||||
ctx.set("mq.last", result.clone());
|
||||
Ok(result)
|
||||
}
|
||||
}
|
||||
117
dsl-flow/src/runtime/engine.rs
Normal file
117
dsl-flow/src/runtime/engine.rs
Normal file
@ -0,0 +1,117 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::traits::{ExprEngine, FlowNode};
|
||||
use crate::core::types::{NodeOutput, NodeRef, node_id};
|
||||
use crate::control::SequenceExecutor;
|
||||
use crate::core::node::Node;
|
||||
use crate::runtime::expr::ExprEngineKind;
|
||||
use crate::runtime::state::{FlowState, StateStore};
|
||||
use std::sync::Arc;
|
||||
|
||||
/// 流程定义
|
||||
pub struct Flow {
|
||||
/// 入口节点
|
||||
pub entry: NodeRef,
|
||||
}
|
||||
|
||||
impl Flow {
|
||||
/// 创建一个新的流程
|
||||
///
|
||||
/// # 参数
|
||||
/// * `entry` - 入口节点
|
||||
pub fn new<N: FlowNode + 'static>(entry: N) -> Self {
|
||||
Self { entry: Arc::new(entry) }
|
||||
}
|
||||
|
||||
/// 创建一个顺序执行的流程
|
||||
pub fn sequence(nodes: Vec<NodeRef>) -> Self {
|
||||
Self {
|
||||
entry: Arc::new(Node::new(
|
||||
node_id("seq"),
|
||||
Box::new(SequenceExecutor { children: nodes })
|
||||
))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// 流程引擎配置
|
||||
#[derive(Clone)]
|
||||
pub struct FlowOptions {
|
||||
/// 是否有状态
|
||||
pub stateful: bool,
|
||||
/// 表达式引擎类型
|
||||
pub expr_engine: ExprEngineKind,
|
||||
}
|
||||
|
||||
impl Default for FlowOptions {
|
||||
fn default() -> Self {
|
||||
Self { stateful: false, expr_engine: ExprEngineKind::Rhai }
|
||||
}
|
||||
}
|
||||
|
||||
/// 流程执行引擎
|
||||
pub struct FlowEngine<S: StateStore> {
|
||||
store: S,
|
||||
expr: Arc<dyn ExprEngine>,
|
||||
}
|
||||
|
||||
impl<S: StateStore> FlowEngine<S> {
|
||||
/// 创建新的流程引擎
|
||||
///
|
||||
/// # 参数
|
||||
/// * `store` - 状态存储
|
||||
/// * `options` - 配置选项
|
||||
pub fn new(store: S, options: FlowOptions) -> Self {
|
||||
let expr: Arc<dyn ExprEngine> = match options.expr_engine {
|
||||
ExprEngineKind::Rhai => {
|
||||
#[cfg(feature = "rhai")]
|
||||
{
|
||||
Arc::new(crate::runtime::expr::RhaiEngine::new())
|
||||
}
|
||||
#[cfg(not(feature = "rhai"))]
|
||||
{
|
||||
panic!("Rhai feature not enabled")
|
||||
}
|
||||
}
|
||||
ExprEngineKind::Js => {
|
||||
#[cfg(feature = "js")]
|
||||
{
|
||||
Arc::new(crate::runtime::expr::JsEngine::new())
|
||||
}
|
||||
#[cfg(not(feature = "js"))]
|
||||
{
|
||||
panic!("JS feature not enabled")
|
||||
}
|
||||
}
|
||||
};
|
||||
Self { store, expr }
|
||||
}
|
||||
|
||||
/// 运行无状态流程
|
||||
///
|
||||
/// # 参数
|
||||
/// * `flow` - 流程定义
|
||||
/// * `ctx` - 初始上下文
|
||||
pub async fn run_stateless(&self, flow: &Flow, mut ctx: Context) -> FlowResult {
|
||||
let out = flow.entry.execute(&mut ctx, Some(self.expr.as_ref())).await?;
|
||||
Ok(out)
|
||||
}
|
||||
|
||||
/// 运行有状态流程
|
||||
///
|
||||
/// # 参数
|
||||
/// * `session_id` - 会话 ID
|
||||
/// * `flow` - 流程定义
|
||||
/// * `ctx` - 初始上下文 (如果会话存在则会被覆盖)
|
||||
pub async fn run_stateful(&self, session_id: &str, flow: &Flow, mut ctx: Context) -> FlowResult {
|
||||
if let Some(stored) = self.store.load(session_id).await {
|
||||
ctx = stored.context;
|
||||
}
|
||||
let out = flow.entry.execute(&mut ctx, Some(self.expr.as_ref())).await?;
|
||||
let state = FlowState { session_id: session_id.to_string(), context: ctx };
|
||||
self.store.save(state).await;
|
||||
Ok(out)
|
||||
}
|
||||
}
|
||||
|
||||
/// 流程执行结果
|
||||
pub type FlowResult = Result<NodeOutput, Box<dyn std::error::Error + Send + Sync>>;
|
||||
101
dsl-flow/src/runtime/expr.rs
Normal file
101
dsl-flow/src/runtime/expr.rs
Normal file
@ -0,0 +1,101 @@
|
||||
use crate::core::context::Context;
|
||||
use crate::core::error::ExprError;
|
||||
use crate::core::traits::ExprEngine;
|
||||
use serde_json::Value;
|
||||
|
||||
/// 表达式引擎类型
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
pub enum ExprEngineKind {
|
||||
/// Rhai 脚本引擎
|
||||
Rhai,
|
||||
/// JavaScript 引擎 (基于 Boa)
|
||||
Js,
|
||||
}
|
||||
|
||||
/// Rhai 引擎实现
|
||||
#[cfg(feature = "rhai")]
|
||||
pub struct RhaiEngine {
|
||||
engine: rhai::Engine,
|
||||
}
|
||||
|
||||
#[cfg(feature = "rhai")]
|
||||
impl RhaiEngine {
|
||||
/// 创建新的 Rhai 引擎实例
|
||||
pub fn new() -> Self {
|
||||
let engine = rhai::Engine::new();
|
||||
Self { engine }
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(feature = "rhai")]
|
||||
#[async_trait::async_trait]
|
||||
impl ExprEngine for RhaiEngine {
|
||||
async fn eval(&self, script: &str, ctx: &Context) -> Result<Value, ExprError> {
|
||||
let mut scope = rhai::Scope::new();
|
||||
let ctx_dynamic = rhai::serde::to_dynamic(ctx.as_value()).map_err(|e| ExprError::Rhai(e.to_string()))?;
|
||||
scope.push("ctx", ctx_dynamic);
|
||||
self.engine
|
||||
.eval_with_scope::<rhai::Dynamic>(&mut scope, script)
|
||||
.map_err(|e| ExprError::Rhai(format!("{e:?}")))
|
||||
.and_then(|dynv| {
|
||||
let v: Result<serde_json::Value, _> = rhai::serde::from_dynamic(&dynv);
|
||||
v.map_err(|e| ExprError::Rhai(format!("{e:?}")))
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// JS 引擎实现
|
||||
#[cfg(feature = "js")]
|
||||
pub struct JsEngine;
|
||||
|
||||
#[cfg(feature = "js")]
|
||||
impl JsEngine {
|
||||
/// 创建新的 JS 引擎实例
|
||||
pub fn new() -> Self {
|
||||
Self
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(feature = "js")]
|
||||
#[async_trait::async_trait]
|
||||
impl ExprEngine for JsEngine {
|
||||
async fn eval(&self, script: &str, ctx: &Context) -> Result<Value, ExprError> {
|
||||
use boa_engine::{Context as JsContext, Source};
|
||||
let mut js_ctx = JsContext::default();
|
||||
let ctx_json = ctx.as_value().to_string();
|
||||
let wrapped = format!("const ctx = JSON.parse(`{}`);\n({})", escape_backticks(&ctx_json), script);
|
||||
let value = boa_engine::JsValue::eval(Source::from_bytes(wrapped.as_bytes()), &mut js_ctx)
|
||||
.map_err(|e| ExprError::Js(format!("{e:?}")))?;
|
||||
to_json(value, &mut js_ctx).map_err(|e| ExprError::Js(e))
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(feature = "js")]
|
||||
fn escape_backticks(s: &str) -> String {
|
||||
s.replace('`', "\\`")
|
||||
}
|
||||
|
||||
#[cfg(feature = "js")]
|
||||
fn to_json(v: boa_engine::JsValue, ctx: &mut boa_engine::Context) -> Result<Value, String> {
|
||||
use boa_engine::JsValue;
|
||||
if v.is_null() || v.is_undefined() {
|
||||
Ok(Value::Null)
|
||||
} else if let Some(b) = v.as_boolean() {
|
||||
Ok(Value::Bool(b))
|
||||
} else if let Some(n) = v.as_number() {
|
||||
Ok(serde_json::Number::from_f64(n)
|
||||
.map(Value::Number)
|
||||
.unwrap_or(Value::Null))
|
||||
} else if let Some(s) = v.as_string() {
|
||||
Ok(Value::String(s.to_std_string_escaped().to_string()))
|
||||
} else {
|
||||
// Try JSON.stringify
|
||||
let script = format!("JSON.stringify({})", v.display().to_string());
|
||||
let res = JsValue::eval(script.as_str().into(), ctx).map_err(|e| format!("{e:?}"))?;
|
||||
if let Some(s) = res.as_string() {
|
||||
serde_json::from_str::<Value>(&s.to_std_string_escaped()).map_err(|e| format!("{e}"))
|
||||
} else {
|
||||
Err("Unsupported JS value".to_string())
|
||||
}
|
||||
}
|
||||
}
|
||||
9
dsl-flow/src/runtime/mod.rs
Normal file
9
dsl-flow/src/runtime/mod.rs
Normal file
@ -0,0 +1,9 @@
|
||||
pub mod engine;
|
||||
pub mod state;
|
||||
pub mod expr;
|
||||
|
||||
pub use engine::{Flow, FlowEngine, FlowOptions, FlowResult};
|
||||
pub use state::{StateStore, InMemoryStateStore, FlowState};
|
||||
pub use expr::{ExprEngineKind, RhaiEngine};
|
||||
#[cfg(feature = "js")]
|
||||
pub use expr::JsEngine;
|
||||
46
dsl-flow/src/runtime/state.rs
Normal file
46
dsl-flow/src/runtime/state.rs
Normal file
@ -0,0 +1,46 @@
|
||||
use crate::core::context::Context;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::collections::HashMap;
|
||||
use std::sync::{Arc, Mutex};
|
||||
|
||||
/// 流程状态
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct FlowState {
|
||||
/// 会话 ID
|
||||
pub session_id: String,
|
||||
/// 执行上下文
|
||||
pub context: Context,
|
||||
}
|
||||
|
||||
/// 状态存储特征
|
||||
#[async_trait::async_trait]
|
||||
pub trait StateStore: Send + Sync {
|
||||
/// 保存状态
|
||||
///
|
||||
/// # 参数
|
||||
/// * `state` - 流程状态
|
||||
async fn save(&self, state: FlowState);
|
||||
|
||||
/// 加载状态
|
||||
///
|
||||
/// # 参数
|
||||
/// * `session_id` - 会话 ID
|
||||
async fn load(&self, session_id: &str) -> Option<FlowState>;
|
||||
}
|
||||
|
||||
/// 内存状态存储实现
|
||||
#[derive(Clone, Default)]
|
||||
pub struct InMemoryStateStore {
|
||||
inner: Arc<Mutex<HashMap<String, FlowState>>>,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl StateStore for InMemoryStateStore {
|
||||
async fn save(&self, state: FlowState) {
|
||||
self.inner.lock().unwrap().insert(state.session_id.clone(), state);
|
||||
}
|
||||
|
||||
async fn load(&self, session_id: &str) -> Option<FlowState> {
|
||||
self.inner.lock().unwrap().get(session_id).cloned()
|
||||
}
|
||||
}
|
||||
243
dsl-flow/tests/flow_tests.rs
Normal file
243
dsl-flow/tests/flow_tests.rs
Normal file
@ -0,0 +1,243 @@
|
||||
use dsl_flow::*;
|
||||
use serde_json::json;
|
||||
|
||||
fn write_report(name: &str, dur: std::time::Duration, ok: bool) {
|
||||
use std::fs::{create_dir_all, OpenOptions};
|
||||
use std::io::Write;
|
||||
let _ = create_dir_all("target/test-reports");
|
||||
if let Ok(mut f) = OpenOptions::new().create(true).append(true).open("target/test-reports/default.jsonl") {
|
||||
let line = serde_json::json!({
|
||||
"name": name,
|
||||
"ok": ok,
|
||||
"duration_sec": dur.as_secs_f64()
|
||||
}).to_string();
|
||||
let _ = writeln!(f, "{}", line);
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
struct SleepNode {
|
||||
ms: u64,
|
||||
id: String,
|
||||
}
|
||||
|
||||
#[async_trait::async_trait]
|
||||
impl dsl_flow::FlowNode for SleepNode {
|
||||
fn id(&self) -> &str {
|
||||
&self.id
|
||||
}
|
||||
async fn execute(&self, ctx: &mut dsl_flow::Context, _expr: Option<&dyn dsl_flow::ExprEngine>) -> Result<dsl_flow::NodeOutput, dsl_flow::node::NodeError> {
|
||||
tokio::time::sleep(std::time::Duration::from_millis(self.ms)).await;
|
||||
ctx.set("sleep.last", json!({"slept_ms": self.ms}));
|
||||
Ok(dsl_flow::NodeOutput { id: self.id.clone(), data: json!(self.ms) })
|
||||
}
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_rhai_expr_set_and_get() -> anyhow::Result<()> {
|
||||
let start = std::time::Instant::now();
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions { stateful: false, expr_engine: ExprEngineKind::Rhai });
|
||||
let flow = Flow::new(sequence! {
|
||||
expr_set(ExprEngineKind::Rhai, "1 + 2 + 3", "calc.sum"),
|
||||
expr_set(ExprEngineKind::Rhai, "ctx.calc.sum * 2", "calc.double"),
|
||||
});
|
||||
let ctx = Context::new();
|
||||
let out = engine.run_stateless(&flow, ctx).await.map_err(|e| anyhow::anyhow!(e))?;
|
||||
let arr = out.data;
|
||||
// Sequence returns last child's output
|
||||
assert_eq!(arr, json!(12));
|
||||
write_report("test_rhai_expr_set_and_get", start.elapsed(), true);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_conditional_node_then_else() -> anyhow::Result<()> {
|
||||
let start = std::time::Instant::now();
|
||||
use std::sync::Arc;
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions { stateful: false, expr_engine: ExprEngineKind::Rhai });
|
||||
use dsl_flow::core::types::node_id;
|
||||
|
||||
let then = Arc::new(expr_set(ExprEngineKind::Rhai, "42", "branch.result")) as Arc<dyn FlowNode>;
|
||||
let els = Arc::new(expr_set(ExprEngineKind::Rhai, "0", "branch.result")) as Arc<dyn FlowNode>;
|
||||
let cond = dsl_flow::Node::new(
|
||||
node_id("cond"),
|
||||
Box::new(dsl_flow::ConditionalExecutor {
|
||||
engine: ExprEngineKind::Rhai,
|
||||
condition: "false".into(),
|
||||
then_node: then.clone(),
|
||||
else_node: Some(els.clone()),
|
||||
})
|
||||
);
|
||||
let flow = Flow::new(sequence! { cond });
|
||||
let ctx = Context::new();
|
||||
let out = engine.run_stateless(&flow, ctx).await.map_err(|e| anyhow::anyhow!(e))?;
|
||||
assert_eq!(out.data, json!(0));
|
||||
write_report("test_conditional_node_then_else", start.elapsed(), true);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg(feature = "js")]
|
||||
#[tokio::test]
|
||||
async fn test_js_run_code() -> anyhow::Result<()> {
|
||||
let start = std::time::Instant::now();
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions { stateful: false, expr_engine: ExprEngineKind::Js });
|
||||
let flow = Flow::new(sequence! {
|
||||
dsl_flow::js("const a = 1; const b = 2; a + b"),
|
||||
});
|
||||
let ctx = Context::new();
|
||||
let out = engine.run_stateless(&flow, ctx).await?;
|
||||
let data = out.data;
|
||||
assert_eq!(data, json!(3));
|
||||
write_report("test_js_run_code", start.elapsed(), true);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg(feature = "rhai")]
|
||||
#[tokio::test]
|
||||
async fn test_rhai_run_code() -> anyhow::Result<()> {
|
||||
let start = std::time::Instant::now();
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions { stateful: false, expr_engine: ExprEngineKind::Rhai });
|
||||
let flow = Flow::new(sequence! {
|
||||
dsl_flow::rhai("let a = 1; let b = 2; a + b"),
|
||||
});
|
||||
let ctx = Context::new();
|
||||
let out = engine.run_stateless(&flow, ctx).await.map_err(|e| anyhow::anyhow!(e))?;
|
||||
let data = out.data;
|
||||
assert_eq!(data, json!(3));
|
||||
write_report("test_rhai_run_code", start.elapsed(), true);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg(feature = "js")]
|
||||
#[tokio::test]
|
||||
async fn test_js_expr_and_fork_join() -> anyhow::Result<()> {
|
||||
let start = std::time::Instant::now();
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions { stateful: false, expr_engine: ExprEngineKind::Js });
|
||||
let flow = Flow::new(sequence! {
|
||||
expr_set(ExprEngineKind::Js, "({ a: 1, b: 2 })", "obj"),
|
||||
fork_join! {
|
||||
expr_set(ExprEngineKind::Js, "ctx.obj.a + ctx.obj.b", "sum"),
|
||||
expr_set(ExprEngineKind::Js, "ctx.obj.a * ctx.obj.b", "mul")
|
||||
}
|
||||
});
|
||||
let ctx = Context::new();
|
||||
let out = engine.run_stateless(&flow, ctx).await?;
|
||||
let data = out.data;
|
||||
assert!(data.is_array());
|
||||
write_report("test_js_expr_and_fork_join", start.elapsed(), true);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg(feature = "http")]
|
||||
#[tokio::test]
|
||||
async fn test_http_node_with_mock() -> anyhow::Result<()> {
|
||||
let start = std::time::Instant::now();
|
||||
use httpmock::MockServer;
|
||||
use httpmock::Method::GET;
|
||||
|
||||
let server = MockServer::start_async().await;
|
||||
let _m = server.mock_async(|when, then| {
|
||||
when.method(GET).path("/data");
|
||||
then.status(200)
|
||||
.header("content-type", "application/json")
|
||||
.json_body(json!({ "ok": true, "msg": "hello" }));
|
||||
}).await;
|
||||
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions { stateful: false, expr_engine: ExprEngineKind::Rhai });
|
||||
let flow = Flow::new(sequence! {
|
||||
dsl_flow::http_get(&format!("{}/data", server.base_url()))
|
||||
});
|
||||
let ctx = Context::new();
|
||||
let out = engine.run_stateless(&flow, ctx).await.map_err(|e| anyhow::anyhow!(e))?;
|
||||
let body = out.data.get("body").unwrap().clone();
|
||||
assert_eq!(body.get("ok").unwrap(), &json!(true));
|
||||
write_report("test_http_node_with_mock", start.elapsed(), true);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_stateful_engine() -> anyhow::Result<()> {
|
||||
let start = std::time::Instant::now();
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store.clone(), FlowOptions { stateful: true, expr_engine: ExprEngineKind::Rhai });
|
||||
let flow = Flow::new(sequence! {
|
||||
expr_set(ExprEngineKind::Rhai, "if ctx.counter == () { 0 } else { ctx.counter + 1 }", "counter")
|
||||
});
|
||||
let ctx = Context::new();
|
||||
let s = "session-1";
|
||||
let _out1 = engine.run_stateful(s, &flow, ctx.clone()).await.map_err(|e| anyhow::anyhow!(e))?;
|
||||
let out2 = engine.run_stateful(s, &flow, ctx.clone()).await.map_err(|e| anyhow::anyhow!(e))?;
|
||||
assert_eq!(out2.data, json!(1));
|
||||
write_report("test_stateful_engine", start.elapsed(), true);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_db_and_mq_nodes() -> anyhow::Result<()> {
|
||||
let start = std::time::Instant::now();
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions { stateful: false, expr_engine: ExprEngineKind::Rhai });
|
||||
let flow = Flow::new(sequence! {
|
||||
dsl_flow::db_node("insert_user", json!({"name": "Alice"})),
|
||||
dsl_flow::mq_node("user.events", json!({"event": "created", "user": "Alice"})),
|
||||
});
|
||||
let ctx = Context::new();
|
||||
let out = engine.run_stateless(&flow, ctx).await.map_err(|e| anyhow::anyhow!(e))?;
|
||||
assert_eq!(out.data.get("status").unwrap(), &json!("sent"));
|
||||
write_report("test_db_and_mq_nodes", start.elapsed(), true);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_group_parallel_sleep() -> anyhow::Result<()> {
|
||||
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions { stateful: false, expr_engine: ExprEngineKind::Rhai });
|
||||
let n1 = SleepNode { ms: 200, id: "sleep-200".into() };
|
||||
let n2 = SleepNode { ms: 200, id: "sleep-200b".into() };
|
||||
let group = group_merge! { "agg.group", merge_mode_array(), n1, n2 };
|
||||
let flow = Flow::new(sequence! { group });
|
||||
let ctx = Context::new();
|
||||
let start = std::time::Instant::now();
|
||||
let _ = engine.run_stateless(&flow, ctx).await.map_err(|e| anyhow::anyhow!(e))?;
|
||||
let elapsed = start.elapsed();
|
||||
assert!(elapsed.as_millis() < 380, "elapsed={}ms", elapsed.as_millis());
|
||||
write_report("test_group_parallel_sleep", start.elapsed(), true);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_expr_set_without_engine_error() -> anyhow::Result<()> {
|
||||
let start = std::time::Instant::now();
|
||||
let mut ctx = Context::new();
|
||||
let node = expr_set(ExprEngineKind::Rhai, "1+1", "x");
|
||||
let res = dsl_flow::FlowNode::execute(&node, &mut ctx, None).await;
|
||||
assert!(res.is_err());
|
||||
write_report("test_expr_set_without_engine_error", start.elapsed(), true);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_fork_join_merge_and_lineage() -> anyhow::Result<()> {
|
||||
let store = InMemoryStateStore::default();
|
||||
let engine = FlowEngine::new(store, FlowOptions { stateful: false, expr_engine: ExprEngineKind::Rhai });
|
||||
let flow = Flow::new(sequence! {
|
||||
fork_join_merge! { "agg.fork", merge_mode_object_by_id(),
|
||||
expr_set(ExprEngineKind::Rhai, "10", "a.x"),
|
||||
expr_set(ExprEngineKind::Rhai, "20", "b.y")
|
||||
},
|
||||
expr_get(ExprEngineKind::Rhai, "ctx.agg.fork")
|
||||
});
|
||||
let ctx = Context::new();
|
||||
let out = engine.run_stateless(&flow, ctx).await.map_err(|e| anyhow::anyhow!(e))?;
|
||||
let obj = out.data;
|
||||
assert!(obj.is_object());
|
||||
Ok(())
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user