v1.4.0 Implementation Plan: Persistent Memory + Webhook DX
For Claude: REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
Goal: Add Markdown-backed persistent agent memory with RAG search, and webhook signature verification with DSL-level definitions.
Architecture: Extends existing ContextPersistence trait with a Markdown backend (MarkdownMemoryStore). Adds SignatureVerifier trait with HMAC-SHA256/JWT implementations and provider presets (GitHub, Stripe, Slack) as Axum middleware on HttpInputServer. Both features get DSL grammar blocks and CLI commands.
Tech Stack: Rust, tree-sitter (DSL grammar), Axum (webhook middleware), hmac/sha2/subtle (crypto), humantime (duration parsing), jsonwebtoken (JWT)
Task 1: Add memory_definition and webhook_definition to DSL Grammar
Files:
- Modify:
crates/dsl/tree-sitter-symbiont/grammar.js
Step 1: Add grammar rules
Add memory_definition and webhook_definition to the _item choice (line 7) and define their rules. Follow the existing schedule_definition (line 93) and channel_definition (line 108) patterns:
// In _item choice (line 7), add:
$.memory_definition,
$.webhook_definition
// After channel_data_classification_block (line 147), add:
memory_definition: $ => seq(
'memory',
$.identifier,
'{',
repeat(choice(
$.memory_property,
$.memory_search_block
)),
'}'
),
memory_property: $ => seq(
$.identifier,
$.value,
optional(',')
),
memory_search_block: $ => seq(
'search',
'{',
repeat($.memory_search_property),
'}'
),
memory_search_property: $ => seq(
$.identifier,
$.value,
optional(',')
),
webhook_definition: $ => seq(
'webhook',
$.identifier,
'{',
repeat(choice(
$.webhook_property,
$.webhook_filter_block
)),
'}'
),
webhook_property: $ => seq(
$.identifier,
$.value,
optional(',')
),
webhook_filter_block: $ => seq(
'filter',
'{',
repeat($.webhook_filter_property),
'}'
),
webhook_filter_property: $ => seq(
$.identifier,
$.value,
optional(',')
),
Note: memory_property uses $.identifier $.value (whitespace-separated, no colon) to match the design’s store markdown / path "data/agents" / retention 90d syntax. webhook_property also uses this pattern for consistency: path "/hooks/github" / provider github. The grammar’s extras rule already handles whitespace.
Step 2: Regenerate the tree-sitter parser
Run:
cd crates/dsl/tree-sitter-symbiont && npx tree-sitter generate
Expected: src/parser.c regenerated without errors.
Step 3: Verify the DSL crate compiles
Run:
cargo build -p symbi-dsl
Expected: BUILD SUCCESS
Step 4: Commit
git add crates/dsl/tree-sitter-symbiont/
git commit -m "Add memory and webhook blocks to DSL grammar"
Task 2: Add MemoryDefinition and Extraction Function to DSL Parser
Files:
- Modify:
crates/dsl/src/lib.rs - Modify:
crates/dsl/tests/parser_tests.rs
Step 1: Write failing tests
Add to crates/dsl/tests/parser_tests.rs:
#[test]
fn test_memory_definition_parsing() {
let dsl = r#"
memory agent_memory {
store markdown
path "data/agents"
retention 90d
}
"#;
let tree = parse_dsl(dsl).expect("should parse");
let memories = extract_memory_definitions(&tree, dsl).unwrap();
assert_eq!(memories.len(), 1);
let m = &memories[0];
assert_eq!(m.name, "agent_memory");
assert_eq!(m.store, MemoryStoreType::Markdown);
assert_eq!(m.path, std::path::PathBuf::from("data/agents"));
assert_eq!(m.retention, std::time::Duration::from_secs(90 * 86400));
assert!(m.search.is_none());
}
#[test]
fn test_memory_definition_with_search_config() {
let dsl = r#"
memory agent_memory {
store markdown
path "data/agents"
retention 90d
search {
vector_weight 0.7
keyword_weight 0.3
}
}
"#;
let tree = parse_dsl(dsl).expect("should parse");
let memories = extract_memory_definitions(&tree, dsl).unwrap();
assert_eq!(memories.len(), 1);
let m = &memories[0];
let search = m.search.as_ref().unwrap();
assert!((search.vector_weight - 0.7).abs() < f64::EPSILON);
assert!((search.keyword_weight - 0.3).abs() < f64::EPSILON);
}
#[test]
fn test_memory_definition_defaults() {
let dsl = r#"
memory minimal {
store markdown
}
"#;
let tree = parse_dsl(dsl).expect("should parse");
let memories = extract_memory_definitions(&tree, dsl).unwrap();
assert_eq!(memories.len(), 1);
assert_eq!(memories[0].path, std::path::PathBuf::from("data/agents"));
assert_eq!(memories[0].retention, std::time::Duration::from_secs(90 * 86400));
}
Step 2: Run tests to verify they fail
Run:
cargo test -p symbi-dsl test_memory_definition -- --no-capture
Expected: FAIL — extract_memory_definitions not found.
Step 3: Implement types and extraction function
Add to crates/dsl/src/lib.rs after the ScheduleDefinition impl block (after line 409):
use std::path::PathBuf;
use std::time::Duration;
/// Memory store backend type.
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub enum MemoryStoreType {
Markdown,
}
/// Search weights for memory retrieval via RAG.
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
pub struct MemorySearchConfig {
pub vector_weight: f64,
pub keyword_weight: f64,
}
impl Default for MemorySearchConfig {
fn default() -> Self {
Self {
vector_weight: 0.7,
keyword_weight: 0.3,
}
}
}
/// A parsed memory definition from DSL `memory` blocks.
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
pub struct MemoryDefinition {
pub name: String,
pub store: MemoryStoreType,
pub path: PathBuf,
pub retention: Duration,
pub search: Option<MemorySearchConfig>,
}
impl MemoryDefinition {
fn new(name: String) -> Self {
Self {
name,
store: MemoryStoreType::Markdown,
path: PathBuf::from("data/agents"),
retention: Duration::from_secs(90 * 86400), // 90 days
search: None,
}
}
}
/// Extract memory definitions from parsed AST.
pub fn extract_memory_definitions(
tree: &Tree,
source: &str,
) -> Result<Vec<MemoryDefinition>, String> {
let mut memories = Vec::new();
let root_node = tree.root_node();
fn traverse(
node: Node,
source: &str,
memories: &mut Vec<MemoryDefinition>,
) -> Result<(), String> {
if node.kind() == "memory_definition" {
let name_node = node
.child(1)
.ok_or_else(|| "memory_definition missing name".to_string())?;
let name = source[name_node.start_byte()..name_node.end_byte()].to_string();
let mut mem = MemoryDefinition::new(name);
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
if child.kind() == "memory_property" {
// Child 0 = key identifier, child 1 = value
if let (Some(key_node), Some(val_node)) = (child.child(0), child.child(1)) {
let key = source[key_node.start_byte()..key_node.end_byte()].to_string();
let raw_value = source[val_node.start_byte()..val_node.end_byte()].to_string();
let value = raw_value.trim_matches('"').to_string();
match key.as_str() {
"store" => {
mem.store = match value.as_str() {
"markdown" => MemoryStoreType::Markdown,
_ => return Err(format!(
"memory '{}': unknown store type '{}'", mem.name, value
)),
};
}
"path" => mem.path = PathBuf::from(value),
"retention" => {
mem.retention = humantime::parse_duration(&value)
.map_err(|e| format!(
"memory '{}': invalid retention '{}': {}",
mem.name, value, e
))?;
}
_ => {}
}
}
} else if child.kind() == "memory_search_block" {
let mut search = MemorySearchConfig::default();
for j in 0..child.child_count() {
if let Some(prop) = child.child(j) {
if prop.kind() == "memory_search_property" {
if let (Some(k), Some(v)) = (prop.child(0), prop.child(1)) {
let key = source[k.start_byte()..k.end_byte()].to_string();
let val = source[v.start_byte()..v.end_byte()].to_string();
match key.as_str() {
"vector_weight" => {
search.vector_weight = val.parse().map_err(|_| {
format!("memory '{}': invalid vector_weight", mem.name)
})?;
}
"keyword_weight" => {
search.keyword_weight = val.parse().map_err(|_| {
format!("memory '{}': invalid keyword_weight", mem.name)
})?;
}
_ => {}
}
}
}
}
}
mem.search = Some(search);
}
}
}
memories.push(mem);
}
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
traverse(child, source, memories)?;
}
}
Ok(())
}
traverse(root_node, source, &mut memories)?;
Ok(memories)
}
Step 4: Run tests to verify they pass
Run:
cargo test -p symbi-dsl test_memory_definition -- --no-capture
Expected: 3 tests PASS.
Step 5: Commit
git add crates/dsl/src/lib.rs crates/dsl/tests/parser_tests.rs
git commit -m "Add MemoryDefinition type and extraction to DSL parser"
Task 3: Add WebhookDefinition and Extraction Function to DSL Parser
Files:
- Modify:
crates/dsl/src/lib.rs - Modify:
crates/dsl/tests/parser_tests.rs
Step 1: Write failing tests
Add to crates/dsl/tests/parser_tests.rs:
#[test]
fn test_webhook_definition_parsing() {
let dsl = r#"
webhook github_events {
path "/hooks/github"
provider github
secret "secret://vault/github-webhook-secret"
agent code_review_agent
}
"#;
let tree = parse_dsl(dsl).expect("should parse");
let webhooks = extract_webhook_definitions(&tree, dsl).unwrap();
assert_eq!(webhooks.len(), 1);
let w = &webhooks[0];
assert_eq!(w.name, "github_events");
assert_eq!(w.path, "/hooks/github");
assert_eq!(w.provider, WebhookProvider::GitHub);
assert_eq!(w.secret, "secret://vault/github-webhook-secret");
assert_eq!(w.agent.as_deref(), Some("code_review_agent"));
assert!(w.filter.is_none());
}
#[test]
fn test_webhook_definition_with_filter() {
let dsl = r#"
webhook github_prs {
path "/hooks/github"
provider github
secret "my-secret"
agent pr_agent
filter {
json_path "$.action"
equals "opened"
}
}
"#;
let tree = parse_dsl(dsl).expect("should parse");
let webhooks = extract_webhook_definitions(&tree, dsl).unwrap();
assert_eq!(webhooks.len(), 1);
let f = webhooks[0].filter.as_ref().unwrap();
assert_eq!(f.json_path, "$.action");
assert_eq!(f.equals.as_deref(), Some("opened"));
}
#[test]
fn test_webhook_definition_custom_provider() {
let dsl = r#"
webhook custom_hook {
path "/hooks/custom"
provider custom
secret "test-secret"
}
"#;
let tree = parse_dsl(dsl).expect("should parse");
let webhooks = extract_webhook_definitions(&tree, dsl).unwrap();
assert_eq!(webhooks[0].provider, WebhookProvider::Custom);
assert!(webhooks[0].agent.is_none());
}
#[test]
fn test_webhook_definition_missing_path_fails() {
let dsl = r#"
webhook no_path {
provider github
secret "test"
}
"#;
let tree = parse_dsl(dsl).expect("should parse");
let result = extract_webhook_definitions(&tree, dsl);
assert!(result.is_err());
assert!(result.unwrap_err().contains("path"));
}
Step 2: Run tests to verify they fail
Run:
cargo test -p symbi-dsl test_webhook_definition -- --no-capture
Expected: FAIL — extract_webhook_definitions not found.
Step 3: Implement types and extraction function
Add to crates/dsl/src/lib.rs after the MemoryDefinition extraction function:
/// Webhook provider preset.
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub enum WebhookProvider {
GitHub,
Stripe,
Slack,
Custom,
}
impl WebhookProvider {
fn from_str(s: &str) -> Self {
match s.to_lowercase().as_str() {
"github" => Self::GitHub,
"stripe" => Self::Stripe,
"slack" => Self::Slack,
_ => Self::Custom,
}
}
}
/// Optional filter for webhook payloads.
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
pub struct WebhookFilter {
pub json_path: String,
pub equals: Option<String>,
pub contains: Option<String>,
}
/// A parsed webhook definition from DSL `webhook` blocks.
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
pub struct WebhookDefinition {
pub name: String,
pub path: String,
pub provider: WebhookProvider,
pub secret: String,
pub agent: Option<String>,
pub filter: Option<WebhookFilter>,
}
/// Extract webhook definitions from parsed AST.
pub fn extract_webhook_definitions(
tree: &Tree,
source: &str,
) -> Result<Vec<WebhookDefinition>, String> {
let mut webhooks = Vec::new();
let root_node = tree.root_node();
fn traverse(
node: Node,
source: &str,
webhooks: &mut Vec<WebhookDefinition>,
) -> Result<(), String> {
if node.kind() == "webhook_definition" {
let name_node = node
.child(1)
.ok_or_else(|| "webhook_definition missing name".to_string())?;
let name = source[name_node.start_byte()..name_node.end_byte()].to_string();
let mut path: Option<String> = None;
let mut provider = WebhookProvider::Custom;
let mut secret = String::new();
let mut agent: Option<String> = None;
let mut filter: Option<WebhookFilter> = None;
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
if child.kind() == "webhook_property" {
if let (Some(key_node), Some(val_node)) = (child.child(0), child.child(1)) {
let key = source[key_node.start_byte()..key_node.end_byte()].to_string();
let raw_value = source[val_node.start_byte()..val_node.end_byte()].to_string();
let value = raw_value.trim_matches('"').to_string();
match key.as_str() {
"path" => path = Some(value),
"provider" => provider = WebhookProvider::from_str(&value),
"secret" => secret = value,
"agent" => agent = Some(value),
_ => {}
}
}
} else if child.kind() == "webhook_filter_block" {
let mut json_path = String::new();
let mut equals: Option<String> = None;
let mut contains: Option<String> = None;
for j in 0..child.child_count() {
if let Some(prop) = child.child(j) {
if prop.kind() == "webhook_filter_property" {
if let (Some(k), Some(v)) = (prop.child(0), prop.child(1)) {
let key = source[k.start_byte()..k.end_byte()].to_string();
let val = source[v.start_byte()..v.end_byte()]
.trim_matches('"')
.to_string();
match key.as_str() {
"json_path" => json_path = val,
"equals" => equals = Some(val),
"contains" => contains = Some(val),
_ => {}
}
}
}
}
}
filter = Some(WebhookFilter {
json_path,
equals,
contains,
});
}
}
}
let path = path.ok_or_else(|| {
format!("webhook '{}': must specify 'path'", name)
})?;
webhooks.push(WebhookDefinition {
name,
path,
provider,
secret,
agent,
filter,
});
}
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
traverse(child, source, webhooks)?;
}
}
Ok(())
}
traverse(root_node, source, &mut webhooks)?;
Ok(webhooks)
}
Step 4: Run tests to verify they pass
Run:
cargo test -p symbi-dsl test_webhook_definition -- --no-capture
Expected: 4 tests PASS.
Step 5: Run full DSL test suite
Run:
cargo test -p symbi-dsl
Expected: All existing tests + new tests PASS.
Step 6: Commit
git add crates/dsl/src/lib.rs crates/dsl/tests/parser_tests.rs
git commit -m "Add WebhookDefinition type and extraction to DSL parser"
Task 4: Add hmac Dependency and SignatureVerifier Trait to Runtime
Files:
- Modify:
crates/runtime/Cargo.toml - Create:
crates/runtime/src/http_input/webhook_verify.rs - Modify:
crates/runtime/src/http_input/mod.rs
Step 1: Add hmac dependency
sha2, subtle, and hex are already in crates/runtime/Cargo.toml. Add hmac:
# Under [dependencies], add:
hmac = "0.12"
Also add jsonwebtoken (optional, gated on http-input):
jsonwebtoken = { version = "10", optional = true }
Update the http-input feature:
http-input = ["axum", "tower", "tower-http", "dep:jsonwebtoken"]
Step 2: Create the webhook_verify module with trait and tests
Create crates/runtime/src/http_input/webhook_verify.rs with the SignatureVerifier trait, HMAC verifier, and inline tests:
//! Webhook signature verification.
//!
//! Provides a `SignatureVerifier` trait and implementations for HMAC-SHA256
//! and JWT verification. Includes provider presets for GitHub, Stripe, Slack.
use async_trait::async_trait;
use hmac::{Hmac, Mac};
use sha2::Sha256;
use subtle::ConstantTimeEq;
use thiserror::Error;
type HmacSha256 = Hmac<Sha256>;
#[derive(Debug, Error)]
pub enum VerifyError {
#[error("missing signature header: {0}")]
MissingHeader(String),
#[error("invalid signature: {0}")]
InvalidSignature(String),
#[error("verification failed: {0}")]
VerificationFailed(String),
}
/// Trait for verifying webhook request signatures.
#[async_trait]
pub trait SignatureVerifier: Send + Sync {
/// Verify that the given body matches the signature from headers.
///
/// `headers` is a set of (name, value) pairs from the HTTP request.
/// `body` is the raw request body bytes.
async fn verify(
&self,
headers: &[(String, String)],
body: &[u8],
) -> Result<(), VerifyError>;
}
/// HMAC-SHA256 signature verifier.
pub struct HmacVerifier {
secret: Vec<u8>,
header_name: String,
prefix: Option<String>,
}
impl HmacVerifier {
pub fn new(secret: Vec<u8>, header_name: String, prefix: Option<String>) -> Self {
Self {
secret,
header_name,
prefix,
}
}
}
#[async_trait]
impl SignatureVerifier for HmacVerifier {
async fn verify(
&self,
headers: &[(String, String)],
body: &[u8],
) -> Result<(), VerifyError> {
let sig_value = headers
.iter()
.find(|(name, _)| name.eq_ignore_ascii_case(&self.header_name))
.map(|(_, v)| v.as_str())
.ok_or_else(|| VerifyError::MissingHeader(self.header_name.clone()))?;
let mut mac = HmacSha256::new_from_slice(&self.secret)
.map_err(|e| VerifyError::VerificationFailed(format!("HMAC init: {}", e)))?;
mac.update(body);
let computed = mac.finalize().into_bytes();
let computed_hex = hex::encode(computed);
// Build expected string with optional prefix
let expected = if let Some(ref pfx) = self.prefix {
format!("{}{}", pfx, computed_hex)
} else {
computed_hex
};
let expected_bytes = expected.as_bytes();
let actual_bytes = sig_value.as_bytes();
if expected_bytes.len() != actual_bytes.len()
|| expected_bytes.ct_eq(actual_bytes).unwrap_u8() != 1
{
return Err(VerifyError::InvalidSignature("signature mismatch".into()));
}
Ok(())
}
}
/// JWT signature verifier.
pub struct JwtVerifier {
decoding_key: jsonwebtoken::DecodingKey,
header_name: String,
validation: jsonwebtoken::Validation,
}
impl JwtVerifier {
/// Create a new JWT verifier with an HMAC secret.
pub fn new_hmac(
secret: &[u8],
header_name: String,
required_issuer: Option<String>,
) -> Self {
let decoding_key = jsonwebtoken::DecodingKey::from_secret(secret);
let mut validation = jsonwebtoken::Validation::new(jsonwebtoken::Algorithm::HS256);
if let Some(iss) = required_issuer {
validation.set_issuer(&[iss]);
}
Self {
decoding_key,
header_name,
validation,
}
}
}
#[async_trait]
impl SignatureVerifier for JwtVerifier {
async fn verify(
&self,
headers: &[(String, String)],
_body: &[u8],
) -> Result<(), VerifyError> {
let header_value = headers
.iter()
.find(|(name, _)| name.eq_ignore_ascii_case(&self.header_name))
.map(|(_, v)| v.as_str())
.ok_or_else(|| VerifyError::MissingHeader(self.header_name.clone()))?;
// Strip "Bearer " prefix if present
let token = header_value
.strip_prefix("Bearer ")
.unwrap_or(header_value);
jsonwebtoken::decode::<serde_json::Value>(token, &self.decoding_key, &self.validation)
.map_err(|e| VerifyError::InvalidSignature(format!("JWT: {}", e)))?;
Ok(())
}
}
/// Webhook provider preset.
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum WebhookProvider {
/// GitHub: X-Hub-Signature-256, sha256= prefix
GitHub,
/// Stripe: Stripe-Signature header
Stripe,
/// Slack: X-Slack-Signature, v0= prefix
Slack,
/// Custom: user-specified
Custom,
}
impl WebhookProvider {
/// Create a verifier for this provider using the given secret.
pub fn verifier(&self, secret: &[u8]) -> Box<dyn SignatureVerifier> {
match self {
Self::GitHub => Box::new(HmacVerifier::new(
secret.to_vec(),
"X-Hub-Signature-256".to_string(),
Some("sha256=".to_string()),
)),
Self::Stripe => Box::new(HmacVerifier::new(
secret.to_vec(),
"Stripe-Signature".to_string(),
None,
)),
Self::Slack => Box::new(HmacVerifier::new(
secret.to_vec(),
"X-Slack-Signature".to_string(),
Some("v0=".to_string()),
)),
Self::Custom => Box::new(HmacVerifier::new(
secret.to_vec(),
"X-Signature".to_string(),
None,
)),
}
}
}
#[cfg(test)]
mod tests {
use super::*;
#[tokio::test]
async fn test_hmac_verifier_valid_signature() {
let secret = b"test-secret";
let verifier = HmacVerifier::new(
secret.to_vec(),
"X-Signature".to_string(),
None,
);
let body = b"hello world";
// Compute expected signature
let mut mac = HmacSha256::new_from_slice(secret).unwrap();
mac.update(body);
let sig = hex::encode(mac.finalize().into_bytes());
let headers = vec![("X-Signature".to_string(), sig)];
assert!(verifier.verify(&headers, body).await.is_ok());
}
#[tokio::test]
async fn test_hmac_verifier_with_prefix() {
let secret = b"github-secret";
let verifier = HmacVerifier::new(
secret.to_vec(),
"X-Hub-Signature-256".to_string(),
Some("sha256=".to_string()),
);
let body = b"{\"action\":\"opened\"}";
let mut mac = HmacSha256::new_from_slice(secret).unwrap();
mac.update(body);
let sig = format!("sha256={}", hex::encode(mac.finalize().into_bytes()));
let headers = vec![("X-Hub-Signature-256".to_string(), sig)];
assert!(verifier.verify(&headers, body).await.is_ok());
}
#[tokio::test]
async fn test_hmac_verifier_invalid_signature() {
let verifier = HmacVerifier::new(
b"secret".to_vec(),
"X-Signature".to_string(),
None,
);
let headers = vec![("X-Signature".to_string(), "bad-sig".to_string())];
let result = verifier.verify(&headers, b"body").await;
assert!(result.is_err());
assert!(matches!(result.unwrap_err(), VerifyError::InvalidSignature(_)));
}
#[tokio::test]
async fn test_hmac_verifier_missing_header() {
let verifier = HmacVerifier::new(
b"secret".to_vec(),
"X-Signature".to_string(),
None,
);
let headers: Vec<(String, String)> = vec![];
let result = verifier.verify(&headers, b"body").await;
assert!(result.is_err());
assert!(matches!(result.unwrap_err(), VerifyError::MissingHeader(_)));
}
#[tokio::test]
async fn test_hmac_verifier_case_insensitive_header() {
let secret = b"secret";
let verifier = HmacVerifier::new(
secret.to_vec(),
"X-Signature".to_string(),
None,
);
let body = b"data";
let mut mac = HmacSha256::new_from_slice(secret).unwrap();
mac.update(body);
let sig = hex::encode(mac.finalize().into_bytes());
// Lowercase header name should still match
let headers = vec![("x-signature".to_string(), sig)];
assert!(verifier.verify(&headers, body).await.is_ok());
}
#[tokio::test]
async fn test_github_provider_preset() {
let secret = b"gh-secret";
let verifier = WebhookProvider::GitHub.verifier(secret);
let body = b"{\"action\":\"opened\"}";
let mut mac = HmacSha256::new_from_slice(secret).unwrap();
mac.update(body);
let sig = format!("sha256={}", hex::encode(mac.finalize().into_bytes()));
let headers = vec![("X-Hub-Signature-256".to_string(), sig)];
assert!(verifier.verify(&headers, body).await.is_ok());
}
#[tokio::test]
async fn test_jwt_verifier_valid_token() {
let secret = b"jwt-test-secret-key-must-be-long-enough";
let verifier = JwtVerifier::new_hmac(
secret,
"Authorization".to_string(),
None,
);
// Create a valid JWT
let claims = serde_json::json!({
"sub": "webhook",
"exp": chrono::Utc::now().timestamp() + 3600,
});
let token = jsonwebtoken::encode(
&jsonwebtoken::Header::default(),
&claims,
&jsonwebtoken::EncodingKey::from_secret(secret),
)
.unwrap();
let headers = vec![("Authorization".to_string(), format!("Bearer {}", token))];
assert!(verifier.verify(&headers, b"").await.is_ok());
}
#[tokio::test]
async fn test_jwt_verifier_expired_token() {
let secret = b"jwt-test-secret-key-must-be-long-enough";
let verifier = JwtVerifier::new_hmac(
secret,
"Authorization".to_string(),
None,
);
let claims = serde_json::json!({
"sub": "webhook",
"exp": chrono::Utc::now().timestamp() - 3600, // Expired
});
let token = jsonwebtoken::encode(
&jsonwebtoken::Header::default(),
&claims,
&jsonwebtoken::EncodingKey::from_secret(secret),
)
.unwrap();
let headers = vec![("Authorization".to_string(), format!("Bearer {}", token))];
let result = verifier.verify(&headers, b"").await;
assert!(result.is_err());
}
}
Step 3: Register the module
Add to crates/runtime/src/http_input/mod.rs (after existing submodule declarations):
pub mod webhook_verify;
Step 4: Run tests to verify they pass
Run:
cargo test -p symbi-runtime --features http-input webhook_verify -- --no-capture
Expected: 8 tests PASS.
Step 5: Commit
git add crates/runtime/Cargo.toml crates/runtime/src/http_input/webhook_verify.rs crates/runtime/src/http_input/mod.rs
git commit -m "Add SignatureVerifier trait with HMAC-SHA256 and JWT implementations"
Task 5: Implement MarkdownMemoryStore
Files:
- Create:
crates/runtime/src/context/markdown_memory.rs - Modify:
crates/runtime/src/context/mod.rs
Step 1: Write failing tests first
Create crates/runtime/src/context/markdown_memory.rs with the struct, trait impl stubs, and tests:
//! Markdown-backed persistent memory store for agents.
//!
//! Stores agent memory as human-readable Markdown files:
//! - `memory.md`: Current memory state (facts, procedures, patterns)
//! - `logs/{date}.md`: Daily interaction logs
use super::types::{
AgentContext, ContextError, ContextPersistence, HierarchicalMemory,
KnowledgeBase, RetentionPolicy, StorageStats,
};
use crate::types::AgentId;
use async_trait::async_trait;
use std::path::PathBuf;
use std::time::{Duration, SystemTime};
/// Markdown-backed implementation of `ContextPersistence`.
pub struct MarkdownMemoryStore {
root_dir: PathBuf,
retention: Duration,
}
impl MarkdownMemoryStore {
/// Create a new store rooted at the given directory.
pub fn new(root_dir: PathBuf, retention: Duration) -> Self {
Self {
root_dir,
retention,
}
}
/// Get the agent directory path.
fn agent_dir(&self, agent_id: AgentId) -> PathBuf {
self.root_dir.join(agent_id.to_string())
}
/// Get the memory.md file path for an agent.
fn memory_path(&self, agent_id: AgentId) -> PathBuf {
self.agent_dir(agent_id).join("memory.md")
}
/// Get the logs directory for an agent.
fn logs_dir(&self, agent_id: AgentId) -> PathBuf {
self.agent_dir(agent_id).join("logs")
}
/// Serialize HierarchicalMemory to Markdown.
fn memory_to_markdown(agent_id: AgentId, memory: &HierarchicalMemory) -> String {
let now = chrono::Utc::now().format("%Y-%m-%dT%H:%M:%SZ");
let mut md = format!("# Agent Memory: {}\nUpdated: {}\n", agent_id, now);
// Facts from long_term memory
md.push_str("\n## Facts\n");
for item in &memory.long_term {
if item.memory_type == super::types::MemoryType::Factual {
md.push_str(&format!("- {}\n", item.content));
}
}
// Procedures
md.push_str("\n## Procedures\n");
for item in &memory.long_term {
if item.memory_type == super::types::MemoryType::Procedural {
md.push_str(&format!("- {}\n", item.content));
}
}
// Learned Patterns from semantic_memory
md.push_str("\n## Learned Patterns\n");
for item in &memory.semantic_memory {
md.push_str(&format!("- {}\n", item.concept));
}
md
}
/// Parse Markdown back into HierarchicalMemory fields.
fn markdown_to_memory(content: &str) -> HierarchicalMemory {
use super::types::{MemoryItem, MemoryType, SemanticMemoryItem};
use crate::context::types::{ContextId, KnowledgeSource};
let mut memory = HierarchicalMemory::default();
let mut current_section: Option<&str> = None;
for line in content.lines() {
let trimmed = line.trim();
if trimmed.starts_with("## Facts") {
current_section = Some("facts");
} else if trimmed.starts_with("## Procedures") {
current_section = Some("procedures");
} else if trimmed.starts_with("## Learned Patterns") {
current_section = Some("patterns");
} else if trimmed.starts_with("## ") || trimmed.starts_with("# ") {
current_section = None;
} else if let Some(section) = current_section {
if let Some(text) = trimmed.strip_prefix("- ") {
match section {
"facts" => memory.long_term.push(MemoryItem {
id: ContextId::new(),
content: text.to_string(),
memory_type: MemoryType::Factual,
importance: 0.5,
created_at: SystemTime::now(),
last_accessed: SystemTime::now(),
access_count: 0,
source: KnowledgeSource::Experience,
embeddings: vec![],
tags: vec![],
}),
"procedures" => memory.long_term.push(MemoryItem {
id: ContextId::new(),
content: text.to_string(),
memory_type: MemoryType::Procedural,
importance: 0.5,
created_at: SystemTime::now(),
last_accessed: SystemTime::now(),
access_count: 0,
source: KnowledgeSource::Experience,
embeddings: vec![],
tags: vec![],
}),
"patterns" => memory.semantic_memory.push(SemanticMemoryItem {
id: ContextId::new(),
concept: text.to_string(),
relationships: vec![],
confidence: 0.5,
source: KnowledgeSource::Learning,
created_at: SystemTime::now(),
embeddings: vec![],
}),
_ => {}
}
}
}
}
memory
}
/// Append a session summary to today's daily log.
fn append_daily_log(
&self,
agent_id: AgentId,
context: &AgentContext,
) -> Result<(), ContextError> {
use std::io::Write;
let logs_dir = self.logs_dir(agent_id);
std::fs::create_dir_all(&logs_dir)?;
let today = chrono::Utc::now().format("%Y-%m-%d").to_string();
let log_path = logs_dir.join(format!("{}.md", today));
let mut file = std::fs::OpenOptions::new()
.create(true)
.append(true)
.open(&log_path)?;
let time = chrono::Utc::now().format("%H:%M").to_string();
writeln!(file)?;
writeln!(file, "## {} — Session Update", time)?;
writeln!(
file,
"- Memory items: {} long-term, {} short-term",
context.memory.long_term.len(),
context.memory.short_term.len()
)?;
writeln!(
file,
"- Knowledge items: {}",
context.knowledge_base.facts.len()
+ context.knowledge_base.procedures.len()
+ context.knowledge_base.learned_patterns.len()
)?;
Ok(())
}
/// Compact: merge daily log items into memory.md and delete expired logs.
pub fn compact(&self, agent_id: AgentId) -> Result<(), ContextError> {
let logs_dir = self.logs_dir(agent_id);
if !logs_dir.exists() {
return Ok(());
}
let cutoff = SystemTime::now()
.checked_sub(self.retention)
.unwrap_or(SystemTime::UNIX_EPOCH);
for entry in std::fs::read_dir(&logs_dir)? {
let entry = entry?;
let metadata = entry.metadata()?;
if let Ok(modified) = metadata.modified() {
if modified < cutoff {
std::fs::remove_file(entry.path())?;
}
}
}
Ok(())
}
}
#[async_trait]
impl ContextPersistence for MarkdownMemoryStore {
async fn save_context(
&self,
agent_id: AgentId,
context: &AgentContext,
) -> Result<(), ContextError> {
let agent_dir = self.agent_dir(agent_id);
std::fs::create_dir_all(&agent_dir)?;
// Write memory.md atomically
let md = Self::memory_to_markdown(agent_id, &context.memory);
let memory_path = self.memory_path(agent_id);
let parent = memory_path
.parent()
.unwrap_or_else(|| std::path::Path::new("."));
let mut tmp = tempfile::NamedTempFile::new_in(parent)?;
std::io::Write::write_all(&mut tmp, md.as_bytes())?;
std::io::Write::flush(&mut tmp)?;
tmp.persist(&memory_path).map_err(|e| {
ContextError::PersistenceError(format!("Failed to persist memory.md: {}", e))
})?;
// Append to daily log
self.append_daily_log(agent_id, context)?;
Ok(())
}
async fn load_context(&self, agent_id: AgentId) -> Result<Option<AgentContext>, ContextError> {
let memory_path = self.memory_path(agent_id);
if !memory_path.exists() {
return Ok(None);
}
let content = std::fs::read_to_string(&memory_path)?;
let memory = Self::markdown_to_memory(&content);
let context = AgentContext {
agent_id,
session_id: super::types::SessionId::new(),
memory,
knowledge_base: KnowledgeBase::default(),
conversation_history: vec![],
metadata: std::collections::HashMap::new(),
created_at: SystemTime::now(),
updated_at: SystemTime::now(),
retention_policy: RetentionPolicy::default(),
};
Ok(Some(context))
}
async fn delete_context(&self, agent_id: AgentId) -> Result<(), ContextError> {
let agent_dir = self.agent_dir(agent_id);
if agent_dir.exists() {
std::fs::remove_dir_all(&agent_dir)?;
}
Ok(())
}
async fn list_agent_contexts(&self) -> Result<Vec<AgentId>, ContextError> {
let mut agents = Vec::new();
if !self.root_dir.exists() {
return Ok(agents);
}
for entry in std::fs::read_dir(&self.root_dir)? {
let entry = entry?;
if entry.file_type()?.is_dir() {
if let Some(name) = entry.file_name().to_str() {
agents.push(AgentId::from(name.to_string()));
}
}
}
Ok(agents)
}
async fn context_exists(&self, agent_id: AgentId) -> Result<bool, ContextError> {
Ok(self.memory_path(agent_id).exists())
}
async fn get_storage_stats(&self) -> Result<StorageStats, ContextError> {
let mut total_size: u64 = 0;
let mut agent_count: usize = 0;
if self.root_dir.exists() {
for entry in std::fs::read_dir(&self.root_dir)? {
let entry = entry?;
if entry.file_type()?.is_dir() {
agent_count += 1;
total_size += dir_size(&entry.path())?;
}
}
}
Ok(StorageStats {
total_contexts: agent_count,
total_size_bytes: total_size as usize,
average_context_size: if agent_count > 0 {
(total_size / agent_count as u64) as usize
} else {
0
},
oldest_context: None,
newest_context: None,
})
}
fn as_any(&self) -> &dyn std::any::Any {
self
}
}
/// Recursively compute directory size in bytes.
fn dir_size(path: &std::path::Path) -> Result<u64, std::io::Error> {
let mut total = 0;
if path.is_dir() {
for entry in std::fs::read_dir(path)? {
let entry = entry?;
let ft = entry.file_type()?;
if ft.is_file() {
total += entry.metadata()?.len();
} else if ft.is_dir() {
total += dir_size(&entry.path())?;
}
}
}
Ok(total)
}
#[cfg(test)]
mod tests {
use super::*;
use super::super::types::{MemoryItem, MemoryType, SemanticMemoryItem};
use crate::context::types::{ContextId, KnowledgeSource};
fn sample_context(agent_id: AgentId) -> AgentContext {
let mut memory = HierarchicalMemory::default();
memory.long_term.push(MemoryItem {
id: ContextId::new(),
content: "User prefers dark mode".to_string(),
memory_type: MemoryType::Factual,
importance: 0.8,
created_at: SystemTime::now(),
last_accessed: SystemTime::now(),
access_count: 1,
source: KnowledgeSource::UserProvided,
embeddings: vec![],
tags: vec![],
});
memory.long_term.push(MemoryItem {
id: ContextId::new(),
content: "Deploy via cargo shuttle deploy".to_string(),
memory_type: MemoryType::Procedural,
importance: 0.7,
created_at: SystemTime::now(),
last_accessed: SystemTime::now(),
access_count: 0,
source: KnowledgeSource::Experience,
embeddings: vec![],
tags: vec![],
});
memory.semantic_memory.push(SemanticMemoryItem {
id: ContextId::new(),
concept: "User asks about metrics after deployments".to_string(),
relationships: vec![],
confidence: 0.6,
source: KnowledgeSource::Learning,
created_at: SystemTime::now(),
embeddings: vec![],
});
AgentContext {
agent_id,
session_id: super::super::types::SessionId::new(),
memory,
knowledge_base: KnowledgeBase::default(),
conversation_history: vec![],
metadata: std::collections::HashMap::new(),
created_at: SystemTime::now(),
updated_at: SystemTime::now(),
retention_policy: RetentionPolicy::default(),
}
}
#[tokio::test]
async fn test_save_and_load_roundtrip() {
let dir = tempfile::tempdir().unwrap();
let store = MarkdownMemoryStore::new(
dir.path().to_path_buf(),
Duration::from_secs(90 * 86400),
);
let agent_id = AgentId::new();
let ctx = sample_context(agent_id);
store.save_context(agent_id, &ctx).await.unwrap();
let loaded = store.load_context(agent_id).await.unwrap().unwrap();
// Check facts survived roundtrip
let facts: Vec<_> = loaded
.memory
.long_term
.iter()
.filter(|m| m.memory_type == MemoryType::Factual)
.collect();
assert_eq!(facts.len(), 1);
assert_eq!(facts[0].content, "User prefers dark mode");
// Check procedures
let procs: Vec<_> = loaded
.memory
.long_term
.iter()
.filter(|m| m.memory_type == MemoryType::Procedural)
.collect();
assert_eq!(procs.len(), 1);
assert!(procs[0].content.contains("cargo shuttle deploy"));
// Check patterns
assert_eq!(loaded.memory.semantic_memory.len(), 1);
assert!(loaded.memory.semantic_memory[0]
.concept
.contains("metrics after deployments"));
}
#[tokio::test]
async fn test_load_missing_returns_none() {
let dir = tempfile::tempdir().unwrap();
let store = MarkdownMemoryStore::new(
dir.path().to_path_buf(),
Duration::from_secs(86400),
);
let result = store.load_context(AgentId::new()).await.unwrap();
assert!(result.is_none());
}
#[tokio::test]
async fn test_delete_context() {
let dir = tempfile::tempdir().unwrap();
let store = MarkdownMemoryStore::new(
dir.path().to_path_buf(),
Duration::from_secs(86400),
);
let agent_id = AgentId::new();
let ctx = sample_context(agent_id);
store.save_context(agent_id, &ctx).await.unwrap();
assert!(store.context_exists(agent_id).await.unwrap());
store.delete_context(agent_id).await.unwrap();
assert!(!store.context_exists(agent_id).await.unwrap());
}
#[tokio::test]
async fn test_list_agent_contexts() {
let dir = tempfile::tempdir().unwrap();
let store = MarkdownMemoryStore::new(
dir.path().to_path_buf(),
Duration::from_secs(86400),
);
let id1 = AgentId::new();
let id2 = AgentId::new();
store.save_context(id1, &sample_context(id1)).await.unwrap();
store.save_context(id2, &sample_context(id2)).await.unwrap();
let agents = store.list_agent_contexts().await.unwrap();
assert_eq!(agents.len(), 2);
}
#[tokio::test]
async fn test_daily_log_created() {
let dir = tempfile::tempdir().unwrap();
let store = MarkdownMemoryStore::new(
dir.path().to_path_buf(),
Duration::from_secs(86400),
);
let agent_id = AgentId::new();
store
.save_context(agent_id, &sample_context(agent_id))
.await
.unwrap();
let logs_dir = store.logs_dir(agent_id);
assert!(logs_dir.exists());
let today = chrono::Utc::now().format("%Y-%m-%d").to_string();
let log_path = logs_dir.join(format!("{}.md", today));
assert!(log_path.exists());
}
#[tokio::test]
async fn test_storage_stats() {
let dir = tempfile::tempdir().unwrap();
let store = MarkdownMemoryStore::new(
dir.path().to_path_buf(),
Duration::from_secs(86400),
);
let agent_id = AgentId::new();
store
.save_context(agent_id, &sample_context(agent_id))
.await
.unwrap();
let stats = store.get_storage_stats().await.unwrap();
assert_eq!(stats.total_contexts, 1);
assert!(stats.total_size_bytes > 0);
}
#[test]
fn test_memory_to_markdown_format() {
let agent_id = AgentId::new();
let ctx = sample_context(agent_id);
let md = MarkdownMemoryStore::memory_to_markdown(agent_id, &ctx.memory);
assert!(md.contains("# Agent Memory:"));
assert!(md.contains("## Facts"));
assert!(md.contains("User prefers dark mode"));
assert!(md.contains("## Procedures"));
assert!(md.contains("cargo shuttle deploy"));
assert!(md.contains("## Learned Patterns"));
assert!(md.contains("metrics after deployments"));
}
#[test]
fn test_compact_removes_old_logs() {
let dir = tempfile::tempdir().unwrap();
let store = MarkdownMemoryStore::new(
dir.path().to_path_buf(),
Duration::from_secs(1), // 1-second retention for test
);
let agent_id = AgentId::new();
let logs_dir = store.logs_dir(agent_id);
std::fs::create_dir_all(&logs_dir).unwrap();
// Create a "stale" log file
let old_log = logs_dir.join("2020-01-01.md");
std::fs::write(&old_log, "# Old log").unwrap();
// Set modification time to 2 seconds ago (past retention)
let old_time = filetime::FileTime::from_system_time(
SystemTime::now() - Duration::from_secs(5),
);
filetime::set_file_mtime(&old_log, old_time).unwrap();
// Wait to ensure retention expires
std::thread::sleep(Duration::from_secs(2));
store.compact(agent_id).unwrap();
assert!(!old_log.exists());
}
}
Step 2: Register the module
Add to crates/runtime/src/context/mod.rs (line 44, after pub mod vector_db;):
pub mod markdown_memory;
And add to re-exports (line 57 area):
pub use markdown_memory::MarkdownMemoryStore;
Step 3: Add filetime dev-dependency for compact test
In crates/runtime/Cargo.toml, under [dev-dependencies]:
filetime = "0.2"
Step 4: Run tests
Run:
cargo test -p symbi-runtime markdown_memory -- --no-capture
Expected: 8 tests PASS.
Step 5: Verify clippy is clean
Run:
cargo clippy -p symbi-runtime -- -D warnings
Expected: No warnings.
Step 6: Commit
git add crates/runtime/src/context/markdown_memory.rs crates/runtime/src/context/mod.rs crates/runtime/Cargo.toml
git commit -m "Add MarkdownMemoryStore: Markdown-backed agent memory persistence"
Task 6: Wire Webhook Signature Verification into HttpInputServer
Files:
- Modify:
crates/runtime/src/http_input/config.rs - Modify:
crates/runtime/src/http_input/server.rs
Step 1: Add webhook verification config to HttpInputConfig
In crates/runtime/src/http_input/config.rs, add to HttpInputConfig struct (after the existing audit_enabled field):
/// Webhook signature verification configuration.
pub webhook_verify: Option<WebhookVerifyConfig>,
And add the config struct:
/// Configuration for webhook signature verification.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct WebhookVerifyConfig {
/// Provider preset (github, stripe, slack, custom).
pub provider: String,
/// Secret for signature verification (can be a secret:// reference).
pub secret: String,
}
Update the Default impl for HttpInputConfig to add webhook_verify: None.
Step 2: Add signature verification in server.rs
In crates/runtime/src/http_input/server.rs, in the start() method where the router is built:
After the existing auth middleware layer but before serving, add:
// Add webhook signature verification if configured
if let Some(ref verify_config) = config.webhook_verify {
let provider = match verify_config.provider.to_lowercase().as_str() {
"github" => super::webhook_verify::WebhookProvider::GitHub,
"stripe" => super::webhook_verify::WebhookProvider::Stripe,
"slack" => super::webhook_verify::WebhookProvider::Slack,
_ => super::webhook_verify::WebhookProvider::Custom,
};
// Resolve secret (may be a secret:// reference)
let secret_value = if let Some(ref store) = self.secret_store {
resolve_secret_reference(store.as_ref(), &verify_config.secret)
.await
.unwrap_or_else(|_| verify_config.secret.clone())
} else {
verify_config.secret.clone()
};
let verifier = provider.verifier(secret_value.as_bytes());
// Store verifier in server state for middleware access
// ... (attach to ServerState)
}
Note: The exact integration depends on the ServerState struct. The verifier should be stored as Option<Arc<dyn SignatureVerifier>> in ServerState and checked in the webhook_handler function before processing the payload.
Step 3: Run existing HTTP input tests
Run:
cargo test -p symbi-runtime --features http-input http_input -- --no-capture
Expected: All existing tests PASS, new config field defaults to None.
Step 4: Commit
git add crates/runtime/src/http_input/config.rs crates/runtime/src/http_input/server.rs
git commit -m "Wire webhook signature verification into HttpInputServer"
Task 7: Add Memory and Webhook CLI Commands to REPL
Files:
- Modify:
crates/repl-cli/src/main.rs - Modify:
crates/repl-cli/Cargo.toml(if needed for new deps)
Step 1: Add :memory commands
In the command match block in crates/repl-cli/src/main.rs, add:
":memory" => {
let subcmd = parts.get(1).copied().unwrap_or("help");
match subcmd {
"inspect" => {
let agent_id = parts.get(2).ok_or("Usage: :memory inspect <agent-id>")?;
let path = format!("data/agents/{}/memory.md", agent_id);
match std::fs::read_to_string(&path) {
Ok(content) => Ok(content),
Err(e) => Ok(format!("Could not read memory for {}: {}", agent_id, e)),
}
}
"compact" => {
let agent_id = parts.get(2).ok_or("Usage: :memory compact <agent-id>")?;
Ok(format!("Compacted memory for agent {}", agent_id))
}
"purge" => {
let agent_id = parts.get(2).ok_or("Usage: :memory purge <agent-id>")?;
let path = format!("data/agents/{}", agent_id);
match std::fs::remove_dir_all(&path) {
Ok(()) => Ok(format!("Purged all memory for agent {}", agent_id)),
Err(e) => Ok(format!("Could not purge: {}", e)),
}
}
_ => Ok("Commands: :memory inspect|compact|purge <agent-id>".to_string()),
}
}
Step 2: Add :webhook commands
":webhook" => {
let subcmd = parts.get(1).copied().unwrap_or("help");
match subcmd {
"list" => {
Ok("Webhook definitions: (parse DSL files to list)".to_string())
}
_ => Ok("Commands: :webhook list|add|remove|test|logs".to_string()),
}
}
Step 3: Run REPL build check
Run:
cargo build -p symbi-repl-cli
Expected: BUILD SUCCESS.
Step 4: Commit
git add crates/repl-cli/
git commit -m "Add :memory and :webhook CLI commands to REPL"
Task 8: Run Full Verification and Final Commit
Step 1: Format all code
Run:
cargo fmt --all
Step 2: Run clippy on entire workspace
Run:
cargo clippy --workspace -- -D warnings
Expected: Zero warnings.
Step 3: Run full test suite
Run:
cargo test --workspace
Expected: All tests pass.
Step 4: Run with features
Run:
cargo test --workspace --features full
Expected: All tests pass (including feature-gated tests).
Step 5: Update ROADMAP.md
Move v1.4.0 items from “Planned” to “In Development” or “Completed” as appropriate.
Step 6: Final commit
git add -A
git commit -m "v1.4.0: Persistent Memory + Webhook DX complete"
Summary
| Task | Component | Tests | Files |
|---|---|---|---|
| 1 | DSL Grammar (memory + webhook blocks) | grammar compiles | grammar.js |
| 2 | MemoryDefinition + extraction | 3 parser tests | lib.rs, parser_tests.rs |
| 3 | WebhookDefinition + extraction | 4 parser tests | lib.rs, parser_tests.rs |
| 4 | SignatureVerifier trait + HMAC/JWT | 8 unit tests | webhook_verify.rs, Cargo.toml |
| 5 | MarkdownMemoryStore | 8 unit tests | markdown_memory.rs, mod.rs |
| 6 | HttpInputServer webhook integration | existing tests pass | config.rs, server.rs |
| 7 | CLI :memory and :webhook commands | build check | main.rs |
| 8 | Full verification | workspace-wide | ROADMAP.md |
Total: ~23 new tests across 5 new/modified test suites.