The open protocol behind
the Abel Platform
CAP defines the causal operations and graph-discovery primitives that builders use through Abel Platform. It is the protocol layer, not the main onboarding destination.
CAP is to causal reasoning what SQL is to data querying.
Why a New Protocol?
MCP gives agents tools. CAP gives agents causal reasoning.
Existing protocols move tool calls around. CAP defines what causal computation means inside the platform.
| Feature | CAP | MCP | Function Calling |
|---|---|---|---|
| Causal semantics | ✓ Native — discover, intervene, counterfactual | ✗ Not defined | ✗ Not defined |
| Graph awareness | ✓ Schema API exposes causal topology | ✗ No graph concept | ✗ No graph concept |
| Layer 2/3 operations | ✓ do-calculus, SCM-based counterfactuals | ✗ Transport only | ✗ Signature only |
| Regime detection | ✓ Built-in primitive | ✗ | ✗ |
| Tool transport | Via MCP or REST | ✓ Native | ✓ Native |
| LLM routing | Schema-as-API — LLM self-routes | Tool descriptions | Function schemas |
| Open standard | ✓ MIT | ✓ Open | Vendor-specific |
Computation Primitives
8 causal operations — the building blocks
discover(data) → GraphPrerequisiteLearn causal structure G from observational data D
Agent has a dataset and needs to find causal relationships
intervene(graph, X, Y, x) → EffectLayer 2Estimate P(Y | do(X=x)) using do-calculus
"What happens if I change X?" — causal intervention
predict(graph, Y, horizon) → ValueLayer 1+P(Y_{t+h} | MB(Y), G) — Markov blanket prediction
"What will Y be in the future?" — causal forecasting
explain(graph, Y) → ParentsLayer 1-2Extract minimal causal explanation set for Y
"What is causally driving Y?" — driver analysis
counterfactual(graph, X', obs) → Y'Layer 3P(Y_{x'} | X=x, Y=y) — counterfactual inference
"What would have happened if…?" — alternative history
validate(graph, data) → ReportMetaTest whether G is consistent with new data
"Is this causal graph still accurate?" — graph validation
detect_regime(graph) → ChangesMetaDetect G_t ≠ G_{t-1} — structural change detection
"Has the causal structure itself changed?" — regime shift
check_reflexivity(graph, Y) → ReportMetaTest if Abel's output has become a causal variable
"Are our predictions self-fulfilling?" — Soros reflexivity
Schema Discovery
4 primitives for graph navigation
Agents need to explore the causal world model before computing on it.
list_communities()List all semantic communities in the causal world model
search_variables(query)Fuzzy search variables by name, type, or domain
get_neighborhood(var, depth)Get causal parents, children, and cross-domain paths
get_regime_status()Current regime state and recent structural changes
Start making decisions with
live causal intelligence.
Interested in shaping the future of
causal intelligence? We're hiring.
Join Abelian Groups to stay on top of new
releases, features, and updates.