Every AI reads the world. Abel computes it.

Abel is building a live causal world model — computing what drives what across 200,000+ financial and macroeconomic variables, with causal structure refreshed daily and predictions updated hourly.

Already built

The world's largest live causal graph
for financial markets.

200,000+ variables across 30 time steps. PCMCI at industrial scale. Structure refreshed daily. Running now.

0+
Financial variables tracked
0
Time steps per inference
0M+
Causal edges discovered
Live
Structure refreshed daily

Scale comparison

200x beyond published SOTA.

The only known system running true causal inference at 200K+ variables with daily structural refresh.

AbelnowTrue Causal
200K+ vars
Frequency: DailyMethod: True causal (PCMCI)
BloombergCorrelation
~5,000 vars
Frequency: DailyMethod: Correlation only
Kensho (S&P Global)Statistical
~500 vars
Frequency: Event-drivenMethod: Statistical association
Causality LinkNLP-extracted
~2,000 vars
Frequency: DailyMethod: NLP-extracted "causal"
Two Sigma / CitadelGranger
Unknown vars
Frequency: UnknownMethod: Likely Granger
Academic PCMCITrue Causal
~500 vars
Frequency: One-shotMethod: True causal

Mission

Three gaps no existing
system closes.

Every major AI system today is built on text. The world does not run on text. It runs on numerical reality: prices moving, rates shifting, flows redirecting, structures forming and dissolving.

The Structure GapDecode Reality
The problem

Correlation-based systems see that X and Y move together. They cannot tell you whether X drives Y, Y drives X, or both are driven by Z.

How Abel solves it

Abel discovers directed causal structure from data using constraint-based and score-based algorithms — then encodes it as a live DAG with 200K+ variables. Every edge has a direction, a β coefficient, a time lag τ, and a p-value. Not correlation matrices. Directed graphs.

Principle: Structure, Not Surface

The mathematical proof

Pearl's Causal Hierarchy

LLMs are stuck at Layer 1 — association. Abel operates at Layer 2 (intervention) and Layer 3 (counterfactual). A mathematical impossibility, not an engineering gap.

Layer 1·Association·Seeing

What is the probability of Y given that we observe X?

Google
Retrieval

WHAT happened?

"BTC dropped 5% today"

Retrieves facts from indexed pages. No mechanism, no directionality. Pure observation.

ChatGPT
Pattern Matching

HOW does it work?

"BTC often drops when Fed raises rates due to historical patterns"

Synthesizes patterns from training data. Sounds causal — is not. Cannot distinguish correlation from causation.

Abel
Causal Inference

WHAT is connected?

"BTC ↔ DXY: β=−0.042, τ=5h, p<0.003 — a directed edge in the live causal graph"

Discovers directed associations with edge weights, time lags, and statistical significance.

One engine, two surfaces

Same question. Same engine.
Two interfaces.

200,000+ variables. 6M causal spatiotemporal nodes. Structure refreshed daily, predictions updated hourly. Whether you're a human or an AI agent — same engine.

Abel AppFor humans

Ask any decision question

“If the Fed raises rates 50bp, should I hold my crypto?”

HOLD
p = 0.003

Causal chain

Fed_Rateτ=5hDXYτ=2hBTCUSD

Effect

−4.2%

95% CI

[−2.1, −6.8]%

β coeff

−0.042

Natural language in, structured causal analysis out. No code required.

Try Abel App
PlatformFor agents
agent.pypython
import abel

client = abel.Client(api_key="sk-your-key")

prediction = client.predict("BTCUSD_close", horizon=48)
drivers = client.explain("BTCUSD_close", depth=2,
  cross_domain=True)
effect = client.intervene("Fed_Funds_Rate", "BTCUSD_close",
  treatment_value=0.5)

print(prediction)
print(drivers)
print(effect)

`pip install abel-cap` — three lines to your first causal query. Typed responses, async support, and built-in caching for high-throughput agent pipelines.

MCP gives agents tools. CAP gives agents causal reasoning.

View API Docs

Same question. Different universe.

What it looks like when
answers are computable.

Every number traceable to a graph edge. Every claim falsifiable with a timestamp.

Abel

Will AI replace designers?

AI adoption → design job postings — causal, not opinion

Execution roles compress. Strategic roles compound. Move up, not sideways.

Signal

Structural Shift

Signal Lag

0h

P(up)

0.0%

93

Nodes

0

Edges

0

Depth

0layers

Speed

1.2s

Data freshness: 2h ago

Analysis

Platform→Toolmaker causal path (tau=84h, 2 hops) shows design-tool automation accelerating, but Toolmaker prob_up = 93.4% — toolmakers thrive while execution-only roles compress. Upskill into systems thinking and brand strategy; those roles have no automation parent in the graph.

What to do

Upskill into systems thinking and brand strategy. Execution-only roles have a direct automation parent in the graph — strategic roles do not.

Evidence

AI_Adoption[beta=0.67, tau=84h, p<0.001]Design_Tool_Automation[beta=0.41, tau=60h, p<0.004]Junior_Designer_Demand

LLM would say

AI will augment designers rather than replace them. While AI tools can automate repetitive tasks, human creativity and empathy remain essential for great design.

For developers

Use the platform to give any LLM
a causal cortex.

Docs are for implementation. The platform is for understanding the model, integrations, and deployment path before you ship.

MCP gives agents tools. CAP gives agents causal reasoning. Orthogonal by design — Schema-as-API provides deterministic, zero-LLM-cost routing into Abel's 200K+ variable graph.

agent.pypython
import abel

client = abel.Client(api_key="sk-...")

# Predict using causal Markov blanket — not correlation
prediction = client.predict("BTCUSD_close", horizon=48)
print(prediction.mean, prediction.ci_95)

# Explain: what drives BTC this week?
drivers = client.explain("BTCUSD_close", depth=2, cross_domain=True)
for d in drivers:
    print(f"{d.variable} → weight: {d.edge_weight}")

# Intervene: what if oil hits $120?
effect = client.intervene("WTI_Crude", "CPI", treatment_value=120)

Fully typed SDK with async support, response caching, and streaming for large graph queries. Designed for agent pipelines that need high-throughput causal inference.

Start making decisions with
live causal intelligence.

Interested in shaping the future of
causal intelligence? We're hiring.

Open Roles

Join Abelian Groups to stay on top of new
releases, features, and updates.