source
LangGraph + Ardur quickstart
A LangGraph agent making tool calls through Ardur's governance proxy. The agent runs under an Ardur-issued mission credential, calls a small set of tools (read, write, summarize),
A LangGraph agent making tool calls through Ardur’s governance proxy. The agent runs under an Ardur-issued mission credential, calls a small set of tools (read, write, summarize), and Ardur:
- Issues a Mission Declaration signed by the local issuer key
- Verifies the credential on every tool call against the mission’s allowed tools, resource scope, and budget
- Emits an Execution Receipt per call (compliant / violation / insufficient_evidence)
- Produces a session-end attestation that’s offline-verifiable with the issuer’s public key
LangGraph’s integration point is different from LangChain’s: the verifier hooks into node transitions on the StateGraph, not the tool wrapper directly. That means cycles and conditional edges need receipts that carry the source/target node so the attestation stays linkable to the graph topology — receipts emit in node order, not call order.
File layout
langgraph-quickstart/
├── README.md # this file
└── demo.py # the LangGraph agent + scenarios entrypoint
demo.py imports framework-agnostic helpers from examples/_shared/demo_scenes.py
. The LangGraph quickstart shares the LangChain Docker image (rahulnutakki/ardur-demo:lang) — both frameworks live in the same Python dependency stack, so a single image covers them.
Dependencies
- Python 3.13+
python/editable install (this repo,pip install -e ../../python[dev])langgraph ^0.2.0plus thelangchain-*family (already pulled by[dev]extras for the LangChain demo)- LLM access: local Ollama, an OpenAI-compatible gateway, or an Anthropic API key
- Optional: Docker via the LangChain image (
rahulnutakki/ardur-demo:langruns this demo too — passdemo.pyas the entrypoint)
Running locally
cd ../../python && pip install -e '.[dev]'
export ARDUR_PROVIDER=ollama
export OLLAMA_MODEL='<your local model tag>'
cd ../examples/langgraph-quickstart
PYTHONPATH=../_shared python demo.py
ARDUR_PROVIDER plus the matching *_MODEL env var are required. No model identifiers are hard-coded — see CONTRIBUTING.md
.
Out of scope for this example
- Real-cluster SPIRE deployment — the example uses local file-based identity.
- Live LLM provider failover — single provider per run.
- Multi-tenant key isolation — single issuer key.
- Persistent checkpointing across runs (LangGraph supports it, but the example resets state each run for reproducible receipts).
For pure protocol exercising without the framework on top, see examples/missions/
.