The Intent Stack
A reference model for governing AI agent behavior within organizations
Four layers of governance context. From intent discovery to runtime alignment. At every interface where authority is delegated or coordination is required.
BPMN standardizes process. DMN standardizes decisions. CMMN standardizes case management. No standard exists for governing the alignment between agent behavior and organizational intent at runtime. The gap is not in policy guidance — frameworks like NIST AI RMF address that. The gap is not in model-level alignment — Constitutional AI addresses that at training time. The gap is in runtime organizational governance: the infrastructure that discovers what an organization actually intends, formalizes that intent in a form agents can operate against, monitors alignment in real time, and adjusts governance as the relationship matures.
The Intent Stack covers governance context — from intent discovery through runtime alignment. The companion BPM/Agent Stack specification covers execution governance — orchestration, integration, and execution of authorized work. Together, the two specifications address seven governance concerns across the full governance lifecycle.
Who this is for
- Standards bodies — OMG, ISO, IEEE members evaluating governance architecture for AI agent deployment
- BPM & governance practitioners — professionals extending process, decision, and case management governance to AI agent contexts
- AI governance researchers — studying runtime governance infrastructure for organizational AI deployment
- Enterprise architects — organizations deploying AI agents at scale who need standardized governance architecture
Public Draft Specification, Version 1.2 — April 1, 2026. Subject to revision through operational evidence. Licensed under CC BY 4.0.