AI Architecture Governance
Nobody knows what their AI systems are actually allowed to do.
ModelAIr is an architecture design workbench where the diagram is the interface and governance is the output. Design AI systems visually. Produce structured, queryable governance artefacts automatically.
The diagram is the mechanism. Governance is the product.
ModelAIr is not a diagramming tool. Lucidchart, Miro, draw.io — those produce pictures. ModelAIr produces a structured data model of your AI system that can be exported, queried, audited, and fed directly into compliance conversations.
The canvas is the interface. What it creates is not a picture. It is a typed, governed record of how your AI system is actually built — and what it is permitted to do.
Not this
A diagram you export to a PDF and attach to a compliance document that nobody updates.
This
A living, queryable governance artefact that is the design workspace — not downstream of it.
Core concepts
Components as structured objects
Every AI system component — model, data store, orchestrator, human checkpoint — is a typed object with domain-specific properties. Not a shape on a canvas.
Signal. Advisory. Authoritative.
Three levels define what an AI component is actually permitted to do. Not what it can technically do — what it is governed to do. A critical distinction.
Autonomous versus human-in-the-loop
Explicit modelling of where AI has autonomous authority and where a human must approve before action is taken. Visible in the diagram. Exportable to governance record.
Connections that carry meaning
Every connection carries typed properties: transport protocol, authentication method, data sensitivity classification, reliability characteristics. The edge is part of the governance model.
The diagram is the artefact
No separate export step. No documentation layer. The design workspace produces a structured, queryable governance record automatically. The diagram is the output.
Why now
The EU AI Act is enforcing documentation that organisations don't have. Fast AI deployment has created a documentation debt crisis.
The tooling gap is real. Nothing exists that models AI architecture with governance as a first-class output.
The full case →No commitment. Early access only.