Skip to content

20 Architecture Diagrams in 20 Minutes: How AI Documents Enterprise Systems

Generate ERDs, network topologies, security models, CI/CD pipelines, and integration maps from code. The batch-generation approach that replaces weeks of Visio work.

Alex Pechenizkiy 10 min read
20 Architecture Diagrams in 20 Minutes: How AI Documents Enterprise Systems

You join a new project. 350 employees, 15 systems, zero architecture documentation. The CTO - David Park, in this case - wants diagrams by Friday. He needs to present the modernization plan to the VA stakeholders and “a few PowerPoints with boxes” won’t cut it.

The traditional approach: 3 weeks in Visio, manually drawing each service, routing each arrow, color-coding each zone. You’ll get through maybe 5 diagrams before the deadline. They’ll be outdated before the ink dries.

My approach: 20 diagrams in 20 minutes. Generated from TypeScript. Version-controlled in git. Reproducible, consistent, and updated by changing a single line of code.

Enterprise integration map showing API Management as the central hub connecting Power Platform, backend services, legacy Oracle, and Dataverse

What Does This Enterprise Actually Look Like?

Cascade Dynamics is a fictional company I created to demonstrate the approach. But the architecture is real - it mirrors what I see in federal healthcare IT modernization projects every day.

The setup: a 350-person firm modernizing clinical systems for Veterans Affairs. They inherited an Oracle 19c database that’s been running for 15 years, and they’re building a hybrid platform on Azure + Power Platform while keeping the legacy system alive during the transition.

Their stack spans:

  • Legacy: Oracle 19c on-prem, SFTP batch feeds, Windows Server VMs
  • Azure: AKS, App Services, Cosmos DB, Azure SQL, API Management, Service Bus, Azure OpenAI
  • Power Platform: Dataverse for case management, Power Automate for approvals, Power BI for dashboards
  • Security: Entra ID with Conditional Access, Key Vault, Sentinel

This is the kind of environment where architecture documentation isn’t optional - it’s a compliance requirement. And it’s the kind of environment where nobody has time to draw diagrams by hand.

How Do You Generate 20 Diagrams from Code?

The approach is straightforward. A TypeScript script contains helper functions for common diagram elements (icons, rectangles, containers, edges) and a specification array where each diagram is a function that returns Draw.io XML.

// Helper: Azure icon with label
function iconBlock(x, y, iconPath, label) {
  const id = addIcon(x, y, iconPath);
  addLabel(x - 20, y + 60, 90, 20, label);
  return id;
}

// One diagram = one function
{ name: 'cascade-cicd-pipeline', fn: () => {
  const ado = iconBlock(30, 70, 'devops/Azure_DevOps.svg', 'Azure DevOps');
  const build = step(140, 80, 140, 55, 'Build', 'Compile + unit tests', blue);
  const test = step(320, 80, 140, 55, 'Test', 'Integration + security', amber);
  addEdge(ado, build, 'push');
  addEdge(build, test);
  // ...
}}

Run the script. It generates 20 .drawio files, exports each to SVG (with Azure icons embedded, transparent backgrounds), and validates against the quality gate. The whole thing takes about 30 seconds.

The 5 Categories Every Enterprise Architecture Needs

Every complex system needs at least these five diagram categories. Skip any of them and you’ll have blind spots that cost you during incident response, audits, or onboarding.

Category Diagrams Who Reads Them
Data Architecture ERDs, data flows, event architecture, state machines Developers, DBAs, compliance
Infrastructure + Security Network topology, zero trust, identity, monitoring Platform engineers, security, auditors
DevOps + Deployment CI/CD, environment topology, IaC, container architecture DevOps, release managers
Integration + AI Integration map, AI pipelines, RAG architecture, migration path Architects, data engineers, AI team
Power Platform + Governance Platform footprint, approval flows, environments, governance CoE team, admins, business owners

Data Architecture: ERDs, Flows, Events, and State Machines

The Core ERD

This is always diagram #1. Before anyone writes a line of code, they need to see the data model. Cascade’s Dataverse schema has 6 core tables across 3 domains - clinical (blue), provider (green), and administrative (amber/purple/gray).

Entity Relationship Diagram showing 6 Dataverse tables: Patient, Clinical Case, Clinician, Appointment, Document, and Audit Log with relationship arrows
Core ERD for Cascade Dynamics. Color-coded by domain: clinical (blue), provider (green), scheduling (amber), documents (purple), audit (gray). All relationships labeled with cardinality.

What makes this ERD useful: column-level detail (not just table names), color coding by domain, and relationship cardinality on every edge. The ai_summary column on the Document table tells you immediately that AI processing is happening at the data layer.

For a deeper look at generating beautiful ERDs from Dataverse schemas, see Generate a Beautiful Dataverse ERD in 5 Minutes.

Data Ingestion Pipeline

The Oracle-to-Azure migration runs nightly batch feeds through SFTP. This diagram shows the flow from legacy to cloud, splitting structured data to Azure SQL and documents to Cosmos DB.

Data flow pipeline from Oracle 19c through SFTP and Blob Storage to Data Factory, splitting to Azure SQL for structured data and Cosmos DB for documents
Nightly data ingestion from legacy Oracle. SFTP batch upload to Blob Storage triggers Data Factory, which routes structured data to SQL and documents to Cosmos.

Event-Driven Architecture

Service Bus handles the async event distribution. Three topic categories (case events, document events, audit events) feed into Function App consumers. This is the backbone - every state change in the system flows through here.

Event-driven architecture with Service Bus distributing events to 4 Function App consumers through 3 topic categories
Service Bus topology with 3 topic categories and 4 Function consumers. Each consumer has a single responsibility: notification, AI processing, search indexing, or compliance logging.

Case Lifecycle State Machine

Clinical cases move through 7 states with a decision gateway. The rejection loop (Returned -> Draft) is the one that causes the most bugs - state transitions that go backward need careful handling in Power Automate.

State machine showing clinical case lifecycle: Draft to Submitted to In Review with decision gateway to Approved/Returned, then Active and Closed
Case lifecycle state machine. The 'Returned' state loops back to Draft - this backward transition is where most workflow bugs live.

Infrastructure + Security: Network, Zero Trust, Identity, and Observability

Network Topology

Hub-spoke VNet design. The hub hosts the firewall, bastion, and DNS. App and data spokes are peered to the hub. On-premises Oracle connects through VPN/ExpressRoute. This is the diagram your network team and auditors ask for first.

Hub-spoke VNet topology with Hub VNet containing Firewall, Bastion, and DNS, connected to App Spoke with AKS and App Service, Data Spoke with SQL and Cosmos, and on-premises Oracle via VPN
Hub-spoke network topology. Hub VNet (10.0.0.0/16) peers to app and data spokes. On-premises connection via VPN/ExpressRoute to the legacy Oracle environment.

Zero Trust Security Model

Entra ID at the center, four pillars radiating out: Conditional Access (MFA), Key Vault (managed identities), Sentinel (SIEM), and NSG Rules (micro-segmentation). Each pillar maps to a concrete implementation below it.

Zero trust security model with Entra ID at center connecting to Conditional Access, Key Vault, Sentinel, and NSG Rules, each mapping to implementation details
Zero trust security model. Every component maps from a principle (Conditional Access) to an implementation (MFA required for all users, all apps).

Identity Architecture

The authentication flow from end to end. Users authenticate through Entra ID, pass Conditional Access (MFA + device compliance), receive a JWT, hit API Management, and get routed to the appropriate backend API. This is the diagram you hand to the penetration testing team.

Identity flow from Users through Entra ID and Conditional Access to API Management, routing to Clinical API and Document API backends
End-to-end identity flow. Every request passes through 4 checkpoints before reaching backend services.

Monitoring and Observability

Three telemetry sources (App Services, AKS, Functions) feed into Application Insights and Log Analytics, which converge on Azure Monitor. From there, security events go to Sentinel and operational metrics go to dashboards. This is how you answer “what broke at 3 AM.”

Monitoring stack showing App Services, AKS, and Functions feeding into Application Insights and Log Analytics, converging on Azure Monitor with outputs to Sentinel and dashboards
Observability stack. Three telemetry sources converge on Azure Monitor, which routes security alerts to Sentinel and operational metrics to Power BI dashboards.

DevOps + Deployment: CI/CD, Topology, IaC, and Containers

CI/CD Pipeline

Azure DevOps runs the pipeline. Code push triggers build and unit tests, then integration and security testing, then an approval gate, then staging with smoke tests, and finally production with blue-green deployment. The ACR (Container Registry) stores the images.

CI/CD pipeline from Azure DevOps through Build, Test, approval Gate, Staging, and Production stages with ACR for container images
CI/CD pipeline with 5 stages. The approval gate between testing and staging is where most deployments pause for human review.

Deployment Topology

Three environments, each with identical resource sets. Dev (blue), Test/UAT (amber), Production (green). The promote/approve edges show the one-way flow - nothing goes backward from prod to dev.

Three deployment environments (Development, Test/UAT, Production) each containing App Service, SQL Database, and Cosmos DB with promote and approve edges between them
Deployment topology. Each environment has identical resource types (App Service, SQL, Cosmos) with promotion gates between them.

Infrastructure as Code

Bicep templates in git, committed to an Azure DevOps pipeline that runs ARM what-if validation, then deploys to resource groups. Azure Policy validates compliance at every deployment. No manual portal clicks allowed.

IaC pipeline from Bicep Templates through Azure DevOps to ARM Deploy, splitting to Resource Groups and Azure Policy validation
Infrastructure as Code pipeline. Bicep templates are the single source of truth. Azure Policy validates every deployment against compliance rules.

Container Architecture

The AKS cluster hosts 4 workloads: Clinical API, Document API, Event Processor, and an Auth Sidecar (DaemonSet). ACR provides container images, Application Gateway handles ingress, and Key Vault provides secrets through the auth sidecar.

AKS cluster containing Clinical API, Document API, Event Processor, and Auth Sidecar deployments, with ACR, Application Gateway, and Key Vault connected externally
AKS cluster architecture. The Auth Sidecar (DaemonSet) handles secret rotation from Key Vault so application pods never touch secrets directly.

Integration + AI: APIs, Document Processing, RAG, and Migration

Enterprise Integration Map

API Management sits at the center. On the left: consumer systems (Power Platform, Power Pages portal, mobile app). On the right: backend services (App Services, Functions). Below: legacy Oracle and Dataverse. Every system-to-system call routes through APIM.

Enterprise integration map with API Management at center, connecting Power Platform, Power Pages, and Mobile App on the left to App Service and Functions on the right, with Oracle and Dataverse below
Enterprise integration map. APIM is the single gateway for all system-to-system communication. The legacy Oracle adapter (dashed line) handles the SFTP/CDC bridge.

AI Document Processing Pipeline

Clinical documents flow through 6 steps: upload, blob storage, AI Document Intelligence (form extraction), Azure OpenAI (summarization), Cosmos DB (storage), and Power App (display). A clinician uploads a scanned form and gets a structured summary in the case management app.

AI document pipeline from upload through Blob Storage, Doc Intelligence, Azure OpenAI, Cosmos DB to Power App display
AI document processing pipeline. Clinical notes go from PDF to structured summary in seconds. Doc Intelligence extracts fields, OpenAI generates the summary, Cosmos stores the results.

RAG Architecture

The Retrieval-Augmented Generation pattern. A clinician asks a question, Azure OpenAI sends a vector search to AI Search, which fetches relevant documents from Cosmos DB. The documents flow back to OpenAI as context for a grounded response with citations.

RAG pattern showing Clinician Query to Azure OpenAI, which sends vector search to AI Search, fetches documents from Cosmos DB, and generates a grounded response with citations
RAG architecture. The feedback loop (OpenAI -> Search -> Cosmos -> OpenAI) ensures every response is grounded in actual clinical documents, not hallucinated.

Legacy Modernization Path

The migration from Oracle to Azure is a 6-month project. Data Migration Service handles the schema and data transfer, with structured records going to Azure SQL and documents to Cosmos DB. A parallel run period validates data consistency before the Oracle cutover.

Legacy modernization path from Oracle 19c through Data Migration Service to Azure SQL and Cosmos DB, with a parallel run validation period before Oracle cutover
Legacy modernization path. The 6-month parallel run is non-negotiable for healthcare systems - you validate data consistency before decommissioning anything.

Power Platform + Governance: Landscape, Approvals, Environments, and CoE

Power Platform Footprint

The complete Power Platform map. Dataverse is the center of gravity. Power Apps drives the case management UI, Power Automate handles workflow, Power BI powers dashboards, and Copilot Studio provides the AI assistant. API Management bridges the Power Platform to Azure backend services.

Power Platform footprint showing Dataverse at center with Power Apps, Power Automate, Power BI connecting from top, Copilot Studio and API Management from bottom
Power Platform footprint. Dataverse is the center of gravity. Every platform component reads from and writes to the same data layer.

Approval Flow Architecture

Clinical cases need 3 approval steps: manager review, clinical review (Dr. Rahman’s team), and compliance check (HIPAA). Power Automate orchestrates the multi-step approval with parallel notifications at each stage.

Approval flow from Case Submit through Manager Review, Clinical Review, and Compliance check to Approved status
Multi-step approval flow. Three sequential gates, each with different reviewers and criteria. The compliance check at the end is the one that catches HIPAA violations.

Environment Strategy

The standard Dev -> Test/UAT -> Production promotion path. Solutions export as unmanaged from Dev, import as managed into Test, and deploy to Production only after approval. Connection references and environment variables handle the per-environment configuration.

Environment strategy from Power Platform through Dev (unmanaged), Test/UAT (managed import), to Production (managed only) with Dataverse
Environment promotion strategy. Dev is unmanaged (experiment freely), Test imports managed solutions, Production accepts managed solutions only.

Governance Model

The Center of Excellence toolkit sits at the middle. DLP policies block risky connectors, CoE Starter Kit provides inventory and compliance tracking, and Environment Groups handle routing rules. Below: Solution Checker gates code quality, Approval Gates control prod promotion, and Azure Policy enforces infrastructure compliance.

Governance model with Power Platform at top, three pillars (DLP Policies, CoE Starter Kit, Environment Groups) and their implementations below
Governance model. Three pillars (DLP, CoE, Environment Groups) each map to concrete enforcement mechanisms. Azure Policy extends governance to the infrastructure layer.

Why Not Use Visio, Lucidchart, or Miro?

Capability Manual Tools Batch Generation
Time for 20 diagrams 3+ weeks 30 seconds
Version control Binary files, no diffs XML in git, meaningful diffs
Consistency Manual - varies by author Code-enforced palette and layout
Update cost Redraw from scratch Change one line, regenerate
Azure icons Download, import, position 648 icons referenced by path
Quality assurance Eyeball it Automated quality gate + visual QA
License cost $$$/year per seat $0 (open source)

The real advantage isn’t speed. It’s the fact that these diagrams are code. When the architecture changes, I change the script and regenerate. The diagrams stay current because updating them costs nothing.

The Visual QA Loop That Catches What Code Reviews Miss

Automated quality gates check structure: grid alignment, color palette compliance, edge routing, icon usage. They catch the mechanical errors. But they miss the aesthetic problems that make a diagram confusing - tangled routing, clipped labels, poor spacing, crossed edges.

The fix: export each diagram to PNG, read it visually, critique, fix, and regenerate. The PNG review catches what the structural check cannot. Both checks together give you full coverage.

For every diagram in this article, the pipeline was:

  1. Generate .drawio XML from the TypeScript spec
  2. Run the quality gate (grid, palette, edges, icons) - must pass
  3. Export to PNG at 2x scale
  4. Visually review the PNG for aesthetic issues
  5. Fix any coordinate issues in the script
  6. Re-generate and re-verify
  7. Export final SVG with transparent background and embedded icons

The result: 20 diagrams that are structurally correct AND visually clean. No tangled arrows. No clipped text. No crossed edges.

For more on the diagramming pipeline, see Architecture Diagrams with Draw.io MCP and Claude Code. For the broader argument about keeping documentation alive in git, see Living Documentation in Git.

What I’d Actually Recommend

Start with 5 diagrams, not 20. Every enterprise needs at minimum: an ERD, a network topology, an integration map, a CI/CD pipeline, and an environment strategy. Those five cover 80% of the questions stakeholders will ask.

Build the batch script incrementally. Add one diagram at a time, verify it looks right, commit. Don’t try to design all 20 at once.

Use the visual QA loop from day one. I shipped diagrams with broken icons, dark backgrounds, and tangled arrows six times in one session before learning this lesson. Export to PNG, look at it, fix it. Every time.

The goal isn’t perfect diagrams. The goal is diagrams that exist, that are accurate, and that update when the architecture changes. If your documentation strategy requires someone to manually update Visio files, the documentation will be wrong by next week.

Code-generated diagrams are living documentation. That’s the entire point.


Building architecture documentation that stays current? Check out how agentic development works in practice and the full diagramming pipeline with Draw.io MCP.

Stay in the loop

Get new posts delivered to your inbox. No spam, unsubscribe anytime.

Related articles