The 10-Minute Build: How Specs and AI Produced 14 Power Automate Flows
Power Automate flows built by AI in 10 minutes -- but only because two years of governance made specs machine-readable. The full architecture story.
The product owner submitted an ADO work item Monday morning. By lunch, 14 Power Automate notification flows were built, packaged into a solution ZIP, and imported into dev.
Not by a team. By one architect and an AI coding assistant.
14 cloud flows. 4 parallel AI agents. 10 minutes of generation time. One clean solution import.
But I need to be honest about that number. The 10 minutes was the execution. The architecture work that made those 10 minutes possible took the better part of two years. Every naming convention, every spec document, every governance decision I had made in the preceding months became a machine-readable instruction that the AI could follow.
This is the full story.
The Starting Point: Not a Blank Canvas
The Meridian Performance Management system at Apex Federal Solutions was not a fresh environment. When the notification requirement arrived, we already had 10 cloud flows running in production, plus 2 classic workflows. All of them followed the same governance patterns.
Every flow had a tag prefix:
- REV (Review Cycle) - 2 flows handling cycle creation and access teams
- EVL (Evaluation) - 5 flows for evaluator assignment, signer resolution, access teams, and signing config
- STP (Signing Step) - 3 flows managing the signing chain, step advancement, and rejections
The flows were named consistently. Specs lived in git as markdown files, version-controlled and diffable. Each flow did exactly one thing. No multi-concern flows. No flows that both processed business logic and sent emails.
This foundation was not built for AI. It was built because I cared about maintainability. But it turned out to be machine-readable by accident.
When the AI assistant read the spec documents for the first time, it understood the tag convention immediately. It could see that REV01 creates evaluation records, that EVL04 stamps signing configuration, that STP01 advances the signing chain. The naming convention was the documentation, and the AI could read it without any additional explanation.
That accidental readability changed everything.
Deep-dive: Tag-Based Flow Architecture - how 3-letter prefixes make 24 flows manageable.
The Requirement: Sarah Chen’s Work Item
Sarah Chen, our Director of Talent Operations and product owner, submitted the work item on Monday morning. The requirement was clear: the signing workflow needed notification emails. People needed to know when forms were assigned, when signatures were due, when deadlines were approaching, and when rejections happened.
She described 14 separate email notifications:
- Form assignment alerts to authors when evaluations are assigned
- Signing step activation alerts when a signer’s turn arrives
- Self-sign acknowledgment notices with distinct wording from standard signature requests
- Reminder emails at 30, 14, and 7 days before due dates
- Past-due escalation to supervisors when authors or signers miss deadlines
- Rejection notifications to the author and to all previous signers in the chain
- Evaluation completion confirmations with full signer details
- Review cycle launch and close announcements to all participants
The work item also included answers to 6 design questions from a prior planning session. Those answers became binding product owner decisions.
The 5-Conflict Reconciliation
Here is where things got interesting. I already had a draft notification spec in the repo. Sarah’s ADO answers did not fully align with that spec. Five conflicts needed resolution before any flow could be built.
| # | Conflict | Original Spec | PO Decision | Resolution |
|---|---|---|---|---|
| 1 | Digest content | List all open items | New items + summary count of older pending | Updated to PO decision |
| 2 | Past-due frequency | Daily to author only | Weekly to author + daily supervisor escalation | Added supervisor escalation flows |
| 3 | Rejection scope | Notify author only | Notify author + all previous signers | Added rejection-to-signers flow |
| 4 | Completion notification | Not specified | Yes, with full signer chain details | Added evaluation-complete flow |
| 5 | Self-sign wording | Generic signing language | Distinct acknowledgment wording | Split into two separate flows |
Conflict #5 is worth calling out. Sarah wanted employees signing their own evaluations to see “Ready for your acknowledgment” instead of “Ready for your signature.” That single wording distinction meant splitting one flow into two: NTF-EMAIL-02 for standard signatures and NTF-EMAIL-03 for self-sign acknowledgments. Different recipient logic. Different subject lines. Different body text. One flow could not serve both purposes without becoming a branching mess.
The reconciliation expanded the original 10 notification types into 14 distinct flows. Every conflict resolution was documented in the spec before any JSON was generated. Documentation first. Always.
Deep-dive: Spec-First Development - why specs should exist before the Power Automate designer opens.
What Architecture Decisions Made the 10-Minute Build Possible?
Four decisions made weeks before the build determined its success: notifications never write to Dataverse, daily digests replace real-time triggers, each flow handles exactly one email template, and a tag convention groups flows by function. These constraints gave the AI clear boundaries and made the output production-ready without rework.
These were not made during the 10-minute build. They were made weeks earlier, during spec review.
1. Notifications Never Write to Dataverse
This is the single most important architectural decision. Every notification flow is read-only. They query Dataverse tables, build email bodies, send emails through a shared mailbox, and exit. They never update a record. They never set a status. They never modify the signing chain.
Why? Because a notification failure must never break the business process. If NTF-EMAIL-02 fails to send a “Ready for Signature” email, the signing step is still awaiting the signer. The signer can still open the form and sign. The business flow is unaffected.
The reverse is also true. The 10 business flows (REV, EVL, STP) were never modified when notifications were added. Zero changes to production logic. Zero regression risk.
2. Daily Digest, Not Real-Time
When REV01 opens a review cycle, it bulk-creates dozens of evaluation records. Real-time triggers on evaluation creation would flood a supervisor with 15+ emails in seconds. That is not a notification. That is spam.
The daily digest pattern solves this. 12 of the 14 flows run on a schedule. Once per day at 8:00 AM Eastern, weekdays only. Each flow queries for records that need attention, groups them by recipient, and sends one consolidated email per person.
The exceptions are the 2 rejection flows (NTF-EMAIL-05 and NTF-EMAIL-06). Rejections are real-time because they are rare, urgent, and result from deliberate human action. A signer rejected a form. The author needs to know now, not tomorrow morning.
3. One Flow Per Email Template
14 flows, not 1 flow with 14 branches. This was non-negotiable.
Each flow’s run history shows exactly one email type. When Sarah Chen asks “are reminder emails going out?”, I check NTF-EMAIL-10. When she wants to disable the “Signer Heads-Up” preview emails during a pilot, she turns off NTF-EMAIL-04 without touching anything else.
One flow per template also means one set of FetchXML queries per flow, one trigger configuration, one run history. Debugging is trivial. Inventory tracking is clean.
4. The NTF-EMAIL Tag Convention
The tag NTF-EMAIL was not arbitrary. It leaves room for NTF-INAPP (in-app notifications) and NTF-TEAMS (Teams adaptive cards) in the future. The tag also gave the AI a grouping signal. When I said “build all NTF-EMAIL flows,” the AI knew the scope, the naming pattern, and the architectural constraints without me repeating them.
Deep-dive: Notification Architecture - notifications that cannot break business logic.
The Build: 4 Agents, 14 Flows, 10 Minutes
With specs updated and architecture locked, the actual flow generation was the shortest part of the process.
I parallelized the work across 4 AI agent threads, each responsible for a functional group:
| Agent | Flows | Functional Group |
|---|---|---|
| Agent 1 | NTF-EMAIL-01, 10, 11 | Author-facing scheduled flows |
| Agent 2 | NTF-EMAIL-02, 03, 04 | Signer-facing scheduled flows |
| Agent 3 | NTF-EMAIL-05, 06, 07 | Event-driven and completion flows |
| Agent 4 | NTF-EMAIL-08, 09, 12, 13, 14 | Cycle broadcasts + escalation |
Each agent produced Power Automate Editor format JSON files. The patterns were consistent across all 4 agents because they were documented in the spec before parallel execution began. Every agent read the same spec. Every agent followed the same conventions.
The consistency came from the spec, not from the AI. The AI was the executor. The spec was the source of truth.
What Every Flow Shared
All 14 flows used identical patterns:
- FetchXML queries instead of OData $filter, because the temporal operators (
last-x-hours,olderthan-x-days) and linked entity joins cannot be expressed in OData - Variable initialization at the top-level action sequence, as Power Automate requires
- SharedMailboxSendEmailV2 action sending from
[email protected], not the flow owner’s personal mailbox - Sequential concurrency on all
Apply to eachloops (parallelism = 1) to prevent race conditions when building HTML email bodies - Deep links in every email using
appidandforceUCI=1parameters for direct navigation to the record - 3 environment variables (
mrd_EnvironmentURL,mrd_MeridianNotificationsMailbox,mrd_MeridianAppID) and 2 connection references (Dataverse + Office 365 Outlook)
When I reviewed the output from all 4 agents, the flows were structurally identical where they should have been, and appropriately different where the business logic required it. That is what spec-driven development looks like.
Deep-dive: FetchXML in Power Automate - when OData $filter is not enough.
The Packaging: From JSON to Importable Solution
Having 14 flow JSON files is not the same as having a deployable solution. The packaging pipeline turned those files into something Dataverse could import.
- 1
Base solution export as template
I exported an existing solution as the starting point. This provided the correct XML schemas, content types, and publisher information.
- 2
Format conversion
PA Editor format JSON is not the same as solution export format. The conversion adds a properties wrapper, simplifies connection references, and sets the correct schemaVersion.
- 3
Deterministic GUIDs via UUID v5
Same input always produces the same GUID. Re-running the build updates existing flows rather than creating duplicates. This is critical for iterative development.
- 4
XML manifest updates
customizations.xml gets a Workflow entry for each flow. solution.xml gets a version bump and RootComponent entries.
- 5
ZIP creation with forward slashes
Node.js archiver with forward-slash path separators. PowerShell Compress-Archive creates backslashes, which causes silent import failures in Dataverse.
The ZIP imported cleanly on the first attempt. All 14 flows appeared in the solution, linked to the correct connection references and environment variables. None of this would work with loose flows. Every flow was solution-aware from the start.
Deep-dive: Building Solution ZIPs - the undocumented packaging guide.
What the AI Got Wrong: 3 Mistakes in the First 10 Minutes
I would not trust this article if it only told the success story. Here is what the AI got wrong.
| # | AI Proposed | Why It Was Wrong | Correction |
|---|---|---|---|
| 1 | Real-time triggers for all 14 notifications | REV01 bulk-creates dozens of evaluations. Real-time triggers would flood recipients with 15+ emails in seconds. | Daily digest pattern with 24-hour lookback |
| 2 | Add email actions inside existing business flows | Mixing concerns: notification failure would risk breaking the signing chain | Completely separate NTF-EMAIL flows |
| 3 | Generic flow names like 'Send notification email' | Impossible to monitor, debug, or selectively disable individual notification types | NTF-EMAIL-## convention, one flow per template |
Every one of these mistakes was reasonable from a surface-level perspective. Real-time notifications sound better than daily digests. Adding an email action to an existing flow sounds simpler than building a new one. A single notification flow sounds more efficient than 14 separate flows.
But every one of these proposals would have created problems in production. The AI was optimizing for simplicity. I was optimizing for maintainability, isolation, and operational clarity.
Here is the part that matters. I explained the reasoning once for each correction. The AI did not just fix the one flow. It incorporated the correction into all subsequent work. When I told Agent 1 why daily digest was necessary, Agents 2 through 4 never proposed real-time triggers. When I explained separation of concerns to Agent 3, it never suggested embedding notifications in business flows.
The correction pattern was not a one-off fix. It was a permanent architectural adjustment. That is the difference between AI as autocomplete and AI as a development partner.
Deep-dive: What AI Gets Wrong - and why human correction is the point.
The Full Inventory: 24 Flows in One System
After the notification work, Meridian Performance Management runs 24 cloud flows organized into 4 functional groups:
| Tag Prefix | Domain | Flows | Trigger Type |
|---|---|---|---|
| REV | Review Cycle | 2 | Dataverse events |
| EVL | Evaluation | 5 | Dataverse events |
| STP | Signing Step | 3 | Dataverse events |
| NTF-EMAIL | Email Notifications | 14 | 12 scheduled + 2 event-driven |
The 10 business flows and 14 notification flows share zero logic. A notification flow failure never affects the signing chain. A business flow update never requires changes to notifications. Sarah Chen can disable any notification independently without coordinating with the development team.
The flows ship in priority tiers, not all at once:
- P1 (11 flows): Must-have for go-live. Core signing workflow notifications.
- P2 (1 flow): NTF-EMAIL-04 Signer Heads-Up. Informational, can follow shortly after go-live.
- P3 (2 flows): Cycle broadcasts (NTF-EMAIL-08 and NTF-EMAIL-09). Can be deferred until the first full review cycle.
This phased approach means we can ship the critical notifications immediately and validate the digest pattern with real users before rolling out the lower-priority flows.
Deep-dive: Flow Inventory - you can’t govern what you can’t see.
The Governance Payoff
Every governance practice I had invested in before the notification work became a force multiplier when AI entered the picture.
| Governance Practice | Pre-AI Value | Post-AI Value |
|---|---|---|
| Tag-based naming (REV/EVL/STP) | Clean inventory and monitoring | AI used tags to batch flows for parallel generation |
| Specs in git (markdown, version-controlled) | Human documentation and PR reviews | AI read the spec and generated 14 flows directly from it |
| One flow per logical operation | Clear run history and debugging | AI followed the same pattern without being told twice |
| Solution-aware flows | Clean ALM and deployment | Packaging pipeline generated deterministic GUIDs |
| Environment variables | Multi-environment deployment | All 14 flows used the same 3 env vars with zero hardcoded values |
| Source control for flow JSON | Diffable PRs and rollback | Every flow JSON tracked in git, reviewable before import |
Without tags, the AI had no grouping signal. Without specs, the AI had no instructions. Without separation of concerns, the AI would have modified the business flows. Without naming conventions, the AI would have produced 14 flows named “Send notification.”
The 10-minute build was not an AI achievement. It was a governance achievement that AI made visible.
What I Would Do Differently
Two things.
First, I would have written the notification spec earlier. The draft spec existed, but it had gaps that Sarah’s ADO answers exposed. If the spec had been complete before the work item arrived, the 5-conflict reconciliation would not have been necessary. The build time stays at 10 minutes, but the prep time drops significantly.
Second, I would have built the solution packaging pipeline before the notification work. I built it during the project because I needed it. If it had existed as a reusable tool, the entire process from spec to import would have been even faster.
Both of these are governance investments. Neither is about AI. The AI is the accelerator. The governance is the engine.
The Takeaway
14 Power Automate flows in 10 minutes sounds like an AI story. It is not. It is a governance story.
The AI did not know that real-time triggers would flood recipients. I told it. The AI did not know that notifications should be separated from business logic. I told it. The AI did not know that NTF-EMAIL-04 should be independently disableable. I told it.
What the AI did was execute. Fast. Consistently. Across 4 parallel threads. It took the patterns I had established over months of governance work and replicated them 14 times without deviation, without fatigue, and without the inconsistencies that creep in when a human builds flow #12 at 4 PM on a Friday.
The 10-minute build was the dividend. The governance was the investment.
If you are building on Power Platform and thinking about AI-assisted development, start with governance. Name your flows. Write specs. Put your flow JSON in source control. Build a flow inventory.
Then, when the AI arrives, it has something to work with.
Spec-Driven Power Platform Series
This article is part of a series on building Power Automate solutions with specs, governance, and AI:
- Tag-Based Flow Architecture - How 3-letter prefixes make 24 flows manageable
- Spec-First Development - Why specs should exist before the designer opens
- Notification Architecture - Notifications that cannot break business logic
- FetchXML in Power Automate - When OData $filter is not enough
- Building Solution ZIPs - The undocumented packaging guide
- What AI Gets Wrong - And why human correction is the point
- 14 Flows in 10 Minutes (this article) - The full story
AZ365.ai - Azure and AI insights for architects building on Microsoft. Follow Alex on LinkedIn for architecture deep dives.
Stay in the loop
Get new posts delivered to your inbox. No spam, unsubscribe anytime.
Related articles
Spec-Driven Power Platform: The Complete Series
7 articles on building Power Automate flows with specs, governance, and AI. From tag-based architecture to solution packaging to honest AI collaboration.
What AI Gets Wrong About Power Platform (And Why That Is the Point)
AI made three Power Automate architecture mistakes in 10 minutes. After correction, it delivered 14 production-ready flows. Here is the real pattern.
Notification Architecture That Cannot Break Your Business Logic
Separate Power Automate notification flows from business logic. 14 flows, zero write operations, daily digests over real-time floods.