Skip to content

Power Platform Pipelines - Moving Flows from Dev to Prod with Approvals

Pipelines in Power Platform are Microsoft's built-in CI/CD. Most orgs don't know they exist. Setup guide with approval gates and deployment settings.

Alex Pechenizkiy 8 min read
Power Platform Pipelines - Moving Flows from Dev to Prod with Approvals

“How do you deploy flows to production?”

I ask this question in every governance workshop. The most common answer is still “export as managed, import manually, cross fingers.” Sometimes they use Azure DevOps pipelines. Sometimes it is a SharePoint list with “ready to deploy” statuses and a person who runs imports on Tuesday mornings.

Microsoft shipped a first-party ALM tool called Power Platform Pipelines. It has been GA since mid-2025. It lives inside the platform itself. No Azure DevOps subscription needed. No YAML. No pipeline agents. And most organizations running Power Platform today do not know it exists.

Pipeline architecture from Dev through approval gates to Test and Production

What Pipelines in Power Platform Actually Are

Pipelines are Microsoft’s built-in CI/CD for Dataverse solutions. They are not Azure DevOps Pipelines. They are not GitHub Actions. They are a native feature inside Power Platform that handles solution export, validation, approval, and import across environments.

The core idea: you configure a pipeline once (Dev to Test to Prod), and makers deploy from inside the maker portal by clicking a button. The system handles the export, validation, approval gates, connection mapping, and import. The maker never touches a ZIP file.

The Architecture

Every pipeline setup has three parts:

  1. Pipeline Host Environment. A dedicated environment (usually production type) that stores pipeline configuration, deployment history, and solution artifacts. Microsoft provides a default platform host, or you can create a custom one.
  2. Development Environments. Where makers build. These are linked to the pipeline as source environments.
  3. Target Environments. Test, QA, Production. These are the stages in your pipeline, deployed to sequentially.

The key constraint: you cannot skip stages. If your pipeline goes Dev to Test to Prod, you must deploy to Test first. The same solution artifact that passed through Test gets deployed to Prod. No re-export, no tampering, no “let me just make one more change before prod.”

This is intentional. It prevents customizations from bypassing your QA process.

Setting Up Your First Pipeline

  1. 1

    Create or identify your host environment

    Go to Power Platform admin center > Deployments > Pipelines. You can use the default Platform host or create a custom host environment. For most organizations, the Platform host is fine to start.

  2. 2

    Open the Deployment Pipeline Configuration app

    In the host environment, open the Deployment Pipeline Configuration model-driven app. This is where you define pipelines, stages, and link environments.

  3. 3

    Create a pipeline

    Name it something meaningful like 'Core Business Apps Pipeline.' Add your stages in order: Dev > Test > Prod. Each stage links to a target environment.

  4. 4

    Link development environments

    Associate your development environments with the pipeline. Makers in these environments will see the pipeline when they open Solutions.

  5. 5

    Configure target environments as Managed Environments

    All target environments must be Managed Environments. You can enable this manually or set it to auto-convert in the admin center under Deployments > Settings.

  6. 6

    Test with a simple solution

    Create a small solution in your dev environment, navigate to Solutions, and look for the Pipelines option. Select Deploy, choose your target stage, and watch it work.

Approval Gates with Delegated Deployments

Out of the box, any maker with access can deploy. That is fine for dev-to-test. For production, you want approval gates. Pipelines handle this through delegated deployments.

When you enable delegated deployments on a pipeline stage, deployments require approval from an authorized identity before they proceed. The deployment runs under a service principal or a designated pipeline stage owner instead of the maker. This means makers can request deployments to production without having System Administrator access in the target environment.

How It Works

The pipeline extensibility model gives you three gated extension points. Think of them as stops on a train where you control whether it continues or not.

  1. Pre-export step. Runs custom validation when a deployment request is submitted. The system will not export the solution from the dev environment until your logic marks this step complete. Use this for things like ensuring all flows are turned off or naming conventions are followed.

  2. Delegated deployment (approval). The main approval gate. When enabled, the deployment pauses after export and waits for approval. You wire up a Power Automate cloud flow using the OnApprovalStarted trigger. Inside that flow, you can add an approval action, send it to the right people, and then call the UpdateApprovalStatus Dataverse action (20 = approved, 30 = rejected).

  3. Pre-deployment step. An additional gate after approval but before the actual import. Useful for final sign-offs or automated checks.

All three can be used together or independently. For most organizations, the delegated deployment (approval gate) alone covers the core need.

Service Principal Setup

For production deployments, use a service principal as the deploying identity:

  1. Create an enterprise application (service principal) in Microsoft Entra ID
  2. Add it as a server-to-server user in the host and each target environment
  3. Assign Deployment Pipeline Administrator role in the host, System Administrator in targets
  4. On the pipeline stage, check “Is delegated deployment,” select “Service Principal,” and enter the Client ID
  5. Create a cloud flow in the host environment using the OnApprovalStarted trigger
  6. Add your approval logic, then call UpdateApprovalStatus using the service principal’s connection

The service principal approach means flows deployed to production are owned by a service identity, not by “Karen from accounting who left six months ago.” This solves the single biggest ALM headache in Power Platform.


Building a Power Platform governance practice? Follow Alex on LinkedIn for architecture patterns, ALM strategies, and governance deep dives every week.


Deployment Settings: Connection References and Environment Variables

Connection references and environment variables hold different values per environment — your dev SharePoint site isn’t the same as production. See the environment strategy article for the full explanation and a worked example.

Pipelines handle the mapping with deployment settings. During deployment, the system prompts for connection references and environment variables. For delegated deployments, admins can preconfigure these values using deployment settings files (JSON) generated with the Power Platform CLI.

pac solution create-settings --solution-zip MySolution.zip --settings-file deploymentSettings.json

This generates a JSON file with placeholders for connection IDs and environment variable values. Fill in the target environment specifics, and the pipeline uses them during import. No manual post-deployment configuration.

Admins can even preconfigure certain connections that get used automatically, so makers don’t need to provide connection details at all.

Solution Checker Validation

Before a solution reaches your target environment, it runs through the solution checker. In Managed Environments, you control this with two settings:

  • Warn: solution imports with a warning if there are critical issues, and an email goes to admins
  • Block: solution import is canceled if there are critical issues

The recommended configuration:

Environment Type Solution Checker Send Emails
Default Block Yes
Developer Warn No
Sandbox / Test Warn No
Production Block Yes

This catches known anti-patterns: deprecated API usage, security issues, web resource problems, and accessibility violations. It does not, however, review the quality of your Power Automate flows. A flow with zero error handling, hardcoded URLs, and naming violations passes solution checker without a warning. That gap is real, and we will cover it in the AI-powered flow review article.

Pipelines vs Azure DevOps vs Manual

Capability Pipelines in Power Platform Azure DevOps / GitHub Actions Manual Export/Import
Setup complexity Low - config in admin center High - YAML, agents, service connections None
Approval gates Built-in delegated deployments Release gates, environment approvals Email or Teams message
Connection mapping Prompted during deployment Deployment settings JSON in pipeline Manual after import
Solution checker Automatic in Managed Environments Pipeline task (PAC CLI) Manual or skipped
Artifact storage Automatic in host environment Pipeline artifacts or repo Hopefully someone saved the ZIP
Maker experience Click Deploy in maker portal Makers don't touch it (pro dev owned) Export ZIP, email it, import
Extensibility Power Automate triggers, Dataverse events Full YAML customization None
Cross-tenant No Yes Yes (manual)
Cost Managed Environment licensing Azure DevOps subscription + licensing Free (but expensive in mistakes)

The sweet spot: use Pipelines for most deployments, and layer Azure DevOps or GitHub on top when you need source control integration, cross-tenant deployment, or complex branching strategies. Pipelines can be extended to integrate with both Azure DevOps and GitHub - they are not mutually exclusive.

Default Deployment Pipelines for Environment Groups

A newer capability: you can set a default deployment pipeline for environment groups. When a maker in a development environment tries to share their solution with users, the system prompts them to deploy through the pipeline first. This nudges makers toward the right process without blocking them outright.

Combined with the admin deployment hub in the Power Platform admin center, admins get visibility into all deployments across the tenant, can approve or reject requests, retry failures, and manage pipeline configuration from a single place.

Do Power Platform Pipelines Replace Azure DevOps?

Not entirely. Power Platform Pipelines cover the most common deployment scenarios without any external tooling, and they work well for teams that want maker-friendly CI/CD inside the platform. However, Azure DevOps and GitHub Actions still win for cross-tenant deployments, complex branching strategies, and full source control integration. Many enterprises use both together.

What Pipelines Do Not Cover

Pipelines move solutions reliably. They validate against solution checker rules. They handle approvals and connection mapping. But they have a blind spot.

Nobody reviews the actual content of the flows being promoted. The approval step is a rubber stamp if the approver does not open the solution and inspect every flow definition. And nobody does that for a solution with 15 flows.

Solution checker catches structural anti-patterns. It does not catch business logic problems: missing error handling, hardcoded values that should be environment variables, unapproved connectors, inconsistent naming, or scope nesting so deep that nobody can debug it.

That is the gap where AI-powered flow review fits in. We cover that in the third article in this series.


Power Automate Governance - The Enterprise Playbook

This article is part of a 10-part series:

  1. Naming Conventions That Scale
  2. Environment Strategy - Dev Test Prod
  3. Solution-Aware Flows
  4. Flow Inventory
  5. Pipelines - Dev to Prod
  6. CoE Starter Kit
  7. AI-Powered Flow Review
  8. Versioning and Source Control
  9. The Governance Repo
  10. Weekly Governance Digest

AZ365.ai - Azure and AI insights for architects building on Microsoft. Follow Alex on LinkedIn for architecture deep dives.

Stay in the loop

Get new posts delivered to your inbox. No spam, unsubscribe anytime.

Related articles