Logic Apps as MCP Servers - The Architecture That Actually Works
Turn Azure Logic Apps into MCP servers for AI agents. Two approaches, auth gotchas, cost math, and the architecture diagram Microsoft didn't draw.
Microsoft just turned Azure Logic Apps into MCP servers. In preview, as of March 2026. This means your 1,400+ Logic Apps connectors - Dataverse, SharePoint, SQL, Outlook, SAP, ServiceNow - are now callable tools for AI agents in Claude Code, VS Code Copilot, or Cursor.
That’s the headline. Here’s what the docs don’t tell you.
What Is an MCP Server and Why Should You Care?
MCP - Model Context Protocol - is the open standard that lets AI agents call external tools. When you ask Claude Code to “send an email to the project team,” it needs a tool that actually sends email. The MCP server provides that tool.
Until now, building an MCP server meant writing TypeScript or Python that wraps an API. You’d build an HTTP endpoint, handle auth, parse requests, call the downstream service, return results. For every integration. Manually.
Logic Apps as MCP servers changes this. You pick a connector, select the actions you want to expose, and Azure builds the MCP endpoint. The AI agent discovers the tools and calls them. No custom code.
The architecture is straightforward:
- MCP Client (Claude Code, VS Code, Cursor) connects to your MCP server
- MCP Server (Logic App Standard) exposes workflows as callable tools
- Connectors (1,400+) do the actual work - querying databases, sending emails, creating records
- Enterprise Systems (Dataverse, SQL, SharePoint, SAP) are the final destinations
The protocol is JSON-RPC 2.0 over Server-Sent Events (SSE). The transport is HTTP. Your MCP server runs in Azure, your MCP client runs locally. Remote, not local.
How Do You Set Up Logic Apps as an MCP Server?
There are two approaches. Microsoft presents them as equivalent options. They’re not.
Approach 1: Direct from Logic App (Fast, No Governance)
Open your Logic App Standard in the Azure portal. Under Agents, you’ll see a new MCP servers blade. Create a server, select existing workflows or build new ones, choose auth (OAuth or API key), done.
Your AI agent connects directly to the Logic App’s MCP endpoint. No intermediary.
When to use this: Dev/test environments. Personal productivity tools. Internal prototyping. Situations where you control both the client and the server.
What’s missing: No rate limiting. No usage analytics. No API key rotation policy. No centralized catalog for other teams to discover your server. If someone hammers your endpoint, there’s no throttle.
Approach 2: Via API Center (Enterprise, Full Governance)
The enterprise path adds two layers: API Management for security and rate limiting, and API Center for discovery and governance.
The stack becomes: MCP Client -> API Management -> API Center -> Logic App Standard -> Connectors -> Enterprise Systems.
You create the MCP server through API Center’s portal, which registers it as a governed API. API Management handles auth, rate limiting, and usage tracking. Other teams can discover your MCP server through the API Center catalog.
When to use this: Production environments. Multi-team organizations. Anything touching customer data. Compliance-regulated industries.
What you get that Direct doesn’t: Centralized discovery, rate limiting, usage analytics, API key management through APIM, and audit trails.
The Auth Problem Nobody Talks About
The docs mention Easy Auth like it’s a checkbox. It’s not. Here’s the setup:
- Create an app registration in Entra ID
- Note the application (client) ID
- Set the issuer URL with your tenant ID
- Configure the allowed token audience (with trailing slash - yes, the trailing slash matters)
- Set identity requirements (specific identities or any identity)
- Set tenant requirements (allow or deny cross-tenant)
- Set App Service authentication to allow unauthenticated access (yes, this is correct - Easy Auth handles it at a different layer)
- Choose your auth method: OAuth 2.0 or API Key
API keys support 3 expiration durations: 24 hours, 7 days, 30 days. There’s no custom duration. There’s no automatic rotation. You generate a key, it expires, you generate a new one.
The governance gap the docs skip: When your MCP server calls a connector - say, the Outlook connector to send email - whose credentials are used? The answer: the credentials stored in the Logic App’s API connection. This means the AI agent sends email AS whoever set up the Outlook connection. If that’s a shared mailbox, fine. If that’s your personal account, every AI agent user sends email as you.
This is a governance question you need to answer before exposing any connector that creates, updates, or sends data. Read/query connectors are less risky. Write connectors need a service account or shared connection.
What Does This Actually Cost?
This is the part that surprises people coming from Consumption Logic Apps.
| Component | Consumption | Standard (MCP Server) |
|---|---|---|
| Hosting model | Pay per execution | Always-on App Service Plan |
| Minimum cost | $0 (idle = free) | ~$160/month (WS1 plan) |
| Scaling | Automatic, per-trigger | Manual or autoscale rules |
| MCP support | No (event-driven, no persistent endpoint) | Yes (persistent HTTP endpoint) |
| Connectors | Same 1,400+ | Same 1,400+ |
Standard Logic Apps run on a Workflow Service Plan or App Service Environment v3. The cheapest option (WS1) runs about $0.22/vCPU/hour. That’s roughly $160/month for the minimum configuration, running 24/7.
You’re paying for an always-on compute instance because MCP servers need a persistent HTTP endpoint. Consumption Logic Apps are event-driven - they spin up on trigger, execute, and shut down. You can’t run an MCP server on something that shuts down between calls.
For dev/test with light usage, this is a real cost consideration. For production with multiple teams calling the server, $160/month is noise.
Five Things I’d Build With This
Now that every Logic App connector is an MCP tool, the interesting question is: which connectors become the most useful AI agent tools?
1. Dataverse query tool. Let Claude Code query your Dataverse tables by describing what you want in natural language. “Find all active cases assigned to Dr. Rahman” translates to a FetchXML query through the Dataverse connector. Huge for Power Platform developers who live in the terminal.
2. SharePoint document search. Your AI agent can search SharePoint document libraries, read file metadata, and pull content. Combine this with Azure OpenAI for a RAG pattern that doesn’t require a custom indexer.
3. Automated incident response. Connect to Azure Monitor alerts, ServiceNow tickets, and Teams notifications. When an alert fires, the AI agent reads the alert, creates a ServiceNow incident, and posts to the on-call Teams channel. All through MCP tools backed by Logic App connectors.
4. Approval workflow trigger. Start a Power Automate approval from your AI agent. “Send this deployment for manager approval” becomes a tool call that triggers the approval flow, waits for the response, and returns the result.
5. Multi-system data aggregation. Pull data from SQL, Dataverse, and a REST API in one agent conversation. Each query is a separate MCP tool. The AI agent orchestrates the calls and synthesizes the results. No custom middleware needed.
My Honest Take
This is a genuinely useful capability. The 1,400+ connector library is Logic Apps’ superpower, and exposing it to AI agents through MCP is the right move. I’ve been building MCP integrations manually for months - this eliminates most of that work.
But the execution has gaps:
Gap 1: Cost model mismatch. Most developers experimenting with MCP servers want something cheap or free for prototyping. Standard Logic Apps have a minimum monthly cost. Microsoft should offer a Consumption-compatible MCP endpoint, even if it’s limited.
Gap 2: Auth complexity. Easy Auth setup is 8 steps with tenant configuration, app registration, and audience URIs. For something called “Easy” Auth, it’s not easy. A one-click “secure this endpoint” option would help adoption.
Gap 3: Connector credential governance. The docs don’t address whose credentials the AI agent uses when calling write connectors. This is the first question any security team will ask.
Despite the gaps, I’d start using this today for read-only Dataverse and SQL queries in dev environments. For production write operations, wait until Microsoft clarifies the credential governance story - or build your own service account pattern.
The architecture diagram at the top of this article? Generated in seconds using the same diagramming pipeline I use for all my Azure architecture documentation. When the architecture changes - and it will, this is preview - I regenerate. The diagram stays current because updating it costs nothing.
For more on building Azure architecture diagrams from code, see Architecture Diagrams with Draw.io MCP and Claude. For the full enterprise diagramming approach, see 20 Architecture Diagrams in 20 Minutes.
Stay in the loop
Get new posts delivered to your inbox. No spam, unsubscribe anytime.
Related articles
20 Architecture Diagrams in 20 Minutes: How AI Documents Enterprise Systems
Generate ERDs, network topologies, security models, CI/CD pipelines, and integration maps from code. The batch-generation approach that replaces weeks of Visio work.
Architecture Diagrams with Draw.io MCP Server and Claude Code
Generate swimlanes, ERDs, and integration maps from text using Claude Code and the Draw.io MCP server. Free, git-friendly, no Visio needed.
15 Rules for Perfect Architecture Diagram Arrows
Zero crossings, zero diagonals, 20px clearance, perfect fan-out symmetry. The 15 rules that separate professional diagrams from auto-generated mess.