If you work with SQL in Fabric, you already know the pattern. Reporting data lands in Fabric SQL or Lakehouse tables, semantic models sit on top, and Power BI turns that into something business users understand. Planning has always been the odd one out, usually living in a separate tool that IT feeds with exports and integrations. Not anymore…
Announced today at FabCon today (March 18th 2026)
If you haven’t already, check out Arun Ulag’s hero blog “FabCon and SQLCon 2026: Unifying databases and Fabric on a single, complete platform” for a complete look at all of our FabCon and SQLCon announcements across both Fabric and our database offerings.
With Planning in Microsoft Fabric IQ, that separation disappears. Planning now sits directly on top of the same Fabric SQL and semantic models that developers and data engineers already maintain.
For SQL professionals, this unlocks a much cleaner architecture, less integration work and more predictable data flows.
Planning Uses Your Existing Semantic Models
Planning in Fabric IQ reads business logic directly from Power BI semantic models. SQL developers no longer have to replicate definitions or maintain special “planning exports.”
If your measures, reference tables and dimensions are modeled correctly, Planning uses them as its foundation. This eliminates drift between planning logic and reporting logic, something that has always been painful in disconnected systems.
Writeback Lands in Fabric SQL
The biggest operational change for SQL developers is this. Planning writeback does not land in a proprietary planning database. It lands in Fabric SQL tables that you control and can query.
This means:
- Forecasts and budgets are stored as regular tables
- Versioning and governance policies work the same way as other data
- You can join planning data with operational data without a round trip to another tool
- Downstream BI reports update automatically because the data lives in the same environment
A typical structure might look like:
SQL
SELECT
d.CustomerKey,
f.ForecastAmount,
f.Version,
a.ActualSales
FROM dbo.FinanceForecast f
JOIN dbo.DimCustomer d ON d.CustomerKey = f.CustomerKey
JOIN dbo.SalesActuals a ON a.CustomerKey = f.CustomerKey
WHERE f.Version = '2026-Base'
No more external APIs, sync jobs or file drops.
OneLake Makes Planning Data Instantly Available
Shortcuts and mirroring remove a major pain point for SQL developers. Data used by planners does not need a dedicated pipeline or internal copy. If the data is already in OneLake, Planning can use it immediately.
This avoids:
- Daily ETL loads
- Redundant staging areas
- Manual reconciliation work
For SQL developers, this is more predictable, more consistent and easier to operate.
A Single Environment From Actuals to Forecasts
Traditionally, SQL teams have had to maintain two parallel worlds.
One world holds actuals and historical performance.
Another world holds budgets and planning data from a separate system.
Planning in Fabric IQ merges these worlds:
- Actuals remain in Lakehouse and Fabric SQL
- Plans and scenarios write back into Fabric SQL
- Semantic models unify both
- Power BI reports read everything from the same data estate
This reduces the number of moving parts and makes lineage, governance and validation much easier.
Planning as a New Input to AI and Automation
Planning introduces a new type of data for SQL developers to work with: intent data.
Targets, constraints, scenario assumptions and expected outcomes become tables that intelligent agents can read.
This shifts planning from an isolated workflow to a central part of automated decision support. For SQL developers, this means new opportunities to model features, feed scoring pipelines and support decision logic with richer context than just historical data.
Why SQL Developers Should Care
Planning in Fabric IQ is worth paying attention to because it simplifies several long standing operational challenges:
- No more pipeline maintenance to feed external planning tools
- No more reconciliation work between planning and reporting
- No more duplicated metrics in separate systems
- No separate planning database that IT cannot fully control
- Writeback is now a standard Fabric SQL operation
- Planning logic aligns with semantic model logic
This is a major improvement for anyone who has had to support planning workflows while also maintaining clean, governed SQL environments.
Getting Started
Planning in Fabric IQ is available in preview. To experiment:
- Open a Fabric workspace
- Connect Planning to an existing semantic model
- Observe how writeback lands in Fabric SQL
- Integrate planning data with your existing T SQL workload
From there, it becomes clear that planning is no longer an external dependency. It is part of the platform, and SQL developers can finally treat it like any other governed dataset.