Microsoft Fabric Project Advice: Getting into the Thick of It

Advice and lessons learned from working with consulting clients for nearly one year since Fabric was released. How to approach architecting solutions, choosing from among Fabric assets and tools, and getting your organization ready for Fabric adoption, developing, managing and deploying successful Fabric solutions.

Microsoft Fabric Roadmap & Feature Status Report

I created this report to summarize the release status for all Fabric features that are documented in the Microsoft Fabric Roadmap. The information on this report is collected and updated frequently from the Fabric Roadmap hosted by Microsoft, and displays status information in a convenient, single-page Power BI report. Each feature area on the report has links to the detail documentation on the official roadmap site. For convenience, I’ve shortened the report path to TinyURL.com/FabricRoadmapReport.

Power BI Direct Lake and DirectQuery in the Age of Fabric

I just returned from the Microsoft Fabric Community Conference in Las Vegas. Over 4,000 attendees saw a lot of demos showing how to effortlessly build a modern data platform with petabytes of data in One Lake, and then ask CoPilot to generate beautiful Power BI reports from semantic models that magically appear from data in a Fabric Lakehouse. Is Direct Lake the silver bullet solution that will finally deliver incredibly fast analytic reporting over huge volumes of data in any form, in real time? Will Direct Lake models replace Import model and solve the dreaded DirectQuery mode performance problems of the past? The answer is No, but Direct Lake can break some barriers. This post is a continuation of my previous post titled “Moving from Power BI to Microsoft fabric”.

Direct Lake is a new semantic model storage mode introduced in Microsoft Fabric, available to enterprise customers using Power BI Premium and Fabric capacities. It is an extension of the Analysis Services Vertipaq in-memory analytic engine that reads data directly from the Delta-parquet structured storage files in a Fabric lakehouse or warehouse.

Moving from Power BI to Microsoft Fabric

Fabric is here but what does that mean if you are using Power BI? What do you need to know and what, if anything will you need to change if you are a Power BI report designer, developer or BI solution architect? What parts of Fabric should you use now and how do you plan for the near-term future? As I write this in March of 2024, I’m at the Microsoft MVP Summit at the Microsoft campus in Redmond, Washington this week learning about what the product teams will be working on over the next year or so. Fabric is center stage in every conversation and session. To say that Fabric has moved my cheese would be a gross understatement. I’ve been working with data and reporting solutions for about 30 years and have seen many products come and go. Everything I knew about working with databases, data warehouses, transforming and reporting on data has changed recently BUT it doesn’t mean that everyone using Power BI must stop what they are doing and adapt to these changes. The core product is unchanged. Power BI still works as it always has.

The introduction of Microsoft Fabric in various preview releases over the past two years have immersed me into the world of Spark, Python, parquet-Delta storage, lakehouses and medallion data warehouse architectures. These technologies, significantly different from the SQL Server suite of products I’ve known and loved for the past twenty years, represent a major shift in direction, forming the backbone of OneLake; Microsoft’s universal integrated data platform that hosts all the components comprising Fabric. They built all of Fabric on top of the existing Power BI service, so all of the data workloads live inside familiar workspaces, accessible through the Power BI web-based portal (now called the Fabric portal).

CI/CD & DevOps for Power BI… Are We There Yet?

In my view, projects and teams of different sizes have different needs. I described DevOps maturity as a pyramid, where most projects don’t require a sophisticated DevOps implementation, and the most complex solutions do. The DevOps maturity is a progression, but only for projects of a certain scale. One of the following options might simply be the best fit for a particular project.
Unless you are throwing together a simple Power BI report that you don’t plan to maintain and add features to, the first and most basic managed project should start with a PBIX file or Power BI Project folder stored in a shared and cloud-backed storage location.
DevOps isn’t a requirement for all projects, but version control and shared file storage definitely is.

Power BI for the Enterprise Precon: Atlanta Feb 9th

Please come to the Atlanta SQL Saturday BI Edition in February, and join me for a full-day preconference event: Designing and Developing Enterprise-scale Solutions with Power BI

Conference Sessions and Presentations in late 2023 & 2024

Are you in the California Bay area? Do you want to learn to do Power BI and Fabric the right…

Introducing the New Power BI Project, and why it is important

The enterprise Business Intelligence and IT community have been anxiously awaiting this capability for many years and it has finally arrived – at least in Preview form for testing and prototyping real Power BI CI/CD DevOps solutions.

How Does Microsoft Fabric Change the Enterprise BI Game?

Microsoft Fabric (#MicrosoftFabric), the new Unified Analytics platform, was announced this morning, literally moments before the Build Conference. Satya made the official announcement in the keynote address. Why is this big news and why is it important? In a few words, this is the most significant expansion to the Power BI and the analytics platform since the product was introduced. Satya said that Fabric is the most significant data platform innovation since SQL Server.

How Lakehouse Architecture is Revolutionizing Business Intelligence

The Lakehouse is the evolution of the earlier cloud data platform in many pieces that came with “some assembly required”. All of the components are modern, mature and capable but complicated and require specialized skills. Imagine that the new version is easier to assemble with instructions that are only a few pages with stick figures, and it comes with an Allen wrench.

We’re seeing consulting customers putting Lakehouse and BI solutions on-line in just a few weeks. Then they iterate to scale-up their modern data warehouse/BI platform as they train their Center of Excellence champions and Data Governance organizations as they progress.

DevOps & CI/CD for Power BI: Real-World Example

In a recent blog post, Paul discussed that DevOps, and specifically CI/CD, principles are essential for modern software development, and Power BI is no exception. Power BI CI/CD for enterprise class projects can be achieved using Azure DevOps, with steps including source control, automated testing, and build and release pipelines.  

Creating and Displaying a Last Refresh Date Measure

Business users often need to know how fresh the data is that they see on a Power BI report. This requirement can be interpreted to mean either “when was the last date & time that the source data was updated?” or “when was the last date & time that the data model was refreshed?”
I’ve used a few different approaches to record the actual date and time, but all generally using the same technique.

Power BI and the Dimensional Dilemma

There is no secret about this. If you do any legitimate research about Power BI (reading blogs, books or training from reliable sources), you will quickly learn that a lot of basic functionality requires a dimensional model, aka “Star Schema”. This is a hard fact that every expert promotes, and self-taught data analysts either have learned or will learn through experience. So, if everyone agrees on this point, why do so many resist this advice?

Perspective is everything. I didn’t understand why getting to the star schema was so out of reach so often until I was able to see it from another perspective. There are a few common scenarios that draw source data into different directions than an ideal dimensional model.

DevOps & CI/CD for Power BI

DevOps isn’t difficult to implement for small and medium-scale projects, and simple things like managing version control in a code repository can save hours of lost time. Organization who are accustomed to managing large application development initiatives might expect to have a fully automated build and deployment process in concert with an Agile delivery process, managed with specialized tools like Jira, GitHub and Azure DevOps.

Developing Large Power BI Datasets – Part 3 – Detail Tables

Power BI is architected to consume data in a dimensional model, with narrow fact tables and related dimensions. Introducing a big, wide table in a tabular model is extremely inefficient. It takes up space and memory resources, impacts performance, and complicates measure coding. Flattening records into a flat table is one of the worst things you can do in Power BI and a common mistake made by novice Power BI users.