Keynote Speakers
Explore speakers' bios and topics.
Why DAX?
Details for Alberto’s keynote session are nearly complete. Get ready for his inspiring presentation
How to grow as a data professional in the era of AI?
Details for Pawel´s keynote session are nearly complete. Get ready for his inspiring presentation
Speakers
Explore speakers' bios and topics.
Power BI
Best practices for developing Power BI reports
In this session we will look at the most important best practices when developing a Power BI report and how these best practices can improve our life as developers.
We will differentiate best practices in the following three areas:
- Power Query / M
- Data modeling & DAX
- Visualization
Within these areas, we will see which best practices have an impact on performance and which ones help you to better organize your artifacts and improve usability of the report and readability of your code.
Remember that best practices are mostly not for you, the developer who has built this report. The main benefit is if we open a report from someone else or inherit one from a colleague, that is when you will start appreciating people following best practices.
Won’t you appreciate being handed over a nicely organized report that fullfils the most important best practices? Make it your mission to stand by them when you build Power BI reports.
Power BI
Star Schema ALL the things! But why?
Perhaps you’ve seen “Star Schema ALL the things!”, “Never use Calculated Columns”, or “Bi-Directional relationships suck” before when thinking about design considerations for your data model, but you’ve never really stopped to think about the specifics behind them and why exactly they could benefit or hurt your model. Who knows, maybe that specific advice doesn’t even work out for the scenario at hand and you might not be aware because you’ve skipped a few steps in the process.
Come along in this journey from source to model to report using a practical mindset, thinking about the design decisions and ramifications along the way. At the core of the session lies the message to think about best practices, with the added step to test, assess, and benchmark what exactly they do for you.
Whether it be the decision of where your transformations need to be done, how exactly the data and tables need to be modelled or what you allow the end users to do with your model these are all important steps to take, preferably without shortcuts. We’ll take the steps on a moderately complex data model, and measure as we move along.
Meaning, at the end of the session we’ll have discussed why Star Schema’s can help you, and how you can assess for yourselves if they are beneficial for your use case.
Fabric
Efficient Data Partitioning with Microsoft Fabric: Best Practices and Implementation Guide
Imagine having a vast encyclopedia, and your task is to find a specific page containing a particular word. Instead of searching through the entire encyclopedia, you opt to navigate directly to the relevant section that might contain the desired information. This approach allows you to narrow down your search scope and quickly find the page you need. This concept of narrowing down the search space by dividing data into manageable sections is precisely what partitioning entails.
Power BI
A delightful Power Concert
In this session I will show you different ways of building a Power concert where we make the different instruments in the Power Platform band play together in a perfect harmony.
I will demo how can we make Power Apps and Automate drive actions from directly in Power BI and how we can connect Power Apps to Power BI reports and how to get data from Power BI into Power Automate.
Combining these instruments will enable you to play power full tunes that will make your users sing along.
It will be a demo heavy session that should give you inspiration to compose your own power tunes 🙂
Power BI
From XL to S – Reduce your Power BI model size by 90%!
Have you ever wondered what makes Power BI so fast and powerful when it comes to performance? So powerful, that it performs complex calculations over millions of rows in a blink of an eye.
In this session, we will dig deep to discover what is “under the hood” of Power BI, how your data is being stored, compressed, queried, and finally, brought back to your report. After the session, you will get a better understanding of the hard work happening in the background and appreciate the importance of creating an optimal data model in order to get maximum performance from the Power BI engine.
Finally, you will see a real use-case demo showing how the Power BI data model was reduced by 90%!
I will demo how can we make Power Apps and Automate drive actions from directly in Power BI and how we can connect Power Apps to Power BI reports and how to get data from Power BI into Power Automate.
Combining these instruments will enable you to play power full tunes that will make your users sing along.
It will be a demo heavy session that should give you inspiration to compose your own power tunes 🙂
Power BI
C# Scripts for Tabular Editor are not that tough: Bringing Visual Studio into the party
If you are not a C# Jedi, trying to write C# scripts for Tabular Editor in Tabular Editor 2 is quite of a challenge. At least it was for me. With time I found out how to bring development over to Visual Studio, and how to bring the code back easily into tabular Editor. In this session I demo how to set up the environment and start coding away in a much more confortable way.
Power BI
Citizen data analyst hitchiker guide to Power BI
You are passionate about data analysis and ended up installing Power BI, but what now? where to go, and what are all those terms?
Don’t worry, this session will guide you through the Power BI language.
What’s the difference between the Direct Query and Live Connect? Then…Data…Dataset – Dataflow – Datamart…What are all these?! “I’ve created a wonderful dashboard”. You mean, report? No…yes…I don’t know the difference ☹
Can I use DAX to create calculated columns in Power Query? And, how to create measures in Power Query? (No, you can’t do that, it’s a different language). I’ve installed the data gateway, now I need to choose the proper mode – I should go with personal, right?
In this session, we will demystify the most common sources of confusion among Power BI practitioners and explain differences and similarities between various Power BI terms and concepts. You’ll walk away with a clear understanding of the terminology and feel more confident talking Power BI!
Fabric, Power BI
Exploring Fabric Semantic Link for Power BI folks!
If you’re coming from a Power BI world, the whole Fabric thing might scare you a bit. Then suddenly, there is something called Semantic Link, which allows connections from Fabric Notebooks to read both data and meta data from your Power BI Semantic Model (dataset).
You might wonder, what’s all this? How does it work and how to take benefit from it? Especially when you’re not familiar yet with notebooks at all.
In this session we will explore what Semantic Link is, how you as Power BI developer/engineer can make use of it and how it will strengthen your solutions in the end. Together, we will explore various aspects like:
- Warming up your data in direct lake datasets
- Querying Dynamic Management Views,
- Reading Semantic Model meta data to generate documentation
- Extend or even forecast your data using Semantic Link
By the end of this session, you will not only grasp the basics of Semantic Link but also gain practical insights into its application so you can bring it to practice directly!
Fabric
Make Fabric the Gold layer of your multi-cloud medallion architecture!
While Microsoft Fabric is a full-blown data platform in its own right, you might allready have invested in a modern data platform allready.
You still use Power BI of course – because that is the best tool out there.
That might come with a cost – such as slower reports or egress costs. Or you might miss out on some new functionality in your current solution that Fabric promises to deliver.
Microsoft has got you covered!
Come join me to see how Fabric can be positioned as your gold layer – or part of the gold layer in your data platform.
We will look at data mirroring, shortcuts, and all the nice little tricks available to get the best of both worlds: Your existing data platform and Microsoft Fabric!
Key take-aways:
Learn how Fabric can integrate with solutions such as Databricks, SQL dwh or Snowflake to give you the best of both worlds.
Learn how data mirroring works in the different systems.
Understand how shortcuts and OneLake APIs can help you.
Fabric
Prepare Azure DevOps for your Microsoft Fabric needs
Join me for a session where I cover how you can prepare Azure DevOps for your Microsoft Fabric needs.
Topics covered include:
- Configuring Azure Repos for Microsoft Fabric Git integration
- Preparing Azure Pipelines to perform CI/CD for Data Warehouses
- Considerations to perform CI for Power BI reports within Azure DevOps
- Demos are shown during the session.
Even though this session focuses on Microsoft Fabric you can use a lot of the concepts in this session for other Microsoft Data Platform services. For example, you can apply some items covered in the CI/CD for Data Warehouses section to SQL Server of Azure Synapse Analytics SQL Pools.
At the end of this session you will walk away with a better understanding of how to do prepare Azure DevOps for your Microsoft Fabric needs.
Power BI, Fabric
The Data Dojo: A Power BI Community of Practice
One of the best ways to improve data literacy and foster an active, passionate data culture within an organization is to establish a “Community of Practice” around the technologies the organization uses to process, store, analyze, and consume data.
In this session, we’ll learn about “The Data Dojo: A Power BI Community of Practice,” which was established at Des Moines University to facilitate the sharing of knowledge and experience among faculty and staff interested in analyzing the data generated by their teams and departments.
We’ll also talk about the Data Dojo’s egalitarian founding principles and unconventional structure, some of the topics we’ve covered at our workshops, what has gone right, what we could have done better, and what we’re planning for the future.
And finally, we’ll discuss how you can encourage data literacy and foster a vibrant data culture within your organization by establishing a Data Dojo of your own!
Fabric
Using Azure EventGrid w/ Fabric Notebooks
In this session we’ll cover how Azure Event Grid can be used to unify and automate your data loading workflows using Fabric Notebooks, including how messages can be utilized to communicate changes in state in the flow of data. Azure Event Grid can for instance be used to monitor changes in the layers of a data lake and trigger downstream processing tasks, handle logging, telemetry and much more. By utilizing messages to communicate actions within the workflows in the data lake, Azure Event Grid enables a more efficient and streamlined data processing pipeline. Data loading workflows can be automated and triggered in real-time, reducing manual intervention and improving overall efficiency.
In the context of a Fabric Notebook, we will cover the steps needed to configure and setup the “backend Azure stuff” as well as configuring the workspace to enable the link to Azure Event Grid. Once that is configured, we will explore some of your options given this capability.
In specific, we will look at how to use Azure Event Grid for
- Logging data processing events
- Logging telemetry
- Logging sample data, data statistics etc. (leveraging features from spark)
Attending this session will leave you with an introduction to Azure EventGrid and the message structure. You will also learn how to utilize this to create a framework for automating data processing in Fabric Notebooks as well as reporting statistics on top of the flows of data in you workspace.
Join me to learn about scalable automation in Microsoft Fabric using Axure EventGrid.
Power BI
Deep Dive into Power BI Source Control
2023 has brought significant new capabilities to Power BI for source control and deployment automation – Developer Mode, TMDL, and Fabric Git Integration, to name a few. Features many Pro users have been demanding for years.
With further extensions to those technologies planned for early 2024, this session is your opportunity to learn about the current state of source control for Power BI in a deep-dive workshop.
The session will focus on professional development patterns in a collaborative environment. It will explain and show how all the latest Pro Dev features in Power BI can be utilized to set up practical team development workflows.
Furthermore, attendees will understand where Microsoft’s tooling still has gaps, and how external tools can be brought in to help.
Fabric
Synapse vs. Fabric: An Azure Fight
You have data. Microsoft has technologies to process your data. But which ones are better for you?
In this fight hub, you’ll witness the major strengths and weaknesses of these technologies, as well as other technologies in the Microsoft ecosystem.
Disclaimer: Don’t forget the first rule of fight hub, you always talk about the fight hub.
Fabric
Supercharge Your Data Analysis with Azure Databricks and Microsoft Fabric Integration
In this session, we’ll explore the powerful combination of Azure Databricks and Microsoft Fabric. Many organizations are already using Azure Databricks for their data needs, and with Fabric, they can access and analyze that data even more efficiently. Specifically, we’ll cover how to:
- Access Azure Databricks Delta Tables in Fabric: With this, you can quickly create shortcuts to read the data and analyze it in Power BI via DirectLake.
- Analyse OneLake data in Azure Databricks: This seamless process lets you access Fabric Delta tables in Azure Databricks that can be easily read and modified via Azure Databricks.
- Execute Azure Databricks notebooks from Fabric pipelines: Just like in ADF & Synapse, you can execute Azure Databricks notebooks from Fabric pipeline.
- Use Dataflow Gen2 to ingest data from Azure Databricks: With this approach, you can ingest and prepare data managed by Azure Databricks.
Once you have your data or shortcut in OneLake, you can easily build insightful dashboards that help you make better business decisions. Join us to learn more about this exciting integration between Azure Databricks and Microsoft Fabric!
Fabric
End to End DnA Pipeline Project on Data Fabric
In this session I would like to showcase the approach that we, Bizztreat team, have taken while building an end to end Data and Analytics project built for a Microsoft FastTrack partner – Zeelandia. The project was built with the intention to replace an older solution and provide a single source of truth for the company using Microsoft Data Fabric as the heart of the solution.
I aim to showcase the hurdles that we have encountered on the way, from the point of building the solution architecture all the way to the point of implementation, and how we managed to leverage our custom ETL solution that helped us overcome some Data Fabric shortcomings that we have ran into.
Fabric, Power BI
Real-Time Reporting in Fabric
In the fast-evolving landscape of data analytics, staying ahead with data insights is crucial for businesses to make informed decisions swiftly. Microsoft Fabric offers new methods of analyzing data in real time.
In this session, we will explore the various options for designing real-time datasets and reports—mainly focusing on the new tools at our disposal. We will cover topics like: DirectQuery for KQL, Eventstream, KQL Databases, Direct Lake, and more.
Attend this session to upgrade your toolbox with knowledge of what Real-Time Analytics in Fabric has to offer. After attending this session, you should be able to utilize and evaluate the different real-time possibilities in Microsoft Fabric and Power BI.
Topics:
- Real-time Analytics in Fabric
- Real-time reporting in Power BI
- Real-time storage modes
- Event-based architecture