In Part 1 of this blog, we extracted Audit Log data from Office 365 using PowerShell and stored it in SharePoint for the purposes of quantifying Power BI adoption. In Part 2 we will transform and visualize the Audit Log data using Power BI.
We’ll focus on: connecting and transforming the Audit Data from JSON format into a more usable state and create a couple of the measures that can help to illustrate Power BI adoption.
Part 2 – SharePoint and Power BI
The following steps are executed using Power BI Desktop.
- Click “Get Data”. Select “SharePoint folder”. Click “Connect”.
- Enter your SharePoint site URL (from Part 1). Click “OK”.
- Click “Edit”. We’ll select the Audit Log csv files in the next step.
- Click the “Extension” column menu option. Click “Load More” if “.csv” is not visible.
- Filter “Extension” to only include “.csv” files. Click “OK”.
- Filter “Name” to only include “Audit_Log” named files. Click “OK”.
- Click the “Content” column. Click “Combine Files”.
- Click “OK” to the “Combine Files” screen.
- Remove all columns except “Column8”. This is the JSON Audit Log data.
- Click “Remove Rows”. Click “Remove Blank Rows”.
- Click the “Transform” tab. Click “Use First Row as Headers”.
- Select the “AuditData” column. From the “Transform” tab, click the “Parse” button and the click “JSON”.
- Click the “Expand Record” button. (2 arrows pointing in opposite directions).
- Click “OK”.
- Remove the following columns:
- AuditData.Id
- AuditData.RecordType
- AuditData.OrganizationId
- AuditData.UserType
- AuditData.UserKey
- AuditData.Workload
- AuditData.ClientIP
- AuditData.UserAgent
- AuditData.WorkspaceId
- AuditData.DatasetId
- AuditData.ReportId
- Rename all columns to remove the “AuditData.” prefix.
- Duplicate “CreationTime” column. Rename to “Activity Date”. From the “Transform” tab, click “Detect Data Type”, click “Date” and then click “Date Only”.
- Click “Remove Rows” and then “Remove Errors”.
- Click “Remove Rows” and then “Remove Duplicates”.
- Click “Close and Apply”.
Your “Advance Editor” entries should look similar to those listed below.
let Source = SharePoint.Files("Your SharePoint site URL here", [ApiVersion = 15]), #"Filtered Rows" = Table.SelectRows(Source, each ([Extension] = ".csv") and ([Name] = "Audit_Log-2017-10-25.csv" or [Name] = "Audit_Log-2017-11-02.csv")), #"Invoke Custom Function1" = Table.AddColumn(#"Filtered Rows", "Transform File from Query1", each #"Transform File from Query1"([Content])), #"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}), #"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "Transform File from Query1"}), #"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from Query1", Table.ColumnNames(#"Transform File from Query1"(#"Sample File"))), #"Changed Type" = Table.TransformColumnTypes(#"Expanded Table Column1",{{"Source.Name", type text}, {"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}, {"Column5", type text}, {"Column6", type text}, {"Column7", type text}, {"Column8", type text}, {"Column9", type text}, {"Column10", type text}, {"Column11", type text}, {"Column12", type text}, {"Column13", type text}}), #"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Source.Name", "Column1", "Column2", "Column3", "Column4", "Column5", "Column6", "Column7", "Column9", "Column10", "Column11", "Column12", "Column13"}), #"Removed Blank Rows" = Table.SelectRows(#"Removed Columns", each not List.IsEmpty(List.RemoveMatchingItems(Record.FieldValues(_), {"", null}))), #"Promoted Headers" = Table.PromoteHeaders(#"Removed Blank Rows", [PromoteAllScalars=true]), #"Changed Type1" = Table.TransformColumnTypes(#"Promoted Headers",{{"AuditData", type text}}), #"Parsed JSON" = Table.TransformColumns(#"Changed Type1",{},Json.Document), #"Expanded AuditData" = Table.ExpandRecordColumn(#"Parsed JSON", "AuditData", {"Id", "RecordType", "CreationTime", "Operation", "OrganizationId", "UserType", "UserKey", "Workload", "UserId", "ClientIP", "UserAgent", "Activity", "ItemName", "WorkSpaceName", "DatasetName", "ReportName", "WorkspaceId", "ObjectId", "DatasetId", "ReportId"}, {"AuditData.Id", "AuditData.RecordType", "AuditData.CreationTime", "AuditData.Operation", "AuditData.OrganizationId", "AuditData.UserType", "AuditData.UserKey", "AuditData.Workload", "AuditData.UserId", "AuditData.ClientIP", "AuditData.UserAgent", "AuditData.Activity", "AuditData.ItemName", "AuditData.WorkSpaceName", "AuditData.DatasetName", "AuditData.ReportName", "AuditData.WorkspaceId", "AuditData.ObjectId", "AuditData.DatasetId", "AuditData.ReportId"}), #"Removed Columns1" = Table.RemoveColumns(#"Expanded AuditData",{"AuditData.Id", "AuditData.RecordType", "AuditData.OrganizationId", "AuditData.UserType", "AuditData.UserKey", "AuditData.Workload", "AuditData.ClientIP", "AuditData.UserAgent", "AuditData.WorkspaceId", "AuditData.DatasetId", "AuditData.ReportId"}), #"Renamed Columns" = Table.RenameColumns(#"Removed Columns1",{{"AuditData.CreationTime", "CreationTime"}, {"AuditData.Operation", "Operation"}, {"AuditData.UserId", "UserId"}, {"AuditData.Activity", "Activity"}, {"AuditData.ItemName", "ItemName"}, {"AuditData.WorkSpaceName", "WorkSpaceName"}, {"AuditData.DatasetName", "DatasetName"}, {"AuditData.ReportName", "ReportName"}, {"AuditData.ObjectId", "ObjectId"}}), #"Duplicated Column" = Table.DuplicateColumn(#"Renamed Columns", "CreationTime", "CreationTime - Copy"), #"Renamed Columns2" = Table.RenameColumns(#"Duplicated Column",{{"CreationTime - Copy", "Activity Date"}}), #"Changed Type2" = Table.TransformColumnTypes(#"Renamed Columns2",{{"Activity Date", type datetime}}), #"Extracted Date" = Table.TransformColumns(#"Changed Type2",{{"Activity Date", DateTime.Date, type date}}), #"Removed Errors" = Table.RemoveRowsWithErrors(#"Extracted Date", {"CreationTime"}), #"Removed Duplicates" = Table.Distinct(#"Removed Errors", {"CreationTime"}) in #"Removed Duplicates"
My Power BI Adoption report utilizes:
- KPI’s for activity and user activity by week over week.
- Top 5 reports and users by activity.
- Activity and unique users by month.
- Granular report activity breakdown by month.
I’ll show the DAX measures required for unique users and the week over week KPI’s. Everything else in the report is straight forward and your imagination can guide you on how to best layout the report.
Measure: Unique Users
- Create a new measure with the following code:
- Unique Users = DISTINCTCOUNT(AuditData[User Id])
- Note: I renamed “UserId” to “User Id” to make the report more user friendly.
Measure: Unique Users Previous Week
- Create a new measure with the following code:
- UniqueUsers-WoW = CALCULATE(DISTINCTCOUNT(AuditData[User Id]), DATEADD(AuditData[Activity Date], -7, DAY))
- Note: I renamed “ActivityDate” to “Activity Date” to make the report more user friendly.
Let’s put it all together now with a KPI showing the unique users performance compared to the previous weeks performance.
KPI: Users Week Over Week
- Add a new “KPI” data visualization.
- Set “Indicator” to “Unique Users”.
- Set “Trend Access” to “Activity Date”.
- Set “Target goals” to “UniqueUsers-WoW”.
My finished report is pictured below. I’ll revisit the report and the Audit Log data after more data has accumulated with an eye towards seasonal trends, average visits per user per week, “free” vs “pro” license usage and finally reassess my KPI’s to ensure that they continue to quantifying adoption in a way that makes sense for my organization.
Thanks for stopping by.
NY