Skip to main content
Using PowerBI with Gearset's Reporting API

A guide to showcase how the Reporting API endpoints can feed into Microsoft PowerBI and visualizations can be created to report on key areas

Samuel Crossland avatar
Written by Samuel Crossland
Updated over 2 weeks ago

When we think about collecting and visualising data in a coherent manner, there are a few well-known tools out there to help do this, and one of those is Microsoft's PowerBI.

It can be used across multiple devices, ingest many different data types, and visualize that data in a lot of different ways depending on who the audience needs to be.

Gearset has some really powerful options available to retrieve key information on your Gearset usage and performance, such as the Reporting and Audit APIs. However, sometimes the raw data in JSON format won't be enough to help convey the messages you need to certain stakeholders or for auditing, and that's where a transformation and visualization tool can come in really useful.

This article uses the Windows PowerBI Desktop application to query, transform and visualize the data from Gearset's Reporting API to create simple graphs for key metrics.

It's very likely you'll want to tweak the data transformation and resulting graphs to your needs, so treat this as a starting point to evolve for your own use cases.

Important notes:

  • This document references the Reporting API V1 to quickly showcase how you can construct queries and transform the data returned - as per this advice, all queries going forward should utilise Reporting API V2 to ensure protection against timeouts and scalability in future.

  • Be aware of the various Gearset API limits in place when constructing and executing these queries, namely the '10 requests per hour per endpoint' to query the Reporting/Audit API.

Initial setup

  1. Open up a new Report in PowerBI.

  2. Click on 'Get data' in 'data' section of the top configuration panel and select 'Blank query'.

  3. You'll now be taken to the 'Power Query Editor' and have opportunity to create multiple queries to build up this report. Click 'Advanced editor' in the 'query' section of top panel to allow us to customize the API call into Gearset and resulting data.

    1. Using this editor allows us to utilize the Power Query M Language to structure our request, transformation, and resulting data output, in the form of multiple queries.

Reporting API - 'Deployment Frequency Aggregate' - Query

  1. The below screenshot and code showcase a query to the Deployment Frequency Aggregate endpoint for the last 30 days, where 'XXXXX' (after 'token') needs to be replaced with your Gearset API Token:

    1. let
      // Allow easy switching of the date dynamically to get the last 30 days
      startDate = Date.ToText(Date.AddDays(Date.From(DateTime.LocalNow()), -30), "yyyy-MM-dd"),
      endDate = Date.ToText(Date.From(DateTime.LocalNow()), "yyyy-MM-dd"),
      urlWithoutDates = "https://api.gearset.com/public/reporting/deployment-frequency/aggregate" &
      "?StartDate={StartDate}T00%3A00%3A00.0000000Z" &
      "&EndDate={EndDate}T00%3A00%3A00.0000000Z" &
      "&Interval=Daily&GroupBy=TotalDeploymentCount",
      url = Text.Replace(Text.Replace(urlWithoutDates, "{StartDate}", startDate), "{EndDate}", endDate),
      headers = [ #"Authorization" = "token XXXXX"],
      result = Json.Document(Web.Contents(url, [Headers = headers])),
      Items = result[Items],
      Items1 = Items{0},
      #"Converted to Table" = Record.ToTable(Items1),
      #"Filtered Rows" = Table.SelectRows(#"Converted to Table", each ([Value] <> "count")),
      #"Expanded Value" = Table.ExpandListColumn(#"Filtered Rows", "Value"),
      #"Expanded Value1" = Table.ExpandRecordColumn(#"Expanded Value", "Value", {"Date", "Value"}, {"Date", "Value.1"}),
      #"Renamed Columns" = Table.RenameColumns(#"Expanded Value1",{{"Value.1", "Deployments"}}),
      #"Removed Columns" = Table.RemoveColumns(#"Renamed Columns",{"Name"}),
      #"Changed Type" = Table.TransformColumnTypes(#"Removed Columns",{{"Date", type datetime}, {"Deployments", Int64.Type}})
      in
      #"Changed Type"

  2. Once you hit Done and the query runs, you'll be presented with the following formatted table (the 'Applied steps' on the right panel correspond to our query above where the lines start with #, indicating we're performing some transformation). This ensures we've got data that can nicely fit into the in built table types.

  3. Exit the Query Editor panel to head back to the Report page, where the Data panel on the right hand side will showcase the Date/Deployments information we've just retrieved and transformed.

    You now have the data available to Visualize as required using the various graphs/filtering capabilities of PowerBI as outlined here!

  4. The example below shows a 'Stacked Column Chart' using the Month and Day attributes of the Date response, and Deployments associated with each one.

    1. You can then modify this as required with the available data and filtering/visualization options to make it as needed, and refresh the data (in accordance with the API limits) for updated information.

Reporting API - 'Lead Time for Changes Aggregate' - Query

  1. Another example may be that you're looking for a 6 month view of your Lead Time for Changes for a particular Pipeline, in aggregated form per week, specifically focusing on the 'Mean' lead time. The query and screenshots below show how to achieve this (Note: this is a slightly more complicated function due to needing to convert the TimeStamp format into DecimalDays so PowerBI charts can recognise it appropriatel)

    1. Make sure to replace the following:

      1. PipelineID - grabbed from the URL when viewing your Gearset Pipeline

      2. The API token id (after 'token') with your specific API token

      3. You can also change the startDate and endDate to your needs

    2. let
      startDate = "2024-01-20",
      endDate = "2024-07-20",
      pipelineId = "<YourPipelineIDHere>", // REPLACE THIS

      // URL template with placeholders for StartDate, EndDate, and PipelineId. Aggregating Weekly
      urlWithoutDates = "https://api.gearset.com/public/reporting/lead-time/{PipelineId}/aggregate" &
      "?StartDate={StartDate}T00%3A00%3A00.0000000Z" &
      "&EndDate={EndDate}T00%3A00%3A00.0000000Z" &
      "&Interval=Weekly",

      // Replace placeholders with actual values in one line
      url = Text.Replace(Text.Replace(Text.Replace(urlWithoutDates, "{PipelineId}", pipelineId), "{StartDate}", startDate), "{EndDate}", endDate),
      // Configure the relevant API Token - REPLACE THIS
      headers = [ #"Authorization" = "token XXXXX"],
      result = Json.Document(Web.Contents(url, [Headers = headers])),
      Environments = result[Environments],
      Environments1 = Environments{0},
      MeanTimeLeadTimeForChanges = Environments1[MeanTimeLeadTimeForChanges],
      #"Converted to Table" = Table.FromList(MeanTimeLeadTimeForChanges, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
      #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"Date", "Value"}, {"Column1.Date", "Column1.Value"}),
      #"Changed Type" = Table.TransformColumnTypes(#"Expanded Column1", {{"Column1.Date", type datetime}, {"Column1.Value", type text}}),
      #"Renamed Columns" = Table.RenameColumns(#"Changed Type", {{"Column1.Date", "Date"}, {"Column1.Value", "MeanLeadTime"}}),

      // Define the function to convert DD.HH:MM:SS.MS or HH:MM:SS format to decimal days
      convertToDecimalDays = (timeSpan as text) as number =>
      let
      // Check if timeSpan starts with "00" (indicating hours only)
      startsWithZero = Text.Start(timeSpan, 2) = "00",

      // Split based on the format
      parts = if startsWithZero then Text.Split(timeSpan, ":") else Text.Split(timeSpan, "."),

      // If starts with "00", set days to 0 and extract hours and minutes
      days = if startsWithZero then 0 else Number.FromText(parts{0}),
      remainingTime = if startsWithZero then timeSpan else parts{1},

      // Split remaining time into hours, minutes, and seconds
      timeParts = Text.Split(remainingTime, ":"),
      hours = Number.FromText(timeParts{0}),
      minutes = Number.FromText(timeParts{1}),

      // Calculate decimal days
      decimalDays = days + (hours / 24) + (minutes / 1440),

      // Round to 2 decimal places
      roundedDecimalDays = Number.Round(decimalDays, 2)
      in
      roundedDecimalDays,

      // Add the DecimalDays column by applying the custom function and setting the data type to Fixed Decimal Number for charting
      #"Added Decimal Days" = Table.AddColumn(#"Renamed Columns", "DecimalDays", each convertToDecimalDays([MeanLeadTime]), type number),
      #"Converted to Fixed Decimal" = Table.TransformColumnTypes(#"Added Decimal Days", {{"DecimalDays", type number}})
      in
      #"Converted to Fixed Decimal"
  2. Once you hit Done and the query runs, you'll be presented with the following formatted table including the applied steps transforming the results.

    1. As above, this ensures we've got data that can nicely fit into the in built table types, and in this particular example we've done some processing to extract the 'DecimalDays' separately from the returned format of DD.HH:MM:SS.MS to use in charting, but of course you could modify this depending on the scale you require.

  3. Exit the Query Editor panel to head back to the Report page, where you can see the new query results in the 'Data' panel on the right, with selectable options for each column we transformed. Select your chart and customisations for the date/decimaldays we've retrieved.

  4. The example below shows a Line Chart to summarise the Month/Day over the 6 month period specified, and the Mean Lead Time (in DecimalDays) to plot across the Y-axis.

  5. We can see it really spiked in mid June so it gives me further information to go off and investigate about why this has occurred, which shows how powerful these visualizations can be!

Summary

As you can see, PowerBI has the capability to retrieve various data points from the Gearset APIs, and visualize them in different ways to satisfy your requirements for both performance-based tracking and auditing requirements.

While we haven't covered every possible endpoint and transformation/visualization opportunity in this document, it should serve as a foundation for you to build up the right set of dashboards to meet your needs as a team.

Did this answer your question?