Important notes:
The scripts in the linked repository are provided 'as-is' on an open source basis. Please see the README and Apache 2.0 Licence here for further details.
These scripts are ancillary to (and not a part of) the Services provided by Gearset, so you assume full responsibility for using and building on these foundational scripts to fit your business requirements, and for testing/maintaining/troubleshooting the code running in your specific circumstances.
When we think about collecting and visualising data in a coherent manner, there are a few well-known tools out there to help do this, both paid and free, but one that stands out in the cloud-based world is Google Sheets.
It can be used across multiple devices, ingest many different data types, and visualize that data in a lot of different ways depending on who the audience needs to be, alongside being a classic spreadsheeting application.
Gearset has some really powerful options available to retrieve key information on your Gearset usage and performance, such as the Reporting and Audit APIs, and includes the definitions to aid you in understanding the structure and creating automation against it. However, sometimes the raw data in JSON format won't be enough to help convey the messages you need to certain stakeholders or for auditing, and that's where a transformation and visualization tool can come in really useful.
This article uses Google Sheets to query, scaffold and visualize the data from Gearset's Reporting API to create simple graphs for key metrics, across multiple sheets and API endpoints. It leverages Google Apps Script (a version of JavaScript) to conduct the queries and place the data in the sheets, and 'triggers' to set this up on a recurring schedule.
It's very likely you'll want to tweak the data transformation and resulting graphs to your needs, so treat this as a starting point to evolve for your own use cases!
Important notes:
This document references the Reporting API V2 as, per this advice, all queries going forward should utilise Reporting API V2 to ensure protection against timeouts and scalability in future. This means the scripting includes more steps due to the async nature of querying/polling/retrieving.
Be aware of the various Gearset API limits in place when constructing and executing these queries, namely the '10 requests per hour per endpoint' to query the Reporting/Audit API as you test/implement these regular retrievals.
The scripts provided as part of this document (JavaScript) will likely need modifying for your specific use cases and implementation details, along with decisions on how often to query the endpoints/recalculate the data for your organizational needs. Also consider the specific columns/rows/cells used in certain scripting elements that may need changing depending on formatting requirements.
Initial setup:
Create a new Google Sheet (basic guide here), named appropriately and with relevant sharing permissions
Consider which different sheets you may want across the bottom (in this document we'll be using 1 sheet per endpoint to store and visualise the data, e.g 1st sheet for Deployment Frequency, 2nd sheet for Lead Time for Changes etc)
Ensure you have relevant permissions to access 'Apps Script' in the Extensions menu, as this is where the scripting/triggers will be configuring to automatically retrieve information.
Ensure you have a Gearset API Token for the APIs required, and have proven a query functions as expected with the dynamic documentation for the Reporting API.
Ensure you have access to relevant parts of the Gearset Platform, such as Pipelines, as you'll require unique identifiers such as the Pipeline ID and Environment IDs.
Reporting API - 'Deployment Frequency Aggregate (V2)' - Automated
Select the relevant sheet for 'Deployment Frequency', which in the examples going forward is the 1st sheet
Set a relevant title in row 1, an area to show the last updated time on row 2, and 'Date'/'Number of Deployments' headers on row 4, ready for data to be retrieved and leveraged in row 5 onwards.
Ensure the 'Last updated' cell (B2 in our examples) is formatted to 'Number > Date time'
Click on 'Extensions' and 'Apps Script' from the top menu bar, which will open a new page to configure the script/trigger. Add a new file on the left hand side named appropriately (I've kept these separated per endpoint call and target sheet, so in this case it's DeploymentFrequency(V2).gs)
Remove the existing placeholder code generated, download the following file, and copy in the code block:
Repository here - file - 'google-sheets/DeploymentFrequency(V2)-GSheetsIntegration.js'
Key code areas to be aware of or that must be modified:
Line 9 - this currently uses the first sheet in the list to insert the data to, so if you've used a different sheet, you'll need to modify this to align with the order (first sheet would be 0, second would be 1 etc which is how 'arrays' usually work in various programming languages)
Line 11/14/18 - these use specific columns/rows/cells to clear and insert relevant data, so may need modifying depending on your setup in step 2.
Line 23 - uses '180' as a default number for how many days back we want to start the query from. You can modify this number to shorten the timeframe appropriately.
Line 30 (Needs modifying)- this is the unique Authorization header you've created to access the Gearset API, and the 'XXXXXX' will need replacing with your token.
Line 36 - we aggregate weekly here, although this can be modified if you wish to change the aggregation timeframes (Daily or Monthly are other options)
Save the file (you may need to allow permissions here depending on your Google settings), and ensure the 'function to run' at the top is selected to be 'callDeploymentFrequencyAPIV2AndPoll'.
Click Run to fire off the request and monitor the Execution log for any issues.
If you encounter any issues, be sure to check the dynamic documentation for any response codes that could indicate an error in your query, and reach out to the in-app chat for anything further.
If the query has been successful and we check back in our google sheet, rows 5 onwards for columns A/B should be populated with dates and number of deployments. Highlight the relevant columns/rows, click insert and 'chart'.
Sheets may then generate a recommended chart for this data (usually a line chart) with relevant axis, labelling and colours. If not, you can select the relevant one for your needs. You can edit all of these components to make it look as required.
Once that is confirmed as working, if you wish to regularly retrieve this data, you can consider adding in a 'Trigger' to run the function regularly.
Go back to your scripting tab, and click the clock icon on the left hand menu to get to Triggers.
Click 'Add Trigger' and configure the relevant settings shown below:
Function to run - callDeploymentFrequencyAPIV2AndPoll
Event source - Time-driven so we can schedule
Type of time based trigger - this could be minutes/hours/days, but hours in my example
Select <Time> interval - in accordance with the API Limits, configure how often you want this call to run. The screenshot shows this trigger will run every hour.
Click Save and you'll see the trigger appear in the row. Depending on your schedule setup, it will run at the next valid step (e.g if hourly, it will run 1 hour from the time you save).
Ensure the data has refreshed in your google sheet (you should be able to tell by the 'Last updated:' date/time field, as if the query hasn't changed, the actual returned data should be the same), and modify the charts to your needs. As new data is inserted when the regular trigger runs, the chart will update accordingly.
Reporting API - 'Lead Time for Changes Aggregate (V2)' - Automated
We'll be following a lot of similar steps to the section above 'Reporting API - 'Deployment Frequency Aggregate (V2)' - Automated', but with some key changes as follows:
Make sure you create a new sheet for Lead time - in my example it's the 2nd sheet
Scaffold the sheet as follows (title, last updated time, and relevant headers for the information we need to make the chart, which includes columns for converting into decimal days)
Head into Extensions > Apps Script, make a new file called LeadTime(V2).gs.
Remove the existing placeholder code generated, download the following file, and copy in the code block:
Repository here - file - 'google-sheets/LeadTime(V2)-GSheetsIntegration.js'
Key code areas to be aware of or that must be modified:
Line 9 - this currently uses the second sheet in the list to insert the data to, so if you've used a different sheet, you'll need to modify this to align with the order (first sheet would be 0, second would be 1 etc)
Line 11 (Needs modifying) - this '<CIJobName>' variable needs replacing with the CI Job Name of your Final environment in the Pipeline to filter the results accordingly.
Line 30>38 - these use specific columns/rows/cells to clear and insert relevant data for the multiple elements we'll retrieve from this endpoint, so may need modifying depending on your setup in step 2.
Line 43 - uses '180' as a default number for how many days back we want to start the query from. You can modify this number to shorten the timeframe appropriately.
Line 50 (Needs modifying)- this is the unique Authorization header you've created to access the Gearset API, and the 'XXXXXX' will need replacing with your token.
Line 55 (Needs modifying) - you need to grab your PipelineID here you want to track the lead time for, and then 'YYYYYY' will need replacing with your ID.
Line 58 - we aggregate weekly here, although this can be modified if you wish to change the aggregation timeframes (Daily or Monthly are other options)
Line 69 - currently the polling is set to be every 5 seconds as per the advice here on rate limiting
Save the file and ensure the 'function to run' at the top is selected to be 'callLeadTimeAPIV2AndPoll'.
Click Run to fire off the request and monitor the Execution log for any issues. You will see some automatic conversions taking place to format the durations, but as long as you see 'Execution completed' without any errors, the data should be in your sheet.
If you encounter any issues, be sure to check the dynamic documentation for any response codes that could indicate an error in your query, and reach out to the in-app chat for anything further.
If the query has been successful and we check back in our google sheet, rows 5 onwards for columns A/B should be populated with dates and mean/max/min lead times.
The information should already be in date order, and the provided code will ensure we output the following format DD:HH:MM, but we still have to do some transformation so google sheets and charting can understand it.
As mentioned above, the lead times aren't in a format sheets can understand, so the easiest way to resolve that is make sure everything is converted into a 'decimal days' format (e.g 1d and 12h would become 1.5d)
Note: Depending on your velocity of delivering features, you may wish to round to the nearest minute instead, so modify the conversion to suit your needs.
Use the following function to achieve this in cell E5 (depending on scaffolding above).
=ROUND(VALUE(LEFT(B5, FIND(":", B5) - 1)) + VALUE(MID(B5, FIND(":", B5) + 1, 2)) / 24 + VALUE(RIGHT(B5, 2)) / 1440, 2)
is the function to convert hours/minutes into the right decimal value after days with 2 decimal places, and then you can hover over the bottom right of the cell and drag it into the other columns/rows to replicate.
Now we have some usable numbers, use multi select to highlight the date column and 3 rounded days columns to create a chart.
Make sure the chart axis are configured correctly (e.g date on x-axis, lead time on y-axis) and any formatting is completed to showcase the data how you wish.
Once that is confirmed as working, if you wish to regularly retrieve this data, you can consider adding in a 'Trigger' to run the function regularly.
Go back to your scripting tab, and click the clock icon on the left hand menu to get to Triggers.
Click 'Add Trigger' and configure the relevant settings shown below:
Function to run - callLeadTimeAPIV2AndPoll (Note: depending on other scripts, you may get a duplicate function warning here which can safely be ignored as we're using individual files for each call in this instance)
Event source - Time-driven so we can schedule
Type of time based trigger - this could be minutes/hours/days, but days in this example
Select <Time> interval - in accordance with the API Limits, configure how often you want this call to run. The screenshot shows this trigger will run every day between 12-1am.
Click Save and you'll see the trigger appear in the row. Depending on your schedule setup, it will run at the next valid step (e.g if hourly, it will run 1 hour from the time you save).
Ensure the data has refreshed in your google sheet (you should be able to tell by the 'Last updated:' date/time field, as if the query hasn't changed, the actual returned data should be the same), and modify the charts to your needs.
As new data is inserted when the regular trigger runs, the chart will update accordingly, but depending on how much data comes out, you may need to redo steps 10>12 to order/transform.
What do I need to retrieve and how could I visualize/analyze it?
We've talked through some examples of endpoints above that we can query from Gearset's Reporting API and possible ways to visualize them, but of course the exact data you want and why will be dependent on internal organizational requirements. Some general considerations on the key DORA Metrics are as follows:
Deployment Frequency -
This measure how often deployments occur, and high frequency can indicates efficient processes and agile throughput to production.
Importance: Frequent deployments help teams quickly respond to customer needs and ensure smaller, more manageable changes.
Potential Chart Type: Line Chart showing deployment frequency over time (e.g., daily, weekly, or monthly) to identify trends over specific periods.
Lead Time for Changes - Time from code committed to code deployed. Shorter lead times indicate an efficient process from development to production.
Importance: Lower lead times improve the speed at which teams can deliver value and address issues.
Potential Chart Type: Line or Area Chart displaying average lead time per deployment or time period, highlighting areas where improvements can be made.
Change Failure Rate - Percentage of deployments causing failures (e.g., bugs, rollbacks). Lower rates mean more stable deployments.
Importance: A lower failure rate indicates reliable deployments and fewer disruptions for users.
Potential Chart Type: Stacked Bar Chart to show the total deployments versus failed ones over time, providing insight into stability trends.
Time to restore - Average time to restore service after an incident. Faster MTTR (Mean time to recovery, as an alternative term) reflects a team’s resilience and recovery efficiency.
Importance: A low MTTR means faster issue resolution, minimizing downtime and user impact.
Potential Chart Type: Line or Area Chart showing MTTR over time, allowing trends to be easily identified and improvements tracked.
There is currently one other endpoint of the Reporting API that doesn't fall into the DORA Metrics, that should be considered too:
Continuous Integration Runs - this can give really useful information on success/failures, performance, and static code violations (if configured) for particular CI jobs you have configured in Gearset. Note: you'll need to grab the jobID from the Continuous Integration dashboard to use this endpoint.
Importance - having insight into a particular CI Job in your workflow can be really useful for a variety of reasons, especially if it's a key integration point for multiple pieces of work and heavily in use on the flow to production.
Proposed Chart Type - Stacked Bar Chart to show the total runs and comparison between success/failure over time, or Line Chart to highlight the 'NumberOfStaticCodeAnalysisIssues' detected over time to highlight improvement/degradation in code quality.
Summary
As you can see, Google Sheets has the capability to retrieve various data points from the Gearset APIs, and visualize them in different ways to satisfy your requirements for both performance-based tracking and auditing requirements. You also have the flexibility to call them manually or on a scheduled trigger, meaning (with further configuration) you could automate regular updates to key metrics and showcase them to relevant stakeholders.
While we haven't covered every possible endpoint and transformation/visualization opportunity in this document, it should serve as a foundation for you to build up the right set of dashboards to meet your needs as a team. If you have any questions/feedback please reach out to our In-App chat team!