Low code adoption of the Gearset APIs
Andy Barrick avatar
Written by Andy Barrick
Updated over a week ago

Just like any engineering process, software delivery should be monitored. At the simplest level there's the immediate feedback of whether a deployment has succeeded or not, and adoption of a platform like Gearset should take a lot of doubt around such things away. You might well encounter failed validations, but they should be more visible, and occur earlier in the process - this is the core of DevOps.

Once you're comfortable that you're deploying more regularly and with greater success, there's still probably scope for improvement. High performing teams across all engineering disciplines, not just software, aim to continually improve their processes and their outputs. If your organization implements anything from timeboxed OKRs to sprint retrospectives, having empirical data to back up targets or observations makes things much easier.

Gearset provides a range of APIs covering many aspects of reporting and auditing, to give you greater insight into your engineering process. We already have great blog posts and documentation about these APIs and the specifics of each of the endpoints, so this won't duplicate any of that detail. Instead, we're going to look at a different angle.

A number of teams have no doubt as to the value of tracking metrics associated with their software engineering process, but don't necessarily have the resources to implement code-driven consumption of the Gearset APIs. Here, we'll look at some alternatives that will get you access to the data without needing dedicated development resource, replacing that with a repeatable manual process.

First of all, it's important to note that any mention of a third party service here doesn't constitute a formal recommendation from Gearset. You shouldn't take any commercial decisions on such services based on a reference to it in this article. All options mentioned were as described at the time of writing; subsequent changes to commercial terms for any of these products may mean they become unsuitable. Always research such offerings independently to ensure that they comply with all your organization's policies.

Token creation

The first stage in adoption of the APIs is to create an access token. This is done within your Gearset account within My Account -> Access token management. Again, there's existing documentation around that process, linked to from the page itself.

However, it's really important to re-iterate that your token should be stored securely. When you're using the token programmatically, that's still a concern, but typically the credential would be brought into a vault-style construction within the development platform and from then on the actual token itself wouldn't need to be known, just the reference to it in the value.

Of course, in a low/no-code setup, you're going to need to provide the token manually when you call the APIs. It's definitely worth checking with your organization's IT or Security teams whether there's a credentials/secrets manager application that could be used, or what they'd otherwise recommend for handling such tokens. Keeping it in a file somewhere, or written down, aren't options.

Manually calling the endpoints with Postman

There are a wide range of offerings that will help you call API endpoints, but one of the most popular is Postman. Currently various tiers of functionality are offered, but for occasional calling of APIs, this can be done for free. You can even create a team of up to three to share the load and responsibility for retrieving the Gearset data without having to worry about credential sharing.

Once within Postman, you can create a Workspace, and from there, call the Gearset APIs. As all the endpoints are for retrieving data from Gearset, all calls will use the HTTP GET method. The base URL is https://api.gearset.com followed by the path shown in the Gearset documentation. For example, the Deployment frequency data is accessed at https://api.gearset.com/public/reporting/deployment-frequency. This will be the URL to which the request will be sent.

The next stage is to add the access token to the request. To do this, go to the Headers (x) tab, and on the provided empty line at the end, add a header called Authorization with a value of token <your token>:

It's vital that token is spelt entirely in lower case letters, otherwise you'll get an authorization failure. Also, the angled brackets above are simply for clarity - the token shouldn't have those around it.

Most of the endpoints have parameters, some mandatory, some optional, which you may need to supply in order to get the relevant data. These can be supplied via the dedicated Params tab, and Postman will automatically update the URL to reflect what you've added:

Note too that the date format is ISO8601, but the supplied dates must be in UTC, and therefore end with a Z. Time zone offsets, e.g. 2023-12-04T00:00:00+0500 are not supported currently, and this would have to be 2023-12-03T19:00:00Z instead.

Getting the data

At this point now, you should have a valid request, as long as all mandatory parameters are satisfied. You can go ahead and click Send to check. The API docs and response itself should help you work out the next steps if you get an error response, but let's assume you receive a successful status of 200 and a response back.

The Gearset APIs return data in JSON format. Even if you're familiar with the request headers that allow the caller to request the returned format, they'll have no effect on the response.

Up in the top right of the response pane, Postman will allow you to download the response to a file - a sensible naming convention will be very useful here!

What you do from this point obviously depends on your requirements. Whilst JSON is human readable, and can be formatted or prettified by IDEs such as Visual Studio Code or sites such as https://jsonviewer.stack.hu/ in order to make the contents easier to view, ultimately, it's usually a format for transfer between applications and won't be your end state.

Therefore, some sort of conversion will probably be needed. You might find that some sort of manual editing may be required as well. If you're looking to create something like a CSV file, JSON may not immediately convert easily. JSON isn't simply a list of key-value pairs, it can contain items such as repeated nested objects which don't map easily to a CSV format.

However, some sites like JSON to CSV have designed solutions that not only handle this problem but also don't require server-side processing, so your data isn't uploaded to their servers, but all the transformation is done within the browser. From there, you can download the resulting CSV file and interrogate as you wish.

Alternatively, if you have access to Microsoft Excel, that is able to import a JSON file and convert the data to a table. From there of course you can reformat and/or export the data as you please. Currently, Google Sheets does not offer equivalent functionality as standard, although there are addons which do.

OpenAPI specification

Rather than manual entry, you can import the OpenAPI specification for the API into Postman. You'll find a link to the specification on the Gearset API home page under the heading:

Clicking on this will take you to the raw spec, which you can save locally as a JSON file. Having done that, you can then use the Import functionality within a Postman workspace to allow Postman to do the scaffolding for you:

You'll have a collection of all the services available on the left hand side, and selecting a specific one will show all known data to be populated. You'll need to provide values for {{baseUrl}}, {{apiKey}} and for any relevant query string parameters before being able to connect successfully of course.

All in one solutions

A number of platforms exist dedicated to enabling ETL - Extraction, Transformation and Loading - of data between applications or formats. Mulesoft, owned by Salesforce, is one such platform.

Mulesoft's free tier is a 30 day trial rather than based on low usage like Postman, so this is probably only an option if your organisation already has Mulesoft licences. Of course, the overall aim is the same - to extract the data from the Gearset APIs and transform it into the desired output.

You'll get more declarative functionality and more control with Mulesoft over something like Postman and another JSON transformation service. As the Gearset APIs conform to the OpenAPI specification, just like Postman, Mulesoft can also understand the services, what they require and what they return, and do a large amount of scaffolding for you, rather than the manual entry of values in Postman.

Just by ingesting the configuration file, Mulesoft is able to expose the whole Reporting API here, ready to be used within any process that I wish to configure within the Anypoint Studio:

This can be as simple as transforming the format from JSON to something else, or as complex as extracting the data, processing it and storing it in another system, which may be needed if the reporting mechanism is PowerBI, Tableau or some other similar application.

Ongoing management

This won't be a one-off exercise. Even if you have a single annual assessment of the metrics, which itself might mean that undesirable trends go undetected for a long period of time or get masked under the weight of data, then extracting that data isn't going to be a single operation due to the volume.

The ideal here is to have regular extractions of smaller chunks of the data, then store and analyse those in your format of choice. Therefore, the avoidance of duplicate data is important, so it should be clear for which time periods the reports have already been run, and the values that need to be supplied on each given invocation.

The companion document detailing the pro-code consumption of the APIs contains a section around this, as it's obviously just as much of a concern there. Bear in mind as well how you'll store the details of the time periods covered by previous requests, so as to avoid that duplication of data.

To help with this, Gearset provides each deployment with a Globally Unique Identifier (GUID), so you can also use this with some outputs to determine whether you've previously retrieved that record or not. Be aware though that in some cases, you will have done and this is expected - when a deployment is later marked as having contained a failed change, it will then appear in the Change failure rate output with its original DeploymentId so that you can match these up.

Did this answer your question?