Why Gtmhub insights
Gtmhub insight are a great way to keep track and act on your company key metrics. They have the ability to connect to dozens of external systems and gather meaningful stats that enables your company to focus and align on the most important objectives.
Insights metrics as equation of time
A very common need when viewing insights is to define a time range for which the metrics should be calculated. Let's say that you are using the built in Jira velocity insight that calculates the amount of completed stories per sprint. Without the ability to specify a time period you would need separate insights for each period you want statistics for - like 'last week velocity' and 'last month velocity'. This adds maintenance overhead and clutters your insight dashboards which is why we integrated the ability to specify a time period on top of the insight boards page:
How to add support for date ranges in your custom insight
The date range parameter functionality works out of the box for most of the built in insights but in order to enable your custom insights to be time interval configurable you need to add a few filtering statements.
SQL insights filtered by date
Let's say that you have a very simple SQL insight that counts all leads from an external system like Salesforce:
WHERE isdeleted = FALSE
In order to count the leads for a specific period of time you can use a hardcoded WHERE clause like the following example and rename the insight to something like "Leads generated last week":
isdeleted = FALSE
createddate > now() - INTERVAL '1 week'
The downside is that you can't easily reuse the insight algorithm for an insight that counts the generated leads for another time period like this quarter. This is where the insight parameter syntax %%param_name%% and the custom gtmhub.DURING function comes into play creating this "leads generated for a given period of time" insight:
isdeleted = FALSE
There are 2 things to note here:
- The %%date_range%% is the Gtmhub SQL placeholder syntax that will be replaced with the value selected from the drop down menu in insight boards section show on the picture above. This is actually the key of the custom insight parameter automatically added to your account in setup - configuration - Insightboard parameters:
2) The gtmhub.DURING([column_name], [time_period_definition_string]) is a custom SQL function that evaluates to TRUE or FALSE depending on the specified parameters and in combination with the WHERE statement in the example above - filters the set of results to leads that have been created in the specified time range.
The function accepts any of the following parameters:
- 'all time' or 'all_time', 'alltime'
- 'today' or 'this_day', 'day', 'this day'
- 'this quarter' or this_quarter', 'quarter', 'q', 'this_q', 'this q'
- 'this week' or 'this_week', 'week'
- 'this month' or 'this_month', 'month'
- 'this year' or 'this_year', 'year'
- 'previous day' or 'last_day', 'yesterday', 'previous_day', 'last day'
- 'last quarter' or 'last_quarter', 'last_q', 'last q'
- 'last week' or 'last_week'
- 'last month' or 'last_month'
- 'last year' or 'last_year'
- 'tomorrow' or 'next_day', 'next day'
- 'next quarter' or 'next_quarter', 'next_q', 'next q'
- 'next week' or 'next_week'
- 'next month' or 'next_month'
- 'next year' or 'next_year'
R insights filtered by date
The same custom gtmhub.During() function can be used in the R language - dplR package filter function.
Here is an example that collects all Jira issues that have been created in the specified time period by the drop down menu (see above picture) - like 'this quarter'.
issues <- entity("jira_issues")
issues <- filter(issues, gtmhub.during(created, settings.date_range))
issues <- collect(issues) # collects the filtered entities in memory
out(issues = issues) #result to be bound in the insight HTML
Again there are 2 things to be noted here:
- settings.date_range is the way to use the selected value from an insight board parameter in R insights:
2) The gtmhub.during([column_name], [date_range_period_string]) function has the same signature as the SQL insight custom function flavour (see above). This is due to the fact that the function execution is actually relayed to the SQL server by design. This is the design of the dplR package filter() function for R - so that the data set is filtered on the SQL server before being loaded in memory of the R executing process.