Using Kibana for your Amazon Connect Reporting

In this post, we take a look at how you can expand on the built-in reporting within Amazon Connect and use the wider AWS Ecosystem to bring real-time connect data into your existing Kibana dashboards. Amazon Connect provides out-of-the-box metrics and reporting that can generate real-time and historical metric reports to monitor efficiency and utilisation, agent performance, and other information about your contact centre. Sometimes more advanced reporting needs to be  designed and implemented in order to meet specific requirements.


What reporting is there?

When it comes to reporting in Amazon Connect, the standard reports can be broken down as follows:


  • Real-time metrics reports show real-time or near-real-time metrics information about activity in your contact centre.
  • Historical metrics reports include data about past, completed activity and performance in your contact centre.
  • Contact Trace Record reports are detailed views of individual calls that you can review by searching for particular sets of calls.


What if we have more requirements?

Sometimes customers require more visual or sophisticated reporting than the tabular display that Amazon Connect provides. For example, graphical Real-Time Dashboards displaying a “Single Pane of Glass” snapshot of what is important to you as it is happening in your call centre.  VoiceFoundry recently developed a Real-Time Dashboard for a client to provide quick insights into the availability of their Call Centre Agents:
  1. Total Agents
  2. Agents in After Call Work
  3. Agents at Lunch or other State
  4. Total Agents available


The built-in reporting provides this information, however, it was not in a graphical format that made it easy for the client to see key contact centre metrics. There are currently 3 options for extracting near real-time data from Amazon Connect for access by other applications. We can use the recently added method to the Connect API called getCurrentMetricData, – you can use this API to query a limited amount of data within the Connect instance directly. You can then use that data in your real-time application, for example as a lambda function called from your connect call flow, or a web page displaying real time data. The other option, and the one we decided to use in this case, is to push/pull the connect data into AWS ElasticSearch and use Kibana to visualise the data in the way the client had requested. Thirdly, Amazon Connect pushes data more slowly into AWS Cloudwatch, but the update time on this data can be up to 5 minutes, so did not meet our clients requirements.


Solutions Implementation

Amazon Connect can publish contact centre events (including agent events), to a Kinesis stream. Agents Events are single log entries written when an agent changes state. As an example, if Jimmy was to transition from a Status Code of “Lunch” into “Available”, Amazon Connect generates an event for that change, and it can make that information available by putting it on the Kinesis stream.  The stream can then be watched by services like elasticsearch, lambda and EC2, which can be alerted when a relevant event is made available.  This process enables you to only monitor for the things your dashboard is interested in and enables you to create dashboards in external systems.

Every connect instance has the ability to ‘stream’ these events although by default this is not enabled. Enabling this is trivial within the AWS Console for Connect. Once you enable the feature, AWS will walk you through creating a Kinesis Stream. Kinesis allows a Producer (Connect in this instance) to put data in the stream, we can then have multiple Consumers (Our Lambda in this instance) collect that data and do whatever we need with it.


While every connect instance has the ability to ‘stream’ these events, by default this is not enabled. You can configure your connect instance to stream in the console section of your instance:


Once you enable this feature, AWS will walk you through creating a Kinesis Stream. Kinesis allows a “Producer” (Connect in this instance) to put data events in the stream. We can then have multiple “Consumers” (In our example this will be a lambda function) collect that data and do whatever they need with it.

As our agent events are now being sent to Kinesis, we next need to go and create something to “consume” these events and do something useful with them. We now need to put the events from the Kinesis stream into Elasticsearch. While Elasticsearch and connect can do some of this by default, it still needs some customisation to make sure we watch for and put the right events into the Elasticsearch database in the right way (called “indexing”).  One of the things Elasticsearch does not do by default is index agent events.  So, we need to write some lambda code that does this work for us:

/* == Imports == */

var AWS = require(‘aws-sdk’);
var path = require(‘path’);

/* == Globals == */
var esDomain = {
region: ‘ap-southeast-2’,
endpoint: ‘’,
index: ‘agentevents-index’,
doctype: ‘agentevents’
var endpoint = new AWS.Endpoint(esDomain.endpoint);

* The AWS credentials are picked up from the environment.
* They belong to the IAM role assigned to the Lambda function.
* Since the ES requests are signed using these credentials,
* make sure to apply a policy that allows ES domain operations
* to the role.
var creds = new AWS.EnvironmentCredentials(‘AWS’);

/* Lambda: Execution begins here */
exports.handler = function(event, context) {
console.log(JSON.stringify(event, null, ‘ ‘));
event.Records.forEach(function(record) {
var jsonDoc = JSON.parse(new Buffer(, ‘base64’));
postToES(jsonDoc, context);

* Post the given document to Elasticsearch
function postToES(doc, context) {

console.log(“this is the doc:”, doc);

var req = new AWS.HttpRequest(endpoint);

req.method = ‘POST’;
// index based on the agent arn
req.path = path.join(‘/’, esDomain.index, esDomain.doctype, doc.AgentARN.split(‘/’).join(‘:’));
req.region = esDomain.region;
req.headers[‘presigned-expires’] = false;
req.headers[‘Host’] =;
req.headers[‘content-type’] = “application/json”;
req.body = JSON.stringify(doc);

var signer = new AWS.Signers.V4(req , ‘es’); // es: service code
signer.addAuthorization(creds, new Date());

var send = new AWS.NodeHttpClient();
send.handleRequest(req, null, function(httpResp) {
var respBody = ”;
httpResp.on(‘data’, function (chunk) {
respBody += chunk;
httpResp.on(‘end’, function (chunk) {
console.log(‘Response: ‘ + respBody);
context.succeed(‘Lambda added document ‘ + doc);
}, function(err) {
console.log(‘Error: ‘ + err);‘Lambda failed with error ‘ + err);



With the Lambda function above filtering for only the agent statistics that we are interested in, it is simply a matter of using the Kibana visualisation tools to display the agent statistics in an easy to understand format.

Example Agent Dashboard in Kibana


And to complete the picture for you, here is the architecture that was used.

Agent Events into Kibana Architecture


As you can see there are many ways in which you can get important data to the people that need it, within the tools that they are used to. With Connect having the ability to be queried by API and Stream Agent Events your choices are vast One of the great benefits is that you can test and learn with no-regret decisions. Want to have another consumer of the Kinesis data? no problem! Sometimes the problem is knowing what you want and what’s possible – which is where we can help you. We can support you with Design, Implementation, Support and a mixture of the three. Contact Us to find out more.

Leave a Reply