Real Time Graph using AWS Services (Part 2)

Rushabh Sudame 21st January 2021 - 5 mins read


In this part of tutorial, we are going to setup server-side logic. We will be covering all steps including AWS Services setup, Backend and Frontend. So, without wasting a minute, let's start our tutorial.

Setting up data flow on AWS

Referring to part 1 of this tutorial, we are able to get data on AWS IoT Core. It’s time to do something with the data. Since the main aim of this tutorial is to show real time graph, we are not going to save the data. For that, we need something to process the data real time. So here comes Amazon Kinesis into play.

Amazon Kinesis Data Streams

Amazon Kinesis Data Streams (KDS) is a massively scalable and durable real-time data streaming service. KDS can continuously capture gigabytes of data per second from hundreds of thousands of sources such as website clickstreams, database event streams, financial transactions, social media feeds, IT logs, and location-tracking events. The data collected is available in milliseconds to enable real-time analytics use cases such as real-time dashboards, real-time anomaly detection, dynamic pricing, and more.

We are going to forward our data to kinesis data streams. IoT Core will act as producer for Kinesis Data Stream.

Follow below steps to create and configure Kinesis Data Stream.

  • Go to AWS IoT Core console.
  • Go to Act and click on Create.

  • Name the rule : IoTCoreToKinesisDataDemo.
  • Write the SQL Query as following:

  • Now we need to set action which will forward the data to Kinesis Streams. Click on Add Action.

  • Click on ‘Send Message to Amazon Kinesis Streams’ and click ‘Next’

  • Click on Create new resource or choose existing if you already created data streams.

  • Create new Data Stream as following:

  • Go back to IoT Core page and click on refresh button beside ‘Create New Resource’ and select created stream.

  • Click on ‘Add Action’ and Click on ‘Create Rule’.
  • Now we have configured data to go into Kinesis Data Streams. It's time to write consumer code which will push data over WebSocket's to show in real time.

Setting Up Backend

It's time to code Kinesis Consumer to consume data from kinesis streams. We are using KCL (Kinesis Consumer Library) provided by AWS.

There are another two parts in backend.

  • KCL (Kinesis Consumer Library) to Redis
  • Redis to Websockets

KCL to Redis

  • Clone following repository:
  • Setup aws-cli with your Access Key and Secret Key, also set appropriate region in which you created stream.
  • Replace the following lines with your kinesis stream name:
  • public static final String SAMPLE_APPLICATION_STREAM_NAME =”{YOU_STREAM_NAME}”; private static final String SAMPLE_APPLICATION_NAME =”{APPLICATION NAME}”;
  • Run the code in Eclipse or any java platform.
  • The above code will consume data from Kinesis Data Streams and forward it in realtime to Redis topic. We are using Redis Pub-Sub for message forwarding.

Redis to Websockets

  • Clone following repository:
  • The Code is written in Spring Boot, so you will have to run the code in STS (Spring Tool Suite).
  • Run the code.
  • The above code will subscribe to redis topic on which we were publishing data from our KCL.
  • It will then forward the data to WebSockets which frontend can listen to to update the graph in real-time.

WebSockets to Frontend


In this tutorial, we have seen how to setup backend to process data immediately for showing real-time graphs. Frontend which will update data in realtime by using websockets.

I hope this helps you solve your problem. If you found any issue in executing above code, you can raise an issue on gitlab. Our team will try to solve it as soon as possible.

Top Blog Posts


Talk to our experts to discuss your requirements

Real boy icon sized sample pic Real girl icon sized sample pic Real boy icon sized sample pic
India Directory