Krzysztof Smogór

Published
November 5, 2024

Handling IoT usecases with Oxla

IoT
Grafana
Sparkplug

Moving away from a world made up of a few energy producers of substantial size to a vast but mixed amount of consumers and producers of electrical energy, creates new challenges for energy distributors. The learning curve is getting steeper and they have to constantly keep monitoring the energy flows in their power network. More often than ever before. In order to handle this, they need a reliable data pipeline, which can gather all their data and at the same time make it easy to analyze the energy readings.

Here at Oxla, we cover such use cases with ease, that is why we have decided to create a data pipeline, which tackles such situations and shows how effective Oxla is in the time series analysis scenarios typical for IoT systems. 

Usecase Definition

What we want to achieve here is pretty straightforward:

  • Gathering real-time data from multiple energy meters
  • Storing the data in an organized manner, so it is ready for analysis
  • The data should be updated on the fly

Sounds difficult!? Here’s a real-life example of such a data processing pipeline!

Energy Meters

As a starting point, we’ve deployed an energy meter in a real household to gather real-life data. There are plenty of protocols which can be used to access specific data from the energy meters, like:

  • Impulse interface
  • RS-485
  • IEC 62056-21
  • DLMS

The first obstacle, however, is to unify the data so that it adheres to a single standard. We decided to use Sparkplug, which is an open industrial MQTT extension prepared for crafting reliable IoT data pipelines. What Sparkplug does is that it simplifies the pipeline scaling process, so we can easily make an instant leap and start using lots of energy meters instead of just one. Additionally, there is a clear distinction between Data Layer and Application Layer, defined by an explicit and self-descriptive interface.

What happens with data published to Sparkplug

At this point, data flow from multiple energy meters is already homogenous and encrypted so it can be easily intercepted (with proper keys, of course!). We developed an official Oxla integration with HiveMQ and other MQTT brokers, which can intercept all Sparkplug data and send it to Oxla, which is essentially a data source for any data analysis tool out there.

Our database is so efficient that it can be used to digest a stream of data from MQTT brokers and run analytical queries at the same time!

Moving to the next stage, let’s take a look at Grafana. Now it’s time to connect what we’ve already built with an analytical application.

How much energy is consumed

The deployed energy meter allowed us to easily measure active and reactive power. Additionally, we could monitor both imported and exported energy counters.

The data is gathered once a minute to make a high resolution report. Grafana is a great open source tool for developing dashboards and performing analysis of time series data. What’s even better is that Oxla is compatible with it, not to mention any other tool that can be integrated through PostgreSQL compatible connectors. Within Grafana, from the data collected in Oxla, you can basically define whatever dashboard your usecase requires.

The Data Pipeline

All in all, here is a schema of the data processing pipeline starting from energy meter and ending with the analytical tooling.

According to a Sparkplug specification the energy meter is a Device, which has to be connected to EoN (Edge of Network Node), that is capable of sending Sparkplug messages to MQTT Server. In our case, it was HiveMQ with a custom extension. On this schema Oxla is an online database, which can be queried from the Analytics tool. In our case it was Grafana, as it perfectly allowed us to show the power consumption of a single household.

Oxla provides great performance when it comes to time series data analysis. This example pipeline can be easily scaled with multiple HiveMQ servers connected with multiple Oxla nodes that form a cluster. Querying data and inserting new rows has never been that easy and efficient!

Sources