In LinkSmart IoT Hackathon 2018 - Part 2, we stored and aggregated our room ambience data in LinkSmart® Historical Datastore and visualized the data using Grafana and the Grafana Datasource Plugin for Historical Datastore. With the goal of performing data aggregation and analysis as well as implementation of stimulus-response, in the third part of our Hackathon you will
- Set up LinkSmart® Service Catalog.
- Set up LinkSmart® IoT Agents.
- Work with the room ambience data by feeding statements to an IoT data processing agent using EsperTech´s Event Processing Language (EPL).
We will assume that your room ambience data is being published to Mosquitto MQTT broker at
1) Set up Service Catalog
docker-compose.yml file and pull image
docker.linksmart.eu/sc:latest. Expose port 8082, mount a volume, choose a name
<your_sc_container_name> for the container, configure file
service-catalog.json according to Service Catalog documentation and run the container.
Hint fow Windows 7 users
If you use Docker Toolbox with Windows 7 you may run into trouble running LevelDB as a backend for Service Catalog. In this case, you can work without mounting a volume and set up Service Catalog storing its data in memory using the following environment variables:
2) Set up Data Processing Agent
Extend your docker-compose file and pull image
docker.linksmart.eu/agent:dpa-snapshot. Expose port 8319, set the following environment variables and run the container:
The agent will be able to work with all SenML data published below
The agent now should be registered in Service Catalog´s index (refer to the API documentation to check it). Note that the agent correctly retrieved the broker address from the catalog.
Example: Create your first statement
Consult the documentation of the agent´s statement REST API. Create a first simple statement that does nothing but just duplicating the data from a specific device (in this example:
linksmart-black) into another topic:
Make sure not to publish to somewhere below
LS/v2/DGW/#! This would create a loop.
Subscribe to topic
LS/v2/LA/linksmart-black/senml with your favourite MQTT client (like MQTT.fx or simply mosquitto_sub) and check the data. A similar statement could be used to forward the data to additional MQTT brokers.
Example: Select a single field
Select only the temperature value from the SenML record in a simple JSON format:
By not defining a
resultType the agent will publish the value in OGC format by default ("
1 as id" specifies the Datastream id):
Subscribe to topic
GOST/Datastreams(1)/Observations to check the data. You could use a statement like this if you want to employ the agent to transfer SenML data into a GOST server.
Exercise: Simple aggregations
3) Outlier Detection
Some sensors can reproduce wildly inaccurate measurements from time to time. We can use the data processing agent and Esper to get rid of these. We want to use a statement similar to our very first one but forward data to the adapted topic only when a measurement does not differ from the previous measurement by a certain threshold. This can be done by using the Match Recognize functionality.
Exercise: Simple outlier detection
- Try to understand the syntax example from the Esper documentation. Create a similar statement for our SenML data checking the difference between two consecutive measurements. The statement should produce an event only if both temperature and humidity values do not differ from the previous values by more than
x. Hint: A very simple condition could look like this:
(Math.abs(cast(B.e.v,double) - cast(A.e.v,double)) < 1) and (Math.abs(cast(B.e.v,double) - cast(A.e.v,double)) < 1)
Implementation of actuation, in this hackathon, involves performing stimulus-response, e.g., as soon as the value of the temperature is beyond a certain value, the IoT engineer will be notified of the event.
Example: Send e-mails
The agent is also able to send e-mails. Create a statement like this (ask Jannis Warnat for the password via e-mail or Rocket.Chat):
Exercise: Send warning and notification e-mails
- Create a statement that sends an e-mail when your room temperature exceeds German health & safety regulations.
- Create a statement that sends an e-mail once a day with the maximum and average temperature of the day in your room.
In LinkSmart IoT Hackathon 2018 - Part 1, we setup the LinkSmart® Device Gateway to collect temperature and humidity data from a DHT22 sensor on a Raspberry Pi. The collected data was exposed over a RESTful API and published periodically to a MQTT broker.
In the second part of the hackaton, you will perform the following on your personal computer (x84 architecture):
- Setup LinkSmart® Historical Datastore and configure it to store and aggregate the data being published to the MQTT broker
- Setup Grafana and Grafana Datasource Plugin for Historical Datastore to visualize temperature and humidity data
A) Historical Datastore (HDS) - deployment and usage
- Follow the documentation to deploy Historical Datastore.
- You should now be able to access the registry at http://localhost:8085/registry
- Let's recap that In LinkSmart IoT Hackathon 2018 - Part 1, we configured the Device Gateway to:
- Publish SenML object with entries named
- with topic
- to Broker
You should register two Datasources, one for Temperature and another for Humidity. On each request, you should get a Datasource ID in response header.
POST to http://localhost:8085/registry
- Publish SenML object with entries named
Query and make sure everything was correct.
The URL for the created Datasource: http://localhost:8085/registry/99e60acc-0683-40fc-9b83-408d89727b0c
Get the stored data
We should query the Data API: http://localhost:8085/data/99e60acc-0683-40fc-9b83-408d89727b0c
The data field follows the SenML model, if the array is empty, you may need to wait for a while until new data arrives. Remember that the measurements are published every 2 minutes. If you receive nothing after 2 minutes, check the topic and broker configurations.
Now, configure a retention period so the measurements are removed after 1h. By default only retention periods 1h and 1w are supported.
We'll work on the response from step 5 and add a retention period of 1h.
We'll continue with the JSON from step 7 and this time add an aggregation object for mean and max kept for 1w:
See the stored aggregated data
First, retrieve the datasource registration:
The aggregation has an id and a data URL.
B) Grafana - deployment, plugin configuration, usage
- Follow the documentation: Grafana Datasource Plugin for Historical Datastore
- Plugin Configuration
- Dashboard / Graph
- Plugin Configuration
- Show Temperature data in Table, Gauge (Singlestat), and Graph
Visualization is a key component of IoT systems. Choosing the right visualization for the development and production requires assessment of available tools in relation to their flexibility, ease of use and the features they support. This page provides the comparison of IoT visualization tools: Thingsboard and Grafana.
This page compares IoT visualization tools with emphasis on Thingsboard and Grafana.
Available Tools for IoT Visualization
There are a number of IoT visualization tools. Examples include: DeviceHive , Grafana , Thinger , and Thingsboard . So far, comparison is done between Thingsboard and Grafana with regard to ease of use, flexibility and avaialbility of features.
Regarding requirement of database, we need Time-Series databases (TSDB) database (Graphite, Elasticsearch, CloudWatch, InfluxDB, OpenTSDB, KairosDB, or Prometheus) between Grafana and our IoT device. Whereas, in Thingsboard, we can utilize the built-in database (i.e., HSQLDB) for development purpose. This frees us from putting in place a separate database for the sole purpose of setting up the visualization system.
Regarding ease of use, in Grafana, i) if the device we want to monitor isn’t attached to a network already we might need a gateway device (e.g., a Raspberry Pi); ii) we need to write a software interface to take the data from our device via a Serial Port or MODBUS or CAN bus or whatever is available; iii) we can then send that data out over the network to our TSDB; iv) once we start pumping data into our database with our language and hardware of choice, we just start up Grafana, navigate to the hosted site, and connect it to our DB via the web interface. In summary, Grafana is challenging to implement first, but once it is up and running Grafana is a very powerful tool that can give some very detailed insights into the data with a very clean and easy to use front-end.
On the other hand, in Thingsboard, device managment, data collection, processing, and visualization are readily available. It has a built-in database for development purposes. It also supports external databases, with Cassandra and PostgreSQL being the recommended ones.
While Thingsboard allows to create rich IoT Dashboards for data visualization and remote device control in real-time and has several customizable widgets to build end-user custom dashboards for most IoT use-cases, it does not beat the flexibility and extensibility of Grafana.
Demonstration of Thingsboard
Thingsboard is installed and tested with SQL (HSQLDB) and NoSQL (Cassandra) available at . and can be accessed from local network with username: email@example.com, password: tenant or firstname.lastname@example.org, password: sysadmin.
While this page provides the summary of comparison of Thingsboard and Grafana, it is yet to include more visualization tools in order to simplify the choice of a tools for a particular purpose.
Here is the list of some other tools. (This is incomplete. But anyone can add)
|Tool||Description||Positive points||What is missing||Reference|
|Infozoom||This is a startup partially located in Fraunhofer premises.||Can load very large data and do basic analytics||Stream data visualization|
|Kibana||By elastic search|
Collaborarive visualization tool over the web (Asynchronous, Distributed)
|Isenberg, P., Elmqvist, N., Scholtz, J., Cernea, D., Ma, K.-L., & Hagen, H. (n.d.). Collaborative visualization: Definition, challenges, and research agenda. https://doi.org/10.1177/1473871611412817 (2011)|
|Web based Collaboratory||Data warehouse||Subramanian S, Malan GR, Shim HS, Lee JH, Knoop P, Weymouth dTE, Jahanian F and Prakash A. Software architec- ture for the UARC web-based collaboratory. IEEE Internet Comput 1999; 3(2): 46–54.|
|TeleMed||Kilman DG and Forslund DW. An international collaboratory based on virtual patient records. Commun ACM 1997; 40(8): 110–117.|
|Particle PhysicsData Grid Collaboratory Pilot||US Department of Energy CollaboratoriesParticle physics data grid collaboratory pilot.Available from: http://www.doecollabora- tory.org/research2/ppdg/homepage.html. (Last Accessed June 2011).|
|Earth System Grid||Bernholdt D, Bharathi S, Brown D, Chanchio K, Chen M, Chervenak A, Cinquini L, Drach B, Foster I, Fox P, Garcia J, Kesselman C, Markel R, MiddletonD, NefedovaV, Pouchard L, Shoshani A, Sim A, StrandGand WilliamsD. The earth system grid: Supporting the next generation of climate modeling research. Proc IEEE 2005; 93(3): 485–495.|
|NAtional Fusion Collaboratory||SchisselDP, Burruss JR, Finkelstein A, Flanagan SM, Foster IT, Fredian TW, Greenwald MJ, Johnson CR, Keahey K, Klasky SA, Li K, McCune DC, Papka M, Peng Q, Randerson L, Sanderson A, Stillerman J, Stevens R, Thompson MR and Wallace G. Building the US national fusion grid: Results from the national fusion collaboratory project. Fusion Eng Des 2004; 71(1-4): 245–250.|
Collaboratory for Multi scale chemical Sience
|Sandia National Laboratories. Collaboratory for multi-scale chemical science. Available from: http://cmcs.org. (Last Accessed 2010)|
|Time Searcher 1-3||Good candidate for time series searching||* TimeSearcher allows users to specify different regions (Motif discovery ) of interest from a query time series, rather than feeding the entire query for matching.User selected patterns are automatically grouped together. * It provides an extended version of timeboxes, variable time timeboxes. It can be used to identify items in a data set that have a values in a given range for an interval consisting of a number of consecutive measurements. * Ability to define a query that is somehow a "reciprocal" of a previously defined query. * It not only supports conjunctive Queries (value range for all) but also Disjunctive ("anyof") Queries interpretation: timeboxes that find items that have a value in the range for at least one time point during the interval. * It supports two different strategies for normalizing data. a) Extreme Normalized b) Deviation Normalized||* users still need to specify the query regions in order to find similar patterns (Motif discovery). * Users may need to have some prior knowledge about the datasets and need to have a general idea of what is interesting. * It suffers from its limited scalability, which restricts its utility to smaller datasets, and is impractical for the task at hand.|
The LinkSmart® team is proud to announce Broetchen API. Major releases are maintained to ensure that the involved services are interoperable, well documented and ready to use. Services of a major release work standalone but can also be integrated into a consistent LinkSmart® platform deployment.
Broetchen API is the first major release of next generation LinkSmart®. It includes the basic services for setting up device abstraction, service provisioning, time-series data storage and stream mining/learning functionalities. Broetchen API works seamlessly with the SenML data model.
Figure 1. LinkSmart Platform components. Green ones are released.
Device Gateway (≥1.1.0)
- Pluggable devices support
- Natively compiled for major platforms and architectures, including arm linux
- No modification/re-compilation/re-deployment of the DGW for a new device
- Exposure of device capabilities as network services by declaration/configuration
- Payload agnostic device exposure
Service Catalog (≥2.2.6)
- Exposes RESTful API for:
- Registering and updating Services (also MQTT support)
- Browsing the Service Catalog entries
- Retrieving information about specific Services
- Filtering Services
Historical Datastore (≥0.4.2)
- All three APIs fully implemented:
- Registry API: Registry of sensor meta-data and details regarding sensor data storage and aggregation
- Data API: Submission and retrieval of raw sensor measurements
- Aggregation API: Retrieval of aggregated sensor measurements
- Grafana Plugin
IoT Agents (≥1.7.0)
- Learning Agent & Data Processing Agent
- Three APIs fully implemented
- Stream Mining API (Statement API)
- Learning API (CEML API)
- IO API.
- The Statement and CEML APIs are CRUD and JSON based, while the IO are write-only (for Input) or read-only (for Output). The APIs are implemented as HTTPS RESTful and MQTT.
Here we compare two major OGC SensorThings implementations:
- GOST: Release V0.5 (docker release on 05-Mar-2018)
- FROST: Release V1.6 (docker release on 05-Mar-2018)
|Compliance to OGC|
|FROST fully is compliant|
|Docker image sizes||582MB||14.2MB|
|Memory usage idle||463.1MiB||13.7MB|
Plans to support storage providers such as MongoDB
|Clustering (Scalability etc)||Not supported||Not supported|
The goal of this Hackaton is to become familiar with several LinkSmart components and at the same time, create something useful.
In the following steps we will setup the DHT Adafruit Library and LinkSmart Device Gateway on a Raspberry Pi 3 in order to read measurements from a DHT22 sensor and publish them in SenML format to a MQTT broker.
1) Setup the DHT Library
- Install the latest stable Docker for debian-stretch (armhf).
- Follow the instructions here (Tip: stretch>jessy>wheezy). Too lazy to follow instructions? Skip to step c.
Optional: Post-installation (Manage Docker as a non-root user), so that docker commands run without
- Click here to expand...
If you didn't go through steps a and b. Run the following commands:
Log out and log in. Then to verify that Docker CE is installed correctly, run the hello-world image. This command downloads a test image and runs it in a container. When the container runs, it prints an informational message and exits.
Create and enter the directory structure
Download Adafruit_DHT_SenML.py. This script reads temperature and humidity values from a DHT sensor and prints the measurements in SenML format. (You need appropriate drivers to run it)
Use the following command to download an image with DHT drivers and runs it once. The command tries to read from a
DHT22sensor with data pin connected to
GPIO 4and SenML basename
bn/. Remove these arguments to see usage instructions.
The SenML output should be similar to:
If you didn't get an output similar to that, go back and figure out what went wrong.
2) Deploy Device Gateway
We'll use the DHT library container (from previous step) to run Device Gateway. The goal is to execute the Python script that reads the measurements and expose the SenML output over networking protocols.
Download Device Gateway (
device-gateway-linux-arm) and make it executable. Deployment instructions on wiki: https://docs.linksmart.eu/display/DGW.
Configure Device Gateway:
Configure the DGW service. Modify (replace <device-name> with the device name e.g. linksmart-cyan and <mqtt-broker-uri> with the broker endpoint) and place in
Configure the device agent. Modify (replace <device-name>s with the device hostname, e.g. linksmart-cyan) and place in
With the following configuration, Device Gateway executes the Python script every 120 seconds and exposes the resulting data over two protocols:
MQTT: Publishes the sensor data to the given topic. The MQTT broker was configured in step a.
- REST: Exposes a REST endpoint to GET the latest collected data. The HTTP server was configured in the step a. E.g for getting data:
Run the container:
It should be in priviledged mode in order to access Raspberry Pi GPIO.
If there were no errors, make a container that starts after boot and runs in detached mode (background):
Refer to docker run reference to understand the given arguments.
3) Try it out
- Subscribe to the correct topic at the broker with the endpoint configured in DGW configuration.
- Get latest measurement from the REST endpoint. The path comes from the names in device agent configuration. e.g.