Yellow Rabbit

Frozen

Here is an active version

Influxdb as a Mechanism for Storing Sensor Readings

Storing Temperature in the Database

Never hurts to have the database, let’s try to store data about temperature measurements. Let’s see if we can use influxdb on Raspberry Pi.

Install

At the time of writing, the dev-lang/go package had version 1.10.1, which did not work on ARM, so I had to make edits to the configuration files1 and install go as:


emerge -1 '=dev-lang/go-9999'
emerge dev-db/influxdb

Let’s check: Checking the installation of Influxdb

There is a temptation to quickly sketch out the simplest base, to integrate it with Kotlin, and then with some system for building beautiful charts … Stop. We are dealing with a database, even for educational purposes, so let’s train to protect ourselves from unauthorized access.

Create a superuser (username and password changed):


> create user spade with password 'super-password' with all privileges;

In the file /etc/influxdb/influxd.conf we edit the line auth-enabled = true and restart the daemon. After that we check the efficiency of authorization: Authorization check

Https

For my purposes, a self-signed certificate is enough (do not forget to specify CN as localhost):


ROOT@pi64 ~# openssl req -x509 -nodes -newkey rsa:2048 \
  -keyout /etc/ssl/influxdb-selfsigned.key \
  -out /etc/ssl/influxdb-selfsigned.crt -days 360

We add this certificate to the Java repository (password changeit or installed by you):


cd /etc/ssl
openssl x509 -in influxdb-selfsigned.crt -outform der -out influxdb-selfsigned.der
cd /usr/lib64/icedtea8/bin
./keytool -import -alias mykeyroot -keystore /usr/lib64/icedtea8/jre/lib/security/cacerts -file /etc/ssl/influxdb-selfsigned.der                      

Edit /etc/influxdb/influxdb.conf:


  https-enabled = true
  https-certificate = "/etc/ssl/influxdb-selfsigned.crt"
  https-private-key = "/etc/ssl/influxdb-selfsigned.key"

After restarting the daemon, we check from the normal user: Test https

It is necessary to add influx depending on build.gradle:


dependencies {
    compile "org.jetbrains.kotlin:kotlin-stdlib-jdk8:$kotlin_version"
    compile 'org.jetbrains.kotlinx:kotlinx-coroutines-core:0.22.5'
    compile 'org.influxdb:influxdb-java:2.9'
}

To begin with, a tiny program to test the connection to the database:


import org.influxdb.InfluxDBFactory
import kotlin.system.exitProcess

fun main(args: Array<String>) {
    println("*** Raspberry Pi Influxdb ***")

    val influxDB = InfluxDBFactory.connect(DB_SERVER, DB_USER, DB_PASS)
    influxDB.run {
        println("Connected to db.")
        close()
    }

    exitProcess(0)
}

const val DB_SERVER = "https://localhost:8086"
const val DB_USER = "spade"
const val DB_PASS = "*******"

Result: Kotlin and influx

We return to the console and create our database and a special user who can only write data to the database:


rabbit@pi64 ~/src/yrabbit-java/thermdb % influx -ssl -unsafeSsl -host localhost
Connected to https://localhost:8086 version unknown
InfluxDB shell version: unknown
> auth
username: spade
password: 
> create database thermdb
> create user sensor with password '********'
> grant write on thermdb to sensor
> create user grafana with password '********'
> grant read on thermdb to grafana
> show users
user   admin
----   -----
spade  true
sensor false
grafana false
> exit

DB structure

The temps measurement has a very simple structure:

Field/tag Type
sensor_id string
temp float

We set this policy by default, that raw data is stored for 2 hours, packed in 15-minute intervals, stored for month, packed in hourly intervals and after 4 years completely removed:


> create retention policy two_hours on thermdb duration 2h replication 1 default
> create retention policy one_month on thermdb duration 4w replication 1
> create retention policy four_years on thermdb duration 208w replication 1
> create continuous query cq_15m on thermdb begin select mean(temp) as mean_temp into one_month.downsampled_temps from temps group by time(15m),* end
> create continuous query cq_4w on thermdb begin select mean(mean_temp) as mean_temp into four_years.year_temps from one_month.downsampled_temps group by time(1h),* end

as a result we have measurements:

  Measurement Data Interval Stored for how long
  temps raw data 2 hours
  downsampled_temps 15 minutes month
  year_temps 1 hour four years

Perhaps I will bring requests that pack data in a more readable form:


create continuous query cq_15m on thermdb
  begin 
    select mean(temp) as mean_temp 
    into one_month.downsampled_temps 
    from temps 
    group by time(15m),* 
  end

create continuous query cq_4w on thermdb 
  begin 
    select mean(mean_temp) as mean_temp
    into four_years.year_temps
    from one-month.downsampled_temps
    group by time(1h),*
  end

Tst data recording

We return to Kotlin and try to add a couple of records to the database under the new user.


package io.github.yrabbit.kotlin

import kotlinx.coroutines.experimental.delay
import kotlinx.coroutines.experimental.runBlocking
import org.influxdb.BatchOptions
import org.influxdb.InfluxDBFactory
import org.influxdb.dto.Point
import java.util.concurrent.TimeUnit
import kotlin.system.exitProcess

fun main(args: Array<String>) {
    println("*** Raspberry Pi Influxdb ***")

    val influxDB = InfluxDBFactory.connect(DB_SERVER, DB_USER, DB_PASS)
    influxDB.run {
        println("Connected to db")
        setDatabase(DB_NAME)
        setRetentionPolicy(DEFAULT_RETENTION)
        enableBatch(BatchOptions.DEFAULTS.flushDuration(FLUSH_INTERVAL))

        runBlocking {
            influxDB.write(Point.measurement(RAW_TEMP_MEASUREMENT)
                    .tag("sensor_id", "test sensor")
                    .time(System.currentTimeMillis(), TimeUnit.MILLISECONDS)
                    .addField("temp", 0.123)
                    .build())
            delay(1000)
            influxDB.write(Point.measurement(RAW_TEMP_MEASUREMENT)
                    .tag("sensor_id", "test sensor")
                    .time(System.currentTimeMillis(), TimeUnit.MILLISECONDS)
                    .addField("temp", 0.723)
                    .build())
        }
        close()
    }

    exitProcess(0)
}

const val DB_SERVER = "https://localhost:8086"
const val DB_USER = "sensor"
const val DB_PASS = "************"
const val DB_NAME = "thermdb"
const val RAW_TEMP_MEASUREMENT = "temps"
const val DEFAULT_RETENTION = "two_hours"

const val FLUSH_INTERVAL = 10 * 60 * 1000 // 10m

After making a few starts and leaving Raspberry Pi to work for a while we get:


rabbit@pi64 ~ % influx -precision rfc3339 -ssl -unsafeSsl -host localhost                       
Connected to https://localhost:8086 version unknown
InfluxDB shell version: unknown
> auth
username: spade
password: 
> use thermdb
Using database thermdb
> select * from temps
name: temps
time                     sensor_id   temp
----                     ---------   ----
2018-04-10T09:11:19.217Z test sensor 0.123
2018-04-10T09:11:20.265Z test sensor 0.723
2018-04-10T09:11:31.814Z test sensor 0.123
2018-04-10T09:11:32.856Z test sensor 0.723
2018-04-10T09:11:38.395Z test sensor 0.123
2018-04-10T09:11:39.439Z test sensor 0.723
2018-04-10T09:11:54.984Z test sensor 0.123
2018-04-10T09:11:56.026Z test sensor 0.723
...
2018-04-10T10:59:37.558Z test sensor 0.123
2018-04-10T10:59:38.609Z test sensor 0.723
2018-04-10T11:00:14.761Z test sensor 0.123
2018-04-10T11:00:15.804Z test sensor 0.723
> select * from one_month.downsampled_temps
name: downsampled_temps
time                 mean_temp           sensor_id
----                 ---------           ---------
2018-04-10T09:00:00Z 0.42299999999999993 test sensor
2018-04-10T09:15:00Z 0.42300000000000004 test sensor
2018-04-10T09:30:00Z 0.42300000000000004 test sensor
2018-04-10T09:45:00Z 0.423               test sensor
2018-04-10T10:00:00Z 0.42300000000000004 test sensor
2018-04-10T10:15:00Z 0.42299999999999993 test sensor
2018-04-10T10:30:00Z 0.42300000000000004 test sensor
2018-04-10T10:45:00Z 0.423               test sensor
> select * from four_years.year_temps
name: year_temps
time                 mean_temp           sensor_id
----                 ---------           ---------
2018-04-10T10:00:00Z 0.42300000000000004 test sensor
>

Excellent! Raw data, first and second ordering are viewed. The following data are given the next morning: raw data has already been destroyed, 15 minutes and hourly readings have been generated.


> select * from temps
> select * from one_month.downsampled_temps
name: downsampled_temps
time                 mean_temp           sensor_id
----                 ---------           ---------
2018-04-10T09:00:00Z 0.42299999999999993 test sensor
2018-04-10T09:15:00Z 0.42300000000000004 test sensor
2018-04-10T09:30:00Z 0.42300000000000004 test sensor
2018-04-10T09:45:00Z 0.423               test sensor
2018-04-10T10:00:00Z 0.42300000000000004 test sensor
2018-04-10T10:15:00Z 0.42299999999999993 test sensor
2018-04-10T10:30:00Z 0.42300000000000004 test sensor
2018-04-10T10:45:00Z 0.423               test sensor
2018-04-10T11:00:00Z 0.423               test sensor
2018-04-10T11:15:00Z 0.423               test sensor
2018-04-10T12:00:00Z 0.423               test sensor
2018-04-10T12:15:00Z 0.423               test sensor
> select * from four_years.year_temps
name: year_temps
time                 mean_temp           sensor_id
----                 ---------           ---------
2018-04-10T10:00:00Z 0.42300000000000004 test sensor
2018-04-10T11:00:00Z 0.423               test sensor
2018-04-10T12:00:00Z 0.423               test sensor
> 

Add to the program a poll of sensors, the correct completion of work with the database and leave work until tomorrow:smiley:


package io.github.yrabbit.kotlin

import kotlinx.coroutines.experimental.delay
import kotlinx.coroutines.experimental.runBlocking
import org.influxdb.BatchOptions
import org.influxdb.InfluxDBFactory
import org.influxdb.dto.Point
import java.io.File
import java.nio.file.FileSystems
import java.nio.file.Files
import java.nio.file.Path
import java.nio.file.Paths
import java.util.concurrent.TimeUnit
import java.util.stream.Collectors
import kotlin.system.exitProcess

fun main(args: Array<String>) {
    println("*** Raspberry Pi Influxdb ***")

    val influxDB = InfluxDBFactory.connect(DB_SERVER, DB_USER, DB_PASS)
    influxDB.run {
        println("Connected to db")
        // exit correctly
        Runtime.getRuntime().addShutdownHook(Thread {
            run {
                println("Finish.")
                influxDB.close()
            }
        })

        setDatabase(DB_NAME)
        setRetentionPolicy(DEFAULT_RETENTION)
        enableBatch(BatchOptions.DEFAULTS.flushDuration(FLUSH_INTERVAL))

        runBlocking {
            while (true) {
                val sensors = findSensors()
                sensors.forEach { sensor_id ->
                    val (therm, status) = readSensor(sensor_id)
                    if (status) {
                        influxDB.write(Point.measurement(RAW_TEMP_MEASUREMENT)
                                .tag("sensor_id", sensor_id)
                                .time(System.currentTimeMillis(), TimeUnit.MILLISECONDS)
                                .addField("temp", therm)
                                .build())
                    }
                }
                delay(SENSOR_READ_INTERVAL)
            }
        }
    }

    exitProcess(0)
}

/*
 * Find all sensors in /sys/bus/w1/devices/
 */
fun findSensors(): List<String> {
    val dir = File(SENSORS_PATH)
    val fileNames = dir.list().filter { name -> name.startsWith("28-")}
    return(fileNames)
}

/*
 * Read sensor value
 * status = true -> data Ok
 * status = false -> error
 */
data class thermResult(val therm: Double, val status: Boolean)
fun readSensor(path: String): thermResult {
    var status = false
    var therm = 0.0
    try {
        val sensorData = File("$SENSORS_PATH/$path/w1_slave").readLines()
        if (sensorData.size == 2) {
            if (sensorData[0].endsWith("YES")) {
                therm =sensorData[1].takeLast(5).toDouble() * 0.001
                status = true
            }
        }
    } catch(e: Exception) {

    }

    return(thermResult(therm, status))
}

const val DB_SERVER = "https://localhost.ssl:8086"
const val DB_USER = "sensor"
const val DB_PASS = "look"
const val DB_NAME = "thermdb"
const val RAW_TEMP_MEASUREMENT = "temps"
const val DEFAULT_RETENTION = "two_hours"

const val FLUSH_INTERVAL = 10 * 60 * 1000 // 10m
const val SENSOR_READ_INTERVAL = 10 * 1000 // 10 seconds

const val SENSORS_PATH = "/sys/bus/w1/devices"

Grafana

Let’s do the visualization of the data. The easiest way is to use such a thing as Grafana


~# emerge grafana-bin 

And then follows the usual black magic: installing this package, we do not actually use the server from it, because it still does not work and I’m too lazy to understand why. So we will build our server from the sources and replace the executable file. Note that the whole build goes under the normal user:


~% go get github.com/grafana/grafana
~% ~/go/src/github.com/grafana/grafana
~% go run build.go setup
~% go run build.go build
~% cp ~/go/bin/grafana-server /tmp 
~% sudo mv /tmp/grafana-server /usr/bin/
~% sudo rc-service grafana start

In the browser, open the pagehttp://localhost:3000. User name admin/admin (or what is stated in /etc/grafana/grafana.ini2). Start screen

Add the data source: Adding a data source

Adding a data source

Next, add a panel, the most simple: Panel Panel Panel Panel

And how do you like this beauty?:smiley: Panel

  1. Elementary changes in the files /etc/portage/package.accept_keywords etc. (Well, the elders of Gentoo know:wink:

  2. It’s a good idea to change the superuser name and password.