I have been asked to document my home sensor network. Being married to a person with a background in web security sets boundary conditions:
- No cloud. We are running all services locally.
- No control, only metrics.
I am collecting data from a number of plugs with power meters over Wifi, using the MQTT protocol. I am also collecting data from a number of temperature sensors over Zigbee, and convert to MQTT. The MQTT data is ingested into Influx, and then read and plotted in Grafana. All of this is dockered and runs locally on an Ubuntu server.
What happened so far
- Plugs with Wifi: In which I am asking what kind of power plug to use to collect usage data.
- Gosund and Tasmota: In which I describe how to convert Gosund SP 111 plugs to Tasmota.
- Air Quality Sensors: In which I ask for Air Quality sensors, specifically with CO2 level metrics.
I have a rather old server machine at home that acts as a file server and docker host.
This machine is hosting a ConBee from Dresden Elektronik:
The instructions say you should be connecting the conbee to the device using a USB cable, to keep it away from the device HF. This will improve the reach of the antenna supposedly a lot.
Dresden Elektronik ConBee attached to the Ubuntu fileserver using a USB cable.
In my case, the ConBee can see the sensors on the top floor and the floor below, but cannot reach the shed or the ground floor.
This is fixed using any Zigbee device with mains power, because they all act as Zigbee signal repeaters, forming a mesh network. The exception here is Philipps HUE equipment, which does not in general do this.
What I did was purchase a few TRÅDFRI Signal repeaters. These are USB-to-USB connectors that act as signal repeaters and a USB plug, powering the thing. You can plug the entire device chain into a wall plug, or run the basic USB-to-USB adapter from any USB socket that happens to be available.
The repeaters need to be paired with the ConBee. That is done by putting the ConBee into open mode, and then poking a small hole in the TRÅDFRI Signal repeater with a SIM tool or a paper clip. The single LED in the repeater will dim, blink, and 30s later the gateway has picked up the new device.
The repeaters have substantially better antennas than the ConBee. I purchased three, estimating I would need them to chain from the top floor across the first floor and the ground floor into the shed, but actually one repeater on the first floor would have been sufficient. On the other hand these devices are 10 Euro each, so I do not really care.
I am using Aqara sensors which report temperature, humidity and pressure. The pressure sensors are impressive - they are long term stable, and they register when I take down a sensor from the second floor to the ground floor. The devices are extremely small - they are powered by a CR2032 cell (eg IKEA PLATTBOJ), and can run off it for around a year. The device is about twice as thick as the CR2032, and only slightly larger than the battery.
They come with two double sided adhesive rings, have a pairing button on one side and a tiny blue indicator LED that only signals pairing and is otherwise unused.
They report measurements aynchronously as the data changes, which makes MQTT a much better suited protocol then HTTP.
They are available on Amazon from a number of makers, at vastly different prices and delivery times. I assume that a 100-pack of these directly from Shenzen comes in at 2.50 Euro/pc or so - individual sensors incl. shipping come in at $6.50 at Aliexpress. Here, here and here are some sources, but I have seen them as low as 10-12 Euro/pc.
A wireless mouse was delivered in container made from these plastic shells. I used one to protect the Aqare sensor against the weather.
The Aqara sensor is not really an outdoor sensor. It works well on the east side of the house, but the west side is the local weather side, and needs more water protection. I know this, because the first sensor on this side of the house drowned and shorted out after a rainstorm. I have mounted the replacement sensor in an upside down, open plastic container, and the glued the container to the top bar of a window. The seems to work fine.
Aqara sensor mounted on the top bar of a window. The plastic container is open to the bottom, but acts as a shield against rain and weather.
I bought a very large number of Gosund SP111 Wall Plugs and converted them to Tasmota using Tuya-Convert. The plugs are attractive, because they can do the full 16A and are small enough to fit next to each other on a regular plug extender, if the slots are angled at 45deg (the device has a button, and the Gosounds touch each other on small extenders, pressing each others button).
This is a solder-free conversion, using the WiFi in a Raspi 4. Newer versions of this plug do no longer convert this way, and require wires to reflash initially.
On the other hand, DeLOCK WLAN Steckdosen are the same device and come pre-tasmotified.
In any case you are looking at around 20 Euro/plug.
The plugs connect to the local 2.4 Ghz Wifi, and after tasmotification speak HTTP(S) and MQTT(s).
I need a database for time series that collects measurements from the sensors and the wall plugs. In my case that is Influx. Visualization is from Influx to Grafana, because that is known to work well.
The MQTT transport is implemented using Mosquitto, again, because that is known to work well.
Data transport from Mosquitto to Influx can be done with Telegraf, or a small Python script - I started out with the Python, but only later learned that it could have been done completely in Telegraf. My setup still uses the Python.
Conversion and transport from Zigbee to MQTT is done using zigbee2mqtt.
All data and config resides in
/export/iot in my setup.
I am providing sufficient storage for about one year of data. I am using XFS as a file system, because while having higher commit latency than ext4, it has close to no jitter and consequently much better plannable performance.
This is a LVM2 partition on the
data volume group, which containts two Samsung EVO 860 4 GB drives.
Created this way:
I am using docker-compose to set this up and run it. This works remarkably well, and there is no need to run K8s on a single box, anyway.
A deployment is specified in a file
docker-compose.yml, which lists a number of containers and optionally a local virtual network segment. The deployment can make use of environmetn variables, which can be put into a file named
.env (dotenv) in the same directory.
We make use of this:
These are being used in the
docker-compose.yml in the same directory. We specify the version, and enumerate the services we want.
Here is our service
We are defining a container named “mosquitto”, which uses the docker internal hostname “mosquitto”. Our other services will be able to connect to this container using this hostname. We try to run this as the user with the UID 1000, but unfortunately this is mostly a vain effort using Docker - a lot of stuff needs to run privileged. The server inside the container is running on port 1883, and we make it available on the outside on
$MQTT_PORT from the dotenv.
We overwrite the internal config file
/mosquitto/config/mosquitto.conf with the file
/export/iot/mosquitto/mosquitto.conf) and so on.
We also define a restart rule.
Next up: we want an Influx instance.
Again, we may the internal port (8086) to what is defined in the dotenv, and we overwrite the internal
We can prove this works: Enter the docker container, create a file, leave the docker container, see the file.
This gets us an instance of Influx.
Next then, the Grafana instance:
We can only run Grafana, when Influx is up and running, so we provide a
depends_on attribute to the service. Again, we map the port, and also the various directories of interest inside the container are made available to the outside.
The bridge from Zigbee to MQTT is defined as before:
This one is special, because it needs device access to the device file of the ConBee inside the container. So the container is
privileged, we import the device file, and a
ro instance of
udev. It also needs access to the
TZ environment variable from dotenv.
Our Zigbee2MQTT data is in
Our python project, mqttbridge
The final component is our mqttbridge, which is a dockerized Python script we provide. Alternatively we could have used telegraf for this, but I realized that only later.
Our script resides in
/export/iot/bridge), and runs with the hostname and container name
mqttbridge, as UID 1000. It makes sense to run it only when it can read data from mosquitto and write to Influx, so we
./bridge directory, we provide a
Dockerfile and the script:
This makes use of the Python 3.8 Alpine base image. We copy our requirements file into the container, run
pip to install the requirements, copy the
bridge.py file into the /app directory and then start that script with the appropriate options (
-u - run Python unbuffered).
The actual script understands the channels I use internally for Aqara, Mijia, and Gosund data.
The various components need configuration.
We also need a
/export/iot/mosquitto/users file, and create a
mosquitto_passwd -c /export/iot/mosquitto/users mqttuser.
The logfile in
/export/iot/mosquitto/log/mosquitto.log will need expiration.
Influx will need a bit more configuration, but this is interactive, and a bit longer. It will be provided in another article.
Grafana comes up empty, and needs an admin password and manual configuration after initial startup. To make things work, we need to configure our InfluxDB as a server-side data source (Grafana connects to Influxdb, not the Browser connects to Influxdb).
We can then use the hostnames we defined to access the database:
As soon as we have data in InfluxDB, we can start to define dashboards.
And this is where we start configuration for real, the entry point for our data: Once everything is running we shoudl see a
It will look somewhat like this:
- We run without home assistant, so we disable this.
- We need to open the z2m to allow devices to join, so
permit_joinneeds to be true.
- Our mosquitto is at the host of that name.
- We post data to zigbee2mqtt.
- We need to tell z2m where our ConBee is located, as a device.
When we have done that, and brought the entire apparatus up, we will be able to have devices join and they are being added to the
After that we can edit the file to give them proper names.
cd /export/iot and
docker-compose build, them
docker-compose up the entire thing.
Among all the other things we also get our bridge process and the node process from z2m:
We can switch to the z2m log directory and tail the log:
We could now try to have a device join the setup, by holding it close to the antenna and pressing the button. For the IKEA bridges, we need to push a paper clip into the small hole at the front, for the Aqara devices we press the button.
We should see a log message for each of the devices. After everything has joined, we can
docker-compose stop z2m; service networking restart and edit the
configuration.yaml to closed state, and name the devices.
z2m will now post messages to the MQTT bus, building the topic from the
The payload here is JSON, which we parse, and push into Influxdb. Or let
telegraf do that, automatically.
service networking restart? For some reasons, when downing or stopping services from docker-compose, docker-compose will unconfigure the routes to my Ubuntu. I then need to connect a keyboard, log in and manually run
service networking restart to fix this.
I have not yet found out why this happens.
When doing it like this, I keep the connection and do not need to fix the Ubuntu box.
Listening to the bus
mosquitto_sub command we can listen to multiple topics on the bus. For that we need to provide the
-t option one or more times. Topics can be literal like
zigbee2mqtt/bathroom/SENSOR or can contain a single level wildcard (
+) or multilevel wildcards (
in one window, and
tail -f /export/iot/z2m/log/*/log.txt in a second window. When log messages appear in
log.txt, we should also see JSON payloads being dumped by
We now know that zigbee2mqtt is up and running, has a configuration and is posting messages to the MQTT, using the proper topic names.
Debugging the topology
Zigbee is a mesh network. Our IKEA signal repeaters will act as intermediate relays, forwarding the messages to the coordinator.
We can dump the current topology, as seen by the zigbee2mqtt gateway, by running the following command in one window:
and running the matching pub command in a second window:
The sfdp command is part of the
The result is a network map. The map is always only a current snapshot, and the actual configuration may vary depending on network conditions.
The MQTT network map shows the devices and how they connect between each other. Unlinked devices connect directly to the coordinator.
To be continued with a section on Influx configuration.
I could not have done this alone
There are many people who have helped me to set this up. The one that stands out most is Marianne Spiller. She wrote Smart Home mit openHAB 2, the book on openHAB and home automation, and while I did not go down that route, she inspired me to research the topic. She also motivated me to look into docker-compose and she is running the fantastic #tabsvongesternnacht Stammtisch every Friday. Thank you, @sys_adm_ama.
Another useful resource was https://github.com/Nilhcem/home-monitoring-grafana and the article in Home sensor data monitoring with MQTT, InfluxDB and Grafana, which I took as a blueprint for my setup.