Edge to Cloud computing

For some time now, I've used my Pi just to collect local sensor data, and forward it by MQTT to a 'free' Oracle cloud server, which does all the heavy lifting, like storing it in a database, visualising it in Grafana etc, so last month's flowforge webinar 'Build an Edge-to-Cloud Solution with the MING Stack' was of interest to me (it unfortunately hasn't been saved in the flowforge website, but here is a link from influx).

One issue that I have encountered is if the internet temporarily fails, then the edge/cloud connection fails, and I lose data, which appear as gaps in charts etc, so I reached out to Jay Clifford (Developer Advocate - InfluxData) who kindly came up with a great solution.

Jay suggested passing the sensor data into the node-red-contrib-influxdb node, which is configured to output its data via local IP/Port to Telegraf, which is also installed on the Pi.

Telegraf in turn then forwards the data via http direct to Influxdb - which is hosted in the cloud alongside Grafana.

Telegraf has a configurable internal buffer, which stores data that has not been successfully acknowledged by Influxdb, and the default setting is 10,000 metrics, which in my case is sufficient to cover most network downtime.

To test, I setup 2 feeds, one using MQTT, and the other using Telegraf as above.
I'm feeding approx 8,000 metrics per hour to both, and then disabled the internet for approx 1 hour, which I then re-enabled.

Looking at just one of the feeds, the MQTT data is represented by white circles, and the Telegraf data by a blue line, and as can be seen, the buffer quickly added all of the metrics over that period as soon as the network resumed.

2 Likes

Ooops! just corrected errors in my post above.

Hi Paul,

Thanks for sharing this idea! Very interesting! Like you say this is solving quite a lot of problems related to cloud solutions not being available all the time. Never heard of the telegraph agent for InfluxDb...

Although there is a little voice in my head that says: instead of installing the Telegraf agent for InfluxDb, why not simply install an InfluxDb. Of course you will need to do more management yourself of course.

I have to admit that your setup makes it more difficult to choose between a local or cloud InfluxDB.
Damn you Paul :yum:

Because my Pi is using a SD card, and not a HDD, and writing that amount of data to the SD card would surely shorten it's life.
Also, Grafana is installed on the same cloud server as influxdb, so loading Grafana dashboards is extremely fast... much faster than from a pi.

grafana

Hmm ...

And what is your retention period? Because I am assume it will become costly if you keep your data. Or can you aggregate the data after some time, e.g. to aggregate minute values into hour or day values?

It varies depending if it's energy, weather etc. But for example I'm currently set the energy data to 12 months (although I'll probably set it shorter).

The 'free' Oracle server comes as standard with ;

Arm-based Ampere A1 cores and 24 GB of memory
2 Block Volumes Storage, 200 GB total
10 GB Object Storage – Standard
10 GB Object Storage – Infrequent Access
10 GB Archive Storage

...so I don't think I'll be running out of free storage any time soon!

Yes, although I'm not doing that at the moment as it's a work in progress, but I intend to when I get some time & motivation :wink:

1 Like

Do we know anything about the security Telegraf provides? I could not find any information from the link above. I think it sounds great to use but I would like to be sure the security is enough for the data sent to the db (http does not sound as sufficient). I would like to see support for https, certificates, TPM2 etc

HTTP(S) is one of many output plugins available in telegraf.
HTTPS is available as many other protocols.
Have a look at Plugin directory | Telegraf 1.27 Documentation to view the rather impressive list.

I'm using the (Telegraf) InfluxDB output plugin to send data to the cloud.
The authentication token referred to in it's config is the token created by influxdb, and of course TLS can additionally be configured.

Thanks for your answer,
I think for "home usage" using a token is maybe ok for a typical home automation system but when I have discussed this with network security professionals, they say that using tokens in a production/industrial solution is not recommended. Also Microsoft say the same when you connect with Azure. What I have understod, the (currently) most secure is to use a device with a TPM2 chip but that is (yet) not available in a RPi. You find it in modern computers and in some IoT gateways. If you use certificates, you would normally prefer to change/update those from time to time but I do not know how easy that would be in a distributed solution and if you could automate it

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.