Storage for MQTT

Hello guys i want to make a storage for my mqtt, the plan is to save messages from topic to a database.
(i'm new to this, so if you could show what kind of setup i need, like a screenshot, i would appreciate it)

This is what i have right now.

Hello and welcome to the forum.
This category is for people to publish (working) projects they have completed.
It seems you are asking for help on how to start a project, not sharing a project.

What have you done so far? Can you share your flow?

Do some searches on the forum as you should find material that will help you get started.

You'll probably have to share a bit more about your specific objectives so people can guide you.

Finally, what you're asking is a classic application where Node-RED can be used.
Capture some data using MQTT, process it and store it in a database.
You might want to think about how you are going to 'display' the results.

I've updated the category.

You need to share with us what you are wanting to do with the data. For example, many of us use InfluxDB - which is a timeseries database - to record sensor, control and other data over time. We use MQTT for the current values and InfluxDB for historic values, great for charting.

But depending on the data and its purpose, you might want a more traditional relational database. We can't really tell at the moment.

I used to use influxdb for storing sensor data but I gave up on it because I couldn't work out the complex queries needed for analysis.

Now I use Mariadb, a proper relational database.
It runs very well on a raspberry pi and happily handles hundreds of thousands of records.

It takes a bit of thinking to get the databases right - though I published some guidance to the forum on that some while back that should make things simpler.

But the queries are easy - IF you use Grafana :grinning: That's because it is point and click, no need to actually learn much about InfluxDB query language. You can do a query in Grafana and click a button to see the actual query in case you wanted that in Node-RED for example.

I'm sure that's right Julian but I can just about work out the SQL for things like
"what is this value as a percentage of the value when it last increased by more than 50% in 30 minutes?"

Don't want to spend the time learning that in Grafana, pretty though it is.

Also I found the combination of Node-red, Influxdb and Grafana too much on a Raspberry Pi. Grafana was especially resource heavy.

Fair enough :slight_smile: I do struggle with queries if I have to try and do one manually. They are similar to SQL but not quite the same which certainly can be confusing.

But then, queries don't need to be created often and most of them are pretty trivial. I did just have to look up how to find out how many data points exist in a measurement (equivalent of a relational table). For me, 56.5k in my weekly table (1 minute sensor data) and 86k in my 5yr table (hourly with max/min/avg). There is a much bigger table that Telegraf populates with system data which mostly updates every 10s or so though I only keep that data for 15d. There are dozens of tables with many values (columns).

Odd, I never found that. I ran that combination and more on a Pi2 and a Pi3 for years. The only problems I had were when I let InfluxDB get too big. Thankfully it has great features for automatically summarising data and trimming it. I had tried various databases prior to that but it is really hard to beat the date/time range processing built into InfluxDB.

Incidentally, my Mosquitto broker is currently running at around 430 msgs per minute (30kbpm) inbound and 500 or so (55kbpm) outbound.

Take a look at this link. MQTT Broker to build enterprise IoT Applications with AI/ML

Depending on your need, We found that there is a perfect solution for us to use Node-RED as a system which can log to our remote MSSQL database to store all MQTT messages and customize a table storage for certain messages you'd like for custom storage. The broker is installed on the MSSQL Database Server. Using MQTT, this can easily keep up with asynchronous systems reporting simultaneously without blocking issues with SQL calls directly to the remote SQL Server. i.e. no missed logs.

It will write all your broker messages to a database "right out of the box".

They have a range of sized solutions for your needs.

EDIT: They appear to have changed the free model to cloud hosting only.

All of my MQTT messages contain JSON. For ease of future manipulation I simple store the raw JSON into a text field. (I use a text field because sometimes the incoming JSON may be invalid and if so I get errors). I then use generated fields in the database to pull the information that I want out of the saved MQTT JSON. By doing it this way the JSON coming from MQTT can mature and yet the data structure can be maintained and enhanced.

I have used databases for 40 years. But sometimes when the dataset is small, a database might be an overkill when a simple array might be sufficient.

I use an array to store sensor states that do not need to be summarised (in a UI). (I store more important sensor states in objects that I can navigate around real time in the NR debugger). Note, that all these sensor states are not persistent across reboots for the processor or NR.

I have a handler to add a sensor state to the array and another to query the sensor state. It's all very very simple (and I am happy to share it), and I only have about 50 sensor states stored in the array.

In my case (with processor only 3% busy and less than 500 array elements) it makes no sense to me to store this transient (non persistent) info in a database.


I would be interested in seeing how you have implemented it as I reckon it might be the answer to storing data from some of my home sensors (that aren't critical). At the moment I use MySQL on remote servers.

It is hard to beat InfluxDB for storing sensor data. Bit of a faff to set up initially but then it doesn't need any maintenance at all.

Hi Richard,
Just as an aside... I've been investigating low-cost EEPROMs to connect to my Raspberry Pi Pico.
For low-update rates and small data-sets these devices might be worth considering.
The chip I've been looking at is... 24LC512 from MicroChip. 512Kb (64k * 8bits).
The price here in the UK is GBP 1.5 There are even cheaper prices for smaller memory sizes.
You can perform single byte read/write or page writes up to 128-bytes.
I've ordered a few in order to do some experiments with the Pico.

Quick update. I've been experimenting with a 24LC32 (32Kb eeprom organised as 4K by 8-bits) and can write and read-back json strings, so that means I could store and recover Python dictionaries like these...
Screen Shot 12-30-21 at 10.05 AM
Here's a snapshot of what it looks like inside the EEPROM.
Screen Shot 12-30-21 at 09.56 AM

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.