Generate CSV data file Every 30 minutes

Dear Node-RED colleague,
I am not expert on Node-RED & other programming language. As per attach Flow I would like someone to enlighten me, is there any possible to create the folder every 30minutes? I need a clue

Auto Logger: log data in daily CSV files

That's a pretty big ask looking at the flow.
(but I'm guilty of asking similar, so don't worry)

But I want to cut what you are doing into smaller pieces because I think you are looking too hard at the bigger picture and so are getting confused.

First off:
Every 30 minutes needs to be qualified.
Only that I have been bitten by this myself.

What structure do you want the folder name?
Personally I would say something like this:

So what you could do is set what is called a flow.context to the file name.

Then when you send a payload to be written to a log file, you put it through a node that gets the name and uses it to qualify the path.

And the file name?

So to step back a bit, here's what you would do:

Every 30 minutes generate a path/directory name.
Save this to flow.context as well as make a folder of that name.

Then when you want to write to the file, you get the file and pass it through a change node to get the path and send it to a write file node.
So the path is set in part of the message.

(from the docs for that node)

filename string
    If not configured in the node, this optional property sets the name of the file to be updated.

But I don't know how good you are with understanding what I mean.

Of course others may have other ideas.

Hi. Thank you for hop in. Actually I want to generate file for every 30minute. not a Folder. I am sorry for this matter.
For my final year project structure plan is to save the Chiller operational data on database as csv file. Then from database to Grafana.
If anything happen on the Chiller we need to see data on grafana of 1 Hours before.
Actually if possible I want to make 10min/file. Therefore the gap time between grafana and real time not so big.

Again, that isn't too much difference to what I said in the first reply.

No problems.

So you get that message which is made every 30 minutes - you will have to work on the exact name - and use it for the filename into the write file node.

Very little change.

Do you understand what I mean?

I could make you a basic idea, but it may have a lot of stuff you need to do.

1 Like

There are many nodes that will trigger an event on a schedule. The easiest being the inject node which can certainly be set to fire a msg every 30min.

As Andrew says, break the problem down into its logical steps and work each one out. There are nodes for saving/updating files and nodes for updating data, ...

Not wishing to be rude but unless your project is about internet forums, you are probably expected to work this out yourself and explain how you did it in your dissertation?

1 Like

Why not push the data directly to database?

Seems arcane to write to CSV just to then import it into a database.


As Steve suggests, you should be looking at using something like Influxdb, not csv files.

1 Like

Regarding to my supervisor, he would like sent a csv file into database to save the memory. Raw data might be bigger than csv.

It is much more likely the other way round.


Do you actully have a database (MySQL, Influxdb, MSSQL etc)?
Or does your supervisor consider a directory full of csv files to be a database?

So you will create a file every 10 minutes?
And when your supervisor says he wants the gap just 2 minutes? 30 seconds?

Use a proper database.
You could write CSV files as well to keep your supervisor happy, just don't do anything with them.

Certainly CSV files are good for data exchange but poor for managing data. :slight_smile:

Until they add something in the middle or need 2 types of data & hack the format to suit etc :laughing:

IMHO, CSV is a terrible interchange format. I Always recommend something extensible like JSON or XML before CSV.

There are always reasons for other formats of course, not suggesting otherwise but CSV provides a pretty universal, simple, relatively robust exchange format that is easily incorporated into other systems. But only for tabular data in a character format.

XML and JSON are for more structured data. So actually I would recommend the right format for the right data :slight_smile:

I do use phpPgAdmin. Currently I pump a raw data into the database. But my supervisor said the raw data is big. He ask me to pump cvs file into the database to save some memory.
The file will generate every 30minute.

For example, file 1 generate at 202201300800 = 08.00am 30/01/2022.
second file will generate at 202201300830 = 08.30am 30/01/2022.
When the second file generate, I will sent the first file into database. and the file will be deleted after 15minute. The process repeat.

Now, I only able to generate a file on 1 hours gap. I wish to make the time shorter as 30 minutes.

I see. That means raw data are smaller than csv? Even the raw data are pull using modbus? if so I need to find a source for this matter. :wink:

The file name would not be the same. File name will base on date and time generate. I hope you can share more to make me understand on what I do not have in my mind.

In order to get the data from your CSV file into the database you have to read it into memory, and it's 30 minutes of data instead of a single record.
And every record has to be processed twice. Once by Node-red to CSV, once from CSV to Postgres
I suppose the database batch upload might be written in C for maximum speed.

whether you send data direct to a database or write it to CSV, the end result will be the same. In fact, as others have mentioned, CSV data WILL be bigger (due to padding, separators etc)

So reading this makes me think...

Does your supervisor mean "Pushing data from ModBus direct to database is generating too many entries" and therefore has suggested that you write 1 sample every 30 minutes.

If I am right, then using CSV is not the answer. Simply store the modbus value (in memory context for example) then use an inject node to send the last stored value at xx:00 and xx:30

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.