I would like develop something but i dont have idea What could be the right way.
I am collenting data in a local network with vps and writing in a csv.
Now, I would like send that data to Mysql in VPS at same time BUT if I lose the Internet connection I lose the data during that period.
The "proper" professional way to do it is with a Message Queue system (not lightweight MQTT but a full enterprise version) such as that from Mulesoft. But that is heavyweight for sure. It would run on your local network.
Otherwise you have to build your own queueing system. How up-to-date do you need the DB to be? Could you write the data periodically? If so, you could rotate the local CSV files regularly and have a separate flow to merge the data to MySQL and then move the CSV file to a "done" folder, tidying up old data as needed.
+1 for node-red-contrib-queue-gate. The example "Retain until processed" on the node's readme shows exactly how to use it for situations such as you describe. It can even retain the queue over a node-red restart if that is required.
First thing to think about - does it really matter if you lose some data? What difference will it actually make to you? If it is commercial data, OK. But if it is home automation data, so what? Also, how often do you lose your Internet connection?
Connecting to a remote DB every 30 sec is quite fast. To do it with a file upload is going to need a batch file and you are going to need to know how to upload a CSV in batch mode to MySQL. Long time since I did that so I can't tell you how I'm afraid. If doing it that way, personally, I would only use Node-RED to create the files. I would run the batch job from CRON. And I would make the batch job do the following:
Check for a file in the folder you are using to output from Node-RED.
If one is present, check for the Internet.
Connect to the remote db.
Upload the CSV to MySQL
If the command succeeds, move the processed file to an archive folder
loop back to 1 in case there was >1 file waiting.
You could enhance that by making the batch job call out to a Node-RED end point (use http-in/-out for that) so that you can trigger other flows depending on success/failure.
As you can see, doing everything within Node-RED using gate and mysql nodes is likely to be easier to set up. Though once you've done the batch job, you will have a lot of very reusable knowledge beyond just Node-RED.
If you do use a node-red mysql node, I suggest that you look up how to do "prepared statements" in mysql, it will make your updates a lot faster when doing them individually or in small batches.