Hello, I write from Argentina and I clarify that my English is quite bad, sorry for that.
Let's go to the important thing! ... in this web directory http://ionos.ingv.it/Tucuman/data/ every 20 minutes a new text file (.txt) is appearing with information that interests me to be able to store it in a db mysql and then work it.
Could you help me and tell me how I can do so that from node-red can extract every 20 minutes the information of the last file hosted?
To be able to take the information from the txt file and send it to a mysql database I made the following flow:
[{"id":"d7326c9.647d79","type":"http request","z":"a33f0c24.bb1ef","name":"","method":"GET","ret":"obj","paytoqs":false,"url":"http://ionos.ingv.it/Tucuman/data/TUJ2O_20192822100.txt","tls":"","persist":false,"proxy":"","authType":"basic","x":300,"y":246,"wires":[["9cea0664.0c58b8"]]}]
But what I need is that this is automated to permanently take the information from the last file hosted in the url and that it be hosted in the database I made.
I hope you could understand me well.
Thank you