I'm working on a flow to create a bot to retrieve a lot of datas from a website: it loops a http request via the nbrowser node, and each time it retrieves the data by doing a few steps. For reason I don't understand, sometimes the call freezes at some point: sometimes the click doesn't bring me to the right page but just refreshes the current one, and sometimes the bot is unable to retrieve the right selector when in the correct page. I have no idea why.
That wouldn't be a big deal, because I built it trying to go around this problem, inserting a catch: so, whenever an error occurs, whatever error it might be, the node simply restarts.
And that's good.
The problem is that I have to do this operation thousands of times (about 3000/4000 addresses), so after too few calls, the flow stops working, giving me this error:
"Message exceeded maximum number of catches"
So we get to the question: can I remove the maximum number of catches allowed by the catch node?
Thanks for any help or suggestion!
I have a list of VAT numbers (i think that's the correct translation? English is not my first language), and I need to get the related addresses of the corresponding companies.
What do you mean with "why so quickly?"
It does not make all the calls at once: I put all the VAT numbers in an array, and then cycle it, making one call at a time for 3000 times. I thought about simply re-injecting the flow when it crashes, but it does it after 20-30 calls so it's not good enough: I need the bot to be usable once a month or so, with similar numbers. So this is why I need a way to go around the maximum number of catches.
No - you cannot alter this, it is hard coded at 10, and is to stop Node RED from igniting and causing further instability, You have an error loop that needs to be fixed.
Personally, and I mean this in the nicest way, you should really fix the fault at source, and not try to avoid Node RED error handling.
You may want to use a queue to only fire off the next request once one has completed successfully to prevent a DDOS on the serving end... maybe something like this Node red flow queuing - #3 by Colin - that flow is self throttling and goes as fast as it can - but one at a time.
I tried fixing the error of course
I really don't understand why those errors occur. It looks like it depends from the website, because they look completely random.
I found lots of API sources doing what you say, but they are not free. I was looking for a way to do it with node red, but it seems like it is not the best tool to do such a big web scraping...