FTP error - too many connections

Hi,
I like to upload a generated CSV file to my FTP server to make it available for another project.
The CSV file will be generated every 5 minutes, so it also has to be uploaded every 5 minutes.

It works a few times, but then, I receive an error "Sorry, to maximum numbers of clients are already connected". I tried 3 ftp plugins, but always face the same error. Do I have to close the connection after the upload and if yes, how?
Right now I use " node-red-contrib-advanced-ftp".

I appreciate your help!

Quick look at the code makes it look ok as it does an conn.end() in seemingly in the right places. You might need to raise an issue on GitHub.

Okay, thanks. I got the same issues with other ftp plugins, but I assume they all use the same ftp base code

I used to see this all the time with FTP, not related to NR, it depends on the quality to the FTP back-end. Not all FTP servers are made equal. Even with a session time out on the back-end explicitly set, we would see FTP sessions linger for no logical reason. If you increase the number of concurrent sessions allows on the FTP server, does the problem take longer to result? Does any of the FTP clients allow definition of a session id? Oh, you are always using the same id? Or anonymous? Using anonymous can be a factor as well. I seem to recall that one of the old security risks was one anonymous session disconnecting but another follow anonymous session could reconnect to the previous session that was believe retired.

Hi,
I am not allowed to modify the ftp settings on the server side, since it is a simple webpack. I tried another server, that one is a pro one, but also managed by the hoster. It works a few injections, but than my IP got banned by the server. Even my FTP programs was not able to connect, so I had to reconnect for a new IP adress.
I use credentials for FTP authentication, not anonymous.

I assume I have to look for something else. I only want a csv file to be available for download, but it has to available for a GET method. I wanted to use my next cloud server or google drive, but then it is not possible use the data directly with GET

It looks like there are very stict rules on the FTP server side. If you can't change them, then, you may have to use another option. Could you use SFTP instead? You would need an ssh server on the server side with SFTP allowed. Should be doable nowadays...

That tells you that you are probably doing something bad.

Firstly, if you are doing authenticated FTP without TLS (e.g. not SFTP or FTPS), it wouldn't be surprising to get a ban as this is sending your credentials in the clear over the Internet. Not a good thing to do in this age.

As well as the other suggestions you've been given, you could look at a file drop service such as Google Drive, or Dropbox. Indeed, if using something like that, you could do the actual copy outside of Node-RED or perhaps a delayed copy triggered from within node-red. Then you could use standard file out tools in node-red. If your target system allows SSH or SCP, you could also use RSync or similar to do the copy and that could also be triggered from Node-RED, or possibly you could do the copy on a schedule (and so could use native CRON) if that fits in with your requirements.

I think it is one of two things, the sessions really are hanging around, or banning is a true possible as well. You should be able to prove the banning issue by testing with 3 ids, test 1, test 2, test 3 in turn let each get blocked by the connect count in turn, then try test 1 again. If it works say with in 15 minutes, that might be a session issue, if they are all blocked for a longer period, as the administration/support what their policy is on banning, concurrent attempts, and duration for repeated attempts. Be sure to ask if they have a successful login policy as well as a failed login policy!

Many FTP admins set a frequency per time limit for even invalid logins. This is done to avoid various DoS attack scenarios. As or valid logins, they could have all kinds of limits in place.

I worked for a company that did a lot of bulk data moves via FTP, it was statistical market data that was useless to any hacker unless they knew the context, so we used FTP for moves at night, and some of the receivers kept rejecting data. Turned out they not only throttled to connections from same sources, but also limited the volume of logins in total per time segments. Someone had the great idea that no one should be moving bulk data at night! Geez.

Thank you very much for four explanation. I am amazed by your detailed feedback.
I am using FTP for more than 20 years now in different private projects, but I was never questioning the security of FTP. So I will definitely use SFTP from now on.
Regarding the issue here, I have to say, that there was an issue in my nodes. Sometimes I receive a large array from my influx quote node and then I split into separate messages, and these messages trigger the file upload - aaaah! What a terrible mistake, so of course the server bans me when he faces hundreds of connections in seconds.
So I rearranged my node and connected the upload node to the very first injection. It is working now for 4 days, even with FTP.

Thank you very much for your help!

1 Like

Run FTP and wireshark at the same time. The password is sent in clear in one IP packet. It is so easy to get the user password this way...
Same for telnet and authentication over HTTP.

This is why these days, ssh, sftp and https are becoming the preferred choice.

1 Like

The only choice - even for "internal" use. Plenty of successful persistent hacks started with an FTP compromise. It isn't allowed anywhere in our world.

There are many, better protocols, than even SFTP now, so using it is not the best choice, but because it is an old standard, it is often used.

Not sure that is particularly true. SFTP/FTPS/SCP are all similar in use really. They work, are reasonably secure (if the server is set up correctly), reasonably performant and reasonably easy to set up and use.

RSYNC over SSH is another option of course. Slightly different way of working but gives you other options.

File uploads over HTTPS is another option but harder to set up I think. And you'd have to layer the authentication/authorisation part which you get anyway with the others.

I was referring, without itemizing, to various custom protocols that are often embedded in solutions for secure transport, the financial industry uses various types of these. Of course there is also secure transport within VPNs, and others.

The ones you outlined, are typical of open-source or qualified source based. Nothing wrong with them of course, if they meet your need or address your risk profile requirements. And of course there are other solutions that use mesh or never unique data models that do data transport. Even the various bit-coin solutions are in fact data transport strategies designed for an unique data set.

Ah yes, well there are, as you say, custom and open messaging standards that are more efficient for exchanging structured and semi-structured data rather than files.

Exchange within VPN is a different beast altogether as that is network level and VPN's are not the sinecure that many people seem to think since an attacker that has reached your network would still have access to FTP passwords. It really only protects transport across untrusted network segments. Bitcoin isn't really a secure transport mechanism and has a number of security issues, it is designed for proof above all else. But maybe we are going a bit too far down the security rabbit hole now. :sunglasses:

Right, bit coin is is unique, because of how it protects the 'context' of the data more than the data its self. Meaning, that unless everyone trusts the 'transaction' what the transaction is or was, becomes worth less. That said, the intention of bit coin was a be a form of secured transport, with extensive validation of context. Because poor implementation of Bit coin, IMHO, it has suffered. Moreover, because 'software' is never 'secure' anything built with or on 'software' is never secure.

Another real test is self driving vehicles, long haul (truck) freight, which is coming. And in due course self driving aircraft for example. But I think even more important before these, is self directed drones, say for Amazon for example.

If I was a hacker? WHICH I AM NOT... I would spike/override amazon drones, so everything is delivered to a temporary location... just imagine what loot you might get? LOL. Just spoof GPS, and you get it done.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.