I'm uploading files with Nodered. Works fine but blows chunks when the files go above a certain size, for example 30MB. I'm using HTTP input node. Is this normal, is there an easy workaround? I'm running on an AWS instance so memory costs. I can recode that section in PHP but I really want to keep the moving pieces at a minimum.
Any ideas why this happens? upload_ui is not an option for various reasons.
Nodered says this when it goes boom..
nodered.service: Main process exited, code=killed, status=9/KILL
nodered.service: Failed with result 'signal'.
Works just fine with small files. Big files HTTP node goes boom.
Using that as a google search term shows a lot of out of memory issues. The fact that it works for smaller files and you seem to indicate you have a memory limit, it would seem to point in that direction.
I'm not an AWS user, but if you can temporary upgrade the memory do you still have the issue?