Exec in spawn mode using curl with a post request?

I am using an API that streams results in json. As there is no "good" way to handle streaming, I thought I use the exec node in spawn mode and perform a curl request.

The response starts with a json result that contains a backslash - and somehow the exec node errors out with a code 3.

curl response start (this is a cli result):



{"error":"invalid character '\\'' looking for beginning of value"}

Is there a way for the exec node to avoid parsing ? I cannot convert the result in realtime as it is streaming.

Digging a bit further it seems like the spawn command is parsing or replacing characters in the input command

the curl command i am using:

curl -d '{"model": "llama2","prompt":"Why is the sky blue?"}'

Looks like the single quotes are getting dropped (?) and everything gets sliced to be used as arguments, which fails at the api end.

The interface between Node.js and exec/spawn can be rather a pain as you've discovered.

I think there are a couple of potentially better approaches.

One would be to use a BASH script to do the curl and to simplify the parameter passing. You could also, instead of getting the data back from std-out, redirect the output to a Node-RED http endpoint (I'll try to dig out an example). This might help?

The other approach would be to use a custom http request in a function node where the callback would spit out a new Node-RED msg each time to get a record back from the request.

Here is an example bash script that sends its output to a Node-RED endpoint:

#! /usr/bin/env bash
# Fast scan the local network for live devices and record
# to /tmp/nmap.xml which can be used in Node-RED
# To run manually:
#   sudo /home/home/nrmain/system/nmap_scan.sh
# To run via cron:
#   sudo crontab -e
#       01,16,31,46 * * * * /home/home/nrmain/system/nmap_scan.sh

# Run the scan
nmap -sn --oX /tmp/nmap.xml --privileged -R --system-dns --webxml
# Make sure ownership & ACLs on the output are secure
chown root:home /tmp/nmap.xml
chmod --silent 640 /tmp/nmap.xml
# Trigger the Node-RED update
curl -I 'http://localhost:1880/localnetscan'


Does your example output a single message once it is completed ?

I actually want to have a streaming output. I am using a local LLM and want to create my own interface ala ChatGPT, but don't want to wait for the full response (which can be rather slow if the response is lengthy).

I like the idea of calling an endpoint, this could work, but I don't seem to be able to post each response from stdin. (ie it outputs a single message when it finishes).

That one does, yes, I expect there would be a way to output a series but my BASH-foo a bit rusty at the mo.

I must admit that I would use a function node over this approach for what you are trying to do. I don't have a llama2 account I'm afraid so I can't easily test something.

You've got me thinking - always dangerous!

I've found an example for setting up an HTTP node.js streaming data server. So I think I'll create that and then I can look at how to get data from it into Node-RED.

With curl it is not going to work I found. I tried it another way - having a "watch" node watch a file for changes and exec a tail -n1 while the curl outputs into that file, it does trigger, but tail -n1 is not cutting it (as it would skip lines). I could make this work, as the response has an absolute timestamp that I could compare, but it might remain somewhat of a "guessing" game i think.

Yes, watch can be somewhat fragile itself since you can get multiple triggers for a file update or, as you saw, missed updates.

I still think that a custom request in a function node is your best bet.

Unless you using node.js v21+ (which has native fetch), you will need a suitable package:

I can't recommend got because the rather obnoxious sindresorhus has tried to force everyone using his libraries to instantly move to ESM. It really wouldn't hurt him to provide CommonJS versions but he thinks he knows best. (annoyance mode off!). But the node-fetch or one of the other similar packages should work fine.