Niclas,
Just guessing based on this:
This is chunked transfer encoding which is the default encoding. The http-request node will (see code) wait until all chunks have arrived (via the "request" npm library), and then send the entire response body as an output message. But could it perhaps be that your system keeps sending chunks, so there is no end of the response. Which means that the http request node never sends an output message... Could it work like that? I mean that every chunk is in your case in fact a complete piece of data.
If so, the second example here could perhaps be of any help to you. Suppose you rewrite it a bit like this (out of my head, so not tested!):
const request = require('request')
request(
{ method: 'GET'
, uri: 'http://192.168.80.121/arx/eventexport?end_date=keep'
, gzip: true
}
, function (error, response, body) {
console.log('Most probably the stream has been ended ...')
}
)
.on('data', function(data) {
// decompressed data as it is received
console.log('decompressed chunk arrived: ' + data)
node.send({payload: data});
})
The .on('data ...'
function should be called every time a chunk arrives.
When you use this in a function node, don't forget to add the request
library to the settings.js file (as described here). See also this discussion...