Writing CSV files to Azure DataLake

Hi there!

I'm trying to write a basic CSV to our Azure Data Lake using NodeRed. I do a SQL read, then transform the data into a CSV, and use a function to covnert it into the correct format.

So my flow runs:

Inject -> SQL Read (works fine) -> CSV conversion (works) -> Function -> Azure File Append

Here is the code in my function:

data = msg.payload;

var today = new Date();
var dd = String(today.getDate()).padStart(2, '0');
var mm = String(today.getMonth() + 1).padStart(2, '0'); //January is 0!
var yyyy = today.getFullYear();

today = dd + '_' + mm + '_' + yyyy;

file = today;

msg.payload = {"filePath": file+"_InventoryMovements"};
msg.payload.content = data;

return msg;

I am, however, getting the following data and I'm not sure what it means:

RestError: The uploaded data is not contiguous or the position query parameter value is not equal to the length of the file after appending the uploaded data.

Can someone point me in the right direction?

Feed the function output into a debug node and check it looks correct. Assuming it does then presumably it is an issue sending it to Azure, but you haven't told us how you are doing that. Also you haven't told us where you are seeing that error.

PS Welcome to the forum.

1 Like

Thanks for the welcome! I'm using the node-red-contrib-azure-data-lake-storage module, and to me it does look right in the debug node. Here is an example:

image

Also, here is an image of my flow:

I think you had better submit an issue on the node's github page and see if the author can help.

If I can ask a possibly stupid question?

I'm sure that Azure SQL can export direct to CSV? It would be trivial to use Azure features to do that directly to the data lake surely?

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.