Change a value within a CSV File

So my input csv file that I am reading from has a reading of this:
e0040150c6cd280d,bolt,5
e0040150c6cd14c3,screw,0
e0040150c6cd2a81,nut,3
e0040150c6cd245b,washer,0

I am struggling to figure out how to write a function that replaces the '0' value with 'null' and not just append to a new line and leave the '0' in the file.
I want the csv file to look like this when it is written back:

e0040150c6cd280d,bolt,5
e0040150c6cd14c3,screw,null
e0040150c6cd2a81,nut,3
e0040150c6cd245b,washer,null

Maybe this example will help

[{"id":"0cc20c51c1ace4cc","type":"inject","z":"b9860b4b9de8c8da","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":280.00000762939453,"y":3243.000096321106,"wires":[["8ad3c8761acf4695"]]},{"id":"8ad3c8761acf4695","type":"template","z":"b9860b4b9de8c8da","name":"simulate csv file read","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"e0040150c6cd280d,bolt,5\ne0040150c6cd14c3,screw,0\ne0040150c6cd2a81,nut,3\ne0040150c6cd245b,washer,0\n","output":"str","x":500.00000762939453,"y":3243.000096321106,"wires":[["d1d232d57e43aa95"]]},{"id":"d1d232d57e43aa95","type":"csv","z":"b9860b4b9de8c8da","name":"","sep":",","hdrin":"","hdrout":"none","multi":"mult","ret":"\\n","temp":"id,name,value","skip":"0","strings":true,"include_empty_strings":"","include_null_values":"","x":310.00000762939453,"y":3303.000096321106,"wires":[["90548984a7c5c51c"]]},{"id":"90548984a7c5c51c","type":"change","z":"b9860b4b9de8c8da","name":"","rules":[{"t":"set","p":"payload","pt":"msg","to":"$$.payload ~> |$|$.value = 0 ? {\"value\":null} : {}|","tot":"jsonata"}],"action":"","property":"","from":"","to":"","reg":false,"x":460.00000762939453,"y":3303.000096321106,"wires":[["229cfe3c397c732a"]]},{"id":"229cfe3c397c732a","type":"csv","z":"b9860b4b9de8c8da","name":"","sep":",","hdrin":"","hdrout":"none","multi":"one","ret":"\\n","temp":"","skip":"0","strings":true,"include_empty_strings":"","include_null_values":"","x":630.0000076293945,"y":3303.000096321106,"wires":[["66f3a8506ea55dae"]]},{"id":"66f3a8506ea55dae","type":"debug","z":"b9860b4b9de8c8da","name":"debug 323","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","statusVal":"","statusType":"auto","x":490.00000762939453,"y":3363.000096321106,"wires":[]}]

The expression looks for value 0 and replaces with null, using the JSONata transform operator syntaxs.

$$.payload ~> |$|$.value = 0 ? {"value":null} : {}|

Sadly that returned this inside the csv file:

e0040150c6cd280d,bolt,5
e0040150c6cd14c3,screw,0
e0040150c6cd2a81,nut,3
e0040150c6cd245b,washer,0
e0040150c6cd14c3,screw,"{""hex_output"":""e0040150c6cd14c3"",""item_name"":""screw"",""quantity"":0}"
e0040150c6cd245b,washer,"{""hex_output"":""e0040150c6cd245b"",""item_name"":""washer"",""quantity"":0}"

Sadly you must be using it wrong, see image

Ah yes I added a .quantity to the set msg.payload, but now its just outputting the whole csv again.

e0040150c6cd280d,bolt,5
e0040150c6cd14c3,screw,0
e0040150c6cd2a81,nut,3
e0040150c6cd245b,washer,0
e0040150c6cd280d,bolt,5
e0040150c6cd14c3,screw,0
e0040150c6cd2a81,nut,3
e0040150c6cd245b,washer,0

Guessing as you have not supplied an example flow.

If the cvs node is outputting a message per row you would do something like this

[{"id":"0cc20c51c1ace4cc","type":"inject","z":"b9860b4b9de8c8da","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":280.00000762939453,"y":3243.000096321106,"wires":[["8ad3c8761acf4695"]]},{"id":"8ad3c8761acf4695","type":"template","z":"b9860b4b9de8c8da","name":"simulate csv file read","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"e0040150c6cd280d,bolt,5\ne0040150c6cd14c3,screw,0\ne0040150c6cd2a81,nut,3\ne0040150c6cd245b,washer,0\n","output":"str","x":500.00000762939453,"y":3243.000096321106,"wires":[["d1d232d57e43aa95","66f3a8506ea55dae"]]},{"id":"d1d232d57e43aa95","type":"csv","z":"b9860b4b9de8c8da","name":"","sep":",","hdrin":"","hdrout":"none","multi":"one","ret":"\\n","temp":"id,name,value","skip":"0","strings":true,"include_empty_strings":"","include_null_values":"","x":310.00000762939453,"y":3303.000096321106,"wires":[["0eeac0795b12e354"]]},{"id":"66f3a8506ea55dae","type":"debug","z":"b9860b4b9de8c8da","name":"debug 323","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","statusVal":"","statusType":"auto","x":610,"y":3360,"wires":[]},{"id":"0eeac0795b12e354","type":"switch","z":"b9860b4b9de8c8da","name":"","property":"payload.value","propertyType":"msg","rules":[{"t":"eq","v":"0","vt":"num"},{"t":"else"}],"checkall":"true","repair":false,"outputs":2,"x":450,"y":3300,"wires":[["99b22e4c09d0aae9"],["2750bb653d171682"]]},{"id":"b47a7d55bb39eb9d","type":"csv","z":"b9860b4b9de8c8da","name":"","sep":",","hdrin":"","hdrout":"none","multi":"one","ret":"\\n","temp":"","skip":"0","strings":true,"include_empty_strings":"","include_null_values":"","x":450,"y":3360,"wires":[["66f3a8506ea55dae"]]},{"id":"99b22e4c09d0aae9","type":"change","z":"b9860b4b9de8c8da","name":"","rules":[{"t":"set","p":"payload.value","pt":"msg","to":"null","tot":"json"}],"action":"","property":"","from":"","to":"","reg":false,"x":620,"y":3300,"wires":[["2750bb653d171682"]]},{"id":"2750bb653d171682","type":"join","z":"b9860b4b9de8c8da","name":"","mode":"auto","build":"object","property":"payload","propertyType":"msg","key":"topic","joiner":"\\n","joinerType":"str","accumulate":"false","timeout":"","count":"","reduceRight":false,"x":290,"y":3360,"wires":[["b47a7d55bb39eb9d"]]}]

That worked perfectly, sorry about not being able to send the flow, the file is massive. Thank you so much for the help.

Do not need the whole flow, just relevent nodes and a simulation of file read , similar to my example.

Got it will do that for further questions/problems

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.