Optimize my first "Flow"

Hi all,

with your help I was able to create my first node-red flow. Now I want to understand what could be made in a better way.

I think there is a way to reduce the nodes in terms of data transformations after the insert to the database.

Here is my first flow:

[{"id":"566c79f1.69cd18","type":"tab","label":"Flow 1","disabled":false,"info":""},{"id":"58bd3ead.0e604","type":"http request","z":"566c79f1.69cd18","name":"","method":"GET","ret":"txt","url":"http://192.168.178.90:8080/last/1","tls":"","x":294.5,"y":213,"wires":[["2a87241d.fb2a2c"]]},{"id":"61769823.4a6978","type":"inject","z":"566c79f1.69cd18","name":"","topic":"","payload":"","payloadType":"date","repeat":"","crontab":"*/1 0-23 * * *","once":false,"onceDelay":"0","x":294,"y":157,"wires":[["58bd3ead.0e604"]]},{"id":"2a87241d.fb2a2c","type":"json","z":"566c79f1.69cd18","name":"","property":"payload","action":"obj","pretty":false,"x":463.5,"y":214,"wires":[["942129dd.a9ab18"]]},{"id":"cca1fc32.48a92","type":"mysql","z":"566c79f1.69cd18","mydb":"edd790d3.2eb0e","name":"","x":502.5,"y":380,"wires":[["e5dfc632.8d0bd8"]]},{"id":"942129dd.a9ab18","type":"template","z":"566c79f1.69cd18","name":"Data insert","field":"topic","fieldType":"msg","format":"text","syntax":"mustache","template":"INSERT INTO  gosdm630.input_data (\n   UniqueDeviceId,\n   Timestamp,\n   Unix,\n   ModbusDeviceId,\n   Power_L1,\n   Power_L2,\n   Power_L3,\n   Voltage_L1,\n   Voltage_L2,\n   Voltage_L3,\n   Current_L1,\n   Current_L2,\n   Current_L3,\n   Cosphi_L1,\n   Cosphi_L2,\n   Cosphi_L3,\n   Import_L1,\n   Import_L2,\n   Import_L3,\n   TotalImport,\n   Export_L1,\n   Export_L2,\n   Export_L3,\n   TotalExport,\n   THD_L1,\n   THD_L2,\n   THD_L3,\n   THD_AVG,\n   Freq\n)\nVALUES( \n   '{{payload.UniqueId}}',\n   '{{payload.Timestamp}}',\n   {{payload.Unix}},\n   {{payload.ModbusDeviceId}},\n   {{payload.Power.L1}},\n   {{payload.Power.L2}},\n   {{payload.Power.L3}},\n   {{payload.Voltage.L1}},\n   {{payload.Voltage.L2}},\n   {{payload.Voltage.L3}},\n   {{payload.Current.L1}},\n   {{payload.Current.L2}},\n   {{payload.Current.L3}},\n   {{payload.Cosphi.L1}},\n   {{payload.Cosphi.L2}},\n   {{payload.Cosphi.L3}},\n   {{payload.Import.L1}},\n   {{payload.Import.L2}},\n   {{payload.Import.L3}},\n   {{payload.TotalImport}},\n   {{payload.Export.L1}},\n   {{payload.Export.L2}},\n   {{payload.Export.L3}},\n   {{payload.TotalExport}},\n   {{payload.THD.VoltageNeutral.L1}},\n   {{payload.THD.VoltageNeutral.L2}},\n   {{payload.THD.VoltageNeutral.L3}},\n   {{payload.THD.AvgVoltageNeutral}},\n   {{payload.Frequency}}   \n   \n)","output":"str","x":291.5,"y":380,"wires":[["cca1fc32.48a92"]]},{"id":"adca0623.064f38","type":"mysql","z":"566c79f1.69cd18","mydb":"edd790d3.2eb0e","name":"","x":499,"y":457,"wires":[["fc03fd2c.c581d"]]},{"id":"fc03fd2c.c581d","type":"template","z":"566c79f1.69cd18","name":"Data preparation","field":"topic","fieldType":"msg","format":"text","syntax":"plain","template":"\nINSERT INTO `gosdm630`.`data`\n(`ID`,\n`UniqueDeviceId`,\n`Timestamp_raw`,\n`Timestamp`,\n`Unix_raw`,\n`Unix_vz`,\n`ModbusDeviceId`,\n`Power_L1`,\n`Power_L2`,\n`Power_L3`,\n`TotalPower`,\n`Voltage_L1`,\n`Voltage_L2`,\n`Voltage_L3`,\n`Current_L1`,\n`Current_L2`,\n`Current_L3`,\n`Cosphi_L1`,\n`Cosphi_L2`,\n`Cosphi_L3`,\n`Import_L1`,\n`Import_L2`,\n`Import_L3`,\n`TotalImport`,\n`Export_L1`,\n`Export_L2`,\n`Export_L3`,\n`TotalExport`,\n`THD_L1`,\n`THD_L2`,\n`THD_L3`,\n`THD_AVG`,\n`Freq`\n\n)\n\nSELECT `input_data`.`ID`,\n    `input_data`.`UniqueDeviceId`,\n    `input_data`.`Timestamp`,\n    (str_to_date(substring_index(`input_data`.`Timestamp`,'.', 1),'%Y-%m-%dT%H:%i:%s')) AS timestamp,\n    #convert(`input_data`.`Timestamp`,DATETIME),\n    `input_data`.`Unix`,\n    (input_data.Unix*1000),\n    `input_data`.`ModbusDeviceId`,\n    `input_data`.`Power_L1`,\n    `input_data`.`Power_L2`,\n    `input_data`.`Power_L3`,\n    (`input_data`.`Power_L1`+`input_data`.`Power_L2`+`input_data`.`Power_L3`) AS TotalPower,\n    `input_data`.`Voltage_L1`,\n    `input_data`.`Voltage_L2`,\n    `input_data`.`Voltage_L3`,\n    `input_data`.`Current_L1`,\n    `input_data`.`Current_L2`,\n    `input_data`.`Current_L3`,\n    `input_data`.`Cosphi_L1`,\n    `input_data`.`Cosphi_L2`,\n    `input_data`.`Cosphi_L3`,\n    `input_data`.`Import_L1`,\n    `input_data`.`Import_L2`,\n    `input_data`.`Import_L3`,\n    `input_data`.`TotalImport`,\n    `input_data`.`Export_L1`,\n    `input_data`.`Export_L2`,\n    `input_data`.`Export_L3`,\n    `input_data`.`TotalExport`,\n    `input_data`.`THD_L1`,\n    `input_data`.`THD_L2`,\n    `input_data`.`THD_L3`,\n    `input_data`.`THD_AVG`,\n    `input_data`.`Freq`\nFROM `gosdm630`.`input_data`;\n;","output":"str","x":334,"y":526,"wires":[["42597d4f.22d4f4"]]},{"id":"7c6a8bce.6ec284","type":"debug","z":"566c79f1.69cd18","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","x":495,"y":651,"wires":[]},{"id":"e5dfc632.8d0bd8","type":"template","z":"566c79f1.69cd18","name":"Data cleanup","field":"topic","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"delete from `gosdm630`.`input_data` \nwhere `input_data`.`ID` <= \t(select max(`data`.`id`) \n\t\t\t\t\t\t\tfrom `gosdm630`.`data`);\n","output":"str","x":292,"y":458,"wires":[["adca0623.064f38"]]},{"id":"42597d4f.22d4f4","type":"mysql","z":"566c79f1.69cd18","mydb":"edd790d3.2eb0e","name":"","x":557.5,"y":528,"wires":[["7c6a8bce.6ec284"]]},{"id":"edd790d3.2eb0e","type":"MySQLdatabase","z":"","host":"192.168.178.107","port":"3306","db":"gosdm630","tz":""}]

Thx and KR
Itchy2

Hi Holger,
First thing, when you post a flow, tell people the names of any extra flows you have installed so if they decide to help you and install your flow they won't have to go digging to figure out what node(s) you used and that they might need to install.

Second, if you want someone to help you analyse your flow, it would be a good idea to explain what you are doing (or tryng to do) with it.

and it is also good to provide the platform, OS release, versions of NR and node.js

It is nice to see you provided your flow in a form that it is importable.:star2:

Hi Zenofmud,

i'm sorry. Requested information:

extra flows: mysql node. The rest was "standard".

The flow gets json data from a web api and inserts it into a mysqldb (staging table). Afterwards the data is transformed by sql scripts (template node) and inserted into another table in the mysqldb. The staging table is cleaned up (truncated) on the next itteration.

Platform is lubuntu 18 lts. and NR 0.19.5.

The node.js file is available in /node_modules and /.node-red and eachtime there is a substructure /until-deprecate; /debug/src; /debug

Which one are you interested in?

Thx and KR
Itchy2

The node.js file is <...snip...>Which one are you interested in?
You should see the version in the NR log at startup.

as for the node, it is best to provide the full name like node-red-contrib-sqldbs or node-red-node-mysql because both of can be used with mysql. Now my 'guess' is that you meant node-red-node-mysql.

So looking at the flow, you get some data, insert it into the input_data table (and it liiks like the column ID is autoincrimented)

Next it looks like you delete everything frpom the input_data table that has an ID less than the greatest ID in the data table

Then you insert into the data table everything that is left in input_data

So you get a piece of data, insert it into input_data, delete things from input_data then insert the rows in input_data into data

Why not insert them directly

Hi zenofmud,

node-red-node-mysql is right. Thanks for the hint with the full name.

Your description of the flow is correct. First I tried to do everything in one step but I could not get the transformations working by accessing the JSON elements. Don't know what I did wrong.

KR
Itchy2

how do you ever get more than one row in the input_data table?

never. There is always only one line in the input_data table. It's just to store the data before the transformations are executed.

Why? Why store a row tehen do transformations on it? why hot do the transformations and then insert the row eliminating two mysql calls?

Hi Zenofmud,

i tried to. But I could not get the transformations running within the template for accessing the json elements. So I tried to get a solutions which works to see some results and searched for your help afterwards :slight_smile:

KR
Itchy2

what are these data transformations you speak of?

Hi Zenofmud,

i need to modify the date format:
(...) str_to_date(substring_index(input_data.Timestamp,'.', 1),'%Y-%m-%dT%H:%i:%s'))

extend the unix timestamp:
input_data.Unix*1000

Get a sum:
input_data.Power_L1+input_data.Power_L2+input_data.Power_L3`

Thats it. I tried to do it all in one step but the transformations did not work on the json paths.

KR
Itchy2

Put a debug node (display complete msg object) on the output of the json node. Then look at the debug output and on the right hand side and a tiny copy icon will appear when the mouse enters the data area of the debug display. Click the icon and paste the result in a response so we can see what the input data looks like.

Hi Zenofmud,

here is the requested output:

INSERT INTO gosdm630.data_test (
	`UniqueDeviceId`, 
	`Timestamp_raw`, 
	`Timestamp`, 
	`Unix_raw`, 
	`Unix_vz`, 
	`ModbusDeviceId`, 
	`Power_L1`, 
	`Power_L2`, 
	`Power_L3`, 
	`TotalPower`, 
	`Voltage_L1`, 
	`Voltage_L2`, 
	`Voltage_L3`, 
	`Current_L1`, 
	`Current_L2`, 
	`Current_L3`, 
	`Cosphi_L1`, 
	`Cosphi_L2`, 
	`Cosphi_L3`, 
	`Import_L1`, 
	`Import_L2`, 
	`Import_L3`, 
	`TotalImport`, 
	`Export_L1`, 
	`Export_L2`, 
	`Export_L3`, 
	`TotalExport`, 
	`THD_L1`, 
	`THD_L2`, 
	`THD_L3`, 
	`THD_AVG`, 
	`Freq` ) 
VALUES( 
	'Meter#1', 
	'2019-03-13T22:21:00.05645879+01:00', 
	(str_to_date(substring_index('2019-03-13T22:21:00.05645879+01:00','.', 1),'%Y-%m-%dT%H:%i:%s')) AS timestamp, 
	1552512060, 
	(1552512060*1000), 
	1, 
	0, 
	23.929502487182617, 
	82.95561218261719, 
	(0+23.929502487182617+82.95561218261719) AS TotalPower, 
	235.19424438476562, 
	236.9258270263672, 
	234.22239685058594, 
	0.10701286047697067, 
	0.29791122674942017, 
	0.5926104187965393, 
	0.07007616758346558, 
	0.3658907115459442, 
	0.6020198464393616, 
	34.974998474121094, 
	64.97099304199219, 
	70.9699935913086, 
	170.91598510742188, 
	0, 
	0, 
	0.001999999862164259, 
	0.001999999862164259, 
	5.079699993133545, 
	4.485335350036621, 
	4.87091064453125, 
	4.888765335083008, 
	49.909854888916016 
	)
;

Looks like it is not performing, executing or whatever the functions within the values list.

Thx and KR
Itchy2

Are you telling me that that is what shows up in the debug node that is attached to the output of the json node? it looks like the output of one of your function nodes.

I don't believe AS aliasname is valid SQL syntax when used within the list of VALUES (...).
Try copying that whole INSERT string into a sql command line utility and see what warnings pop up.

Hi all,

@Zenofmud: yes that was the output of the function node with the sql statement. I got you wrong, sorry.

Here is the output of the debug, attached to the json node:

{"UniqueId":"Meter#1","Timestamp":"2019-03-17T23:10:00.029606437+01:00","Unix":1552860600,"ModbusDeviceId":1,"Power":{"L1":0,"L2":42.22355270385742,"L3":0},"Voltage":{"L1":234.20062255859375,"L2":234.6476593017578,"L3":234.2668914794922},"Current":{"L1":0.07976268231868744,"L2":0.3602589964866638,"L3":0.06430672854185104},"Cosphi":{"L1":0.08792629092931747,"L2":0.5800871849060059,"L3":0.28384318947792053},"Import":{"L1":36.288997650146484,"L2":67.29199981689453,"L3":73.39199829101562},"TotalImport":176.97299194335938,"Export":{"L1":0,"L2":0,"L3":0.001999999862164259},"TotalExport":0.001999999862164259,"THD":{"VoltageNeutral":{"L1":4.305647850036621,"L2":2.613537073135376,"L3":4.187195301055908},"AvgVoltageNeutral":3.923384666442871},"Frequency":49.939571380615234}

@shrickus: That was a real good remark. For sure this was not valid and the reason for building so many steps. But "over night" there came an idea into my mind: Chaning the statement from "values" to a kind of sub select. I'm gonna try and will compare the results.

INSERT INTO  gosdm630.data_test (
	`UniqueDeviceId`,
	`Timestamp_raw`,
	`Timestamp`,
	`Unix_raw`,
	`Unix_vz`,
	`ModbusDeviceId`,
	`Power_L1`,
	`Power_L2`,
	`Power_L3`,
	`TotalPower`,
	`Voltage_L1`,
	`Voltage_L2`,
	`Voltage_L3`,
	`Current_L1`,
	`Current_L2`,
	`Current_L3`,
	`Cosphi_L1`,
	`Cosphi_L2`,
	`Cosphi_L3`,
	`Import_L1`,
	`Import_L2`,
	`Import_L3`,
	`TotalImport`,
	`Export_L1`,
	`Export_L2`,
	`Export_L3`,
	`TotalExport`,
	`THD_L1`,
	`THD_L2`,
	`THD_L3`,
	`THD_AVG`,
	`Freq`
)
SELECT 
   '{{payload.UniqueId}}',
   '{{payload.Timestamp}}',
   (str_to_date(substring_index('{{payload.Timestamp}}','.', 1),'%Y-%m-%dT%H:%i:%s')) AS timestamp,
   {{payload.Unix}},
   ({{payload.Unix}}*1000),
   {{payload.ModbusDeviceId}},
   {{payload.Power.L1}},
   {{payload.Power.L2}},
   {{payload.Power.L3}},
   ({{payload.Power.L1}}+{{payload.Power.L2}}+{{payload.Power.L3}}) AS TotalPower,
   {{payload.Voltage.L1}},
   {{payload.Voltage.L2}},
   {{payload.Voltage.L3}},
   {{payload.Current.L1}},
   {{payload.Current.L2}},
   {{payload.Current.L3}},
   {{payload.Cosphi.L1}},
   {{payload.Cosphi.L2}},
   {{payload.Cosphi.L3}},
   {{payload.Import.L1}},
   {{payload.Import.L2}},
   {{payload.Import.L3}},
   {{payload.TotalImport}},
   {{payload.Export.L1}},
   {{payload.Export.L2}},
   {{payload.Export.L3}},
   {{payload.TotalExport}},
   {{payload.THD.VoltageNeutral.L1}},
   {{payload.THD.VoltageNeutral.L2}},
   {{payload.THD.VoltageNeutral.L3}},
   {{payload.THD.AvgVoltageNeutral}},
   {{payload.Frequency}}   
   
;

Thx and KR
Itchy2