I started with smarthome automation a few weeks ago.
That's why I'm still more of a newbie in almost all respects.
First, I used a Tasmota sensor to transfer the data from my electricity meter via the Mosquitto Broker and Node-RED to an InfluxDB2 and then visualized it using Grafana.
Everything runs on a RaspberryPi 5. All programs on actual version.
Last week, I then used a bash script to read the data from my inverter on my balcony power plant and transferred it in the same way (MQTT Broker, Node-RED) to another bucket in InfluxDB2 and then visualized it with Grafana.
Everything was ready by Friday evening.
On Saturday and Sunday, the data was transferred when the inverter went online until it went offline.
Without any problems. All data arrived in the InfluxDB2 in the bucket.
This morning there was nothing. While searching for the error, I found it in Node-RED.
“HttpError: failure writing points to database: partial write: field type conflict: input field “current_power” on measurement “grid” is type float, already exists as type string dropped=1”
If I understand this correctly, then the value cannot be inserted because it is of type float but is of type string in the database.
I don't understand that!
I haven't changed anything since Friday evening. At no point. Everything worked until Sunday evening and then the error on Monday morning.
The values in the database are clearly floating values.
So where does the error come from?
And how do I get it to work again?
What information can/should I provide to find a solution?
Add a debug node showing what you are sending to the database and paste the result here please. Make sure that the debug output relates to a write that generates the error.
Also show us how you can see what the existing datatype is for that field in the database.
First screenshot from 2 debug nodes. Debug 9 shows original message comming from MQTT broker. Debug 8 show the output which are send to the Influx interface.
This is unfortunately, not an uncommon issue with InfluxDB. Once you write a value to a measure it will always have to be of the same data type.
You can find references to this on the InfluxDB forum. I've personally tended to recreate the table rather than trying to mess around but your mileage may vary.
To avoid the problem after hitting it a few times, I have a standard flow that all my InfluxDB outputs go via. That flow enforces that tags are always strings and values are always numbers, anything that doesn't meet that is not sent to the db.
OK. Sounds interessting. At the moment I´ve only data from two days. It´s no problem to restart the things completely.
Can you tell me a little bit more how can I ensure the things you do?
/**
* Validate input suitable for the influxdb-out node.
* We will ALWAYS use the "array containing two objects" payload
* OR the singe Object payload (if no tags being used).
* See the Description tab for more details.
*/
// check measurement field is set - if not exit with error
if ( ! msg.measurement ) {
node.error('msg.measurement is missing')
node.send([null,msg])
return
}
let fields,tags
// if payload is an object, assume that it contains fieldName:fieldValue pairs
if ( msg.payload!== null && msg.payload.constructor.name === 'Object' ) {
fields = msg.payload
} else if ( msg.payload!== null && msg.payload.constructor.name === 'Array' ) {
node.error('msg.payload cannot be an array. It must be an object containing fieldName:fieldValue pairs or a single value (which would be written to the `value` field name).')
node.send([null,msg])
return
} else {
// Otherwise, we always use 'value' as the default field name
fields = {'value': msg.payload}
}
const lstFields = Object.keys(fields)
// check to make sure that there is a value field name - if not, continue but with a warning
if ( ! lstFields.includes('value') ) {
// Lets us turn off the warning if we know what we are doing :-)
if ( msg.noValueField !== true )
node.warn('Default field name "value" not present, was that deliberate? Set msg.noValueField=true or use the `value` field name to avoid this msg')
}
// check to make sure that all field values are numeric - if not, exit with a warning
let allNumeric = true
lstFields.forEach( key => {
// I use On/Off for simple switch values in MQTT but these are not useful
// in InfluxDB, so translate them to ON=1 and OFF=0 (ignoring case).
try {
if ( fields[key].toLowerCase() === 'on' ) fields[key] = 1
if ( fields[key].toLowerCase() === 'off' ) fields[key] = 0
} catch (e) {}
// then check to make sure the field is actually a number
if ( parseFloat(fields[key]) !== fields[key] ) {
node.error(`Field msg.payload.${key} is not numeric. Only use numbers for field values, text should go in tags. Value=${fields[key]}`)
allNumeric = false
node.send([null,msg])
return
}
})
// @ts-ignore
if ( allNumeric === false ) {
return
}
// check to make sure that if msg.tags is present, it is an object - if not, exit with a warning
if ( msg.tags ) {
if ( !(msg.tags!== null && msg.tags.constructor.name === 'Object') ) {
node.error('msg.tags is not an object - it must contain tagName:tagValue pairs')
node.send([null,msg])
return
}
tags = msg.tags
}
// Format the output to go to the InfluxDB out node
if ( msg.tags ) {
msg.payload = [
fields,
tags,
]
} else {
msg.payload = fields
}
return msg;
You would probably need to adjust for your own requirements.
So I´ve reduced the information into the bash script sending to MQTT broker so that I´ve only the important ones. Also reduce the flow and sending the MQTT in message directly to the database.
Just the same.
What I don´t understand is the last part of the error message: ...already exists as type string..
I could not found in the database that the value for current_power is definded as string.
I´ve adapted the functional node. At the end I get the same object for inserting in the database and the same result.
Possible to prevent with that functional node in the future the problem but don´t solve actual problem.
In the influxdb node you have specified the bucket test, but in the screenshot you posted earlier to check the field types you appear to be looking at a different bucket.
So I´ve solved the problem.
Using a bash script to dump my data from the original bucket into a temp. and force numeric values during the dump.
Then deleting data in the original Bucket and redump temp. Bucket back into the original Bucket.
Implementing the function node from @TotallyInformation to ensure numeric values only in future.
Thanks for the code @TotallyInformation.
Hope that it will now works for longer time