Incorrect DB lookup issue

I'm having a weird problem with one of our apps, as follows:

  1. The app receives an email, it is parsed by a function node and the from address is logged using node.warn.
  2. The same function node then creates a message with a topic/payload combination for query/parameter to pass to a mysql db node. The only parameter in the query is the from email address which was logged.
  3. Some other data is placed in the object msg.data.
  4. The message is passed to the mysql db node.
  5. A second function node parses the result of the db select query.
  6. Assuming the from email address was found (if not it stops and returns null), the second function node again writes to the log, printing the email address along with a matched accountid/userid pair.
  7. The from address comes from msg.data object and the accountid/userid come from the msg.payload[0] object (i.e. the data returned from the db query).

During a routine check of the logs, I noticed occurrences where the second function node printed incorrect account/user details for the email address. Note this is all written to the log in one command.

For example:

Jul  1 07:42:06 prod01 Node-RED[1370158]: 1 Jul 07:42:06 - [warn] [function:Parse] inbound email: <email 1> matched 4/46     <-- correct match for email 1
Jul  1 07:42:34 prod01 Node-RED[1370158]: 1 Jul 07:42:34 - [warn] [function:Parse] inbound email: <email 2> matched 4/46     <-- incorrect match for email 2 (matches previously returned result for email 1)
Jul  1 07:45:21 prod01 Node-RED[1370158]: 1 Jul 07:45:21 - [warn] [function:Parse] inbound email: <email 1> matched 5/52     <-- incorrect match for email 1 (but does match email 2)
Jul  1 07:45:22 prod01 Node-RED[1370158]: 1 Jul 07:45:22 - [warn] [function:Parse] inbound email: <email 1> matched 4/46     <-- correct match for email 1

After an occurrence like this, it goes back to normal and returns the correct results.

Obviously the data is mixed up somehow and the mix up is always to transpose two query results that are around the same time. However, I'm at a loss as to how this could occur. All I can think of is that somehow the db query result is getting out of sync with the message flow, but I'm not sure how that could occur.

Obviously I could fix it by forcibly queuing the inbound emails and releasing after the db lookup but this would throttle everything and slow the app down.

Any advice on how the mysql db node works and whether the query responses are supposed to be synchronous or not or any other suggestions as to potential cause/fix would be appreciated.

I should have added that I'm running node-red 2.2.2 and node-red-node-mysql 1.0.1.

Are you using any context variables to store data related to a specific message, as a flag maybe?

Currently I’m storing data in an object in the same message. But from the context, would a better solution be to store that data in a context variable and link it back that way?

Would linking it using _msgid be the correct method to bring it back together?

would a better solution be to store that data in a context variable and link it back that way?

I think probably not.

It's hard to imagine how data can get switched between messages if it's all contained in it's specific message.

I just can't work out what could cause this message sent from the first of 3 nodes (shown below)...

var res = {
    topic: "select <query> where user.email = ?",
    payload: [ from ],
    filename: path,
    data: {
        to: to,
        from: from,
        subject: msg.payload.subject,
        body: body[0],
        attachment: attach
    }
}

Screen Shot 2022-07-07 at 9.36.54 am

Could result in a incorrect data matches at the third node.

The only thing I can think of is that the db lookup is a promise and the messages are getting muddled somehow.

#perplexing

Can you select and export those three nodes please? I would like to understand what parts of the data might be references - so I can suggest where cloning might solve the issue.

Also, could you add logging from each of those three nodes to separate log files? I would recommend the node-red-contrib-flogger node for this. Once you find another instance of this problem then look at these separate log files.

Note. By virtue of adding logging to the output of these three nodes could actually fix your problem due to the mechanics of how node red sends messages. What I mean by that is when there are two wires connected to a node output, the outgoing message will be cloned.

1 Like

Here is the export.

flows.json (6.7 KB)

Since I originally posted, I have modified the third node to detect the issue, stop and log a specific error. The error now goes to a second output which emails me but otherwise everything else is the same.

I have also added a flogger node to the output of each of the 3 nodes and will post if/when the issue occurs again.

ok, so I see very little "wrong" (some undeclared variables, lots of setting references of objects and arrays to other properties/variables) but nothing inherently wrong.

i guess you are NOT using monaco editor (since it would have warned you about the non declared vars and the usage of env which is a node-red internal/global name). I recommend editing your settings.js file & setting the code editor choice to "monaco", fix up your missing and overlapping named variables.

Additionally, I would propose you start each function with a clone of the msg object so that you can be sure data in the function cannot change due to "by reference" issue from the outside. Add const msgClone = RED.util.cloneMessage(msg) to the top of your functions & replace all occurrences of msg. in your functions to msgClone.

One other thing I would recommend, switch from using var to const (and where const is not possible, use let) - this will catch issues where a var is re-declared.

Other issues i mentioned - highlighted by monaco...
chrome_li5TCvOnsw

1 Like

Yeah I'm still pretty new at this... and learning by trial and error to some extent. I'll make those changes and see what comes regarding the db lookup mismatch.

Thanks a lot for your help.