I have a python script that retrieves data from a smart plug (to monitor energy consumption from devices). My end goal is to analyze this data with Stream Analytics and therefore I need to send it to a database in Microsoft Azure.
4 Apr 11:57:51 - [info] [Save Blob:Azure Save Blob Storage] Uploading blob... 4 Apr 11:57:51 - [info] [Save Blob:Azure Save Blob Storage] Container 'container5' ready for blob creation 4 Apr 11:57:52 - [info] [Save Blob:Azure Save Blob Storage] Error: ENOENT: no such file or directory, stat 'Thu Apr 4 11:57:51 2019 21 mA 229867 mV 0 mW 556 Wh
Those values ( 21 mA, 229867 mV, 0 mW,556 Wh) are what I want to save in Blob Storage. I know that this node can connect to my Blob Storage account because that container5 is being created every time I press inject. But the data is not coming anywhere. Can anyone offer some help?
UPDATE*******
I have been googling a lot and the only possible problem I could find, is that the PUT and POST Requests need authentication in order to work (for GET I could just set the container in "public" mode and it worked). This means I need to use some kind of authentication in order to use PUT-Request, which is what I need. The problem now is that I cannot find any username or password in the Azure Service (Blob storage), only the access keys, shared access signature (SAS) and not much more. I am not experience so I don't know what to write in the HTTP node in order to do this authentication. Any help?
Hello. This question is more specific and relevant to me. I have already flagged my other questions so that they get deleted (I cannot delete them myself). Please ignore the other two. I need help with this one.
Hi @xoani ... it looks like the root issue is in the Azure Blob node. If you look at the source for the node: azureblobstorage.js it's calling createContainer followed by createBlocBlobFromLocalFile.
It would be convenient if the node supported both sending the payload directly and creating if from a local file, but according to the Readme file on the project:
Azure Blob Storage
Node-Red node to connect to Azure Blob Storage
Ex: 'msg.payload' -> filename that you need to upload. Ex: filename.txt
Use msg.payload to send a file to save on Azure Blob Storage.
Hi @dustinw thanks for your reply I am not a programmer so sorry if I ask dumb stuff. Looking at the code, I see that it only uploads local files right? So you mean that I cannot upload the message.payload directly, I would probably have to modify my python script and store the values in a file, so that this file can get uploaded. Right?
If this is the solution, I have a problem with it because my python script outputs new values every seconds (with a timestamp). So how could I do this? Do I rewrite the same file every seconds and inject the blob node every seconds too?
hi @dustinw and thank you! I think for me option 2 or 3 would be ok because tbh I don't really understand option 1 (it's probably the most advanced, right? ). In the meantime I have found out what my problem is, I can't even access the blob storage cause I need authentication (please see the update I wrote on my question). Thanks. If you have any idea how to do this, please let me know. I am stuck since many days.
The error you added earlier looks like you have successfully authenticated via the API. Have you tried creating a temporary file and putting the path to that temporary file in the payload of the Azure Storage node?
Switching from using the node-red-contrib-azure-blob-storage node to the builtin HTTP node would add a lot of work for authentication, etc.
Hello, you are right. I tried switching to HTTP but the authentication process is very hard so now I am back to this node. It works now! The problem was that I had to create the file manually in /home/pi so that it can be found and uploaded. Now the problem is that I want this file to be created automatically. But I will create another question for this matter. Thanks!