Extract numbers from image

I'm not sure what the terminology is, but I'd like to setup a flow to extract the hydro meter readings from a live view camera. I already know how to pull in still images from the camera but the meter cycles through the three attached images with a blank display inbetween. It's a bidirectional meter so I want to read both values and differentiate between them. Can node red be setup to do this?

Hi @m_elias,
Can node-red-contrib-tesseract be used for this?
This node uses tesseract.js.
I have not used this node myself!!
Bart

The process is OCR. it stands for optical character recognition.

If you use node-red contrib image tools, you can crop the image to the LCD part (making the OCR do less work& focusing it only on the region of interest) this should be fairly reliable.

1 Like

I tested it today with my sample images. I cropped them with the image tools node, then fed that into the tesseract node, it sees no text.

2 cropped 12 cropped

{"text":"","confidence":0,"lines":[]}

Are you passing a buffer to the tesseract node?

Share you flow & I'll give it a go later.

Yes. At first I had the crop node set to output an image and then the tesseract node never output any results. Then I changed the crop node output to binary and the tesseract node outputs data but it doesn't appear to identify any text. Could it be the LCD segments are a problem?

I've setup my flow to pull the images from my local web server in preparation to pull them from an IP camera later. I'm not sure how to share the flow so that you can test it but here it is the way I'm using it and I guess you can use the images I've shared on this thread.

[{"id":"41f142ae.569b1c","type":"jimp-image","z":"4c195d90.05acd4","name":"","data":"payload","dataType":"msg","ret":"buf","parameter1":"x","parameter1Type":"msg","parameter2":"y","parameter2Type":"msg","parameter3":"w","parameter3Type":"msg","parameter4":"h","parameter4Type":"msg","parameter5":"","parameter5Type":"msg","parameter6":"","parameter6Type":"msg","parameter7":"","parameter7Type":"msg","parameter8":"","parameter8Type":"msg","sendProperty":"payload","sendPropertyType":"msg","parameterCount":4,"jimpFunction":"crop","selectedJimpFunction":{"name":"crop","fn":"crop","description":"crop to the given region","parameters":[{"name":"x","type":"num","required":true,"hint":"the x coordinate to crop form"},{"name":"y","type":"num","required":true,"hint":"the y coordinate to crop form"},{"name":"w","type":"num","required":true,"hint":"the width of the crop region"},{"name":"h","type":"num","required":true,"hint":"the height of the crop region"}]},"x":430,"y":120,"wires":[["18814ede.a4c5f1"]]},{"id":"9a6e37f4.668ad8","type":"http request","z":"4c195d90.05acd4","name":"get image","method":"GET","ret":"bin","paytoqs":"ignore","url":"","tls":"","persist":false,"proxy":"","authType":"","x":300,"y":120,"wires":[["41f142ae.569b1c"]]},{"id":"a23b5433.cd9f88","type":"inject","z":"4c195d90.05acd4","name":"test.jpg","props":[{"p":"x","v":"55","vt":"num"},{"p":"y","v":"316","vt":"num"},{"p":"w","v":"253","vt":"num"},{"p":"h","v":"56","vt":"num"},{"p":"url","v":"http://192.168.12.251/misc/test.jpg","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","x":130,"y":80,"wires":[["9a6e37f4.668ad8"]]},{"id":"4f83f2c0.a20034","type":"debug","z":"4c195d90.05acd4","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","x":810,"y":120,"wires":[]},{"id":"18814ede.a4c5f1","type":"image viewer","z":"4c195d90.05acd4","name":"","width":160,"data":"payload","dataType":"msg","x":550,"y":120,"wires":[["b3f9b3f2.c4a768"]]},{"id":"55acb0b4.b0aea","type":"inject","z":"4c195d90.05acd4","name":"2.jpg","props":[{"p":"x","v":"47","vt":"num"},{"p":"y","v":"311","vt":"num"},{"p":"w","v":"309","vt":"num"},{"p":"h","v":"89","vt":"num"},{"p":"url","v":"http://192.168.12.251/misc/2.jpg","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","x":130,"y":120,"wires":[["9a6e37f4.668ad8"]]},{"id":"d8e5a77c.6ec4f8","type":"inject","z":"4c195d90.05acd4","name":"12.jpg","props":[{"p":"x","v":"86","vt":"num"},{"p":"y","v":"339","vt":"num"},{"p":"w","v":"252","vt":"num"},{"p":"h","v":"53","vt":"num"},{"p":"url","v":"http://192.168.12.251/misc/12.jpg","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","x":130,"y":160,"wires":[["9a6e37f4.668ad8"]]},{"id":"b3f9b3f2.c4a768","type":"tesseract","z":"4c195d90.05acd4","name":"","language":"eng","x":680,"y":120,"wires":[["4f83f2c0.a20034"]]}]

I tried a test image using a LCD type font & a more normal font (Calibri) and the Calibri numbers work but the LCD font does not work so I think it's the LCD segments causing problems.

Could you try to convert the image to black & white, it may need some more contrast.

Indeed it seems that tessaract has difficulties to recognize 7-segment displays. It also seems you can train tessaract to make this possible, but I don't know how that works... And some others apply image preprocessing, like @bakman2 suggests. Like e.g. in the digital_display_ocr.py file of this project.

Or you can drop the whole tessaract node, and implement some kind of recognition yourself, e.g. like with this trick based on pixel counting. Perhaps you can find a javascript variant somewhere..

Thanks for the suggestions, I see there are some good examples/data on doing this but I'm unsure how to implement it in NR and it looks like it'll exceed my abilities/time for this project. Thanks again.

Hi Matthew,
I found here some preprocessing steps that can be used to fill the gaps between the segment lines, in case you would ever need it. It is based on Opencv, but I assume this kind of image processing might also be possible with the node-red-contrib-image-tools nodes from @Steve-Mcl .

Or if you ever want to continue with your project in the future, perhaps you can use this trick to create a simple decoder? E.g. count the number of pixels at some coordinates on your camera image...

Probably...

vpJgXFrVaB

1 Like

Why not install tesseract standalone, you will be more flexible for testing out in tesseract (train it, fix bug) before hooking in NR.
Then you hook it to NR through command line. I used this method for some external tool without writing a new node

Or make it even more flexible and useful as a service, via MQTT

As a matter of fact, you could also select the "lazy" way and use an online service. In this case I played a bit with AWS Rekognition, doing text analysis (Rekognition automatically detects and extracts text in your images)

In the flow below I inject the sample images as provided by @m_elias, make some image manipulation using the great nodes provided by @Steve-Mcl, basically setting some threshold and inverting them. Here I am not certain what would be the best methods to use but so far they work. Finally I feed the images to a Python script via MQTT

The Python script basically does two main things:

  • it sends the image buffer to AWS for analyse
  • returns the result back to Node-RED via MQTT

To make this wonder happen, you need to register an account with AWS. If I remember, it is not that expensive to use but it depends how frequent you plan to do the meter readings :wink:

To setup this, in this case on a RPI, you need to install certain stuff

  • install the Mosquitto MQTT broker unless you have it installed already

For Node-RED:

  • node-red-contrib-image-tools and then import the flow below. If you have the MQTT broker on the same RPi as I have, it's fine, otherwise you have to change the configuration for the MQTT nodes

For Python:

  • paho MQTT client, run sudo pip3 install paho-mqtt, read here: paho-mqtt · PyPI
  • boto3, run sudo pip3 install boto3 and configure the credentials for AWS
    Read "how to" here: boto3 · PyPI
    (I skipped the virtual environment, I installed boto3 directly in a "normal" way)

Editing the script:

  • change the file extension from .txt to .py and save the file in your /home/pi directory
  • if your MQTT broker is not installed in the same RPi, then you have to change the ip address for it inside the script

To start the script, type python3 awsReadText.py on the command line in a terminal

To terminate the script, use the button in the Node-RED flow

Hope I covered the whole

EDIT: For sure I missed to tell you that you have to have the images available for testing as well. What I did was, I simple copied them and saved them with the chosen names in my /home/pi directory. So this you will have to do in addition until you have the interface for the image capturing connected instead


[{"id":"b73aca11.4cdb98","type":"inject","z":"4e2296ba.635af8","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"/home/pi/meter2.jpg","payloadType":"str","x":170,"y":280,"wires":[["bb7d4b0.4eee0b8"]]},{"id":"bb7d4b0.4eee0b8","type":"jimp-image","z":"4e2296ba.635af8","name":"","data":"payload","dataType":"msg","ret":"img","parameter1":"70","parameter1Type":"num","parameter2":"255","parameter2Type":"num","parameter3":"true","parameter3Type":"bool","parameter4":"","parameter4Type":"msg","parameter5":"","parameter5Type":"msg","parameter6":"","parameter6Type":"msg","parameter7":"","parameter7Type":"msg","parameter8":"","parameter8Type":"msg","sendProperty":"payload","sendPropertyType":"msg","parameterCount":3,"jimpFunction":"threshold","selectedJimpFunction":{"name":"threshold","fn":"threshold","description":"apply one or more functions","parameters":[{"name":"max","group":"options","type":"num","required":true,"hint":"max value of byte 0 ~ 255"},{"name":"replace","group":"options","type":"num","required":false,"hint":"replace with byte 0 ~ 255. Default is 255"},{"name":"autoGreyscale","group":"options","type":"bool","required":false,"hint":"default is true"}]},"x":370,"y":260,"wires":[["b3b7ec90.139e9"]]},{"id":"35ab66e0.14586a","type":"image viewer","z":"4e2296ba.635af8","name":"","width":160,"data":"payload","dataType":"msg","x":590,"y":320,"wires":[[]]},{"id":"d953ac2.3e83e5","type":"inject","z":"4e2296ba.635af8","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"awsReadText_close","payloadType":"str","x":170,"y":380,"wires":[["d08bdc79.2c53b"]]},{"id":"d08bdc79.2c53b","type":"mqtt out","z":"4e2296ba.635af8","name":"","topic":"aws_in","qos":"","retain":"","broker":"75eba16c.094f9","x":370,"y":380,"wires":[]},{"id":"b3b7ec90.139e9","type":"jimp-image","z":"4e2296ba.635af8","name":"","data":"payload","dataType":"msg","ret":"buf","parameter1":"","parameter1Type":"msg","parameter2":"","parameter2Type":"msg","parameter3":"","parameter3Type":"msg","parameter4":"","parameter4Type":"msg","parameter5":"","parameter5Type":"msg","parameter6":"","parameter6Type":"msg","parameter7":"","parameter7Type":"msg","parameter8":"","parameter8Type":"msg","sendProperty":"payload","sendPropertyType":"msg","parameterCount":0,"jimpFunction":"invert","selectedJimpFunction":{"name":"invert","fn":"invert","description":"invert the image colours","parameters":[]},"x":370,"y":320,"wires":[["35ab66e0.14586a","d08bdc79.2c53b"]]},{"id":"e17a0353.f60b9","type":"inject","z":"4e2296ba.635af8","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"/home/pi/meter.jpg","payloadType":"str","x":170,"y":230,"wires":[["bb7d4b0.4eee0b8"]]},{"id":"3617c0cb.5eede","type":"mqtt in","z":"4e2296ba.635af8","name":"","topic":"aws_out","qos":"2","datatype":"auto","broker":"75eba16c.094f9","x":140,"y":490,"wires":[["1cd9dea.a554d21"]]},{"id":"1cd9dea.a554d21","type":"debug","z":"4e2296ba.635af8","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"false","statusVal":"","statusType":"auto","x":370,"y":490,"wires":[]},{"id":"75eba16c.094f9","type":"mqtt-broker","name":"","broker":"127.0.0.1","port":"1883","clientid":"","usetls":false,"compatmode":true,"keepalive":"60","cleansession":true,"birthTopic":"online","birthQos":"0","birthPayload":"BULB-1/LWT","closeTopic":"","closeQos":"0","closePayload":"","willTopic":"offline","willQos":"0","willPayload":"BULB-1/LWT"}]

awsReadText.txt (1.8 KB)

4 Likes

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.