Effortless JSON creation from Linux command line output

Edit: corrected install directory
I have been struggling to get data from Linux commands into Node-red as JSON format and stumbled upon this utility.
Maybe other people will find it useful.

jc is a Python based command line utility which understands the output of many Linux commands and formats it as json (or do I mean javascript string?)
To install it sudo pip3 install jc. It’s installed to /usr/local/bin
More info at https://blog.kellybrazil.com/2019/11/26/bringing-the-unix-philosophy-to-the-21st-century/
and https://github.com/kellyjonbrazil/jc

Three ways to get Linux CLI data into Node-red:

  1. You can use the exec node to run a command
    free -m | grep -v total
    This returns a string which you have to parse

  2. Use a shell script to reformat the output as a json string eg using awk

$ cat myawkwrapper.sh
#! /bin/bash
mem=$(free -m | grep -v total | awk '

{printf "{\"type\": \"%s\", \"total\": %d, \"used\": %d, \"free\": %d, \"available\": %d },\n",
$1, $2, $3, $4, $7 }' )
mem=[${mem:0:-1}]

echo $mem

This returns a string which the json node accepts and converts into the object

[
{"type":"Mem:","total":1871,"used":223,"free":1082,"available":1539},
{"type":"Swap:","total":1871,"used":0,"free":1871,"available":0}
]
  1. A much easier option (for commands it supports): jc
    free -m | jc –free

This returns a very similar string which converts to
[
{"type":"Mem","total":1871,"used":221,"free":1084,"shared":24,"buff_cache":565,"available":1541},
{"type":"Swap","total":1871,"used":0,"free":1871}
]

[{"id":"81232669e7d1ad84","type":"tab","label":"Flow 6","disabled":false,"info":"","env":[]},{"id":"4e1464b54c2160e1","type":"inject","z":"81232669e7d1ad84","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"free -m","payloadType":"str","x":390,"y":160,"wires":[["e48fe23e797d5aa1"]]},{"id":"d417f6bdaeab1aaf","type":"inject","z":"81232669e7d1ad84","name":"My awk wrapper","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"/home/pi/myawkwrapper.sh","payloadType":"str","x":360,"y":240,"wires":[["8a32405c33195801"]]},{"id":"8a32405c33195801","type":"exec","z":"81232669e7d1ad84","command":"","addpay":"payload","append":"","useSpawn":"false","timer":"","winHide":false,"oldrc":false,"name":"","x":530,"y":240,"wires":[["bffaa6b12ad23ade"],[],[]]},{"id":"bffaa6b12ad23ade","type":"json","z":"81232669e7d1ad84","name":"Convert to js object","property":"payload","action":"obj","pretty":false,"x":710,"y":240,"wires":[["484d562f92e3e176"]]},{"id":"484d562f92e3e176","type":"debug","z":"81232669e7d1ad84","name":"Version 2","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":900,"y":240,"wires":[]},{"id":"e48fe23e797d5aa1","type":"exec","z":"81232669e7d1ad84","command":"","addpay":"payload","append":"","useSpawn":"false","timer":"","winHide":false,"oldrc":false,"name":"","x":530,"y":160,"wires":[["8d6406fb26c4dc7e"],[],[]]},{"id":"8d6406fb26c4dc7e","type":"debug","z":"81232669e7d1ad84","name":"Version 1","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":900,"y":160,"wires":[]},{"id":"4825cb8dcd1cf607","type":"inject","z":"81232669e7d1ad84","name":"free -m | jc --free","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"free -m | jc --free","payloadType":"str","x":360,"y":320,"wires":[["ba24cafbc46e1c65"]]},{"id":"ba24cafbc46e1c65","type":"exec","z":"81232669e7d1ad84","command":"","addpay":"payload","append":"","useSpawn":"false","timer":"","winHide":false,"oldrc":false,"name":"","x":530,"y":320,"wires":[["b2f1e4ba696a2a3a"],[],[]]},{"id":"b2f1e4ba696a2a3a","type":"json","z":"81232669e7d1ad84","name":"Convert to js object","property":"payload","action":"obj","pretty":false,"x":710,"y":320,"wires":[["bdf50e199fd7a19c"]]},{"id":"bdf50e199fd7a19c","type":"debug","z":"81232669e7d1ad84","name":"Version 3","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":900,"y":320,"wires":[]},{"id":"c7a5e8a8c7a4a7c2","type":"comment","z":"81232669e7d1ad84","name":"Exec free -m directly","info":"","x":350,"y":120,"wires":[]},{"id":"59fdd6f5e291d1ea","type":"comment","z":"81232669e7d1ad84","name":"Shell script to reformat it","info":"","x":330,"y":200,"wires":[]},{"id":"929e31bbf52e5f73","type":"comment","z":"81232669e7d1ad84","name":"Reformat it with jc","info":"","x":350,"y":280,"wires":[]}]
4 Likes

Here is another way to get JSON or XML data back into Node-RED - using a web endpoint that you set up with the http-in/-out nodes. This example is from one of the commands I run on a CRON schedule but you could, of course, trigger manually from an exec node.

#! /usr/bin/env bash
# Fast scan the local network for live devices and record
# to /tmp/nmap.xml which can be used in Node-RED
#

# Run the scan
nmap -sn --oX /tmp/nmap.xml --privileged -R --system-dns --webxml 192.168.1.0/24
# Make sure ownership & ACLs on the output are secure
chown root:home /tmp/nmap.xml
chmod --silent 640 /tmp/nmap.xml
# Trigger the Node-RED update
curl -I 'http://localhost:1880/localnetscan'

You will see that the result of the script is sent back to Node-RED via a curl command at the end of the script. In this case, I use that endpoint to trigger reading and processing the XML file but you could easily do a POST and pass in JSON/XML data.

I think @jbudd's case was to have commands produce usable json output in general and most don't.

That jc python script looks very useful, but can also imagine it is very maintenance heavy with all various distro's and their individual command versions, looks like quite some work went into it already.

2 Likes

I use a couple of scripts.
To send a file to Node-RED:

#!/bin/bash

file_path=$1
# Check if the file exists
if [ -f "$file_path" ]; then
  # Send the POST request with the file as form-data
  curl -X POST -F "file=@$file_path" -F "topic=File received." "http://localhost:1880/curl"
else
  echo "File not found: $file_path"
fi

To send a buffer to Node-RED:

#!/bin/bash

file_path=$1
# Check if the file exists
if [ -f "$file_path" ]; then
  # Send the POST request with the file as form-data
  curl -X POST -F "file=@$file_path" -F "topic=Buffer received." "http://localhost:1880/curl"
else
  echo "File not found: $file_path"
fi

And the flow to receive them:

[
    {
        "id": "eb4c7d4de948d356",
        "type": "tab",
        "label": "unix to red",
        "disabled": false,
        "info": "",
        "env": []
    },
    {
        "id": "28495b0c3d3e8d65",
        "type": "http in",
        "z": "eb4c7d4de948d356",
        "name": "",
        "url": "/curl",
        "method": "post",
        "upload": true,
        "swaggerDoc": "",
        "x": 80,
        "y": 120,
        "wires": [
            [
                "94c62f56dbf19829",
                "b2a4edfd6f561f42",
                "49f172ca86f009fb"
            ]
        ]
    },
    {
        "id": "94c62f56dbf19829",
        "type": "http response",
        "z": "eb4c7d4de948d356",
        "name": "",
        "statusCode": "",
        "headers": {
            "content-type": "application/json"
        },
        "x": 390,
        "y": 80,
        "wires": []
    },
    {
        "id": "b2a4edfd6f561f42",
        "type": "function",
        "z": "eb4c7d4de948d356",
        "name": "file handler",
        "func": "if (msg.payload.topic == \"File received.\") {\n    return { \"payload\": msg.payload.topic, \"file\": msg.req.files[0].buffer.toString(), \"filename\": msg.req.files[0].originalname};\n} else if (msg.payload.topic == \"Buffer received.\") {\n    return { \"payload\": msg.payload.topic, \"file\": msg.req.files[0].buffer, \"filename\": msg.req.files[0].originalname };\n} else {\n    return msg;\n};",
        "outputs": 1,
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 250,
        "y": 160,
        "wires": [
            [
                "b4ecb48dbf9eba9c"
            ]
        ]
    },
    {
        "id": "b4ecb48dbf9eba9c",
        "type": "debug",
        "z": "eb4c7d4de948d356",
        "name": "debug 142",
        "active": true,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "true",
        "targetType": "full",
        "statusVal": "",
        "statusType": "auto",
        "x": 410,
        "y": 160,
        "wires": []
    },
    {
        "id": "49f172ca86f009fb",
        "type": "debug",
        "z": "eb4c7d4de948d356",
        "name": "debug 143",
        "active": true,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "true",
        "targetType": "full",
        "statusVal": "",
        "statusType": "auto",
        "x": 410,
        "y": 120,
        "wires": []
    }
]

Any further processing I'd just do in Node-RED most of the time.

1 Like

@TotallyInformation, @HaroldPetersInskipp perhaps I didn't explain well enough.

It's not about how to get data into Node-red, it's about getting that data in json format without writing any code to parse it.

I changed the title.

1 Like

I see. My intent with these scripts was actually to create the same functionality as well but I didn't quite finish that one yet. I am also going to be using "jc" if possible. Here is the current state:

#!/bin/bash
echo "Script still needs to be fixed to process and send json data"
args="$@"
# Create the JSON object
json_object="{\"payload\":\"$args\"}"

# Send the POST request
curl -X POST -H "Content-Type: application/json" -d "$json_object" "http://localhost:1880/curl"

I only just discovered jc and it's use for formatting output of single commands, I'm hoping that it will simplify my more complex scripts too.

Regarding sending script output to Node-red, I use MQTT because it "just feels" more elegant than HTML.
For sure the invocation is no less cryptic:
mosquitto_pub -h $BROKER -t $TOPIC -m "$output" -u $USER -P $PWD

An update:

I'm finding jc (and jq) less useful than I hoped because it's slooooww.

An example on my Raspberry Pi Zero

time lscpu | jc --kv | jq '{processor: ."Model name",   BogoMIPS, CPUs: ."CPU(s)"}' 
{
  "processor": "Cortex-A53",
  "BogoMIPS": "38.40",
  "CPUs": "4"
}

real    0m0.557s
user    0m0.892s
sys     0m0.061s

versus this much much less intuitive command loosely based on a suggestion from openai

time lscpu | egrep "Model name|BogoMIPS|^CPU\(s\)" | sed -r 's/^(.*):(\s*)(.*)/"\1": "\3",/; $ s/,$/\n}/; s/Model name/processor/; s/CPU(s)/CPUs/; 1i{'
{
"CPU(s)": "4",
"processor": "Cortex-A53",
"BogoMIPS": "38.40"
}

real    0m0.025s
user    0m0.002s
sys     0m0.051s

OK, but I'm not sure I see the point - it is a lot of work for little gain.

For example, if the returned output looks something like x=23,foo="bah", you could get that into Node-RED with a simple GET rather than a post since it works as a set of URL parameters.

For more complex data, I'd use a bash script to format the text output into something reasonably logical that is easily parsed and do the rest in Node-RED where it is easier.

I see the desire for a tool that does it all for you but as you say, it is likely to be slow due to having to handle so many different formats.

1 Like

I suppose it depends what you are used to.
I have never knowingly contructed a "simple GET" so it seems ridiculously complicated to invoke a browser to obtain the output of a command line tool. (if that is what GET means).
Clearly you don't share this point of view. One of the joys of Linux is lots of ways to do the simplest thing.

Hmm. I think that's exactly what I'm doing!

When I first played with Node-red I had no idea what Javascript was. On the other hand I had decades of experience in Unix shells.
For people with Javascript but no Linux knowledge I agree, do everything in Node-red where it's easier.

ps The lscpu command in my example above is simple enough but it's output is not trivial to parse in any language. The jc command does a good, if slow, job of cutting it down to something reasonably logical.


Architecture:        armv7l
Byte Order:          Little Endian
CPU(s):              4
On-line CPU(s) list: 0-3
Thread(s) per core:  1
Core(s) per socket:  4
Socket(s):           1
Vendor ID:           ARM
Model:               3
Model name:          Cortex-A72
Stepping:            r0p3
CPU max MHz:         2000.0000
CPU min MHz:         600.0000
BogoMIPS:            108.00
Flags:               half thumb fastmult vfp edsp neon vfpv3 tls vfpv4 idiva idivt vfpd32 lpae evtstrm crc32

pps
I see that your HTTP and curl is just the delivery method.
I use MQTT and mosquitto_pub.
Not much reason to prefer one over the other.

Not quite - an http-in/-out pair of nodes in node-red creates the simplest of endpoint that can be used to trigger a flow. So a curl/wget from a command line is quick and easy.

Urm, so its built in json formatting not good enough?

home@home:~$ lscpu --json
{
   "lscpu": [
      {"field":"Architecture:", "data":"x86_64"},
      {"field":"CPU op-mode(s):", "data":"32-bit, 64-bit"},
      {"field":"Byte Order:", "data":"Little Endian"},
      {"field":"Address sizes:", "data":"39 bits physical, 48 bits virtual"},
      {"field":"CPU(s):", "data":"4"},
      {"field":"On-line CPU(s) list:", "data":"0-3"},
      {"field":"Thread(s) per core:", "data":"2"},
      {"field":"Core(s) per socket:", "data":"2"},
      {"field":"Socket(s):", "data":"1"},
      {"field":"NUMA node(s):", "data":"1"},
      {"field":"Vendor ID:", "data":"GenuineIntel"},
      {"field":"CPU family:", "data":"6"},
      {"field":"Model:", "data":"69"},
      {"field":"Model name:", "data":"Intel(R) Core(TM) i5-4300U CPU @ 1.90GHz"},
      {"field":"Stepping:", "data":"1"},
      {"field":"CPU MHz:", "data":"1589.783"},
      {"field":"CPU max MHz:", "data":"2900.0000"},
      {"field":"CPU min MHz:", "data":"800.0000"},
      {"field":"BogoMIPS:", "data":"4988.22"},
      {"field":"Virtualization:", "data":"VT-x"},
      {"field":"L1d cache:", "data":"32K"},
      {"field":"L1i cache:", "data":"32K"},
      {"field":"L2 cache:", "data":"256K"},
      {"field":"L3 cache:", "data":"3072K"},
      {"field":"NUMA node0 CPU(s):", "data":"0-3"},
      {"field":"Flags:", "data":"fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid sse4_1 sse4_2 x2apic movbe popcnt aes xsave avx f16c rdrand lahf_lm abm cpuid_fault epb invpcid_single pti tpr_shadow vnmi flexpriority ept vpid ept_ad fsgsbase tsc_adjust bmi1 hle avx2 smep bmi2 erms invpcid rtm xsaveopt dtherm ida arat pln pts"}
   ]
}
home@home:~$

Also, call me old-fasioned but the standard output is a fixed-width column layout which is pretty easy to parse since you can simply split the output at a fixed column and then trim the resulting columns.

Also, it has lscpu -p which specifically produces easily parsable output (a CSV format in fact).

home@home:~$ lscpu -p
# The following is the parsable format, which can be fed to other
# programs. Each different item in every column has an unique ID
# starting from zero.
# CPU,Core,Socket,Node,,L1d,L1i,L2,L3
0,0,0,0,,0,0,0,0
1,0,0,0,,0,0,0,0
2,1,0,0,,1,1,1,0
3,1,0,0,,1,1,1,0
home@home:~$

Not trying to be difficult here. I've never used that command before I don't think but many modern Linux commands deliberately produce text-parsable output and/or have json output switches.

Yes, they do, and are all the better for it.

Amen to that brother! :grin:

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.