Using NodeGenNode generates a large flow

Hi,
I generated from a swagger file a Node using NodeGen.
When I use that flow, it contains a very large set, one for each rest-api calls and some related value.
Is there any chance to shrink that data? I tried to remove all entries, the flow continues to work, but they get re-created. If I use multiple nodes of that size, it blows up the json file.

I'm not sure if this is related to NodeGen or to NodeRed.

Example (everything starting with postAhapi...)

{
        "id": "7f94b4a1.a54abc",
        "type": "AhapiEmail",
        "z": "d044536a.f72f",
        "service": "a2084bc4.eb4ff8",
        "postAhapiCaseByCaseidEmailexportByEmail_caseid": "",
        "postAhapiCaseByCaseidEmailexportByEmail_caseidType": "str",
        "postAhapiCaseByCaseidEmailexportByEmail_email": "",
        "postAhapiCaseByCaseidEmailexportByEmail_emailType": "str",
...}

Is this a Node-Red issue or a NodeGen issue?
I would be even happy to just know if this is not related to NodeRed Core.

it'll be nodegen - it will create all the possible endpoints for you to use - so if the swagger doc is large - it will be large.

Thank you @dceejay Yes, that's what I was expecting. But if I use this nodegen generated node in node-red, for each endpoint there is an attribute (that is "" or "str"). In the selected node, I select one endpoint, but the others appear as attributes on node-level. I can't understand whats the reason for that.

Well yes. Would need to dig into the source to find out what is prompting that. ( probably both the source openapi and possibly the converter)

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.