Getting to 3D world with node-red-contrib-web-babylonjs

This is a very early usable draft of my idea, about 3D visualization , may be AR in later phases not now, using node-red based on webGL and babylonjs library



FEAUTRE:
Draw basic mesh Sphere, box, plan , box
Do spatial transformation using websocket

LIMITATION:
Only work localhost, still the websockt hardcoded localhot

3 Likes

Morning Ahmed,

Looks like a VERY useful contribution!

Question 1

I had never heard about babylon.js until now. Are there any special IOT related use cases you had already in mind for your node?

Our friend @dceejay had a proposal some time ago, based on this article:

That article uses three.js and has a nice 3D use case to control lights in a house:

I see that babylon.js also supports lights. So this kind of fancy stuff might be possible with your node?

Question 2

Had no time yet to play with your node, so perhaps this is already possible? Is there any kind of babylon.js web based drawing editor that could be embedded with your node. So we can go from your node directly to an editor, like we integrated DrawSvg into our node-red-contrib-ui-svg node?

Thanks for your time and effort!!!
Bart

1 Like

Hi Bart,

to be honest, I was working on it and hesitated to share it till I have a valid use case , but I shared it to explore with community the use-cases, I was exploring AR as well

but what you propose is really interesting usecase as well

for the editor, yes, there is editor, not sure yet about the capabilities still exploring it as well

1 Like

@BartButenaers I am exploring DrawSvg is very impressive, specially when you click in the screen it will reflect on the node-red
Full of useful idea. thank you very much for referencing it.
May be I can think to add similar features.

Damn, I need more time to explore all this nice stuff :roll_eyes:
P.S. I have also a node-red-contrib-drawsvg node, that allows to host a DrawSvg service for offline installations. Perhaps (in a far future) we could implement a similar node to host an offline BabylonJs editor...

I did not yet have a chance to look at your code. Am I correct that you only use the BabylonJs viewer to render the result? I saw this morning that there is also a BabylonJs editor available. Don't see immediately how you could integrate this into your jQuery config screen. Although I assume it should be possible somehow, because (if I'm not mistaken) they show here how to integrate it within a React web app?

@ahmadsayed,
Finally I had time to have a look at your repository. Now I see that it is a "...-web-..." node instead of a dashboard "...-ui-..." node, so you show the rendering in a separate window. And you have created a separate node for each shape (sphere, plane, ...):

image

It is VERY user friendly this way, since you can just drop new shapes in the flow. But - based on the feedback and feature requests we got from the UI SVG node users - I 'think' this approach will have also some disadvantages:

  • Not possible to create extra shapes dynamically.
  • Due to the large number of available shapes, the palette sidebar on the left will become very large.
  • Is it possible to create multiple (completely separate) scenes, or do all the shapes belong to a single scene?
  • It might be a bit more difficult if you want to add user interaction with a scene. Suppose a user wants to make a sphere (e.g. a light bulb) clickable: then you need to add a wire the output of that 'sphere' node. Which will result in lots of wires.

But don't hesitate to correct me if I'm wrong!!!!!!

Thought this morning (by having a quick look at it) that it was a bit more like the UI SVG node. I mean that there was a single BabylonJs node, which:

  • can dynamically be controlled via input messages:

    {
       payload: {
          command: "create_shape",
          shape: "sphere",
          id: "someUniqueShapeId",
          radius: 25
          ...
       }
    }
    

    And further you could have "delete_shape", "update_shape", ...

  • sends output messages when a clickable shapes would be clicked.

  • contains a large edit box on the config screen, where you can manually enter a the JSON scene definition manually.

  • (Optionally) had some way to open the BabylonJs online/offline scene editor via a button, to allow the JSON scene to be edited via that tool.

Keep up the good work! I'm sure your node has a lot of potential...
Bart

Hi Brat,

You are absolutely right, having a node that build the scene dynamically, in declarative way is on my TODO list.
Let me tell my thinking process, most of the 3D software give the user basic 3D mesh a.k.a primitive mesh, even some tools such as blender has some easter eggs like the monkey head as primitives, as basic directly drag and drop components, to give the user jump start or rapid protoyping still I believe node-red is the best may be the only tools that give this intuitive rapid protoyping, without almost reading any documentation.

ofcourse the 3D tools give some capabilities to morph those primitives later on, but my thinking is if I even add this morphing it will a procedural morphing by code, not artistic (still not in my TODO list as of now honestly)

by providing those basic mesh it will give the user an easy jump start , still a declarative way to provide advanced mesh, can be provided, but in that case will use a standard file format such as gITF, that can be created by professional software tools, and will rendered and controlled by node-red and this is in my TODO list, but this complex mesh will be single unit, with single input and output.

to be honest the idea of making the 3d object clickable just came to me from your initial comment, and it become almost the first in my TODO list, i did not put much thinking on it yet, I am using my 8 years son to test it, whatever he find convenient I do it :slight_smile:

as of now no, but why the scene node exists at all not just add it implicitly ,the scene itself, has some defaults presets for camera and right, again in my TODO list :blush: is to add some properties to control the global lighting as well camera behavior , not as advanced as professional 3D software it will be usecase opinionated presets, that is my thinking

Thank you very much for the support.

That is indeed true! If you would have the time to offer both solutions, then you could support all kind of users. Just be aware that lots of users don't like their palette being stuffed with tons of nodes, especially which they don't use. So perhaps you should (somehow?) try to split it in two modules: i.e. a separate node-red-xxx suite with all the shape nodes?

In the SVG node we have a dual way communication with the shapes:

  • By input messages you can change shapes (e.g. change the color of a sphere, let a light source shine, ...).
  • By output messages you know which event has happened, so when a light source is clicked you could really activate that light bulb in your house.

BTW I see that BabylonJs supports mouse/key/touch events are supported. We will soon (in the svg node version 2.0) release touch events, because lots of users have a Node-RED dashboard running on e.g. a wall-mounted touch screen...

But of course our SVG node is a dashboard node, which means the drawing is rendered in the dashboard. Yours is a normal node, so assume you have based it on the worldmap node from @dceejay? And that can also be displayed in the dashboard (I thought?)??

If you find a female test person, that would even be better to determine the WAF-factor :joy:

When you would ever have that declarative node, you could support a scene per node perhaps?

You are welcome. Don't hesitate to ask for feedback!

1 Like

Just to join in the fun I did get this far using ar.js

It should install manually - but I've not added it to the flows site (as it's a bit clunky still)
Being me - it is indeed based on the same concepts as worldmap - idea being that some of the same messages would "just work"... (for simple blob icons at a location etc). But that you could specify various other primitives and colours etc...

Main problem is that AR.js support is not great on non-phone devices - and then only on some phones - and exposing Node-RED to internet so phones can render it is a pain (though ngrok helps for developing). Secondly due to scaling anything an interesting distance away tended to be very small (invisible)... But hey it works - sort of.

of course that was 3 months ago - it may be much better now :slight_smile:

2 Likes

AR was my initial motivation to make those nodes, and I faced the same issues, with AR I do not know about arjs, if it implements AR natively, but in my case I tried to enable AR, The library I am using depends on webxr (https://www.w3.org/TR/webxr/) so it is expecting ARCore to be installed on the phone, and for windows it expect AR headset to be connected in order to work via windows mixed reality, tried the emulator no luck.

This topic was automatically closed after 60 days. New replies are no longer allowed.