Building an image via Github-Actions with flow provided by projects

So right now our organization is using portainer to get the last container from DOCR repository, edit it and push it back.
The thing is when we need to edit something quickly, we would just edit it on the deployed instance, meaning if the node-red crashes all of the changes are gone.

So i looked in to projects, and how we can utilize one node-red instance (running projects) to be able to edit all of the deployed instances. And figured out i can use the Gh Actions to compile our flows & other stuff to a docker image and send it off to DOCR.

This is the deploy.yaml file i came up with:

name: Deploy Node-RED

on:
  push:
    branches:
      - main

env:
  DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
  DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}

jobs:
  deploy:
    runs-on: ubuntu-latest

    steps:
      - name: Checkout code
        uses: actions/checkout@v2

      - name: Login to Docker registry
        run: echo $DOCKER_PASSWORD | docker login registry.digitalocean.com -u $DOCKER_USERNAME --password-stdin

      - name: Get Docker image
        run: |
          docker pull nodered/node-red:latest

      - name: Build and push Docker image
        run: |
          # Extract the base image into a container
          
          docker run -d --name temp-container nodered/node-red:latest
          docker cp package.json temp-container:/data/package.json 
          docker cp flows.json temp-container:/data/flows.json
          docker cp flows_cred.json temp-container:/data/flows_cred.json
          docker cp settings.js temp-container:/data/settings.js
          

          
          # Build the new image using the modified Node-RED data directory
          docker commit temp-container registry.digitalocean.com/lsolutions/node-red-ec-scraping:latest
          docker images -a 
          # Push the image to the registry
          
          docker push registry.digitalocean.com/lsolutions/node-red-ec-scraping:latest

          # Clean up the temporary directory
          docker stop temp-container
          rm -rf temp

Is this a good way of doing this? Just finished the file and didn't had time to test it out using any community nodes?
Wanted to use Github packages registry, but i can't use any other registries in Digital ocean ,except theirs and DockerHub's.
Also please suggest a new name for this topic so anyone who needs this can find it easily. Thank you!

I am uncertain of your goal but it reads like you want to run node-red, containerised with persistent flows? In digital ocean? And you are trying to achieve this by dynamically recreating the image ?

I can't help you with that but I can offer a ready made solution. FlowForge was built for this kinda thing & you can run the open source version on digital ocean (or locally, or in docker, or k8s): Docker on Digital Ocean • FlowForge Docs

Of course, you could also simply take advantage of one of the cloud providers and it is all done for you/hassle free. FlowForge offers that too. (Disclaimer: I work for FlowForge - but it is open source so you can do this for free if you wish)

1 Like

Thanks, had a look @ Flow Forge previous month, and it is too expensive for our usage.

The idea was:
To have one node-red instance to edit flows, and push changes to Github

From github we are packaging it and sending it off to DOCR where it gets auto deployed.
All of our 'projects' have their own settings.js which prohibits users to change flows from deployed instances

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.