Sqlite large file import - best practice?

I have a large (700MB+) tab separated file and want to insert this into a sqlite db.

On the commandline I could do a dot command and perform an import using mode tabs or read commands, which is extremely fast.

Let's assume I only have node-red + the sqlite node available.

What could be the best (performant) approach to get the data into the db ?

The sqlite documentation describes that it can perform 50K inserts per second using begin/commit transactions. Can node-red split the file into 50K lines/array elements (which in turn also need some sanitizing while looping over them). Are there bottlenecks I need to take into account somewhere ?

Anyone has ideas ?

Currently I am running a bash script, but would like some more node-red control when processing.

What about using an exec node?

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.