I have a large (700MB+) tab separated file and want to insert this into a sqlite db.
On the commandline I could do a dot command and perform an import using mode tabs or read commands, which is extremely fast.
Let's assume I only have node-red + the sqlite node available.
What could be the best (performant) approach to get the data into the db ?
The sqlite documentation describes that it can perform 50K inserts per second using begin/commit transactions. Can node-red split the file into 50K lines/array elements (which in turn also need some sanitizing while looping over them). Are there bottlenecks I need to take into account somewhere ?