Codementor Events

How to handle bulk data insertion SQLite + python

Published May 30, 2020
How to handle bulk data insertion SQLite + python

Efficient Handling of Data

When it comes of handling huge amount of data, the most common things that developer always does is to store data in a single manner each SQL statement has a new transaction started for it. This is very expensive, since it requires reopening, writing to, and closing the journal file for each statement. Despite that fact that they can do it in a bulk transaction. Now how do we did this? I’ll show you.

Let’s say you have 20,000 candidate records to be inserted in your database. It really makes sense to consider a bulk transaction right? Sure why not.

1.PNG
I really recommend to use this method for a bulk transaction for at least 20K data and what can I say is very fast and efficient. Such a great news right?.

ANOTHER TRICK

There is also another efficient way that you can use to improve the level of speed during SQLite transactions. In this case you have multiple database writes instances, put them inside a transaction. Instead of writing to (and locking) the file each and every time a write query is executed, the write will only happen once when the transaction completes

2.PNG

Discover and read more posts from Charrel Jame Eramis
get started
post commentsBe the first to share your opinion
Show more replies