Thanks Bernardo for your reply.
In the first paragraph my question can be broken down into this:
- Say you have a DB with 50 tables, some of which are very large.
- You want to rebuild 1 of those large tables (new structure, new data, same name) in the primary database
- So you do the following at the primary device/database:
- - drop the table
- - then recreate the table structure
- - then load the new data:
- - - either using the application to construct large and optimized insert transactions
- - - or using the SQLite import command to load the data from a local csv file.
The loading of the large amount of data in the primary has to be loaded using constructed commands or built in import feature but it may end up as a lot of data and commands to send through to the secondaries across the internet without the chance of a problem happening while it loading.
So, my thought is - perhaps it is better to make sure the secondary doesn't try to update itself when the data for a table is very large and instead have a way to either 1) send the same new data as a file to be imported in the secondary database locally or 2) have the primary send a .sqlite file just with the table in it and perform the sync with the database locally at the secondary location. These would be attempts at ways to speed the process up but also reduce the chance of errors during a long update process of commands across the internet.
If this could be handled within LiteSync that would be great, but for a moment assuming that ability is not part of LiteSync, it could be that those steps are taken in the application, but we would need a way to make sure LiteSync doesn't attempt the long update for that particular table while we handle in the application. Can you have settings for which tables should be sync'd vs not and then disable a particular table until the external update is complete, then re-enable the syncing for that particular table?