scenario is shown as below,i got many processes to do CPU-bound work and read only on the same database,i know cache and uri key words could be used for sqlite to share database cache between threads,but how about between processes?it's better to be available both for linux and windows, Thank you!
def run(self):
self.phyconn = apsw.Connection(self.fileName)
self.memconn = apsw.Connection(":memory:")
try:#backup.__exit__() just make sure copy if finished,not close backup,so with is good,memconn is still exist when out
with self.memconn.backup("main", self.phyconn, "main") as backup:
# call with 0 to get the total pages
backup.step(0)
total = backup.pagecount
stepped = 0
one_percent = total if total < 100 else total // 100
last_percentage = 0
while stepped <= total:
if self.cancel:
#self.progressCanceled.emit()
self.memconn=None
return
backup.step(one_percent)
stepped = stepped + one_percent
stepped_percentage = stepped*100//total
if stepped_percentage != last_percentage:
last_percentage = stepped_percentage
#self.progressChanged.emit(stepped_percentage)
websocket.UpdateLoadDBProgress(stepped_percentage,self.sid)
This is not possible. The whole point if processes is to isolate their memory from each other. (Most OSes allow shared memory, but there is no portable mechanism for that.)
Even if it were possible, it would not be any faster, because both SQLite and the OS cache the data.