I have a tinydb json file but I noticed that at a certain point it refuses to write more items to the json file and throws an error while parsing because it cut off in the middle of writing an item
This is the traceback. It shows the json parser failing to parse because the writer cut off writing in the middle of an item
Traceback (most recent call last):
File "C:\Program Files (x86)\Python38-32\lib\threading.py", line 932, in _bootstrap_inner
self.run()
File "main.py", line 79, in run
message.process()
File "C:\Users\Administrator\Downloads\dbtest\Login_Message.py", line 45, in process
DataBase.loadAccount(self)
File "C:\Users\Administrator\Downloads\dbtest\DataBase.py", line 9, in loadAccount
user_data = db.search(query.token == str(user.token))
File "C:\Program Files (x86)\Python38-32\lib\site-packages\tinydb\table.py", line 234, in search
docs = [doc for doc in self if cond(doc)]
File "C:\Program Files (x86)\Python38-32\lib\site-packages\tinydb\table.py", line 234, in <listcomp>
docs = [doc for doc in self if cond(doc)]
File "C:\Program Files (x86)\Python38-32\lib\site-packages\tinydb\table.py", line 588, in __iter__
for doc_id, doc in self._read_table().items():
File "C:\Program Files (x86)\Python38-32\lib\site-packages\tinydb\table.py", line 638, in _read_table
tables = self._storage.read()
File "C:\Program Files (x86)\Python38-32\lib\site-packages\tinydb\storages.py", line 125, in read
return json.load(self._handle)
File "C:\Program Files (x86)\Python38-32\lib\json\__init__.py", line 293, in load
return loads(fp.read(),
File "C:\Program Files (x86)\Python38-32\lib\json\__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "C:\Program Files (x86)\Python38-32\lib\json\decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "C:\Program Files (x86)\Python38-32\lib\json\decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 234357 (char 234356)
The reason this happened was because multiple threads were writing to the db file at the same time so I ended up manually writing to multiple files: one for every entry