I store a game world (2D array of 5,000 x 5,000 int between -100 and 900) in an SQLite database:
CREATE TABLE map (x SMALLINT, y SMALLINT, h SMALLINT);
With VACCUM
after INSERT
it takes 500MB. I can compress (lzma) the file to 35-40MB but to use it I need to unzip. How do I store that so it could still run on a Raspberry Pi? A 2D array of int, or a list of 3-int tuples? Speed is not important but RAM and file size are.
I finally used the HDF5 file format, with pyTables. The outcome is a ~20MB file for the exact same data, directly usable by the application. Here is how I create it:
import tables
db_struct = {
'x': tables.Int16Col(),
'y': tables.Int16Col(),
'h': tables.Int16Col()
}
h5file = tables.open_file("my_file.h5", mode="w", title='Map')
filters = tables.Filters(complevel=9, complib='lzo')
group = h5file.create_group('/', 'group', 'Group')
table = h5file.create_table(group, 'map', db_struct, filters=filters)
heights = table.row
for y in range(0, int(MAP_HEIGHT)):
for x in range(0, int(MAP_WIDTH)):
heights['x'] = x
heights['y'] = y
heights['h'] = h
heights.append()
table.flush()
table.flush()
h5file.close()