Search code examples
flutterinstallationziparchive

How to deploy initial Data to Flutter App


I am stuck with the following problem: We are writing a library app which contains a bunch of internal documents (roughly 900). The total size of all docs is around 2,5+ GB. I have to find a way how to initialise the app with 2,5GB of Docs on first start. My idea was to create a zip file of the initial load, download it from server and unpack it during setup on first start. However I am not finding a way how to unzip such a big file, as all solutions read the zip to memory first (completely) before writing it to storage. I also do not want to make 900+ calls to our web-server to download the Documents on first start.

Deployment target is iOS and Android, possibly win+mac later on.

Any ideas?


Solution

  • i tested it on flutter linux where Uri.base points to the root of the project (the folder with pubspec.yaml, README.md etc) - if you run it on android / iOS check where Uri.base points to and change baseUri if it is not good location:

    final debug = true;
    final Set<Uri> foldersCreated = {};
    final baseUri = Uri.base;
    final baseUriLength = baseUri.toString().length;
    final zipFileUri = baseUri.resolve('inputFolder/backup.zip');
    
    final outputFolderUri = baseUri.resolve('outputFolder/');
    print('0. files will be stored in [$outputFolderUri]');
    final list = FileList(zipFileUri, debug: debug);
    print('1. reading ZipDirectory...');
    final directory = ZipDirectory.read(InputStream(list));
    print('2. iterating over ZipDirectory file headers...');
    for (final zfh in directory.fileHeaders) {
      final zf = zfh.file;
      final content = zf.content;
    
      // writing file
      final uri = outputFolderUri.resolve(zf.filename);
      final folderUri = uri.resolve('.');
      if (foldersCreated.add(folderUri)) {
        if (debug) print(' #### creating folder [${folderUri.toString().substring(baseUriLength)}] #### ');
        Directory.fromUri(folderUri).createSync(recursive: true);
      }
      File.fromUri(uri).writeAsBytesSync(content);
    
      print("file: [${zf.filename}], compressed: ${zf.compressedSize}, uncompressed: ${zf.uncompressedSize}, length: ${content.length}");
    }
    list.close();
    print('3. all done!');
    

    and here is a List backed by a LruMap that reads data in chunks from your huge zip file:

    class FileList with ListMixin<int> {
      RandomAccessFile _file;
      LruMap<int, List<int>> _cache;
      final int maximumPages;
      final int pageSize;
      final bool debug;
    
      FileList(Uri uri, {
        this.pageSize = 1024, // 1024 is just for tests: make it bigger (1024 * 1024 for example) for normal use
        this.maximumPages = 4, // maybe even 2 is good enough?
        this.debug = false,
      }) {
        _file = File.fromUri(uri).openSync();
        length = _file.lengthSync();
        _cache = LruMap(maximumSize: maximumPages);
      }
    
      void close() => _file.closeSync();
    
      @override
      int length;
    
      int minIndex = -1;
      int maxIndex = -1;
      List<int> page;
      @override
      int operator [](int index) {
        // print(index);
    
        // 1st cache level
        if (index >= minIndex && index < maxIndex) {
          return page[index - minIndex];
        }
    
        // 2nd cache level
        int key = index ~/ pageSize;
        final pagePosition = key * pageSize;
        page = _cache.putIfAbsent(key, () {
          if (debug) print(' #### reading page #$key (position $pagePosition) #### ');
          _file.setPositionSync(pagePosition);
          return _file.readSync(pageSize);
        });
        minIndex = pagePosition;
        maxIndex = pagePosition + pageSize;
        return page[index - pagePosition];
      }
    
      @override
      void operator []=(int index, int value) => null;
    }
    

    you can play with pageSize and maximumPages to find the optimal solution - i think you can start with pageSize: 1024 * 1024 and maximumPages: 4 but you have to check it by yourself

    of course all of that code should be run in some Isolate since it takes a lot of time to unzip couple of GB and then your UI will freeze, but first run it as it is and see the logs

    EDIT

    it seems that ZipFile.content has some memory leaks so the alternative could be a "tar file" based solution, it uses tar package and since it reads a Stream as an input you can use compressed *.tar.gz files (your Documents.tar had 17408 bytes while Documents.tar.gz has 993 bytes), notice that you can even read your data directly from the socket's stream so no need for any intermediate .tar.gz file:

    final baseUri = Uri.base;
    final tarFileUri = baseUri.resolve('inputFolder/Documents.tar.gz');
    
    final outputFolderUri = baseUri.resolve('outputFolder/');
    print('0. files will be stored in [$outputFolderUri]');
    final stream = File.fromUri(tarFileUri)
      .openRead()
      .transform(gzip.decoder);
    final reader = TarReader(stream);
    print('1. iterating over tar stream...');
    
    while (await reader.moveNext()) {
      final entry = reader.current;
      if (entry.type == TypeFlag.dir) {
        print("dir: [${entry.name}]");
        final folderUri = outputFolderUri.resolve(entry.name);
        await Directory.fromUri(folderUri).create(recursive: true);
      }
      if (entry.type == TypeFlag.reg) {
        print("file: [${entry.name}], size: ${entry.size}");
        final uri = outputFolderUri.resolve(entry.name);
        await entry.contents.pipe(File.fromUri(uri).openWrite());
      }
    }
    print('2. all done!');