Search code examples
pythonredistornado

Storing gzipped data in redis


I have a huge python dictionary that i want to save to redis cache and then have an API handler return this dictionary straight from cache

Im using gzip to compress the stringified dict first before storing in cache

 transformed_object = {...big dictionary}

    byte_object = BytesIO()

    data = json.dumps(transformed_object)
    with gzip.GzipFile(fileobj=byte_object, mode="w") as f:
        f.write(data.encode())

    final_data = byte_object.getvalue()

I write this to Redis cache

context.redis.set(COMPLETE_GZIPPED_CACHE, final_data)

I have an API handler where I want to return the gzipped data

    cache_list = redis.get(COMPLETE_GZIPPED_CACHE)
    self.finish(
        {
            "status": True,
            "cache_list": cache_list,
            "updated_at": datetime.datetime.now(),
        }
    )

The problem is I'm getting the below error

TypeError: Object of type 'bytes' is not JSON serializable

Do i need to decode the bytes first back to string before returning to the frontend? ideally i would like the frontend to handle the decoding

Is there a better way to do this?


Solution

  • Figured it out from other posts - wrote a function like this and opted to use zlib

    def convert_to_gzip_format(dict):
        stringified_object = json.dumps(dict).encode("utf-8")
        compressed_file = zlib.compress(stringified_object)
    
        base64_string = base64.b64encode(compressed_file).decode("ascii")
        return base64_string
    

    This saves it as an ascii string to redis. I then use pako.js in the frontend to decode the above into a readable string.