Search code examples
c++ipcabi

C++ data serialization and ABI compatibility


I am writing code for an embedded platform. I have a client library that communicates with a service process over IPC.

I am developing this client library in C++ and the service process. I am reading about ABI as I new to the C++ world.

When I send my data over the IPC, I am already serializing it to a uint8_t byte array.

But I want to persist this data in a local store and redrive later if the IPC fails. I can just write the byte buffer to SQLite, but I don’t quite understand the pros and cons of this decision. Not about use of SQLite or receiving failures, but specifically what format to persist.

If the binary interface changes, even if the serialization and deserialization routines are same in client and receiver process, would there a problem with the storage format? In other words, from ABI lens is it safe to locally persist and re-read as byte payload in the same application, or should I serialize to a more common format (like JSON)? Is there risk that if compiler changed, deserialization of existing data in the DB fails?


Solution

  • there is no problem with saving binary data, so long as the format stays stable across different platforms,

    1. use fixed size types like using int64_t instead of long, because long is not portable.
    2. use the compiler pragmas to remove padding as padding is not portable (or write your own serializer/deserializer and don't dump memory)
    3. byte order, big-endian vs little-endian can be a problem for portability, use network byte order instead or at least make sure you are always using the same byte order everywhere.

    if you get those 3 things correctly you can save binary data, and it would be more efficient in terms of storage and transfer and processing than something like JSON.

    Make sure you tag your structures when saving to disk with a version ie: V1, to make it possible to modify the format in the future without breaking backwards compatibility.