I have an API that is auto generated from Swagger, where the response object has a binary field MyBinary
. MyBinary
is defined in Swagger as:
MyReponse:
type: object
properties:
MyBinary:
type: string
format: binary
This generates a C# object:
pubic class MyResponse
{
public byte[] MyBinary {get;set;}
}
When I call this EP in Postman, I see that MyBinary
is returned as base64 encoded string:
{
"MyBinary": "....ydCB1bmQgUXVhcms="
}
Inside there is an image. I would expect the field to contain something like �PNG ���� IHDR�������.....
Why and where is it base64 encoded? I couldn't find neither a decent Swagger description or .net docu on the topic.
The thing is that we have 3 clients of the API: iOS, Android and Web. iOS successfully generates Data
type from Swagger and somehow manages to automatically load the base64 into it, but Android (Java) and Web (TypeScript) throw exceptions saying they expect a binary but receive a string.
Can somebody please explaing to me, why byte[] is converted to base64 in .net, and, if you had such situations, what is the expected way to deal with it with Swagger?
Why and where is it base64 encoded?
Because JSON doesn't support binary data, basically. A JSON document is UTF-8-encoded text, so that can't represent arbitrary binary data without using some sort of binary-to-text encoding such as base64 - and base64 is by far the most common approach for this.
Basically, this is entirely reasonable, in order to create valid JSON.
I would suggest using the byte
format instead of binary
format - it looks like binary
is actually intended to include arbitrary octets, but including those in JSON creates invalid JSON (as well as leading to obvious questions about how the end of the data is represented). The byte
format is explicitly documented as being in base64, so I'd expect that to be the portable solution.