I've a solution which is composed by a server and a client application.
The problem is that most of users are connected through a small GPRS connection, and it's pretty slow to sync data with the server, so I'm searching some ways to reduce size of the exchanged data.
Actually the application use an HTTP connection with a TextMessage encodings.
I saw that I can encode this with the binary encodings, which will saves me some time, and I also just saw that there is a gzip encoder: http://msdn.microsoft.com/en-us/library/ms751458.aspx .
In the sample, it has a textMessage inner encoder, but I was wondering if there is any reason to don't use a binary inner encoding?
Is this against productive to do an binary encoding before gzipping it?
You would have to profile the various options with your specific data to answer that. For example, I do a lot of work with protobuf-net data (an unrelated binary serializer), and whether that compresses depends on the exact data. If the data is heavily text based, or has a lot of repeats, then maybe it will compress quite well, even after binary serialization. However, in many cases attempting to gzip the data will cause it to increase in size.
So; it is very data specific.
If bandwidth is your main issue, I would probably say: check that WCF and things like DataContractSerializer are helping you. Personally I'd be looking at something smaller (json or protobuf-net maybe) and raw sockets or basic http bodies. Some bandwidth comparisons here.