Actually using ZLIB might not be the best idea, better would be something that's faster.
I'd go with LZ4. There's also lzturbo, but, lz4 has format that might be nicer for "small devices" (raspis & Co.)
And ChuckOne is right, binary protocol equals problems
If we go with a compression, then it should be optional. To me, a RaspberryPi isn't a small device.
A JavaCard with 80k working memory is a small device.I agree, that's why I'd like the idea of an additional layer. Turn it simply on or off at your convenience.
I'd like to re-iterate my support for a simple endian-neutral binary format. Encode integers using the simple variable-length scheme for numerical constants as outlined in section 2.1 of
The Case for Universal Symbol Files by Michael Franz [
PDF here]:
Our symbol files encode constant data in a machine-independent and space-economical way. The two basic formats that are used for representing numbers and strings require a variable number of bytes on the file, and use a stop bit for denoting the last byte of a sequence. In order to keep matters simple, we purposefully represent each data value by an integral number of bytes, disregarding the potential further savings in size that might be possible if byte boundaries were transcended.
The scheme we employ for coding integers, suggested by Odersky [9], can be applied to values of any magnitude and is independent of the word length and the byte ordering of the machine used. Not only is the resulting file representation portable, but so is the algorithm, which can be used on any machine that offers 2's complement arithmetic.

The encoding and decoding is very simple and there is an example encoder given in the paper.
Don't use the format for strings outlined, however, as it assumes an 7-bit character set. Instead encode strings and arbitrary buffers using a format similar to the storage of Pascal strings: a size integer, which uses the variable-length scheme above, followed by a string of bytes. This would allow strings/buffers of any length.
--- off topic ---
Besides real problems regarding binary formats, its psychologically most destructive property is that everybody has the self-proclaimed best encoding, the fastest encoding, the most space-efficient encoding, the most flexible encoding and so and so which only differ in one of two bits somewhere and therefore makes them 100% incompatible. Thank you.
Why? Because binary is the way to go if you want fast, efficient, direct, buzzword, buzzword, ....
Look at text-based format. Do all these requirements matter there? No. Their implementors focused primarily on the content in conjunction of readability, maintainability, flexibility, expandability and independence. All properties, which are required for the maintenance of long-term applications.