On the server I'm forcing compression with a custom message handler. The handler checks the Accept-Encoding
header and if it is supported (e.g. GZip) will swap out the HttpResponseMessage.Content
with an instance of CompressedContent
. This simply compresses the original content like so:
protected override async Task SerializeToStreamAsync(Stream stream, TransportContext context)
{
Ensure.Argument.NotNull(stream, "stream");
using (content) // original content
{
// creates a new GZip/Deflate stream
var compressed = GetStream(CompressionMode.Compress, stream);
await content.CopyToAsync(compressed);
compressed.Dispose();
}
}
On the client, we can achieve the decompression by checking the Content-Encoding
header and using another HttpContent
type to perform the decompression:
protected async override Task SerializeToStreamAsync(Stream stream, TransportContext context)
{
Ensure.Argument.NotNull(stream, "stream");
using (content)
{
var compressed = await content.ReadAsStreamAsync();
var decompressed = GetStream(CompressionMode.Decompress, compressed);
await decompressed.CopyToAsync(stream);
decompressed.Dispose();
}
}
The part I am unsure of is whether we should be using a custom HttpContent
type to do the decompression. On the server it makes sense to do this since we don't really have other way of touching the response stream. On the client however, this could be done by decompressing the standard StreamContent
directly or even with a custom HttpClient
implementation.