Varnish FetchError: Gunzip+ESI Failed at the very end

Want to help support this blog? Try out Oh Dear, the best all-in-one monitoring tool for your entire website, co-founded by me (the guy that wrote this blogpost). Start with a 10-day trial, no strings attached.

We offer uptime monitoring, SSL checks, broken links checking, performance & cronjob monitoring, branded status pages & so much more. Try us out today!

Profile image of Mattias Geniar

Mattias Geniar, November 26, 2014

Follow me on Twitter as @mattiasgeniar

This error can occur in a Varnish setup in multiple forms, but they commonly include the Gunzip + ESI error line.

FetchError   c Gunzip+ESI Failed at the very end
...
FetchError   c TestGunzip error at the very end

The most common result appears to be a combination of GZIP and a Content-Length of 0, coming from the backend. To troubleshoot these, start a varnishlog that filters on the backend response code of 503.

$ varnishlog -m TxStatus:503

Your output should be similar to the one below.

$ varnishlog -m TxStatus:503
...
   12 RxRequest    c GET
   12 RxURL        c /your/url
   12 RxProtocol   c HTTP/1.1
...
   12 VCL_call     c recv lookup
   12 VCL_call     c hash
...
   12 ObjProtocol  c HTTP/1.1
   12 ObjResponse  c OK
   12 ObjHeader    c Date: Wed, 26 Nov 2014 21:44:45 GMT
   12 ObjHeader    c Server: Apache
   12 ObjHeader    c Content-Encoding: gzip
   12 ObjHeader    c Content-Type: application/json
   12 Gzip         c U F E 0 0 0 0 0
   12 FetchError   c Gunzip+ESI Failed at the very end
   12 Gzip         c G F E 0 26 80 128 138
   12 VCL_call     c error deliver
   12 VCL_call     c deliver deliver
   12 TxProtocol   c HTTP/1.1
   12 TxStatus     c 503
   12 TxResponse   c Service Unavailable

The backend in my case sent a HTTP 200 response, but without a Content-Length header, as it was using Transfer-Encoding: chunked. The Apache logs also did not indicate a Content-Length, because of the chunked encoding (where the clients receives the bits and bytes as soon as they’re available, instead of awaiting the entire request at once). Here’s the Apache log. The result was, in fact, a completely empty response – so a zero byte response.

10.0.1.5 - - [26/Nov/2014:21:44:45 +0100] "GET /your/url HTTP/1.1" 200 - "-" "curl/7.37.1"

After the HTTP 200 code, there should be a size in bytes of the response of the request. This was empty (again: chunked response). It’s this scenario where Varnish has trouble processing the Gzip of the request, because it tries to gzip a zero-byte request, which ultimately fails.

What actually happened

In my case, the backend responded properly with an HTTP 200 code, with a gzip’d empty response. Here’s the response coming directly from the backend (without Varnish in the middle of it).

$ curl http://...../your/url -H "Accept-Encoding: gzip"
HTTP/1.1 200 OK
Date: Wed, 26 Nov 2014 21:48:25 GMT
Server: Apache
Cache-Control: public, max-age=300
Last-Modified: Wed, 26 Nov 2014 21:48:25 +0000
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Vary: Cookie,Accept-Encoding
Content-Encoding: gzip
Transfer-Encoding: chunked
Content-Type: application/json

As PHK says;

If the backend sends “Content-Length: 0” and “Content_Encoding: gzip” it is buggy, work around it. If the backend sends chunked encoding we don’t catch it and fail the transfer.

Poul-Henning Kamp

This actually poses a problem. If your backend is misbehaving, how do you let Varnish fix this?

Disable compression in the backend

One solution I found to be working was to disable all compression made by the backend (either the Apache webserver, or in the case of PHP – any custom output handlers that can gzip a request) and let Varnish handle the GZIP actions.

If you can disable Gzip support in your Webserver (ie: by disabling mod_deflate/mod_gzip in Apache), that’s the easiest fix. Alternatively, you should be able to modify vcl_recv and just strip the Accept-Encoding header from the request, making the backend think the client doesn’t support any kind of compression.

Let Varnish handle the gzip compression

Since your clients would still want Gzipped responses, you can let Varnish do the compression. To enable Varnish gzip on the requests, add the following in your vcl_fetch (varnish 3.x and upwards only).

...
    if (beresp.http.content-type ~ "text/plain"
          || beresp.http.content-type ~ "text/xml"
          || beresp.http.content-type ~ "text/css"
          || beresp.http.content-type ~ "text/html"
          || beresp.http.content-type ~ "application/(x-)?javascript"
          || beresp.http.content-type ~ "application/(x-)?font-ttf"
          || beresp.http.content-type ~ "application/(x-)?font-opentype"
          || beresp.http.content-type ~ "application/font-woff"
          || beresp.http.content-type ~ "application/vnd\.ms-fontobject"
          || beresp.http.content-type ~ "image/svg\+xml"
       ) {
        set beresp.do_gzip = true;
    }
...

This will make Varnish do a more intelligent compression of the response, but only for the few Content-Type's that actually support it. There is very little gain in gzip’ing a jpeg or png image, so we filter those Content-Type's out.

For the record, this was caused by a Drupal system that did Gzip compression in PHP (Settings > Configuration > Development > Performance > Compress cached pages). The compression implementation made in Drupal appears to be sending a Gzip’d version of the page for responses that can not be gzip’d (or: where gzip has no benefit, since a 0 byte response compressed is still a 0 byte response).

Disabling gzip in Drupal and letting Varnish handle it more intelligently, seems to resolve this particular problem. I’ve not found a bug-report for Drupal in this, perhaps it’s even intended behaviour from their point-of-view – but it sure caused some issues with Varnish.



Want to subscribe to the cron.weekly newsletter?

I write a weekly-ish newsletter on Linux, open source & webdevelopment called cron.weekly.

It features the latest news, guides & tutorials and new open source projects. You can sign up via email below.

No spam. Just some good, practical Linux & open source content.