#1443 Absence of facultative Content-Range crash my script


The addition of Content-Range in HTTP 416 responses is optional per RFC 2616. But curl bug if it is not present.

#306MB file
curl -L -C - 'http://www.ngdc.noaa.gov/mgg/global/relief/ETOPO1/data/ice_surface/grid_registe‌​red/netcdf/readme_etopo1_netcdf.txt' -o ./data/countries.zip  
#1.1kB file
curl -L -C - 'http://www.ngdc.noaa.gov/mgg/global/relief/ETOPO1/data/ice_surface/grid_registe‌​red/netcdf/readme_etopo1_netcdf.txt' -o ./data/texte.txt

The target server definitively accepts to resume unfinished download. However, when
the download is already finished and the curl request is sent again, curl crash my script and drop back the error message :

“Curl : (33) HTTP server doesn't seem to support byte ranges. Cannot resume.”

It actually should rather notice the completeness and send a success message.

The bug occurs with the ngdc.noaa.gov website.

With Wireshark I checked what is going on in the HTTP protocol. Basically, when curl makes the request to resume the completed file, the server sends back an HTTP 416 error ("Requested Range Not Satisfiable"). In the case of naturalearthdata.com, the CDN they use adds a Content-Range header specifying the exact length of the file. ngdc.noaa.gov does not add this header. Note that the addition of Content-Range in HTTP 416 responses is optional per RFC 2616.

curl uses Content-Range to determine if the download is complete. If the header is missing, curl assumes that the server doesn't support range downloads and spits out that error message.


  • Daniel Stenberg

    Daniel Stenberg - 2014-10-29

    I agree. We should do better than this.

  • Daniel Stenberg

    Daniel Stenberg - 2014-10-29
    • status: open --> open-confirmed
    • assigned_to: Daniel Stenberg
  • hugo

    hugo - 2014-10-29

    The user who helped to make this diagnostic suggested to use

       curl -I <URL> | grep Content-Length | cut -d' ' -f 2

    to obtain the length of the file, and check that against the downloaded file size, before running curl. (http://stackoverflow.com/a/25434725/1974961)

    Last edit: hugo 2014-10-30
  • Daniel Stenberg

    Daniel Stenberg - 2014-10-30

    So what about a patch like this, does it make your case better?

    diff --git a/lib/transfer.c b/lib/transfer.c
    index b5ba86e..6d4ad43 100644
    --- a/lib/transfer.c
    +++ b/lib/transfer.c
    @@ -545,10 +545,22 @@ static CURLcode readwrite_data(struct SessionHandle *data,
                 infof(data, "Ignoring the response-body\n");
               if(data->state.resume_from && !k->content_range &&
                  (data->set.httpreq==HTTPREQ_GET) &&
                  !k->ignorebody) {
    +            if(k->size == data->state.resume_from) {
    +              /* The resume point is at the end of file, consider this fine
    +                 even if it doesn't allow resume from here. */
    +              infof(data, "The entire document is already downloaded");
    +              connclose(conn, "already downloaded");
    +              /* Abort download */
    +              k->keepon &= ~KEEP_RECV;
    +              *done = TRUE;
    +              return CURLE_OK;
    +            }
                 /* we wanted to resume a download, although the server doesn't
                  * seem to support this and we did this with a GET (if it
                  * wasn't a GET we did a POST or PUT resume) */
  • hugo

    hugo - 2014-11-01

    Seems good. Would allow to continue if file already downloaded completely. Let's roll with this :)

  • Daniel Stenberg

    Daniel Stenberg - 2014-11-01

    Thanks, fix pushed now together with two test cases.

  • Daniel Stenberg

    Daniel Stenberg - 2014-11-01
    • status: open-confirmed --> closed-fixed
  • hugo

    hugo - 2014-11-07

    Daniel, when will this bug-fix will be published to the ubuntu packages ?

    • Daniel Stenberg

      Daniel Stenberg - 2014-11-10

      You need to ask the Ubuntu people about that. We release curl, they decide on their own what releases from us to use and when.

  • hugo

    hugo - 2014-11-10

    Noticed. I will check out in 3 months. Thanks for all !


Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:

No, thanks