Menu

#20 content-encodeing gzip not uncompressed

open
nobody
None
5
2002-03-06
2002-03-06
Anonymous
No

perl version 5.6.1
linux (and solaris perl version 5.005_03)
libwww version 5.53 and 5.63

GETting gzip content from an IIS server. Headers from
the server have "content-encoding gzip" and I
do have Compress::Zlib installed and in my path.
However the body does not get uncompressed.

I also suspect that libwww is not checking for
Compress::Zlib during "make test".

Yours
Faye

fayegibbins@if.com

Unix Script Architect
Technical Architect Team

Discussion

  • Gisle Aas

    Gisle Aas - 2002-03-17

    Logged In: YES
    user_id=59084

    Can you explain a bit more why this would
    be a good thing? If you download a big tarball,
    you don't want it uncompressed for you I would
    guess.

     
  • Nobody/Anonymous

    Logged In: NO

    When down loading a page which for the sake of argument is
    text/html mime type with headers to show that the content
    is gzipped the routines [in LWP] that are ment to see that
    it is gzipped and to uncompress the body (in HTTP1.1) are
    not activated. I checked by adding little print statements
    to it. If your downloading a big tar ball it would have a
    different MIME type and should be caught as an execption.

    What is strange is that the LWP looks like it should catch
    HTML when compresssed, it has all the routines in place.
    Even the doc's say that it should deal with gzipped HTML?

    For example in Net/HTTP.pm -:

    "Get/set the a value indicating if the request will be sent
    with a "TE"
    header to indicate the transfer encodings that the server
    can chose to
    use. If the C<Compress::Zlib> module is installed then
    this will
    annouce that this client accept both the I<deflate> and
    I<gzip>
    encodings."

    and in "Net/HTTP/methods.pm"

    "elsif ($self->send_te && zlib_ok()) {
    # gzip is less wanted since the Compress::Zlib
    interface for
    # it does not really allow chunked decoding to take
    place easily.
    push(@h2, "TE: deflate,gzip;q=0.3");
    push(@connection, "TE");
    }
    "

    and here where we ahve the block for decompressing gzip
    HTML but it is never reached even if the server and client
    agree to transmit compressed HTML....

    "for (@te) {
    if ($_ eq "deflate" && zlib_ok()) {
    #require Compress::Zlib;
    my $i = Compress::Zlib::inflateInit();
    die "Can't make inflator" unless $i;
    $_ = sub { scalar($i->inflate($_[0])) }
    }
    elsif ($_ eq "gzip" && zlib_ok()) {
    #require Compress::Zlib;
    my @buf;
    $_ = sub {
    push(@buf, $_[0]);
    return Compress::Zlib::memGunzip
    (join("", @buf)) if $_[1
    ];
    return "";
    };
    }
    "

    I checked up on the HTTP heades the server was sending out
    and made sure that the compress flags complied with the
    standard which they do.

    Also the doc's say that if you have "Compress::Zlib"
    installed whn installign LWP extra checks will be made
    however this is not the case.

    Yours
    Faye

     
  • Terje Bless

    Terje Bless - 2002-10-17

    Logged In: YES
    user_id=8470

    "Content-Transfer-Encoding: gzip/inflate" is _not_ the same
    as "Content-Type: application/gzip". When you download a
    .tgz you get the latter. AFAICT this bug report is that a
    "text/html" document returned with
    "Content-Transer-Encoding: gzip" is not unpacked before
    being returned. Given that the documentation suggests that
    this should be supported, this rates as a bug.

    But the Changes file also suggests this may be fixed by now
    so the bug can perhaps be closed?

     

Log in to post a comment.

Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.