[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Potential GZIP Bug
From: |
Hans-Bernhard Broeker |
Subject: |
Re: Potential GZIP Bug |
Date: |
10 Apr 2001 16:03:20 GMT |
Hrin, Mike <address@hidden> wrote:
> Hello,
> I am trying to use gunzip (or gzip -d) to decompress a TAR file. The
> compressed file is 494 MB, and the directory to which I'm unzipping the file
> contains 22 GB of free space. I'm estimating the decompressed file will be
> 8-10 GB.
Why estimate? Ask gzip for the actual uncompressed size:
gzip -lvN inputfile
will print out something like this:
method crc date time compressed uncompr. ratio uncompressed_name
defla 8361aa8d Sep 21 20:47 10124 48128 79.0% [...]
> However, when the decompressed file reaches approximately 2GB, the
> app fails with the error of "file size too large." Any ideas on what could
> be causing this?
Obviously, your 'gzip' wasn't built with large file support. You may
need an alpha release of gzip to have large file support.
--
Hans-Bernhard Broeker (address@hidden)
Even if all the snow were burnt, ashes would remain.
- Potential GZIP Bug, Hrin, Mike, 2001/04/10
- Re: Potential GZIP Bug,
Hans-Bernhard Broeker <=