[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: cut fails with "cut: memory exhausted" when taking a large slice
From: |
Mordy Ovits |
Subject: |
Re: cut fails with "cut: memory exhausted" when taking a large slice |
Date: |
Wed, 21 Apr 2004 15:43:49 -0400 |
User-agent: |
KMail/1.5.3 |
On Wednesday 21 April 2004 03:35 pm, Jim Meyering wrote:
> If you want to continue using cut, you'll have better
> luck with the latest:
>
> ftp://ftp.gnu.org/gnu/coreutils/coreutils-5.2.1.tar.gz
> ftp://ftp.gnu.org/gnu/coreutils/coreutils-5.2.1.tar.bz2
Excellent. Will do.
> Or, just use head and tail with their --bytes=N options.
>
> e.g., head --bytes=N < FILE | tail --bytes=412569600
>
> where N is chosen so that the head command outputs everything
> in the file up to and including the desired range of bytes.
Sure, but that reads the whole file and pushes it through the pipe. That's
far more IO than is strictly necessary. This is especially true with a 9GB
file.
Thanks,
Mordy
--
Mordy Ovits
Network Security
Bloomberg L.P.
Re: cut fails with "cut: memory exhausted" when taking a large slice, Ken Wolcott, 2004/04/21