[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Import large field-delimited file with strings and numbers

From: Joao Rodrigues
Subject: Re: Import large field-delimited file with strings and numbers
Date: Mon, 08 Sep 2014 20:44:55 +0100
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:24.0) Gecko/20100101 Thunderbird/24.6.0

On 09/08/2014 08:27 PM, Markus Bergholz wrote:

Bottom line: I think it has to do with the way Octave allocates memory to cells, which is not very efficient (as opposed to dense or sparse numerical data, which it handles very well).

I managed to solve the problem I had, thanks to the help of you guys.

However, I think it would probably be nice if in future versions of Octave there was something akin to ulimit installed by default to prevent a process from eating up all available memory.

If someone wants to check this issue the data I am working with is public:*/csv/*

where * = 1990:2013

nvm, got it.
which columns do you need?
Hi. As I said above, I already solved the problem (with your help).

I just put the link so that someone interested can check the memory overload problem.

(But the data I need to extract is in columns 1-3 and 8-11.)

Many thanks

reply via email to

[Prev in Thread] Current Thread [Next in Thread]