[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Jordi Gutiérrez Hermoso
Thu, 1 Nov 2012 09:04:19 -0400
On 1 November 2012 04:37, <address@hidden> wrote:
> I am trying to read a csv file with about 25M data points. (250,000 rows,
> 100 columns). Using MacBook Pro OSX 10.8 with 8GB RAM, 2.3Ghz i7. I used
> csvread('file'). The process has been running for 1.5 days. It's currently
> using 2.5GB ram. The file is 230MB.
> This seems too slow. I didn't make a matrix of zeros before running the
> process. Also, now I have about 1GB of RAM left.
This sounds like a performance bug, please report it:
> Can someone give me insight into what's happening?
The csvread implementation probably needs to be improved. If your data
is all numerical, you could try to simply use "load" instead of
"csvread". Indeed, I don't even know what the purpose of csvread is,
since it can *only* read numerical data, making it almost useless, and
it probably has to remain useless for Matlab compatibility:
> If I interrupt the rocess will it keep the information that is already
> loaded? Or will I lose
> everything? Should I start quiting other processes to free up ram?
You will lose everything and quitting other processes to free up RAM
may help but it sounds undesirable.
Again, try load instead of csvread. It may fare better.
- Jordi G. H.