On 19.04.11 12:05, Peter Norlindh wrote:
Thank you Marco! I have now tried to read the data in bigger
chunks, and that seems to have sped things up a bit:)
Currently, I'm waiting for dlmread(FILE) to read the entire
file. It's been running for about one hour now and I expect it
to continue for quite some time. Are there ways to decrease the
reading time even further?
On Tue, Apr 19, 2011 at 10:09 AM, marco
surely not reading one element at time as you are doing with
[i, col, i, col]
On Tue, Apr 19, 2011 at 9:41 AM, Peter
> I am analyzing data from a 400MB ascii-file.
There are about 30 million
> data points in the file and the execution takes
> Currently the program reads and processes the file
element-by-element ( val
> = dlmread(fileName, "emptyvalue", [i, col, i, col]) )
and I suspect that
> this is a very unafficient way to do it. Could you
suggest a better
The RANGE parameter may be a 4-element vector containing the
left and lower right corner `[R0,C0,R1,C1]' where the
value is zero.
if the data is already in matrix form, you should read all the
in one pass
DATA = "" (FILE)
Help-octave mailing list
if you have to read the file more than once it might be worth to
save it in binary format after
having read it for the first time. All subsequent reads should be
very much faster.