help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Optimal way to handle big data table ?


From: Alexander Barth
Subject: Re: Optimal way to handle big data table ?
Date: Wed, 20 Mar 2013 13:39:21 +0100

Hi Pascal,

You get the best input/output performance when using a binary file format (such as octave's mat format).

Cheers,
Alex


On Thu, Mar 7, 2013 at 11:07 AM, CdeMills <address@hidden> wrote:
Hello,

I was doing recently lamp spectrum analysis to extract photometric
properties. This implies to compute the integral of the spectrum by the
CIE1931 sensitivity functions; they are tabulated at 400 wavelength, each
time 4 values. What's the best way to use those data inside a function ?
1) encode them inside the function body ? It will be compiled once.
CIE31Table = [360 0.000130 0.000004 0.000606
              361 0.000146 0.000004 0.000681
              362 0.000164 0.000005 0.000765 ... ];
2) read them from a text file ?
3) read them from a binary file ?
4) other ?

The point is to minimise the computational load of each time refilling this
matrix with its 1600 entries.

Regards

Pascal



--
View this message in context: http://octave.1599824.n4.nabble.com/Optimal-way-to-handle-big-data-table-tp4650575.html
Sent from the Octave - General mailing list archive at Nabble.com.
_______________________________________________
Help-octave mailing list
address@hidden
https://mailman.cae.wisc.edu/listinfo/help-octave


reply via email to

[Prev in Thread] Current Thread [Next in Thread]