help-octave
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: ATLAS and octave


From: Al Goldstein
Subject: Re: ATLAS and octave
Date: Mon, 14 Feb 2000 18:46:23 -0800 (PST)

Thanks Clint I tried it my freeBSD octave2.0.16. Wow. It's souped up!

On Sat, 5 Feb 2000, R Clint Whaley wrote:

> Guys,
> 
> As I was attempting to sleep, I realized I there might be a relatively
> ease way to add ATLAS to octave.
> 
> All ATLAS internal routines begin with ATL_ (to avoid naming conflicts
> with other libraries), and the rest of the defined routines are 
> BLAS and LAPACK API routines.  This means, I think, that we can
> put ATLAS into libcruft without too much problem.
> 
> The most direct method is, with a previously installed ATLAS, build 
> octave's libcruft as normal, but you then ar into it the contents of:
>    ATLAS/lib/<arch>/liblapack.a (overwriting some LAPACK routines)
>    ATLAS/lib/<arch>/libatlas.a  (ATLAS internals)
>    ATLAS/lib/<arch>/libf77blas.a (F77 BLAS interface -- overwrite BLAS)
>    ATLAS/lib/<arch>/libcblas.a (C BLAS interface)
> 
> I have hacked up a quick version of this on my 600Mhz Athlon.  Everything
> is not 100% I'm sure, but already I can see this:
> 
> Octave au natural:
> ===========================================================================
> octave:1>  A = rand(1000,1000); 
> octave:2> t1 = time(); lu(A) ; t2 = time() - t1
> t2 = 30.460
> octave:4> B = rand(1000,1000);
> octave:5> t1 = time() ; A * B ; t2 = time() - t1
> t2 = 41.757
> 
> 
> My hacked up octave/atlas:
> ===========================================================================
> octave:1> A = rand(1000,1000); 
> octave:2> t1 = time(); lu(A) ; t2 = time() - t1
> t2 = 5.1056
> octave:7> B = rand(1000,1000);
> octave:8> t1 = time() ; A * B ; t2 = time() - t1
> t2 = 3.0831
> 
> So with trivial effort, we've got something running quite a bit faster.
> All is not right with my hacked up octave, 'cause I'm not sure I got things
> done in the right order.  I rather suspect I'm not using ATLAS's lapack
> (I just hacked the libcruft/blas makefile and had it add all of atlas's
> files).  I'm also not sure why I'm getting 324Mflop DGEMM on a platform
> where `barebones' ATLAS DGEMM gets almost 700 . . .
> However, the fact that the build succeeded tells me that ATLAS/octave are
> playing together nicely . . .
> 
> Anyway, one easy approach is to formalize a process like this within the
> octave framework.  Could be perhaps some kind of config option, or something.
> If you are interested, I would be happy to supply the required
> makefile / shell file / help needed  to shove the atlas stuff into
> libcruft . . .
> 
> The cool think is that then ATLAS would not have to be incorperated into
> octave, saving you from having to forward ATLAS queries on to us.  The
> process for a normal octave remains the same.  For the high-octane version,
> the user installs ATLAS first (sending mail to us if this doesn't work).
> With the a good ATLAS install on the machine, you throw a configure option
> in octave, and the faster stuff is built . . .
> 
> Anyway, the sun is about to come up, so it must be time for bed.  Let me
> know what you think.
> 
> Cheers,
> Clint
> 
> 
> 
> -----------------------------------------------------------------------
> Octave is freely available under the terms of the GNU GPL.
> 
> Octave's home on the web:  http://www.che.wisc.edu/octave/octave.html
> How to fund new projects:  http://www.che.wisc.edu/octave/funding.html
> Subscription information:  http://www.che.wisc.edu/octave/archive.html
> -----------------------------------------------------------------------
> 
> 



-----------------------------------------------------------------------
Octave is freely available under the terms of the GNU GPL.

Octave's home on the web:  http://www.che.wisc.edu/octave/octave.html
How to fund new projects:  http://www.che.wisc.edu/octave/funding.html
Subscription information:  http://www.che.wisc.edu/octave/archive.html
-----------------------------------------------------------------------



reply via email to

[Prev in Thread] Current Thread [Next in Thread]