octave-maintainers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: mpi 1.1.1 released


From: Michael Creel
Subject: Re: mpi 1.1.1 released
Date: Thu, 2 Jan 2014 10:22:56 +0100

Hi Carlo,
Sure, no problem. I'm not much of a real hacker when it comes to the internal workings, but I'm an experienced user.
M.


On Thu, Jan 2, 2014 at 10:19 AM, c. <address@hidden> wrote:

On 2 Jan 2014, at 08:56, Michael Creel <address@hidden> wrote:

> Hi Sukanta and others,
> I haven't been following this issue. I have been using the mpi package with Open MPI, currently v1.6.5, and Octave v3.6.4, on Debian. I use it on a daily basis, on up to 32 nodes, for runs that can go overnight. So far, I have not noticed a problem, but perhaps I'm not using whatever part might have a leak. Have you posted the code that shows the problem somewhere? If not could you send it to me, please?
> Thanks,
> Michael

On 2 Jan 2014, c. <address@hidden> wrote:

> I am considering proposing a GSoC project about improvements to the MPI package, in particular I'd like to add the
> ability for users to start parallel jobs and collect the output in an interactive Octave CLI/GUI session.
>
> If your have a non trivial application built on Octave MPI it would be great to use it for testing.
> Would it be possible to use your MATLES application for this purpose? Is it Free Software?
>
> In addition to solving the memory leak issue do you have any other improvements that could be part of the project?
> Would you like to be a mentor for the project?

Michael,

Would also like to be listed as a possible mentor for this project?
Your help would be greatly appreciated.

c.


reply via email to

[Prev in Thread] Current Thread [Next in Thread]