octave-maintainers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Octave Forge] Octave 4.0 call for packages


From: Carnë Draug
Subject: Re: [Octave Forge] Octave 4.0 call for packages
Date: Mon, 13 Apr 2015 13:21:29 +0100

On 11 April 2015 at 13:07, Oliver Heimlich <address@hidden> wrote:
> On 11.04.2015 11:56, Oliver Heimlich wrote:
>>
>> On 11.04.2015 03:25, Mike Miller wrote:
>>>
>>> Would you mind sharing the test failures you are seeing? Either as bug
>>> reports against each package or a summary log file sent to the list or
>>> posted somewhere. Full output would be most helpful as many developers
>>> are unable to test on Windows.
>>>
>>> Thanks,
>>
>>
>> I am going to put the results into our wiki [1] as a table. I can put
>> the test logs on a private website and link to them in the table. Then,
>> we can collect references to existing bug reports/patches in a
>> structured way. This should simplify the assessment of all packages
>> together. And it might help to sort out deprecated packages.
>
>
> You can find the results in the wiki:
> http://wiki.octave.org/Octave-Forge#GNU_Octave_4.0_compatibility_assessment
>
> The wiki contains a link to my test logs from Win7.
>
> So far, there are 13 packages that seem fit, 10 with bug reports or known
> fixes. Many where I am unsure about their current state. Please update the
> status of packages that you know better than I do.
>

I changed the status of the image package from "broken" to "ok".

The logs show 25 failing tests.  I was expecting 23.  Those 23 failing
tests are for Matlab compatibility.  They are failing because no one has
fixed them yet, but they never worked.  I simply added the tests so at
least I won't forget about it.  Those are documented.  I don't want to
use xtest for these because I think those are meant for tests that involve
some level of randomness.  In this case, the functions really are not
working as I would hope them to work.

The other 2 failing tests are new, but they are both caused by a random
signal, and are within machine precision.  I should change the precision
of the tests.

Carnë



reply via email to

[Prev in Thread] Current Thread [Next in Thread]