xforms-development
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [XForms] automated regression tests for XForms


From: alessandro basili
Subject: Re: [XForms] automated regression tests for XForms
Date: Thu, 31 Jan 2013 09:12:21 +0100
User-agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:17.0) Gecko/20130107 Thunderbird/17.0.2

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hi Jeafn-Marc,

On 29/01/2013 17:06, Jean-Marc Lasgouttes wrote:
> Le 29/01/2013 12:47, Jens Thoms Toerring a ←crit :
>> I guess the simplest thing would be to create a directory named 
>> e.g. 'Testing' in the main XForms directory and then do whatever 
>> you like in there. Once you have an idea how things are going to 
>> be organized we can discuss how to integrate it into the built 
>> system etc. (that's something I'm a bit worried about since I 
>> never managed to get the hang of this automake/autoconf stuff;-)
> 
> For what is worth, we have started to use some automated tests in
> LyX that may be relevant: 
> http://git.lyx.org/?p=lyx.git;a=tree;f=development/autotests

that is definitely a good help. At least I'll have a reference to look
at. It might be possible that I'll need to come back to you to ask for
some details.

> This framework is a child of our former crash test enviromnment
> that send random events to LyX and makes a bug report when there is
> a crash: 
> http://git.lyx.org/?p=lyx.git;a=tree;f=development/keystest

That is interesting indeed, a random generator of X events can
certainly stress much more the interface than what a user can possibly
do. What concerns me a bit is that if for every test the generated
events are different then it would be hard to compare test results.

What can be extremely useful though is to use a random generator to
create the first test to be recorded with Xnee and then continue to
use Xnee in the playback mode. This will alleviate the need to have
someone perusing the test programs in order to generate test reports.

Somehow the workflow may look like this:

- - create a test program
- - run the program and Xnee
- - run an event generator for a crash test
- - if (test=fail) look at the report and fix the code
  else           store the recorded session from Xnee
- - submit the test program together with the recorded session

on the next build every test program is run with xnee playing back the
test session.

One small problem we might have with this flow is that not everything
can be generated by an event generator. For example, there are
functions that open a file for reading and build an object like an
image. If the format of the image is not correct then the function
should handle it, but none of the crash test events will generate
something like this.

Maybe it is possible to classify functions in such a way that
different types of tests are required. What do you think?

- -- 
PGP Fingerprint DCBE 430E E93D 808B 45E6 7B8A F68D A276 565C 4450
PGP Public Key available at keys.gnugp.net
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.17 (MingW32)
Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/

iQEcBAEBAgAGBQJRCidlAAoJEPaNonZWXERQdVAIAOIeZtuLXTXekOcZzRQFGSVY
Q4thVqGGC1TAmlOJJELpVPZ+L53y6LAaasZmwxfxRvW184+RpAuuEsXM0JGcHeN7
eLIcM0xwYkwToyvVu+AFUVsLJWenJVtPmBvjcyRwKAdFDjD5twmb+Pbv9Me66BKP
ToThSdbWL5ppnBh3zvPNlARWwmNsU7rL6jMvZMIXv5Szf9dvY751+FqafUUKhrVW
f+Hdaf/yDisKSFuSR6YXaOUKpUdwvnALqHP48W7Osx7L87RcpIb54Aw8ngPsvs3P
cRfHQZpC0/Og44wOKQFgagoWlFVmYZbPdEoWv0zBQrXg+bel7upZnKwts/6jPh0=
=uyN+
-----END PGP SIGNATURE-----



reply via email to

[Prev in Thread] Current Thread [Next in Thread]