qemu-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Qemu-devel] [ANNOUNCE] qemu-test: a set of tests scripts for QEMU


From: Anthony Liguori
Subject: Re: [Qemu-devel] [ANNOUNCE] qemu-test: a set of tests scripts for QEMU
Date: Thu, 29 Dec 2011 18:33:32 -0600
User-agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.23) Gecko/20110922 Lightning/1.0b2 Thunderbird/3.1.15

On 12/29/2011 05:17 PM, Lucas Meneghel Rodrigues wrote:
On 12/29/2011 03:02 PM, Anthony Liguori wrote:
On 12/29/2011 10:53 AM, Avi Kivity wrote:
On 12/29/2011 06:39 PM, Anthony Liguori wrote:

It might have made sense to split the kvm-testing functionality of
autotest, and have autotest drive that. We could even have called it
qemu-test.

I specifically advocated this during Lucas' KVM Forum talk and he was
strongly opposed to it.

Ok, this is a point where I have failed in communicating things.

I've watched the presentation on video just to see if my memory was not
betraying me, and I wouldn't say I was 'strongly opposed', it's just that there
are some practical problems, not impossible to overcome of course, but worth
thinking.

So I decided to do some snooping.  Here are some stats:

address@hidden:~/git/autotest/client/tests/kvm/tests$ wc -l *.py
   150 balloon_check.py
    68 boot_savevm.py
   190 cdrom.py
  1875 cgroup.py
   111 cpu_hotplug.py
   170 enospc.py
    71 floppy.py
    72 getfd.py
    89 hdparm.py
     0 __init__.py
   615 ksm_overcommit.py
   107 migration_multi_host.py
   117 migration.py
    85 migration_with_file_transfer.py
    43 migration_with_reboot.py
   138 multi_disk.py
    62 nic_bonding.py
   104 nic_hotplug.py
    60 nmi_watchdog.py
   203 pci_hotplug.py
   182 physical_resources_check.py
   439 qemu_img.py
    48 qemu_iotests.py
   407 qmp_basic.py
   389 qmp_basic_rhel6.py
    54 set_link.py
    67 smbios_table.py
   356 stepmaker.py
   247 steps.py
    43 stop_continue.py
    31 system_reset_bootable.py
   181 timedrift.py
    96 timedrift_with_migration.py
    91 timedrift_with_reboot.py
   103 timedrift_with_stop.py
    28 unittest_kvmctl.py
   121 unittest.py
    90 usb.py
  2174 virtio_console.py
    85 vmstop.py
  9562 total

address@hidden:~/git/autotest/client/tests$ git log --format="%an <%ae>" kvm | 
sort -u

Amos Kong <address@hidden>
Chris Evich <address@hidden>
Cleber Rosa <address@hidden>
Jerry Tang <address@hidden>
lmr <address@hidden>
Lucas Meneghel Rodrigues <address@hidden>
Lucas Meneghel Rodrigues <address@hidden>
Lukas Doktor <address@hidden>
mbligh <address@hidden>
Onkar N Mahajan <address@hidden>
pradeep <address@hidden>
Qingtang Zhou <address@hidden>
Thomas Jarosch <address@hidden>
Yiqiao Pu <address@hidden>

Which leads me to the following conclusions:

1) No one outside of autotest developers is actually contributing tests to kvm-autotest.

2) Most of the tests are relatively small and simple enough that they could be trivially ported to a stand alone utility. For instance, looking at the pci_hotplug.py, it just executes monitor commands and does basic pci enumeration in the guest.

I don't see a lot of pitfalls here. At the end of the day, we're not talking about all that much code.


* We rely on a fair bit of autotest libraries/API, so by extracting the APIs we
rely on to start a new project we'd be creating some code duplication, and
updates/bugfixes on the autotest API wouldn't be easily carried on to the new
project.
* I'm not sure that this per se would fix the perceived problems with the tests.
It'd be nearly the same code, just outside the autotest tree, I fail to see how
that would be much better, and therefore, justify the work on doing this. Of
course I might be wrong.

It's not about where they live, it's about the ability to execute them in a simple and direct fashion.


I think kvm autotest would get a lot more interest if the test cases
were pulled out of autotest, made more stand alone. They also should be
more Unix like being divided into individual executables with
independent command line options over a single do-everything
configuration file.

You have a point with regards to having the test cases more stand alone, but
it's not as easy as one would think mainly because of the large amount of user
configurable options that we have to pass to the tests.

Configuration options are really the devil when it comes to testing. As a developer, I want my test suite to just run and tell me if I have bugs. I have no interest in configuring it to run a certain way. I don't want to think that much about what I'm testing, I just want the tests to run and tell me if I screwed something up.

Regards,

Anthony Liguori

This is a place where
the config file format shines, since we just have to override some parameters on
different variants to have a test working among Windows and Linux guests, for
example, all expressed in a concise way. The config files might look enormous,
but they are actually pretty small compared with the amount of test combinations
we can get out of it. This is nothing I'd call insurmountable as well.

If I get more feedback of people saying "Ok, this would make things better for
me and make me more interested", then we'll go that route. I'm sorry if I gave
the impression I was strongly opposed to your idea.





reply via email to

[Prev in Thread] Current Thread [Next in Thread]