Is it possible to have unittest run each test in a separate process? That we we don't have to worry about bad memory accesses in one test crashing python and messing up later tests (and hence can leave failed tests in until we can actually fix them). We are kind of abusing unittest (by using it to test C++ code) so it may not be easy to do.
Daniel Russel wrote: > Is it possible to have unittest run each test in a separate process? > That we we don't have to worry about bad memory accesses in one test > crashing python and messing up later tests (and hence can leave failed > tests in until we can actually fix them). We are kind of abusing > unittest (by using it to test C++ code) so it may not be easy to do.
It is possible, but it'd be ugly, and I don't think it would be an 'improvement', since it would hide a multitude of sins (the OS does a lot of cleanup at process end which hides various bugs). You can always run your tests individually if you so desire. But the nightly builds do not complete if any of the tests fail, regardless of whether they're run in separate processes, and then people queue up at my desk the next day to complain about New Feature X not being usable yet in the nightlies. That's the primary reason why I insist on things not being broken for more than a day or two at a time.
The Python unittest tests are not for testing C++ code, but the Python interface. If you want to write tests for something which is not exposed to Python, it makes more sense to write C++ tests rather than exposing it just so you can test it. I seem to recall that Boost has a unittest framework, which would probably be the best thing to use here. We don't have any C++ tests yet because nobody has written them, but I'm certainly not averse to them in any way. Of course, if the functionality *is* exposed to Python, you'll need a Python testcase, at a minimum to make sure the SWIG wrapping is working correctly, so you may as well use this Python testcase to also test the underlying C++ code unless you want to have a lot of overlap in the testcases.
Ben
On May 29, 2008, at 9:50 AM, Ben Webb wrote: > But the nightly builds do > not complete if any of the tests fail, regardless of whether they're > run > in separate processes, and then people queue up at my desk the next > day > to complain about New Feature X not being usable yet in the nightlies. Ahhh.
That said, it is annoying to have to turn off a test when we discover a bug which is not going to be fixed immediately. It should just stay as a failed test as a reminder of non-working functionality (after all, it is still a bug). If it weren't for the memory corruption issue, we could presumably just tell Python.unittest how many tests we expect to have fail and have it modify the return state accordingly. At the very least, such bugs should be stuck somewhere on the wiki (or the doxygen todo list) rather than just working around the failure, but I would much rather have a test failure.
> The Python unittest tests are not for testing C++ code, but the Python > interface. If you want to write tests for something which is not > exposed > to Python, it makes more sense to write C++ tests rather than exposing > it just so you can test it. I seem to recall that Boost has a unittest > framework, which would probably be the best thing to use here. So where do I put C++ tests when I have written them? I currently just delete them when I am done. As a note, I mostly test in C++ until I get rid of the obvious bugs and then change it to python since it is hard to debug across swig.
I don't think we really want to have two separate test frameworks (or rather, I don't think you really want to have two separate test frameworks :-)
> Of course, if the functionality > *is* exposed to Python, you'll need a Python testcase, at a minimum to > make sure the SWIG wrapping is working correctly, so you may as well > use > this Python testcase to also test the underlying C++ code unless you > want to have a lot of overlap in the testcases.
Is it really worth writing code just to make sure that swig works properly? Anything where we have code in IMP.i which modifies return types or memory management needs to be tested in python, but I don't see the point of just checking that each, more or less identical, C++ function call is handled properly by swig.
Daniel Russel wrote: > If it weren't for the memory corruption issue, we could > presumably just tell Python.unittest how many tests we expect to have > fail and have it modify the return state accordingly.
I don't believe you can do that, actually - at least, not without writing your own subclass to do it.
> At the very least, > such bugs should be stuck somewhere on the wiki (or the doxygen todo > list) rather than just working around the failure, but I would much > rather have a test failure.
Sure, but code that triggers a test failure should not be committed - or, at least, not unless it's going to be fixed within a day or two. Lab culture is to expect SVN to be usable "most of the time".
We already have Bugzilla set up for Modeller bug tracking. It would be trivial for me to add an IMP component if people had bugs to put in it.
> So where do I put C++ tests when I have written them?
There currently isn't anywhere, so you'd just send them to me. Then I have to figure out how to integrate them into the build system.
> I don't think we really want to have two separate test frameworks (or > rather, I don't think /you/ really want to have two separate test > frameworks :-)
Indeed. If people are going to using the functionality in Python, I'd much rather it were tested there, since there are many areas of the SWIG wrapping that can introduce problems (one example is the correct mapping of C++ operator overloads to the corresponding special Python methods - e.g. const operator[] to __getitem__ and non-const operator[] to __setitem__). C++ unittests probably only make sense for internal stuff (which seems to be largely covered by IMP_check and friends anyway) and things that are tough to check in Python, such as C++ reference counting.
Ben
participants (3)
-
Ben Webb
-
Daniel Russel
-
Daniel Russel