I'm wondering how hard it would be to be able to generate timings for the tests.
The idea is that each time the unit tests are run on the build machine, I'd like to be able to tell that a) the tests are all passing and b) the tests didn't impact performance compared to the previous build.
Anyone worked on something like this ?
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
This could then be combined with the existing RepeatedTest decorator for average timings.
The question then is how to log the timings. Rather than expose the timing (and possibly other benchmarks) in the Test interface, I'd suggest using a visitor for output logging?
I'll have a go and let you know if it works ;)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
In the end I implemented timing directly in Test/TestCase.
It seems reasonable to me that every test run should be timed - though not necessarily always reported.
This is the question that decides whether to use a decorator, or directly modify Test/TestCase.
So far I have only implemented timing output in TextOutputter, by supplying a bool flag to print details of passing tests (timing is not much use if the test fails).
Also, my changes are to a fairly badly hacked (by me - that is ;) version of the 1.7.3 distribution.
What do you think about decorator vs modifying Test/TestCase?
Would you like me to send my changes as they are?
I guess you'd prefer files compiled in the original distribution? If so - would you like the 1.6.2 or 1.7.3 version of the files?
Best Regards
--
Jake
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Hi all,
I'm wondering how hard it would be to be able to generate timings for the tests.
The idea is that each time the unit tests are run on the build machine, I'd like to be able to tell that a) the tests are all passing and b) the tests didn't impact performance compared to the previous build.
Anyone worked on something like this ?
Hi,
I was thinking that one way to do this would be with a new TestDecorator subclass e.g. with a run method like:
void
TimedTest::run()
{
StopWatch watch;
TestDecorator::run();
m_time = watch.lap();
}
This could then be combined with the existing RepeatedTest decorator for average timings.
The question then is how to log the timings. Rather than expose the timing (and possibly other benchmarks) in the Test interface, I'd suggest using a visitor for output logging?
I'll have a go and let you know if it works ;)
Keep us informed of your progress, looks promising :)
In the end I implemented timing directly in Test/TestCase.
It seems reasonable to me that every test run should be timed - though not necessarily always reported.
This is the question that decides whether to use a decorator, or directly modify Test/TestCase.
So far I have only implemented timing output in TextOutputter, by supplying a bool flag to print details of passing tests (timing is not much use if the test fails).
Also, my changes are to a fairly badly hacked (by me - that is ;) version of the 1.7.3 distribution.
What do you think about decorator vs modifying Test/TestCase?
Would you like me to send my changes as they are?
I guess you'd prefer files compiled in the original distribution? If so - would you like the 1.6.2 or 1.7.3 version of the files?
Best Regards
--
Jake
The easiest way to add timing for test, is to use TestListener.
Here is an example I just added to the documentation:
#include <cppunit/TestListener.h>
#include <cppunit/Test.h>
#include <time.h> // for clock()
class TimingListener : public CppUnit::TestListener
{
public:
void startTest( CppUnit::Test *test )
{
_chronometer.start();
}
void endTest( CppUnit::Test *test )
{
_chronometer.end();
addTest( test, _chronometer.elapsedTime() );
}
// ... (interface to add/read test timing result)
private:
class Clock
{
public:
Clock() : _startTime( 0 ), _endTime(0) {}
void start()
{
_startTime = clock();
}
void end()
{
_endTime = clock();
}
double elapsedTime() const
{
return double(_endTime - _startTime) / CLOCKS_PER_SEC;
}
private:
clock_t _startTime, _endTime;
};
Clock _chronometer;
};
---
To register it:
TimingListener timing;
runner.eventManager().addListener( &timing );
Baptiste.
Looks good - but I'm not too sure about the nested Clock class ;)
Don't you need to also change the outputters to (optionally) output the tests that pass + to access the listener timings?
Or is it up to the client to deal with the listener after the runner.run() is called?
Is this listener something you'd consider adding to the distribution?
Best Regards
--
Jake
I wouldn't dare to assume how timing will be use. Timing test is already a "perversion" ;-).
The easiest would probably be to extends TestResultCollector and add timing collection (yuk).
Then, extends XmlOutputter (which output all tests) and print timing...
Baptiste.