Learn how easy it is to sync an existing GitHub or Google Code repo to a SourceForge project!

## Re: [sdcc-devel] Regression test ticks

 Re: [sdcc-devel] Regression test ticks From: Bernhard Held - 2006-10-20 09:59:27 ```> Please, don't startle when you see the regression test tick counts for the > mcs51 as of today. I've increased the baudrate from 9600 to 57600 and now > the outcome is about 5x less ticks! All output is framework related and > not functional to the tests anyway. I call it overhead. Many thanks for your optimizations, they make testing life much easier. I'm unhappy about the summary for quite some time. Today I made a small test: first I run the regression tests with today's svn: > Summary for 'mcs51': 0 failures, 4843 tests, 630 test cases, 889592 bytes, 143378976 ticks Then I added an empty dummy test (see below) in order to measure the overhead: > Summary for mcs51: 0 failures, 4844 tests, 631 test cases, 890706 bytes, 143616060 ticks The number of bytes increased by 1114 bytes, the number of ticks increased by 237084. Ok, now we know the overhead of an empty test. Now let's calculate the average size and tick count: 889592 / 630 = 1412 (bytes) 143378976 / 630 = 227600 (ticks) In this simple approach the average tick count is even smaller than the number of additional ticks needed by dummy.c. Most propably it's necessary to add more tests to evaluate numbers for a "test" and a "test case". A further (not exact) result is that 80% of the average test size (1114 / 1412) is overhead. The numbers in the regression test results are obviously mostly determined by the overhead. Therefore they are not very usefull when evaluating optimizations. I would love to see the numbers in the result minus the overhead. I'll supply the dummy tests and the correct mathematics, if somebody helps me with the python script "collate-results.py". Bernhard /* dummy test for evaluation of overhead */ #include void testDummy(void) { ASSERT(1); } ```