From: Waylan L. <wa...@gm...> - 2008-08-23 03:59:10
|
Just an FYI to everyone. I've come to realize more than ever today that just because the tests included in the repo are passing does not mean *anything*. If you look at the numerous commits I've made today, most are changes to the tests themselves. I've spent hours fixing bad tests and only minutes on actual bugs in the code. It seems that when the tests were altered to fit the output of ElemenTree, a large number of new bugs have been introduced into the tests. So while it looks like everything is ok - I've found a few glaring blunders. The only way to catch them is to either do your own testing outside the provided tests and fix the bugs - then run the tests and fix all the tests that fail, or to review each and every test by hand and try to spot problems. I've been doing the former, and in the process I've looked at a few tests and discovered additional problems. When I have, I've been updating the tests to what they should be *even if I haven't fixed the bug*. That means those tests are still failing. Please don't change the test - fix the bug so the test passes. Or, if you disagree with what the test should expect, bring it up here for discussion. Even better, spend some time reviewing the tests for more errors and submit patches. :-) -- ---- Waylan Limberg wa...@gm... |
From: Artem Y. <ne...@gm...> - 2008-08-23 09:23:30
|
Waylan Limberg wrote: > Just an FYI to everyone. I've come to realize more than ever today > that just because the tests included in the repo are passing does not > mean *anything*. If you look at the numerous commits I've made today, > most are changes to the tests themselves. I've spent hours fixing bad > tests and only minutes on actual bugs in the code. > > It seems that when the tests were altered to fit the output of > ElemenTree, a large number of new bugs have been introduced into the > tests. So while it looks like everything is ok - I've found a few > glaring blunders. The only way to catch them is to either do your own > testing outside the provided tests and fix the bugs - then run the > tests and fix all the tests that fail, or to review each and every > test by hand and try to spot problems. > Hmm, tests modification was done in that way: I changed output of NanoDOM, without any changes in core, then I ran through all tests and wrote results to separate files. So, I think problems concerning switching to ElementTree could only exist in TextPostprocessors. We have only 2 TextPostprocessors, I don't think that there are a lot of problems concerning this issue. Also, before porting to ElementTree and modify tests, I found some incorrect tests too, most of them concerned inline patterns. And now I realize, why changing of NanoDOM output changed this output: <hr />, to <p><hr /></p> :) I even created branch called stripped-dom, to find out what the problem is, but later I decided to I add Markdown._processHR function. |
From: Artem Y. <ne...@gm...> - 2008-08-23 20:30:50
|
Waylan Limberg wrote: > I've been doing the former, and in the process I've looked at a few > tests and discovered additional problems. When I have, I've been > updating the tests to what they should be *even if I haven't fixed the > bug*. That means those tests are still failing. Please don't change > the test - fix the bug so the test passes. Or, if you disagree with > what the test should expect, bring it up here for discussion. > > I fixed bugs, all test, except codehilite and wikilink passes now, by the way the same problems exists in Markdown 1.7, it's legacy problems. |
From: Waylan L. <wa...@gm...> - 2008-08-23 22:47:58
|
On Sat, Aug 23, 2008 at 4:30 PM, Artem Yunusov <ne...@gm...> wrote: > Waylan Limberg wrote: >> I've been doing the former, and in the process I've looked at a few >> tests and discovered additional problems. When I have, I've been >> updating the tests to what they should be *even if I haven't fixed the >> bug*. That means those tests are still failing. Please don't change >> the test - fix the bug so the test passes. Or, if you disagree with >> what the test should expect, bring it up here for discussion. >> >> > > I fixed bugs, all test, except codehilite and wikilink passes now, by > the way the same problems exists in Markdown 1.7, it's legacy problems. > Thanks Artem. Well, yes, the bug you fixed was legacy, but the ones I fixed yesterday were not. See commit fa014ac. I only changed one line in markdown.py, yet the patch included 956 lines changed in 18 files. The rest were all changes to the tests - tests that used to be right back in 1.7. -- ---- Waylan Limberg wa...@gm... |
From: Yuri T. <qar...@gm...> - 2008-08-26 07:45:47
|
Waylan, thanks for noticing this! How confident do you feel about the tests at this point? Perhaps we should do more test testing. I just tried running John Gruber's test suite (MarkdownTest.pl) with --tidy option and it appears that the current version does pass the old version of the test (from 2004), which is what I used before writing test_markdown.py. However, it fails some of the new ones. This is not a regression, since those represent the more recent changes to Markdown.pl and we most probably never passed them. Just so that we are all on the same page, I committed those tests into our tree. I also modified MarkdownTest.pl to work with the version of tidy that came with Ubuntu 7.10. To run older tests: perl MarkdownTest/MarkdownTest.pl --tidy --script ./markdown.py --testdir MarkdownTest/Tests_2004 (this will pass) perl MarkdownTest/MarkdownTest.pl --tidy --script ./markdown.py --testdir MarkdownTest/Tests_2007 (some of those will fail) - yuri >> I fixed bugs, all test, except codehilite and wikilink passes now, by >> the way the same problems exists in Markdown 1.7, it's legacy problems. >> > Thanks Artem. Well, yes, the bug you fixed was legacy, but the ones I > fixed yesterday were not. See commit fa014ac. I only changed one line > in markdown.py, yet the patch included 956 lines changed in 18 files. > The rest were all changes to the tests - tests that used to be right > back in 1.7. -- http://sputnik.freewisdom.org/ |