Menu

Coders, pls test code using regression

Developers
2005-06-14
2013-10-17
  • Harry Mangalam

    Harry Mangalam - 2005-06-14

    Hi All,

    Charlie and I have been bashing on the regression tests to make sure that they catch more errors - and now they have :).  Charlie's recent thread about the benchmarks and file creation tests showed up a bug that I created/let thru resulting from not using the most recent version of ncap.  That is fixed and should be committed later today (and the tests now reveal a huge improvement over the values Charlie mentioned - thanks largely to Henry's swift 2xpass-in-memory).

    The current regression tests rely on checking single values after a series of nco manipulations, which can let some oddities thru, so we decided to be a bit more careful about this.
    We're now using MD5 checksums of the entire output files and word counts of chunks of text dumps to add 2 more levels of error checking - that has pointed out some interesting things that we're still in the process of figuring out (some MD5 differences between output files generated on 64b machines vs 32b while the single value tests signal OK).

    However, in verifying that the regression tests and benchmarks worked correctly on the local platforms, I noticed that a recent ncap failed some of the regression tests on the ESMF and Opteron platforms.  Charlie thought that it may have been the result of some of the recent changes that Henry committed.  The errors only show up on those 64b systems so would have been missed by a regression test on his local 32b platform.

    So this is a warning/plea to all code contributors to run your changes past the new (and as of this writing, uncommitted) regression tests on the x86-32, AMD64, and ESMF platforms.  It's probably also a good idea to run them automagically every day to make sure nothing has gone bump in the night - I'll work on this.

    Thanks
    Harry

     
    • Charlie Zender

      Charlie Zender - 2005-06-14

      Hi All,

      I want to chime in on Harry's request.
      Fortunately NSF funded NCO for rapid growth this year.
      We have more opportunity for progress...and mutual interference.
      With four people actively developing code and committing,
      there will be conflicts.
      Let's adopt some ground rules to keep things humming :)

      1. Use doc/ChangeLog to outline your changes
         This is, I think, the best way to keep others apprised
         This documentation in ChangeLog never goes away and is always
         close to the source code. Easy to find, date-ordered.
         Emacs: C-x 4 a "Type your changes" C-x C-c
         If you commit changes without updating the ChangeLog, then the
         rest of us have no idea what to expect when we do a CVS update.

      2. Test your commits with the regression tests (e.g., make tst)
         Whenever possible, refrain from committing code that causes
         a new regression (keep working on it in your private tree).
         If you want to commit code that breaks something...fine, if you
         A. Modify the regression test to state when breakage is expected
         B. Send a message explaining the reason for the new breakage
         (Sometimes breakage is needed to get to the next stage,
         e.g., features that should fail but don't).

      Things we might consider:
      3. Adding automatic notification of CVS commits?
         This would help people keep track of what others commit
         It's relatively high noise since usually you don't care

      4. Nightly builds/regression testing

      If we can all get on board with numbers 1 and 2, then we can
      avoid/postpone 3 and 4.

      Thanks,
      Charlie

       

Log in to post a comment.