Menu

Quality Guidelines

Andreas Mangold

Before releasing a new version, the following tests have to be run successfully. All deviances have to be documented.

For all the following checks, use IE9, most current version, unless stated otherwise.

For each run of SASUnit, make sure that all other SAS sessions have been close before. Warnings will appear in the logs otherwise.

  • On first platform (Windows SAS 9.3 64 bit, OS settings English)

    • run example project test suite in overwrite-mode in English
      Please note that the purpose of this project is the demonstration of SASUnit functionality to the end user (i.e. the SAS programmer who wants to test his programs) and to serve as a starting point for his own tests. When comparing boxplot results, please note that as always in report inspection there might be some small visual differences between the actual and the expected outcome.

      • check test scenario page:
        • All scenarios present? Check with scenario programs.
        • Result as expected for each scenario? Result must be red where indicated in test scenario description, otherwise green or white!
        • All scenario logs with exactly those errors and warnings expected from the various test cases?
        • Links to test scenario details and programs functional? Check a small random sample!
        • Correct tooltips on mouse hover for each column except duration? Check a small random sample!
        • Correct overall appearance?
      • check test cases page
        • All scenarios present? Check with scenario programs.
        • Info and Links in header same as in scenario table? Check for a small sample!
        • Links to test case detail pages functional? Check for a small sample!
        • Links to units under test functional? Check for a small sample!
        • Links to logs functional? Check for a small sample!
        • Correct tooltips on mouse hover for each column except duration? Check for a small sample!
        • Information and links on headers of test case details pages as in test cases page? Check for a small sample!
        • Correct overall appearance?
      • check a sample of test case detail pages, at least two occurrences of every assertion type
      • Check page Units under Test
        • tooltips on mouse over correct?
      • Check navigation tree
        • Same scenarios, test cases and tests as on scenario report pages? Check with a sample!
        • Proper tooltips on all levels of navigation tree below scenarios? Check with a sample!
        • Same units under test, test cases and tests as on units under test page? Check with a sample!
        • Proper tooltips on all levels of navigation tree below units under test? Check with a sample!
      • check main page of output:
        • correct title with link to SF?
        • correct footnote with link to SF and correct version and build number?
        • correct overall appearance?
        • run_all.log must not contain any errors or warnings
    • run self-test suite in overwrite-mode in English
      Please note that the purpose of this test suite is to have a number of tests which check for correct functionality of SASUnit itself. This implies that some tests have to fail and that testing involves checking whether every test has the intended outcome.

      • check test scenario page:
        • All scenarios present? Check with scenario programs.
        • Result as expected for each scenario? Result must be red where indicated in test scenario description, otherwise green or white! Test scenario number 001 must be red although not stated in the description.
      • check test cases page
        • All scenarios present? Check with scenario programs. Every program ending in "_test.sas" in the saspgm/test folder is a scenario program.
        • For each scenario
          • All test cases per scenario present? Check with source code for each scenario!
          • Log with exactly those errors and warnings expected from the various test cases?
          • Result as expected for each test case? Result must be red where indicated in test case description, otherwise green or white! Test case 001 of scenario 007 (reportsasunit_emptyscn_test.sas) must fail although not stated in description.
          • For each test case open test case detail page
            • Result as expected for each test? Result must be red where indicated in test description, otherwise green (or white for assertreport only)!
            • Manually check for each occurrence of assertreport (white result) whether results are as indicated in test description!
      • Check page Units under Test
        • All units under test present, specified in at least one test scenario? Check with source code of test scenarios!
        • All units under test correctly specified per program library and with test scenario? Check with source code of test scenarios!
        • Results correct?
      • Check main page of output,
        • run_all.log must not contain any errors or warnings
        • must be as follows, all links must be functional:

Name of project

&g_project

SASUnit

Root directory

&g_root

[check root directory and link]

Path to test repository

&g_target

doc/sasunit/en

Program libraries

(macro autocall paths)

&g_sasautos

&g_sasautos1

&g_sasautos2

&g_sasautos3

saspgm/sasunit

saspgm/test

saspgm/test/pgmlib1

saspgm/test/pgmlib2

SAS configuration file for test scenarios

&g_sascfg

bin/sasunit.9.3.windows.en.cfg

Folder for test data

&g_testdata

dat

Folder for reference data

&g_refdata

dat

Folder for specification documents

&g_doc

doc/spec

Path to SASUnit macros

&g_sasunit

saspgm/sasunit

SAS log of reporting job

 

doc/sasunit/en/run_all.log

Platform

&SYSCPL

W64_7PRO

SAS Version

&SYSVLONG4

9.03[id of maintenance release]

User ID

&SYSUSERID

[user id of operator]

SASUnit Language

SASUNIT_LANGUAGE

en

Number of test scenarios

 

[count]

Number of test cases

 

[count]

Number of assertions

 

[count]

    • run example project test suite in overwrite-mode in German

      • Compare to English version - differences must be only due to language and run date/time
    • run self-test suite in overwrite-mode in German

      • Compare to English version - differences must be only due to language and run date/time
    • test incremental build facility with example project

      • Touch (save without changes) programs example/saspgm/nobs.sas and getvars_test.sas
      • Run example project test suite in non-overwrite-mode in English
      • Check last run date/time on scenario overview page - must have been updated for nobs.sas and getvars.sas tests and unchanged for all other scenarios except for scenario tree1_test.sas which will always run.
      • Check generation date/time for test case page - must have been updated
      • Check generation date/time of a small sample of test case details pages of test cases for nobs.sas and getvars.sas - must have been updated
      • Check generation date/time of a small sample of test case details pages of other test cases - must be unchanged
    • run doxygen for example project

      • Check whether report has been created properly and all programs have been documented
      • Check contents of navigation tree
      • Check complete documentation for a small sample of programs
    • run doxygen for self test project

      • Check whether report has been created properly and all programs have been documented
    • Check rendering in Firefox (latest version)

      • Open report of example project (Windows SAS 9.3 64 bit, OS settings German, SASUnit English) in Firefox (latest version) and compare with IE9
      • open doxygen for example project in Firefox (latest version) and compare with IE9
  • On each of the following further platforms repeat the following steps

    • Platforms
      • Windows 7 64 German / SAS 9.3 64 bit
      • Windows 7 64 English / SAS 9.3 32 bit
      • Windows 7 64 English / SAS 9.2 32 bit
      • Linux Ubuntu 12.04 LTS / SAS 9.3 64 bit
      • Linux Suse 10.3 / SAS 9.2 32 bit
    • Steps
      • run example project test suite in overwrite-mode in English
        • compare to baseline: compare all main pages and a sample of the test case details pages
      • run self-test suite in overwrite-mode in English
        • compare to baseline: compare all main pages and a sample of the test case details pages
      • run example project test suite in overwrite-mode in German
        • compare to baseline: compare all main pages and a sample of the test case details pages
      • run self-test suite in overwrite-mode in German
        • compare to baseline: compare all main pages and a sample of the test case details pages
      • test incremental build facility with example project
        • same procedure as specified with baseline above
      • run doxygen for example project
        • compare to baseline
      • run doxygen for self test project
        • compare to baseline
    • "Compare to baseline" means: Compare the results on each platform to the results of the baseline (Windows 7 64 English / SAS 9.3 64 bit) above. Compare HTML reports using Firefox (latest version) under Microsoft Windows 7.
    • Please note that the sort order of test scenarios differs between Windows and Linux: Linux ignores the underscore when sorting scenario program names while windows doesn't.

Related

Tickets: #64
Documentation: Development Guidelines

Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.