Content to be discussed and agreed within IOG prior implementation.
The development of the DDMoRe Product is organized in two-week iterations. At the end of each iteration a stable artifacts are being made available to testers for acceptance testing.
Iteration Planning
The development team is consulted during the iteration planning process in order to ensure that agreed scope is achievable within the time scales. Any inter-dependencies between tasks (both technical and non-technical) are being identified during the iteration planning and detailed tasks resolution detas are agreed to ensure given feature-set deliverability.
Once the scope of the iteration has been agreed the project team starts to work on the agreed features and tasks.
At the end of the iteration the stable artifacts are being made available to project partners and testers for acceptance testing. In the process of acceptance testing the features are verfied and bugs reported.
Each iteration is named P4.1_IT<no>.</no>
The iteration lasts two weeks but the duration of a particular iteration may be extended. The extension is agreed before iteration starts (i.e. during iteration planning).
The deliverable names are named: 1.2.0-SNAPSHOT_P4.1_<date> where <date> is the date when the artifact has been published to SourceForge.</date></date>
Iterations are aligned with DDMoRe project release schedule and their dates are outlined below.
Note: the iteration schedule may change if the project members agree.
| Iteration Name | Start Date | End Date | Deliverable Name |
|---|---|---|---|
| P4.1_IT1 | (on-going) | 26-AUG-2015 | 1.2.0-SNAPSHOT_P4.1_26-AUG-2015 |
| P4.1_IT2 | 31-AUG-2015 | 9-SEP-2015 | 1.2.0-SNAPSHOT_P4.1_9-SEP-2015 |
| P4.1_IT3 | 14-SEP-2015 | 23-SEP-2015 | 1.2.0-SNAPSHOT_P4.1_23-SEP-2015 |
| P4.1_IT4 | 28-SEP-2015 | 30-SEP-2015 | 1.2.0-SNAPSHOT_P4.1_30-SEP-2015 |
| P4.1_IT5 | 5-OCT-2015 | 28-OCT-2015 | 1.2.0-SNAPSHOT_P4.1_28-OCT-2015 |
| P4.1_IT6 | 2-NOV-2015 | 12-NOV-2015 | 1.2.0-SNAPSHOT_P4.1_12-NOV-2015 |
| P4.1_IT7 | 16-NOV-2015 | 20-NOV-2015 | P4.1_ALPHA_20-NOV-2015 |
| P4.1_IT8 | 23-NOV-2015 | 3-DEC-2015 | P4.1_ALPHA_INTERIM_3-DEC-2015 |
| P4.1_IT9 | 7-DEC-2015 | 11-DEC-2015 | Demonstrator-1.2 |
The stable SEE artifacts are published to: SEE Devlierables under location clearly stating appropriate iteration identifier.
The stable SEE artifacts will be made available for download for a duration of two iterations. The removal of obsolete iteration artifacts will be made in the first week of the third subsequent iteration unless there is a request from the partners to keep it longer.
The deliverable software artifacts will be uniquely identifiable. The information will be included in:
As it is possible that some of the tasks assigned to an iteration will not be performed due to unforseen factors or due to iteration scope change after the iteration has started. It is understood that in general the scope of the delivered system for review may differ from the initial scope. Each such ammedament to the initial scope will be communicated weekly and changes agreed on case-by-cases basis.
Note: altough the scope may change, the agreed iteration schedule will not change. This means that in case of the scope change, the stable artifacts will be delivered as initially planned but with changes.
Once the SEE artifacts has been delivered the process of acceptance testing of the iteration deliverables starts. The testers will be responsible for performing any testing activities that will verify that the delivered feature set and bug fixes are acceptable.
Testers will report any exposed software defects using Source Forge ticketing system. The following information must be included in each ticket/bug report:
| Information | Form Field | Description |
|---|---|---|
| Software Version | Found In | A version of the DDMoRe Product being under development. |
| Deliverable identifier | Labels | Identifier of the SEE deliverable that the defect was found in. |
| Steps To Reproduce | Description | The description should list detailed steps that have to be performed in order to reproduce the issue. |
| Input Data | Attachements | The test data that should be used to reproduce the issue. |
| Expected Behavior | Description | What should be the correct behavior of the System. |
| Referenced Resources | Attachements | Any supporting resources/documents like specifications, screenshots etc. should be attached to the ticket. |
| Information | Field | Description |
|---|---|---|
| Walkaround | Description | If there is a known way of walking around the issue the tester should state the information in the description. So other testers may apply it if a functionality that they want to test is affected by the particular bug. |