Mifos 1.5 Release Test Strategy

Document describing the overall testing strategy and release criteria for the Gazelle C (v1.5) release of Mifos.

Objective

The Gazelle C release provides new features as well as some architecture improvements.  During this release, system testing will emphasize validation and verification of these new features:  Currency Denominated Loan Products and Shutdown.

Testing Goals

For the Gazelle C project, the testing goals are:

  • Functional testing of Gazelle C enhancements using documented test plans and test cases or automated acceptance tests when applicable.
  • Additional build validation acceptance tests using new acceptance test framework.
  • Verification of bug fixes and other small enhancements.
  • Acceptance and regression testing of existing Mifos features to confirm no regressions have been introduced during the project.

 

Testing Non-Goals

The following testing goals are considered to be beyond the scope of the Gazelle C testing effort.

  • Uptime or reliability testing.  Standard architecture and platform being used is considered to be a proven, stable environment.
  • Unit/integration testing is not part of this system testing effort.
  • Verifying accuracy of localized UI translations.
  • Compatibility testing with unsupported Web servers, databases, browsers will not be a focus of this release’s testing.   See Test Environment for supported test platform.
  • Performance, scalability testing of the overall application.  Focused scaliability testing will be done in during this release's timeframe, but will be focused on the subsequent Mifos release, Shamim D.

 

 

Test Schedule

The testing effort for Gazelle C will be executed with the following assumptions:

  • Scheduling and priortization of testing efforts is based on the project team's priortization.  
  • Risks and project team feedback will be used as a basis for feature and test case priorities. 
  • The automated acceptance test suite will reduce the overall system test schedule  down to 1 iteration for this release.

Specific test milestones and dates will be maintained with the Gazelle C project schedule.

Test Environments

Standard software test environment for Gazelle C testing

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Mifos Test Server

Currently, we have a test environment generally available for testing the latest builds of Mifos.  This server is updated in an automated fashion with the latest build available on the Mifos Continuous Integration Server.  This server also contains test data which you can add to and modify for testing purposes.   

Please view the Mifos Test Server page for more details about accessing and using the test server.  

Test Data

For basic functional testing, a standard set of test data will be used to ensure consistency and repeatability.  Test data sets are available from mifos.org's resource repository and in the data set area of the "acceptanceTests" project.

Local Test Environment

Testing of features that require non-default configuration or unique data can be done using a local system.  Deployment should include the standard software runtime environment configuration as noted above.  Mifos war files are generated by the build server and are available to download. 

Mifos can be installed on a local test environment based on the build and install instructions for developers.

Test Approach and Execution

New feature testing

New feature testing will be scheduled to following the development iteration process for Gazelle C.  Test plans and test cases will be documented either in spreadsheets or as part of Mingle user stories.  

 

Acceptance and Regression testing

A set of automated and manual test cases from previous releases will be selected to produce a representative set of acceptance and regression tests.  These regression tests will be prioritized and executed as part of the overall test schedule. 

 

Performance and Stress testing

No specific Performance testing will be targeted for the Gazelle C schedule.  Instead, improvements and testing will be focused on the subsequent release - Shamim D.

 

UAT Candidate testing

The team will schedule a User Acceptance Testing candidate (UAT) build that will be available to Mifos existing and future deployments for evaluation and feedback. The UAT candidate will be a build that has passed initial acceptance testing for Gazelle C features.

Resolved bugs requiring verification

A list of all bugs that have been marked as 'resolved' in our issue tracker and are awaiting review and verification by a tester.  All bugs targeted for Gazelle C should be verified by project completion.  Please contact the Mifos Developer Mailing List if you would like to assist with verifying these bugs.

Test Status Execution and Monitoring

Test for new features and bug fixes will be executed in order of business priority and availability.  Within a feature area, the following priorities will be designated for test cases:

We will report test results by monitoring the results of test case execution and issue tracking results. 

Test case pass/fail totals will be tracked for each milestone, along with reference to issues reported on any test cases.  Test case results should also refer to the build used for test execution.

Issues reported during testing will be monitored to identify the trend of issues being detected and resolved. 

 

 

Release Criteria

The following release criteria will be met prior to completion of the project:

  • Browser: IE 7 and Firefox 3.0
    • 1 (High) - test cases that are run with any new build prior to release, regardless whether any changes were introduced to the feature.
    • 2 (Medium) - test cases which test major functionality of the features.  These cases are executed with each major release if there have been changes around that feature area.
    • 3 (Low) - test cases which will test additional aspects of a new feature, but can be skipped later in a project release when a feature is considered stable.
    • All new features pass acceptance criteria based on the feature's functional requirements or other agreed release criteria.
    • All the features targeted for this release are feature complete and tested to the satisfaction of the entire team. 
    • There are no remaining High (P1 or P2) priority open issues targeted for the release.  Some high priority issues identified during testing may deferred based on acceptable workarounds. 
    • Planned acceptance and regression tests have been executed, with all tests passing or having an acceptable workaround.
    • Final release candidate build has been tested and has passed all acceptance and installation tests.