Gazelle B Release Test Strategy

Document describing the overall testing strategy and release criteria for the Gazelle B release of Mifos®.

Objective

The Gazelle B release provides new features as well as some architecture improvements.  During this release, system testing will emphasize validation and verification of these new features - updated BIRT viewer, import of bulk accounting transactions, and improved Collection Sheet Entry processing.  As part of the Collection Sheet Entry changes, we are teaming with SunGard to focus on performance/functional testing.

Testing Goals

For the Gazelle B project, the primary testing goals are as follows:

  •  Functional testing of Gazelle B enhancements using documented test plans and test cases or automated acceptance tests when applicable.
  • Automated load testing for Collection Sheet Entry.
  • Additional build validation acceptance tests using new acceptance test framework.
  • Verification of bug fixes.
  • Acceptance and regression testing of 1.3.x features to confirm no regressions have been introduced during the project.
  • Testing with two browsers - Firefox and Internet Explorer.

 

Testing Non-Goals

The following testing goals are considered to be beyond the scope of the Gazelle B testing effort.

  •  Uptime or reliability testing.  Standard architecture and platform being used is considered to be a proven, stable environment.
  • Unit/integration testing is not part of this system testing effort.
  •  Verifying accuracy of localized UI translations.
  •  Compatibility testing with unsupported Web servers, databases, browsers will not be a focus of this release’s testing.   See Test Environment for supported test platform.
  • Performance, scalability testing of the overall application.  For this release, performance testing is focused on one strategic area - Collection Sheet Entry.

 

 

Test Schedule

The testing effort for Gazelle B will be executed with the following assumptions:

  • Scheduling and priortization of testing efforts is based on the project team's priortization.  
  • Risks and project team feedback will be used as a basis for feature and test case priorities. 
  • Test results from initial test cycles will influence the priority and order of manual test case execution in later test cycles. 
  • The automated acceptance test suite will reduce the overall system test schedule from 2 iterations down to 1 iteration for this release.

Specific test milestones and dates will be maintained with the Gazelle B project schedule.

 Test Environments

Standard software test environment for Gazelle B testing

  • Operating system: Windows 2003 server
  • Java: JRE 1.6U14
  • Application server: Tomcat 6.0.18
  • Database: MySQL 5.1  Community Edition
  • Report Designer: BIRT 2.5
  • Browser: IE 6 and Firefox 3.0

    Mifos Test Server

    Currently, we have one test environment generally available for testing the latest code of Mifos.  This server is updated in an automated fashion with the latest build available on the Mifos Continuous Integration Server.  This server also contains a set of test data which you can add to and modify for testing purposes.   

    Please view the Mifos Test Server page for more details about accessing and using the test server.  

    Test Data

    For basic functional testing, a standard set of test data will be used to ensure consistency and repeatability.  Test data sets are available from mifos.org's resource repository and in the data set area of the "acceptanceTests" project.

    Local Test Environment

    Testing of features that require non-default configuration or unique data can be done using a local system.  Deployment should include the standard software runtime environment configuration as noted above.  Mifos war files are generated by the build server and are available to download. 

    Mifos can be installed on a local test environment based on the build and install instructions for developers.

    Test Approach and Execution

    New feature testing

    New feature testing will be scheduled to following the development iteration process for Gazelle B.  Test plans and test cases will be documented either in spreadsheets or as part of Mingle user stories.  

     

    Acceptance and Regression testing

    A set of automated and manual test cases from previous releases will be selected to produce a representative set of acceptance and regression tests.  These regression tests will be prioritized and executed as part of the overall test schedule. 

     

    Performance and Stress testing

    Dedicated performance and stress testing will be focused on the area of Collection Sheet Entry.  SunGard will plan and execute these tests in coordination with the rest of the Mifos community.  Documentation of that testing approach is attached to this thread http://tinyurl.com/yfweqgg.   Document is titled "CollectionSheetEntry_Approach.doc". 

    UAT Candidate testing

    The team will schedule a User Acceptance Testing candidate (UAT) build that will be available to Mifos existing and future deployments for evaluation and feedback. The UAT candidate will be a build that has passed initial acceptance testing for Gazelle B features.

    Resolved bugs requiring verification

    This list in our issue tracker is a list of all bugs that have been marked as 'resolved' in our issue tracker and are awaiting review and verification by a tester.  All bugs targeted for Gazelle B should be verified by project completion.  Please contact the Mifos Developer Mailing List if you would like to assist with verifying these bugs.

    Test Status Execution and Monitoring

    Test for new features and bug fixes will be executed in order of business priority and availability.  Within a feature area, the following priorities will be designated for test cases:

    • 1 (High) - test cases that are run with any new build prior to release, regardless whether any changes were introduced to the feature.
    • 2 (Medium) - test cases which test major functionality of the features.  These cases are executed with each major project milestone.
    • 3 (Low) - test cases which will test additional aspects of a feature, but can be skipped later in a project release when a feature is considered stable.

    We will report test results by monitoring the results of test case execution and issue tracking results. 

     Test case pass/fail totals will be tracked for each test milestone, along with reference to issues reported on any test cases.  Test case results should also refer to the build used for test execution.

    Issues reported during testing will be monitored to identify the trend of issues being detected and resolved. 

     

     

    Release Criteria

    The following release criteria will be met prior to completion of the project:

    •  All new features pass acceptance criteria based on the feature's functional requirements or other agreed release criteria.
    • All the features targeted for this release are feature complete and tested to the satisfaction of the entire team. 
    •  There are no remaining High (P1 or P2) priority open issues targeted for the release.  Some high priority issues identified during testing may deferred based on acceptable workarounds. 
    •  Planned acceptance and regression tests have been executed, with all tests passing or having an acceptable workaround.
    • Final release candidate build has been tested and has passed all acceptance and installation tests.
last modified 2009-10-19 16:45