Skip to main content
Case Study · System Test Plan

The Release 1.0 system test plan for a call-center loan app.
Six build cycles. WebLogic + Oracle + MQ scoring. A hard go-live.

System test plan for a bank's home equity loan call-center application. Six weekly build cycles, credit-scoring mainframe integration, cloned call-center desktops, and a three-week no-crash exit bar. Client name and individuals scrubbed; the method is intact.

Build Cycles
6
Test Regions
3
Exit Bar
3 wks no crash

Key Takeaways

Four things to remember.

01

Positive use cases were the scope

The plan explicitly scoped OUT negative use cases, localization, operations, documentation, unit/FVT, and white-box testing. It could not cover everything in the time available, and it said so in writing.

02

Exit bar: three weeks no server crash

System test exit required no panic / crash / halt / wedge / unexpected process termination on any server for the previous three weeks. Not "incidents resolved." Three consecutive weeks of zero.

03

Reference platform as oracle

The existing origination system ("Some Other Loan App") acted as the reference oracle for offered products — testers entered the same application on both, then compared. That solved the hardest part of test design for free.

04

Escalation process is in the plan

Contact lists, escalation paths, and release management are part of the plan itself, not a separate runbook. The plan is the operations manual.

Overview

This system test plan was written for the Release 1.0 deployment of a home-equity loan application used by Call Center agents on live customer calls. Names of people and places (Minneapolis Test Group, Fairbanks Call Center, specific individuals) are preserved as they appear in the source, but the client is pseudonymized as "Some Client" / "Somebank" and the product as "Some Loan App".

What makes this plan worth studying is the ruthless specificity of its exit criteria and the operational discipline of its test execution process. It is a plan written to survive a go-live date that will not move.

01

Overview

The 'Some Loan App', as deployed in Release 1.0, allows Home Equity Loan Call Center Agents to fit home equity products (loans and lines of credit) to customers. The system is a group of Java programs running on WebLogic, with Oracle storage and a Netscape gateway. Call Center agents interview customers through a Web browser; the system scores credit via a mainframe connection (MQ) and displays eligible products. If a product is accepted, the loan is transmitted to the existing origination platform for document generation and finalization.

02

Scope — What system test IS and IS NOT

The system test scope table was written up front to eliminate later arguments about what "done" meant.

IS

  • Positive use cases (functionality)
  • Capacity and volume
  • Error handling and recovery
  • Standards and regulatory compliance (as covered in the use cases)
  • Client configuration (browser and call center desktop compatibility)
  • Security [scope TBD at plan time]
  • Distributed (leverage Webdotbank testing)
  • Performance
  • Black-box / behavioral testing
  • "Some Loan App" / "Some Other Loan App" status communications
  • Confirmation testing in QA region

IS NOT

  • Negative use cases
  • Operations (paperwork processing, loan initiation, rate updates)
  • Usability or user interface
  • Date and time processing
  • Localization
  • Test database development
  • Documentation
  • Code coverage
  • Software reliability
  • Testing of the complete system
  • Horizontal (end-to-end) integration
  • Data flow or data quality
  • Unit or FVT testing
  • White-box / structural testing

03

Milestone schedule

The plan laid out the six-cycle schedule from unit test through deployment, against real calendar dates.

  • Unit test complete
  • Smoke build delivered and installed
  • System Test Entry Criteria met
  • System Test (six release cycles) — ~6 weeks
  • System Test Launch Meeting
  • Builds 1–6 delivered and installed (weekly)
  • Golden Code review (all bugs fixed: ready for final build)
  • System Test Exit Criteria met
  • System Test Phase Exit Meeting
  • User Acceptance Test (two-week window)
  • Go / No-Go Decision
  • Deployment

04

System Test Entry Criteria

System Test can begin when the following criteria are met:

  • The "Tracker" bug tracking system is in place and available for all project participants.
  • All software objects are under formal, automated source code and configuration management control.
  • The HEG System Support team has configured the System Test clients and servers for testing — cloned call-center agent desktops, LoadRunner Virtual User hosts, Netscape, WebLogic, Oracle (including indices and referential integrity), MQ connections, and network infrastructure. Test Team has been granted access.
  • The Development Teams have code-completed all features and bug fixes scheduled for Release 1.0.
  • The Development Teams have unit-tested all features and bug fixes scheduled for Release 1.0 and transitioned the appropriate bug reports into a "verify" state.
  • Fewer than ten (10) must-fix bugs are open, including bugs found during unit testing. Must-fix status is determined by the Project Manager and the AVP of Home Equity.
  • The Development Teams provide revision-controlled, complete software products to MTG (see Release Management).

05

System Test Continuation Criteria

System Test will continue provided:

  • All software released to the Test Team is accompanied by Release Notes. These must specify the bug reports the Development Teams believe are resolved in each software release.
  • No change is made to the 'Some Loan App' — whether in source code, configuration files, or other setup instructions or processes — without an accompanying bug report.
  • Twice-weekly bug review meetings occur until System Test Phase Exit to manage the open bug backlog and bug closure times.

06

System Test Exit Criteria

System Test will end when the following criteria are met:

  • No panic, crash, halt, wedge, unexpected process termination, or other stoppage of processing has occurred on any server software or hardware for the previous three (3) weeks.
  • The Test Team has executed all the planned tests against the GA-candidate software release.
  • The Development Teams have resolved all must-fix bugs (defined by the Project Manager and the AVP, Home Equity Group).
  • The Test Team has checked that all issues in the bug tracking system are either closed or deferred, and, where appropriate, verified by regression and confirmation testing.
  • The open / close curve indicates that product stability and reliability have been achieved.
  • The Project Management Team agrees that the product, as defined during the final cycle of System Test, will satisfy the Call Center Agent's reasonable expectations of quality.
  • The Project Management Team holds a System Test Phase Exit Meeting and agrees that these System Test exit criteria are met.

07

Test configurations and environments

Testing involved both client systems and server regions.

Client systems

  • LoadRunner Virtual User clients ("LR clients") — Windows NT, configured for large numbers of simultaneous virtual-user sessions, used for stress, performance, and capacity test cases.
  • Call Center Desktop Agent clients ("CC clients") — Windows 95, configured to resemble the Fairbanks Call Center Agent Desktop as closely as possible, used for manual test cases.

Server regions

  • "Some Loan App" QA Region — where CC and LR clients send loan applications during testing.
  • Scoring QA Region — provides credit-bureau scoring to the Some Loan App so it can assign a customer to a credit-risk tier.
  • "Some Other Loan App" Regression Region — the existing origination platform, used as the reference oracle for offered products.

08

Test execution process

The plan described exactly how the team would run test execution, not just what they would test.

  • Test Hours — the weekly envelope
  • Test Cycles — what a single build cycle looked like
  • Test Execution Process — step-by-step (see QA Library test-execution-process)
  • Human Resources — roles, chair time, and responsibilities
  • Escalation Process — with Test Contact List, Support Contact List, and Management Contact List named
  • Test Case and Bug Tracking — the Tracker workflow
  • Release Management — how a build becomes a testable build

09

Risks and contingencies

The plan closed with named risks to the test effort itself — environment instability, late-breaking scope additions, third-party dependencies — and the contingency actions MTG would take if each materialized. Then a change history, referenced documents, and a frequently-asked-questions appendix.

Take it with you

Download the piece you just read.

We keep this library free. All we ask is that you tell us who you are, so we know who to follow up with if we release an updated version. One-time form, this browser remembers you after that.

Need a QA program to back this up in your organization?

If a checklist is not enough and you want help applying it to a live engagement, we can have a call this week.

Related reading

Articles, talks, guides, and case studies tagged for the same audience.