Chapter 17: Software Testing Strategies
2110423: Software Engineering lecture 12 1
A Generic Software Testing Templates
• Testing begins at the module level and works “outward”
toward the integration of the entire computer-based system
• Different testing techniques are appropriate at different
points in time
• Testing is conducted by the developer of the software and
an independent test group
• Testing and debugging are different activities, but
debugging must be accommodated in any testing strategy
2110423: Software Engineering lecture 12 2
1
Verification and Validation (V & V)
• Verification refers to the set of activities that ensure that
software correctly implements a specific function
• Validation refers to a different set of activities that ensure
that the software that has been built is traceable to customer
requirements
• Boehm states that
– Verification: “Are we building the product right?”
– Validation: “Are we building the right product?”
2110423: Software Engineering lecture 12 3
Organizing for Software Testing
• Software analysis and design are constructive tasks
• Software Testing can be considered to be destructive
process
• Who should do the testing?
– Software Developers: should test their own individual units
(informal)
– Independent Test Group which may be a part of the software
develop project team
2110423: Software Engineering lecture 12 4
2
Software Testing Steps
Requirement High-order
Tests
Design Integration Test
Code Unit
Test
2110423: Software Engineering lecture 12 5
Unit Testing
interface
Module local data structure
(or unit) boundary conditions
basis paths
error handling paths
Test Cases
2110423: Software Engineering lecture 12 6
3
Unit Testing Environment
interface
Driver local data structure
boundary conditions
basis paths
Module error handling paths
to be tested
stub stub stub Test Cases
Result
2110423: Software Engineering lecture 12 7
Stubs’ Complexity
Stub Stub Stub Stub
A B C D
Display Display passed Return a value Do a table
a trace message parameter from a table search for input
(or external file) parameter and
associated output
parameter
2110423: Software Engineering lecture 12 8
4
Integration Testing
• Incremental approaches:
– top-down
– bottom-up
– sandwich
• Non-Incremental
– big-bang
2110423: Software Engineering lecture 12 9
Integration Testing: Top Down Approach
• test main first - build stubs
• replace each stub with a module
• conduct the new tests and run the old tests
• a new module which has been added might be fail the
successful test cases; the fault might be in either the new
module or the interfaces between new the new module and
the rest of the product
2110423: Software Engineering lecture 12 10
5
Integration Testing: Top Down Approach
A
B C D
E F G
H I J K
L M
Breadth First: A B C D E F G H I J K L M
Depth First: A B E H C F I D G J L M K
2110423: Software Engineering lecture 12 11
Integration Testing: Bottom Up Approach
• lower level modules are combined into builds or clusters
• develop a driver for a cluster
• test the cluster
• replace drive with module higher in hierarchy
2110423: Software Engineering lecture 12 12
6
Integration Testing: Bottom-Up Approach
A
B C D
E F G
H I J K
L M
•Build 1: H E B
•Build 2: I F C D
•Build 3: L M J K G
•after that Integrate Build 1, 2, and 3 with module A
2110423: Software Engineering lecture 12 13
Sandwich Approach
• combine top down and bottom up
• consider
– A,B,C,D,G, and J are logic modules ==> top down
– E,F,H,I,K, and L are functional modules ==> bottom up
• when all modules have been appropriately integrated, the
interfaces between the 2 groups are tested one by one
2110423: Software Engineering lecture 12 14
7
Steps for Integration Testing
• All modules should be unit tested
• Choose integration testing strategy
• Do WB/BB, test input/output parameters
• Exercise all modules and all calls
• Keep records (test results, test activities, faults)
2110423: Software Engineering lecture 12 15
System Testing
• starts after integration testing
• ends when
– we have successfully determined system capabilities
– we have identified and corrected known problems
– we confidence that system is ready for acceptance
2110423: Software Engineering lecture 12 16
8
Components of System Testing
• Requirement-based functional tests
• Performance Capabilities
• Stress or Volume tests
• Security Testing
• Recovery Testing
• Quality attribute - reliability, maintainability, integrity
2110423: Software Engineering lecture 12 17
Requirement-Based System Test
• to demonstrate that all functions are available
• test cases derived from requirements
• exercise all functions, classes of output, and system status
• all valid input data is accepted
• all invalid input data is rejected without system failure
• test interfaces to other systems
2110423: Software Engineering lecture 12 18
9
Requirement-Based System Test
• look for systematic coverage: use a functional coverage
matrix
• the coverage matrix is different from the unit-testing
• we are now planning the testing for a group of programs
instead of a single one
2110423: Software Engineering lecture 12 19
Performance capability tests
• examine performance limits and show that performance
objectives are met
• evaluate parameters such as: respond time, memory
requirements, run-time requirements, and file sizes
• need source of transaction, lab, and instrumentation -
hardware or software to monitor
• look for hardware or software units that limit performance
2110423: Software Engineering lecture 12 20
01
Stress or Volume Testing
• is designed to confront the system with abnormal monitors
• drive the system to its limit and determine whether it breaks
down
• first test to specification, then break and analyze
• determine how many transactions or records can be
operationally supported
• demand resources in abnormal quality, frequency, and
volume
2110423: Software Engineering lecture 12 21
Recovery and Security Testing
• Recovery Testing:
– to confirm that the system with switchover capabilities resumes
processing without transaction compromise
– look for transaction fidelity
– can the system recover from any situations that make the system
crash
• Security Testing
– test security issue of the system
2110423: Software Engineering lecture 12 22
1
Acceptance Testing
• to provide clients/users with confidence and insure that the
software is ready to use
• begins when system test is complete
• test cases are subset of the system test
• acceptance tests are based on functionality and
performance requirements
• take typical day’s transactions, month or year of operation
2110423: Software Engineering lecture 12 23
Acceptance Testing (cont.)
• is usually complete when the client is satisfied
• is formal and held to a predefined schedule and duration
• we need to test for a replacement system
• acceptance test run for customized software product but if
we develop software for many users, alpha and beta test are
required
2110423: Software Engineering lecture 12 24
21
Requirements for Acceptance Testing
• tests must be run on operational hardware and software
• tests must stress the system significantly
• all interfaced systems must be in operation throughout the
test
• the test duration should run a full cycle
• tests should exercise the system over a full range inputs
• all major functions and interfaces should be exercised
• if running the entire test cannot be completed, a new run
should include a complete start
2110423: Software Engineering lecture 12 25
Alpha Test
• conduct at developer’s site
• invite users
• developers interact with users
• record errors, and usage problems
2110423: Software Engineering lecture 12 26
31
Beta Test
• conduct at customer site by users
• developer usually not present
• users record and report problems, developers fix them and
then release
2110423: Software Engineering lecture 12 27
Regression Test
• involves a retesting of software after changes have been
made to insure that its basic functionality has not been
affected by the changes
• insure that no new errors have been introduced
• involves rerunning old test cases
• automation is necessary since this process can be very time
consuming
2110423: Software Engineering lecture 12 28
41
Debugging
• when tests found faults or defects
• is programmer’s responsibility
• purpose of debugging
– locate fault (find causes and prevention)
– correct it
– retest to ensure that you have removed bug and not introduced
other
2110423: Software Engineering lecture 12 29
Bug Consequences
• Mild - misspell output, lack of white space
• Moderate - output may be misleading or redundant
• Annoying - users need tricks to get system to work
• Disturbing - refuses to handle legitimate transaction
• Serious - looses track of transaction and its occurrence
• Very serious - bug causes system to incorrect transaction
2110423: Software Engineering lecture 12 30
51
Bug Consequences (cont.)
• Extreme - problem limited to a few user or a few transaction
type, frequent, and arbitrary
• Intolerable -long term unrecoverable corruption of database,
system shutdown may need to occur
• Catastrophic - system fails
• Infectious - corrupts other systems, system that causes loss
of life
2110423: Software Engineering lecture 12 31
61