KEMBAR78
5 AYTech SoftwareTesting Material | PDF | Software Testing | Risk
0% found this document useful (0 votes)
39 views26 pages

5 AYTech SoftwareTesting Material

The document outlines various aspects of web testing, including cloud testing, the software test life cycle (STLC), and the importance of requirement traceability matrices (RTM). It details the stages of test planning, design, execution, and defect management, emphasizing risk analysis and the need for effective communication and training. Additionally, it covers test case development, optimization techniques, and the significance of thorough defect reporting and test cycle closure for continuous improvement in testing processes.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views26 pages

5 AYTech SoftwareTesting Material

The document outlines various aspects of web testing, including cloud testing, the software test life cycle (STLC), and the importance of requirement traceability matrices (RTM). It details the stages of test planning, design, execution, and defect management, emphasizing risk analysis and the need for effective communication and training. Additionally, it covers test case development, optimization techniques, and the significance of thorough defect reporting and test cycle closure for continuous improvement in testing processes.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Different Types of Web Testing

Cloud Testing
• Principles of cloud computing – “On demand availability”, “As a service”
and “Pay for the usage”.

• Cloud testing refers to testing of resources such as hardware, software, etc.


that are available on demand.
Software Test life cycle

Overview of the stages of STLC


Requirement Analysis

Business
Requirement
1

Test
M Requirement
Generates
1
Test
Scenarios/ M
Generates
M Cases Executes1
Test
/Runs
Procedure/
Script
• Activities

o Identify types of tests to be performed.

o Gather details about testing priorities and focus.

o Prepare Requirement Traceability Matrix (RTM).

o Identify test environment details where testing is supposed to be


carried out.

o Automation feasibility analysis (if required).

• Deliverables

o RTM

o Automation feasibility report. (if applicable)


Requirement Traceability Matrix
• Requirement Traceability Matrix or RTM is a document that maps and
traces user requirement with test cases.

• The main purpose of Requirement Traceability Matrix is to see that all test
cases are covered so that no functionality should miss while testing.

• Types of Traceability Matrix

o Forward traceability - It maps requirements to test cases.

o Backward or reverse traceability - It maps test cases to requirements.

o Bi-directional traceability – Forward + Backward.

Advantage of RTM

• RTM helps to know the % of test coverage.

• It highlights any requirements missing or document inconsistencies.

• It shows the overall defects or execution status with a focus on business


requirements.

Test Planning
• Test Planning is most important phase of Software testing life cycle where
all testing strategy is defined.

• This phase also called as Test Strategy phase.

• A Senior QA manager will determine effort and cost estimates for the
project and would prepare and finalize the Test Plan.
• Activities

o Preparation of test plan/strategy document for various types of testing

o Test tool selection

o Test effort estimation

o Resource planning and determining roles and responsibilities

o Training requirement

• Deliverables

o Test plan /strategy document.

o Effort estimation document.

Test Plan Types:

o Master Test Plan: A single high-level test plan for a project/product


that unifies all other test plans.

o Testing Level Specific Test Plans: Plans for each level of testing.

o Testing Type Specific Test Plans: Plans for major types of testing like
Performance Test Plan and Security Test Plan.

o Micro Test Plan: For a major project/ Product we have multiple


releases. So we need to plan each of the activities in detail for such
releases.

Test Plan Document

1. Analyze the product

2. Develop Test Strategy


3. Define Test Objective

4. Define Test Criteria

5. Resource Planning

6. Plan Test Environment

7. Schedule & Estimation

8. Test Deliverables

• Test Scope

o The components of the system to be tested (hardware, software,


middleware, etc.) are defined as "in scope“.

o The components of the system that will not be tested also need to
be clearly defined as being "out of scope“.

• Test strategy: Test strategy is a set of guidelines that explains test design
and determines how testing needs to be done.

o It sets the standard for testing processes and activities and other
documents.

o For smaller projects Test Strategy inside the test plan.

o For larger projects, there is one Test Strategy document and


different number of Test Plans for each phase or level of testing.

• Test Approach: A test approach is the test strategy implementation of a


project, defines how testing would be carried out.

o Test approach has two techniques:


▪ Proactive - An approach in which the test design process is
initiated as early as possible in order to find and fix the
defects before the build is created.

▪ Reactive - An approach in which the testing is not started


until design and coding are completed.

• Test Objective: The objective of the testing is finding as many software


defects as possible; ensure that the software under test is bug free before
release.

• Assumptions: Assumption documents are prerequisites, which if not


met, could have a negative impact on the test.

o Examples of assumptions include:

▪ Skill level of test resources.

▪ Test budget.

▪ State of the application at the start of testing.

▪ Tools available.

▪ Availability of test equipment.

▪ Test criteria for each stage.

Test Criteria

Entry Criteria: The minimum set of conditions that should be met in


order to start the testing activities.

Exit Criteria: The minimum set of conditions that should be met in order
to close the testing activities.

Suspension Criteria: Suspension criteria specify the criteria to be used to


suspend all or a portion of the testing activities. It assumes that testing
cannot go forward and that going backward is also not possible.

Resumption Criteria: Resumption criteria specify when testing can


resume after it has been suspended.
Risk Analysis: Identify the test risks and their possible impact on the test
effort.

Risks that could impact testing include:

Availability of downstream application test resources to perform the system


integration or regression testing.

Implementation of new test tool.

Sequence and increments of code delivery.

New technology.

Risk mitigation plans are drawn.

Test Design:

The type of tests that must be conducted.

The different testing levels those are required.

Resource Planning: Resource plan is a detailed summary of all types of


resources required to complete project task. Resource could be human,
equipment and materials needed to complete a project.

Take right mix of experience to ensure the team balance and have resources
with multiple skills as needed

Test schedule: A test schedule includes the testing steps or tasks, the target
start and end dates, and responsibilities. It should also describe how the test
will be reviewed, tracked, and approved.

Test Milestones: Test Milestones are designed to indicate the start and
completion date of each test.

Test Estimation: In this phase, break out the whole project into small tasks and
add the estimation for each task.

Test Data Management:

Test data is actually the input given to a software program.


Identify common test data elements.

Prioritization and allocation of test data.

Generating reports and dashboards for metrics.

Creating and implementing business rules.

Building an automation suite for master data preparation.

Masking, archiving and versioning aging of data.

Data security issues.

Test Environment: A testing environment is a setup of software and hardware


on which the testing team is going to execute test cases. The test
environment consists of real business and user environment, as well as
physical environments, such as server, front end running environment.

Defect Management process:

A defect is an observed difference between the expectation or prediction


and the actual outcome of a test.

Defect tracking mechanisms to be used.

• Communication Approach: Various communication mechanisms are


used,

o Formal and informal meetings.

o Working sessions.

o Processes, such as defect tracking.

o Techniques such as escalation procedures or the use of white boards


for posting.

o Current state of testing.


o Project contact list, meeting audience and frequency of defect
reporting.

• Tools used: Any tools that will be needed to support the testing process
should be included here.

Test Plan Maintenance:


o Test plan is not static and is updated on an on demand basis.

o As the project advances there may be changes or clarity for the


requirements and the plan need to accommodate that.

o Test plan is a living document and hence being maintained.

o Eg: If we procuring a tools like LoadRunner, by the project ongoing


time it is not ready what is the alternative?

Training in Test Planning Phase

• Identify training that is necessary to provide those skills, if not already


acquired.

• Training is given to the tester according to the need

o Training on the application/system.

o Training for any test tools to be used.

o Training about organization or testing process.

o Training on the technology used in the project.

o Training can be classroom or on the job or offline.

Risk Management
• Risk is an event that, if it occurs, adversely affects the ability of a project to
achieve its outcome objectives.
• Risk management is the process of identifying risk, assessing risk, and
taking steps to reduce risk to an acceptable level.

• The generic process for Risk Management involves

o Risk Identification

o Risk Impact or Consequence Assessment

o Risk Prioritization

o Risk Mitigation Planning

Risks during Test Planning

• The following is a sample list of risks that might be listed during test
planning phase.

• Testing schedule is tight. If the start of the testing is delayed due to design
tasks, the test cannot be extended beyond the UAT scheduled start date.

• Not enough resources, resources on boarding too late (process takes around
15 days).

• Defects are found at a late stage of the cycle or at a late cycle; defects
discovered late are most likely be due to unclear specifications and are time
consuming to resolve.

• Scope not defined/completely defined.

• Natural disasters.

• Non-availability of Independent Test environment and accessibility.

• Delayed Testing Due To new Issues.

• Risk Identification

o Risk identification is the critical first step of the risk management


process. Its objective is the early and continuous identification of
risks, including those within and external to the engineering system
project.

• Risk Impact or Consequence Assessment:

o In this step, an assessment is made of the impact each risk event could
have on the engineering system project.

o This includes how the event could impact cost, schedule, or technical
performance objectives.

o Additional criteria such as political or economic consequences may


also require consideration.

o An assessment is made of the probability (chance) each risk event will


occur.

Risk Based Testing

• Risk based testing is basically a testing done for the project based on risks.

• Risk based testing involves testing the functionality which has the highest
impact and probability of failure.

• Risk-based testing starts early in the project.

• Risk-based testing involves both mitigation and contingency.

• Risk-based testing also includes measurement process that recognizes how


well we are working at finding and removing faults in key areas.

• How to perform risk based testing?

o Make a prioritized list of risks.

o Perform testing that explores each risk.

o As risks evaporate and new ones emerge, adjust your test effort to stay
focused on the current crop.
Functional Requirements
• The Functional Requirements Specification documents the operations and
activities that a system must be able to perform.

• Functional Requirements should include:

o Descriptions of data to be entered into the system.

o Descriptions of operations performed by each screen.

o Descriptions of work-flows performed by the system.

o Descriptions of system reports or other outputs.

o Who can enter the data into the system.

o How the system meets applicable regulatory requirements.

• The Functional Requirements Specification is designed to be read by a


general audience.

Identify Test requirement based on FRD review

• FRS review is nothing but going through the functional requirements


specification document and trying to understand what the target application
is going to be like.

Prerequisites to prepare test scenarios:


o The correct version of the SRS document.

o Clear instructions on who is going to work on what and how much


time have they got.

o A template to create test scenarios.

o Other information on - who to contact in case of a question or who to


report in case of a documentation inconsistency.
Test Scenarios
• Test scenarios provide one line information about what to test.

• It is also called Test Condition or Test Possibility.

• Test scenarios are not external deliverables (not shared with Business
Analysts or Dev teams) but are important for internal QA consumption.

• Good test coverage can be achieved by dividing application in test scenarios


and it reduces repeatability and complexity of product.

• Test scenarios are more important when time to write test cases is no
sufficient and team members are agree with the detailed one liner scenario.

Test Case Development


• This phase involves creation, verification and rework of test cases & test
scripts.

• Test data, is identified/created and is reviewed and then reworked as well.

• Activities

o Create test cases, automation scripts. (if applicable)

o Review and baseline test cases and scripts.


o Create test data. (If Test Environment is available)

• Deliverables

o Test cases/scripts

o Test data

Test Case Document


• Fields in test case document:

o Test case id

o Test Summary/ Scenario: Description of the test (What to be


verified?)

o Pre-Condition

o Test Steps: Steps to be executed

o Test data: Input data

o Expected result

o Actual result

o Status: Pass/Fail

o Defect ID

Test Case Optimization Techniques

• Boundary Value analysis

• Equivalence partitioning

• Error guessing

• Decision tables

• State Transition Diagrams:


• Orthogonal Array Testing (OAT)

Decision tables

• All the validations specified in the decision boxes should be made as


columns in table.

• All the results mentioned in flow diagram should be covered in the decision
table.

• All combinations of inputs needed to obtain a certain result shall be


mentioned in the combinations column and can be included while writing
the test cases.

• After completing the decision table one has to just verify whether all the
branches and leaves in the logical tree are covered.

Conditions Condition Alternatives

Actions Action Entries

State transitional diagrams:


• Identify a finite number of states the model execution goes through.

• Create a state transition diagram showing how the model transitions


from one state to the other.

• Assess the model accuracy by analyzing the conditions under which a


state change occurs.

• State transition: A transition between two states of a component or


system.
• Minimal number of test cases to cover each state is two.

Orthogonal Array Testing (OAT)


• Implementing OATS technique involves the below steps:

o Identify the independent variables. These will be referred to as


“Factors”.

o Identify the values which each variable will take. These will be
referred as “Levels”.

o Search for an orthogonal array that has all the factors from step 1 and
all the levels from step 2.

o Map the factors and levels with your requirement.

o Translate them into the suitable test cases.

o Look out for the left over or special test cases (if any).

Steps to prepare test cases so as to ensure maximum test coverage

• Use decision table test case design technique to attain 100% logical
coverage.

• Boundary value analysis and Equivalence partitioning for covering various


ranges of inputs.

• Combinations and permutations for field level validations (though not all
permutations are required).

• Error guessing (apart from the errors that can be identified from the above
three steps) with experience as a final touch.
Defect Management

Defect Management Overview

• Defect can be defined as an unexpected behavior of the software.

• The elimination of defects from the software depends upon the efficiency of
testing done on the software.

Logging Defect

• Writing a good bug report is primary responsibility of any tester.

• If tester is not reporting bug correctly, programmer will most likely reject
this bug stating as irreproducible.

• Try to summarize the problem in minimum words yet in effective way.

• Do not combine multiple problems even they seem to be similar.

• Good practices to write a bug report:

o Report the problem immediately.

o Reproduce the bug three times before writing bug report.

o Test the same bug occurrence on other similar module.

o Write a good bug summary.

Defect Report/Bug Report


• Defect Report consists of the following information:
o Defect ID

o Defect Summary

o Test case ID- This is as reference to failed test case

o Defect description

• Steps to Reproduce

• Expected Result

• Actual Result

o Defect Status

• Open/Assigned/ In Progress/Resolved/Released for Testing/


Closed/ Reopen/Differed/ Cannot Reproduce/ Duplicate/ Not a
Bug

o Defect Severity: Blocker/Critical/Major/Minor/Trivial

o Defect Priority: High/Medium/Low

o Reported by

Test Cycle Closure


• Testing team will meet, discuss and analyze testing artifacts to identify
strategies that have to be implemented in future, taking lessons from the
current test cycle.

• The idea is to remove the process bottlenecks for future test cycles and share
best practices for any similar projects in future.

• Activities

o Evaluate cycle completion criteria based on Time, Test coverage,


Cost, Software, Critical Business Objectives, Quality.

o Prepare test metrics based on the above parameters.


o Document the learning out of the project.

o Prepare Test closure report.

o Qualitative and quantitative reporting of quality of the work product


to the customer.

o Test result analysis to find out the defect distribution by type and
severity.

• Deliverables

o Test Closure report

o Test metrics

Test Reports
• Daily status reports

o What did you do today? (Eg: How many test cases planned vs.
executed.)

o What are you planning to do tomorrow?

o Did you face any issues during your day? If yes, how did you resolve
them or are they still open?

o Do you need any inputs for tomorrow? If yes, from whom and what
are they?

Defect Priority and Severity


• High Priority & High Severity: An error which occurs on basic functionality
of the application and will not allow the user to use the system.

• High Priority & Low Severity: The spelling mistakes that happens on the
cover page or heading or title of an application.
• High Severity & Low Priority: An error which occurs on the functionality of
an application and will not allow the user to use the system but that
functionality is rarely used by the end user.

• Low Priority & Low Severity: Any cosmetic or spelling issues which is
within a paragraph or in the report (Not on cover page / title / heading).

Defect Life Cycle / Bug Life Cycle


User Acceptance Test

What is UAT?

• In software development, User Acceptance Testing (UAT) is a phase of


software development in which the software is tested in the "real world" by
the intended audience.

• UAT also called beta testing, application testing, or end user testing.

• UAT is the last phase of the software testing process.

• This is typically the last step before the product goes live or before the
delivery of the product is accepted.

• During UAT, actual software users test the software to make sure it can
handle required tasks in real-world scenarios, according to specifications.
Need of UAT

Acceptance Criteria for UAT


• Acceptance criteria are defined on the basis of the following attributes:

o Functional Correctness and Completeness

o Data Integrity

o Data Conversion

o Usability

o Performance

o Timeliness

o Confidentiality and Availability

o Installability and Upgradability

o Scalability
o Documentation

UAT Process

UAT Planning
• The process is almost the same as with the regular test plan.

• The acceptance test activities are carried out in phases.

• Firstly the basic tests are executed and if the test results are satisfactory then
the execution of more complex scenarios are carried out.

• The Acceptance test plan has the following attributes:

o Introduction

o Acceptance Test Category

o Operation Environment

o UAT Test case ID


o Test Title

o Test Objective

o Test Procedure

o Test Schedule

o Resources

UAT Execution
• UAT happens in a conference or war room where the users, PM, QA team
representatives all sit together for a day or two and work through all the
acceptance test cases.

• The UAT team is generally a good representation of the real world end
users.

• This team executes the test cases and may additional perform random tests
relevant to them.

• Once all the tests are run and the results are in hand, the acceptance
decision is made.

Go / No-Go Decisions
• Acceptance decision is also called the Go/No-Go decision more colloquially.

• If the users are satisfied it’s a Go.

• If the users are not satisfied it’s a No-go.

• The reaching of the acceptance decision is typically the end of UAT phase.

• If the customer is satisfied, they will signoff the UAT, that means approval
to go live.

You might also like