Agenda
1.Introduction to Testing 2.SDLC Models 3.Types of Testing 4.Testing Flow 5.Test Design Techniques 6.STLC 7.Defect Tracking 8.Automation Testing 9.Configuration Management
What is Software Testing..??
Software testing is the process of testing the Software /Application in intent to check whether application is working as per requirements.
Why Software testing is required..?
To achieve the Quality of an Application.
When does the testing phase Starts..?
Testing starts at the early phase itself from the requirement collection stage.
Software Development Life Cycle(SDLC)
It is the step by step process which explains how to develop a Software / Application. Water Fall Model
SDLC Models
V Model
Etc.
Requirement collection
Water Fall Model
Design
Development
Testing Maintenance
It is the basic sequential model
Applications:
Used in small applications Used in low complexity(application which is not often tend to change) projects
Advantages:
Low cost(since developers involve in testing most of the times) Used in small application with low complexity
Disadvantages:
Time consuming Fixing of bugs cost more comparably
V Model (Verification & Validation)
Veri ficat ion
Devel opers
RC
UAT
Design
ST IT
Valid ation
Testers
Coding
Unit Testing
FT
Applications:
Used in Large applications Used in High complexity(application which is often tend to change) projects.
Advantages:
Cost of fixing of bugs is less Total investment is more
Verification & Validation
Verification Are we Building product Right ? Validation Are we Building right Product ?
White Box (Unit) Testing Knowledge of internal program design and code required. Tests are based on coverage of code statements, branches, code,path. BlackBox Testing Knowledge of internal program design and code not required. Tests are based on requirements and functionality.
TYPES OF BLACKBOX TESTING
Smoke Testing Functional Testing Integration Testing Regression Testing System Testing Acceptance Testing Exploratory Testing
Smoke Testing
Testing the basic or critical feature of an application before doing rigorous testing
To ensure that product is stable
Functional Testing
Testing each and every component rigorously according to requirement specification
Integration Testing
Testing the data flow or interface between modules. Should know the product very well Should identify all the possible scenarios Prioritization which scenario should be tested.
Regression Testing
Re execution of same test cases in different build or releases to ensure that changes(i.e adding / modifying /deleting modules or defect fixed) are not introducing defects in unchanged features.
System Testing
Its end to end testing where in testing environment is just like customer environment(where real business is running to live environment)
Acceptance Testing
Acceptance testing is designed to determine whether software is fit for use or not UAT helps to determine whether a software system satisfies its acceptance criteria and to enable buyer to determine whether to accept the system or not.
Alpha Testing
Alpha testing is done before the release of a product to check whether it is functioning properly or not.
Beta Testing
Beta testing is done when the product is given to end users. They use it and if they find any defects in it they report back to developers. This is done before the final release of the product.
Compatibility Testing
Testing the application on different software and hardware environment
Hardware Compatibility
Processor-make( Intel, A MD) Speed(32 bit,6bit) RAM-make(Samsung, transient) size(1gb,2gb,4gb) Mother Board(mercury) Visual Graphic cards
Software Compatibility
Different OS(windows Xp, windows 7,vista,linux,Mac OS) Different Browser(IE, Firefox, Safari (with different versions))
Performance Testing
Testing the stability, response time of an application by uploading load
Stability-ability to withstand no of desired users Response time-time taken to receive the response Load-no of user
Types of Performance Testing
Load testing Stress testing Volume testing
Exploratory Testing
Testing the application without requirements/test cases
Code have been developed Deployment
Flow of testing
Testing Environment
Build
Smoke Testing
Got any issues
Sending back to dev to fix the issue
Small changes
Patch Black Box Testing Release
Alpha testing
Any modifications/ adding extra features
No issues in ST
Custom er
Cycle
Beta Testing
V e r s i o n
Build:
A executable code given by developers to test the stability of an application in testing environment. Once new build comes smoke test will be done where we check major functionalities, if its working fine black box testing will be carried or else once again new build will be given by developers fixing the issues found in old build.
Cycle: Developers deploy the build to Testing environment, if issue
exits test engineers escalate that to developers, developers fix that issue & deploys new build that complete rotation is known as One Cycle. Issues Executable Code Dev Build Smoke Testing TE Cycle
Release: from RC stage to Product ready stage, the complete
cycle is known as release. Before release Application/Product undergo Alpha Testing During release, product will be delivered to the Customer/Client After release Application/Product undergo Beta Testing in customer/ client place and further it will go on Live One Release has no of Builds/Cycles
Alpha Testing
Client
Release
RC RC
Development
Testing Testing
LIVE
Version: After release if any modifications or need to add
extra feature those changes will be done in next release is called a version One version has no of releases Completion of one project consists no of versions
Release Release 1 Release 2 No of releases Version Version
TEST DESIGN TECHNIQUES
Error Guessing Equivalence Partition Boundary Value Analysis
Equivalence Partition
Input data is divided into different equivalence data classes
It can classified as one valid & two invalid TC This method is typically used to reduce the total number of test cases to a finite set of testable test cases, still covering maximum requirements.
BOUNDARY VALUE ANALYSIS
Used to identify errors at boundaries rather than finding those exist in center of input domain. Its widely recognized that input values at the extreme ends of input domain cause more errors in system. More application errors occur at the boundaries of input domain. Boundary value analysis is often called as a part of stress and negative testing. n, n+1, n-1 one can use this formulae to calculate BV
Software Testing Life Cycle (STLC)
It is the step by step process which explains how the testing process goes on in developing the application.
Test Strategy Test Planning Test Case Development Test Execution Test Summary Defect Tracking
Test Plan Format
Test Case Id Reference Introduction Resource Requirements Scope Approach Test Deliverables Entry & Exit Criteria Dependencies/Risks Responsibilities
Test Case Template
Test Case ID Test Case Title Test Case Description Preconditions Expected Result Actual Result Status(Pass/Fail)
Smoke Test Template
NO
Req ID
Project ID
URL
Login Details
Description
Envirounment
Reproducable
Detected By
Test Case Template
Title: Test Case Id/Requirement No: Details Author: Reviewed By : Approved By: Name Last Modified Date
testcase id
test case name
test case desc step
test steps
test status (P/F) actual
expected
Sample Test Case: HOME PAGE: test URL: www.qatest.co.in/rail Preconditions: Open Web browser and enter the given url in the address bar. Home page must be displayed. All test cases must be executed from this page.
Test case id Test case name Validate Login test case desc step To verify that Login name on login page must be greater than 3 characters enter login name less than 3 chars (say a) and password and click Submit button enter login name less than 3 chars (say ab) and password and click Submit button enter login name 3 chars (say abc) and password and click Submit button enter login name greater than 10 chars (say abcdefghijk) and password and click Submit button enter login name less than 10 chars (say abcdef) and password and click Submit button test steps expected an error message Login not less than 3 characters must be displayed an error message Login not less than 3 characters must be displayed Login success full or an error message Invalid Login or Password must be displayed an error message Login not greater than 10 characters must be displayed Login success full or an error message Invalid Login or Password must be displayed actual test status (P/F)
Login01
Login02
Validate Login
To verify that Login name on login page should not be greater than 10 characters
Login03
Validate Login
To verify that Login name on login page does not take special characters
Pwd01
Validate Password
enter login name starting with specail chars (!hello) password and click Submit button enter login name ending with specail chars (hello$) password and click Submit button enter login name with specail chars in middle(he&^llo) password and click Submit button To verify that Password enter Password less on login page must be than 6 chars (say a) greater than 6 and Login Name and characters click Submit button enter Password 6 chars (say abcdef) and Login Name and click Submit button
an error message Special chars not allowed in login must be displayed an error message Special chars not allowed in login must be displayed an error message Special chars not allowed in login must be displayed an error message Password not less than 6 characters must be displayed Login success full or an error message Invalid Login or Password must be displayed an error message Password not greater than 10 characters must be displayed Login success full or an error message Invalid Login or Password must be displayed
Pwd02
Validate Password
To verify that Password enter Password on login page must be greater than 10 chars less than 10 characters (say a) and Login Name and click Submit button enter Password less than 10 chars (say abcdefghi) and Login Name and click Submit button
Traceability Matrix
Traceability Matrix: It is used to map between the
customer requirements and prepared test case It ensures that atleast one testcase for each requirement To keep the track of passed/failed testcases
Traceability Matrix
Requirement ID No of Test Cases RC1 RC2 RC3 RC4 RC5 RC6 RC7 RC8 RC9 No of Req No of Tc
Req RC1
TC TC1 TC2 TC(n)
Status Pass
RC2
TC1 TC2 TC(n) Fail
RC(n)
TC1 TC2 TC(n) Not able to Test
Challenges for Testers
Domain knowledge Time Test Data setup Test environment setup Unavailability of right Tools Team at Multi location Developing a good relationship with developers
Bug: "Bug is an error in the software program"
Different Stages of Defect Tracking
1.New 2.Open 3.Assign 4.Test 5.Verified 6.Deferred 7.Reopened 8.Duplicate 9.Rejected 10.Closed
Defect Life Cycle
New NN New
OP Open
Rejected
Assign Assign D Duplicate
Reopened
Fixed Fixed V Verified
Deferred
Closed
Severity & Priority
Severity & Priority
Severity:
Defines how important to fix the bug which impact on the business Based on the severity/priority developers will fix the issue Severity Types Critical Major Minor
Priority:
Defines how important to fix the bug which impact on the Application/Product Developers fix the issue on priority wise.
Priority types
High Medium Low
Showstopper/Blocker: Doesn't allow the user to further do the testing
Showstoppers are raised as High Priority No work around provided
Critical /High: Major functionality missing
No work around can be provided
Major/Medium: Unable to function or Misfunctionality of a particular feature Minor/Low: Cosmetic defects which does not affect the functionality of the system.
Defect Report Template
Title: Application crash on clicking the SAVE button while creating a new user Req ID: mention in which module/ feature issue exits Version Number: 5.0.1 Build Number: 5.1.10 Severity: HIGH (High/Medium/Low) or 1 Priority: HIGH (High/Medium/Low) or 1
Assigned to: Developer-X Reported By: Your Name Reported On: Date Status: New/Open Environment: Windows 2003/SQL Server 2005/Java1.6 : O.S/Browser/Language(version)
Login Details: Login/Password Description: Application crash on clicking the SAVE button while creating a new user, hence unable to create a new user in the application. Steps To Reproduce: 1) Logon into the application 2) Navigate to the Users Menu > New User 3) Filled all the user information fields 4) Clicked on Save button 5) Seen an error page ORA1090 Exception: Insert values Error
Issue Exists in production: Yes/No Expected result: On clicking SAVE button, should be prompted to a success message New User has been created successfully. Actual Result: On clicking SAVE button, application crashed Screenshot Attached: Yes(Attach the screen shot)
Automation Testing
Test automation is the use of software to control the execution of tests, the comparison of actual outcomes to predicted outcomes, the setting up of test preconditions, and other test control and test reporting functions. Test automation tools can be expensive, and it is usually employed in combination with manual testing. It can be made cost-effective in the longer term, especially when used repeatedly in regression testing
There are two general approaches to test automation Code-driven testing(Data Driven Testing)
The public(usually) interfaces to classes, modules, or libraries are tested with a variety of input arguments to validate that the results that are returned are correct.
Graphical User Interface Testing(GUI):
A testing framework generates user interface events such as keystrokes and mouse clicks, and observes the changes that result in the user interface, to validate that the observable behaviour of the program is correct.
Automation Framework
Automation framework is not a tool to perform some specific task, but is an infrastructure that provides the solution where different tools can plug itself and do their job in an unified manner. A framework is an integrated system that sets the rules of Automation of a specific product. The framework provides the basis of test automation and simplifies the automation effort. This system integrates the function libraries, test data sources, object details and various reusable modules.
There are various types of frameworks.
Data driven Testing Modularity driven Testing Keyword driven Testing Hybrid Testing Model based Testing
Popular Test Automation Tools
Tool Name HP QuickTest Professional Company Name HP Latest Version 11.0
IBM Rational Functional Tester Parasoft SOAtest Rational robot Selenium SilkTest TestComplete TestPartner Visual Studio Test Professional WATIR
IBM Rational Parasoft IBM Rational OpenSource Tool Micro Focus SmartBear Software Micro Focus Microsoft OpenSource Tool
8.1.0.3 9.0 2003 1.0.6 2010 8.0 6.3 2010 1.6.5
Software configuration management (SCM)
It is the task of tracking and controlling changes in the software. Configuration management practices include revision control and the establishment of Baselines SCM concerns itself with answering the question "Somebody did something, how can one reproduce it????? Often the problem involves not reproducing "it identically, but with controlled, incremental changes Source configuration management : is a related practice often used to indicate that a variety of artefacts may be managed and versioned, including software code, hardware, documents, design models, and even the directory structure itself.
The goals of SCM are generally Configuration identification - Identifying configurations, configuration items and baselines. Configuration control - Implementing a controlled change process. This is usually achieved by setting up a change control board whose primary function is to approve or reject all change requests that are sent against any baseline. Configuration status accounting - Recording and reporting all the necessary information on the status of the development process. Teamwork - Facilitate team interactions related to the process.
Configuration auditing - Ensuring that configurations contain all
their intended parts and are sound with respect to their specifying documents, including requirements, architectural specifications and user manuals. Process management - Ensuring adherence to the organization's development process. Environment management - Managing the software and hardware that host the system. Build management - Managing the process and tools used for builds. Defect tracking - Making sure every defect has traceability back to the source.