KEMBAR78
System-Level Test Automation: Ensuring a Good Start | PDF
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
1
System-level Test Automation:
ensuring a good start
Prepared and presented by
Dorothy Graham and Chris Loder
© Dorothy Graham and Chris Loder 2018
www.DorothyGraham.co.uk
info@dorothygraham.co.uk
Twitter: @DorothyGraham
www.ingenius.com
chris.loder@ingenius.com
Twitter: @AutomationChris
2
Tutorial description
• Many organizations invest a lot of effort in test automation at the system level
but then have serious problems later on. As a leader, how can you ensure that
your new automation efforts will get off to a good start? What can you do to
ensure that your automation work provides continuing value?
• This tutorial covers both “theory” and “practice”. Dot Graham explains the
critical issues for getting a good start, and Chris Loder describes his
experiences in getting good automation started at a number of companies.
• We cover the most important management issues for test automation
success, particularly when you are new to automation, and how to choose the
best approaches - no matter which automation tools you use.
• Focusing on system level testing, Dot and Chris explain how automation
affects staffing, who should be responsible for which automation tasks, how
managers can best support automation efforts to promote success, what you
can realistically expect in benefits and how to report them.
• They explain (for non-techies) the key technical issues that can make or break
your automation effort. Come away with your own clarified automation
objectives and a draft test automation strategy that you can use to plan your
own system level test automation.
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
3
Contents
Introduction to this tutorial
Test automation objectives
Responsibilities for automation tasks
Reporting and benefits
Other management issues
Technical issues / pitfalls
Conclusion
System-level Test Automation:
ensuring a good start
Twitter: @DorothyGraham & @AutomationChris
4
Objectives of this tutorial
• help you achieve better success in starting
system-level functional automation
– independent of any particular tool
• theory and practice
– generic advice from Dot
– practical implementation advice from Chris
• mainly management plus key technical pitfalls
– including objectives, responsibilities, reporting
• help you plan an effective automation strategy
– Chris’s experience of what works
or re-starting
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
5
What is today about? (and not about)
• test execution automation (not other tools)
• main focus: system level functional test
automation
• test automation, not testing
• we will NOT cover:
– comparative tool information
– demos of tools (no time, which one, expo)
– (but see Chris afterwards for more information)
6
CL
• Connects	leading	phone	systems	and	CRM	platforms	
with	computer	telephony	integration	(CTI)
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
7
Chris
• grew up in mining town in Labrador and got
my first IT job there.
– started automation then and didn’t know it
• worked for Cognos/IBM, Halogen Software
and now InGenius Software
– written numerous automation frameworks
– used a LOT of tools
– have a patent pending for algorithm used in
automation framework I wrote while at Halogen
CL
8
Dot
• first job: Bell Labs NJ, programmer put in test
group, wrote test execution and comparison
tools
• Ferranti UK, Police command & control
systems, developer but keen on testing
• independent, specialising in testing
– consultancy & training (Grove Consultants)
– UK BCS SIGIST, 1st EuroStar conference, helped
start ISTQB, many STAR conferences (since ‘92)
– attempted retirement in 2008
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
9
Shameless commercial plug
www.DorothyGraham.co.uk
info@dorothygraham.co.uk
Part 1: How to do
automation - still relevant
today, though we plan to
update it at some point
THIRD EDITION
For your lifelong learning solutions,
visit www.cengage.co.uk and course.cengage.com
Purchase your next print book, e-book or e-chapter at www.CengageBrain.com
Visit the website at www.cengage.co.uk/istqb3
FOUNDATIONS OF
Rex Black
Erik Van Veenendaal
Dorothy Graham
Edition updated for
ISTQB Foundation
Syllabus 2011 andGlossary 2.1
ISTQB CERTIFICATION
SOFTWARE
TESTING
Latest
book
(2012)
TestAutomationPatterns.wikispaces.com
TestAutomationPatterns.org
10
About you
–other people here today
• may have similar problems to you
• may use the same tools or tools you want to know about
• may have ideas to help you
• you may have knowledge that will help them
–show of hands
• job role (tester, test manager, automator, other)
• automation experience
– none / < 6 months / 6 – 12 months / 1 – 2 yrs / > 2 yrs
• test execution or framework tools used
– (major ones)
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
11
The tutorial materials
–copies of slides
• please write your own comments / notes on the pages
–the handout pages
• your objectives for this tutorial (vs description)
• a one-page summary of today
– note key “take-aways” as we go through
• exercises: objectives, responsibilities, etc
• your automation strategy /action plan
–free book! (for one person)
your
objec-
tives
12
Contents
Introduction to this tutorial
Test automation objectives
Responsibilities for automation tasks
Reporting and benefits
Other management issues
Technical issues / pitfalls
Conclusion
System-level Test Automation:
ensuring a good start
Twitter: @DorothyGraham & @AutomationChris
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
13
Good objectives for automation?
– run regression tests evenings and weekends
– increase test coverage
– run tests tedious and error-prone if run manually
– gain confidence in the system
– reduce the number of defects found by users
14
Automation objectives exercise
• 1) a list of objectives
– are they good ones?
– are they already in place in your organisation?
Handout, page 2
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
15
fast
testing
slow
testing
Effectiveness
Low
High
EfficiencyManual testing Automated
Efficiency and effectiveness
poor
fast
testing
poor
slow
testing
goodgood
greatest
benefit
not good but
common
worst
better
16
Same tests
automated
edit tests
(maintenance) set-up execute
analyse
failures clear-up
Manual
testing
More mature
automation
Reduce test execution time
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
17
Automate x% of tests?
manual
tests automated
tests
tests that
shouldn’t be
automated
new approaches,
e.g. monkey
testing, HiVAT*
manual tests
automated
(% manual)
tests (&
verification)
not possible to
do manually
tests not
automated
yet
*High Volume Automated Testing See http://kaner.com
18
What finds most bugs?
regression tests exploratory testing
likelihood of
finding bugs
most often
automated
What is usually automated?
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
19
Automation success = find lots of bugs?
• tests find bugs, not automation
• automation is a mechanism for running tests
• the bug-finding ability of a single test is not
affected by the manner in which it is executed
• this can be a dangerous objective
– especially for regression automation!
Automated tests Manual Scripted Exploratory Fix Verification
Experiences of Test Automation, Ch 27, p 503, Ed Allen & Brian Newman
Pattern: SET CLEAR GOALSTestAutomationPatterns.org
20
Experience stories
•unattended regression:
• check for new build, run regression tests, publish & email
results
•tests we didn’t automate
• not worth the effort / not possible (contrast test example)
•confidence level in the automation
• nothing released, not even new environment changes
• devs asked for automation run before merging branches
•always found more bugs automating the tests than
automation itself ever found.
CL
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
21
Goals that work (and don’t)
• working goals for me have been:
– reduce the time it takes for regression testing
– make life easier for other team members
• Not working:
– reduce % of testing
– reduce # of manual testers
– automate x% of the tests
CL
22
Automation objectives exercise
• 1) a list of objectives
– are they good ones?
– are they already in place in your organisation?
• 2) feedback on the list
– for discussion
• 3) select test automation objectives for you
– why are they good ones?
• 4) how will you measure them?
Handout, pages 3 & 4
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
23
Contents
Introduction to this tutorial
Test automation objectives
Responsibilities for automation tasks
Reporting and benefits
Other management issues
Technical issues / pitfalls
Conclusion
System-level Test Automation:
ensuring a good start
Twitter: @DorothyGraham & @AutomationChris
24
What is an automated test?
• a test!
– designed by a tester for a purpose
• test is executed
– implemented / constructed to run automatically
using a tool
– could be run manually also
• who decides what tests to run?
• who decides how a test is run?
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
25
Who should do the automation work?
• testers? (popular current view)
– not all testers can automate (well)
– not all testers want to automate
– not all automators want to test!
• conflict of responsibilities
– (if you are both tester and automator)
– should I automate tests or run tests manually?
• get additional resources as automators?
– contractors? borrow a developer? tool vendor?
26
Automation rolesAutomation roles
engine:
test tool
passengers:
test cases
car: testware
architecture
driver:
tester
mechanic:
test automator
fleet manager:
test automation manager
car designer:
testware / automation architect
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
27
Responsibilities
• test the software
– design tests
– select tests for automation
• requires planning / negotiation
• execute automated tests
– should not need detailed
technical expertise
• analyse failed automated
tests
– report bugs found by tests
– problems with the tests may
need help from the automation
team
• automate tests (requested by
testers)
• support automated testing
– allow testers to execute tests
– help testers debug failed tests
– provide additional tools (home-
grown)
• predict
– maintenance effort for software
changes
– cost of automating new tests
• improve the automation
– more benefits, less cost
Testers
Automators
Pattern: AUTOMATION ROLES
28
Example structure & responsibilities
• how we organized
– manual Testers
• built test suites
• tested the product
– automation Developers
• converted the manual tests written by manual testers into
automated tests
• built page classes and test classes
– automation Framework Developers
• built the automation framework for others to use
• didn’t test or write automated tests
• focused on adding functionality to framework
• built and maintained run environments
CL
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
29
Responsibilities exercise
• list of tasks – who is or should be responsible for
what?
• do you want to change who is responsible for what?
Handout, page 5
30
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
31
Contents
Introduction to this tutorial
Test automation objectives
Responsibilities for automation tasks
Reporting and benefits
Other management issues
Technical issues / pitfalls
Conclusion
System-level Test Automation:
ensuring a good start
Twitter: @DorothyGraham & @AutomationChris
32
Reporting failures to developers:
• #1 goal is to reduce the turn-around time from
test case failure to identifying the cause, bug
logging and/or fixing
• make it easy to read and analyze
• provide as much information as possible
• soft vs. hard failures
– soft: log defect, test can continue
– hard: test must be stopped, blocks the rest
CL
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
33
CL
full output for
each run
attemptthe automation
logs
the logs for the
application
under test
exact failure
reason
screen shots!
34
CL
even more
reports!!
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
35
Reporting progress to managers:
• #1 goal is to communicate the essentials
quickly – “at a glance”
• make it easy to read and analyze
• provide as concise information as possible
• executives only want a yes or no
CL
36
flaky tests
re-tried
without
known
failures
aka “expected fail”
bug in script or automation
no. of comparisons to some
expected result
CL
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
37
Is this Return on Investment (ROI)?
• tests are run more often
• tests take less time to run
• it takes less human effort to run tests
• we can test (cover) more of the system
• we can run the equivalent of days / weeks of
manual testing in a few minutes / hours
• faster time to market
these are (good) benefits
but are not ROI
ROI = (benefit – cost)
cost
38
Making benefits visible
• get something running, regularly and
consistently
• don’t bite off more than you can chew
• put your results in to the right hands – when
they are ready!
• be your own cheerleader!
CL
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
39
Report to managers by release
3
9
0
1000
2000
3000
4000
5000
6000
7000
11.7 11.8 11.9 12.0 16.0 16.1 16.2 17.0
NumberofTests
Release Version
Automation Test Case Comparison by
Release
March 2017Sept 2014
CL
40
30,000 foot view
CL
collect and build
automation and tools
VM’s run the
automation
automation
interacts with
Phones, ICE and
CRMs
generate and
email the results
Jenkins used as
job scheduler
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
41
Reporting exercise
• what needs to be reported?
• who should it be reported to?
• how is reporting different for different
audiences?
Handout, page 6
42
Contents
Introduction to this tutorial
Test automation objectives
Responsibilities for automation tasks
Reporting and benefits
Other management issues
Technical issues / pitfalls
Conclusion
System-level Test Automation:
ensuring a good start
Twitter: @DorothyGraham & @AutomationChris
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
43
What’s the best tool?
Commercial or open source?
poor benefits
low cost
good benefits
high cost
good benefits
low cost
poor benefits
high cost
benefits
cost budget
investment in
good automation
good benefits
moderate cost
commercial tools?
open source tools?
44
No “best” tool, but good suitable tools
• start by using 2 or 3 open source tools
– perhaps 1 week each with the same tests
– what other tools, what do they need to link with
• what are your objectives for automation?
– what tool features are needed to achieve them?
• what skills?
• commercial tools
– make use of free trials (not day 29 of 30 day trial)
– give vendors your tests for their demo
• but keep some others back for the day
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
45
Automation and agile
• can’t do agile without automation
– in agile teams, developer-tester works well
• apply agile principles to automation
– automation sprints, refactor when needed
• support manual and automated tests
• fitting automation into agile development
– ideal: automation is part of “done” for each sprint
– alternative: automation in the following sprint ->
• may be better for system level tests
See Chapter 1, Lisa Crispin, Experiences of Test Automation book,
www.satisfice.com/articles/agileauto-paper.pdf (James Bach)
46
Automation in agile
A
manual testing of
this release (testers)
A B
B CA
FEDCBA
regression testing (automators automate the best tests)
run automated tests (testers)
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
47
Automation and agile – how we did it
• starts with a new feature
• scrum teams: testers and automation engineers
• agreed responsibilities (see previous section)
• goals and QA practices
– QA have separate goals, not report to dev
• distribution between manual/aut, unit/system,
Rest/API/Webdriver
• need QA approval before release
CL
48
Pilot project
• reasons
– you’re unique
– many variables /
unknowns at start
• benefits
– find the best way for you
(best practice)
– solve problems once
– establish confidence
(based on experience)
– set realistic targets
• objectives
– demonstrate tool value
– gain experience / skills
in the use of the tool
– identify changes to
existing test process
– set internal standards
and conventions
– refine assessment of
costs and achievable
benefits
Pattern: DO A PILOT See Chapter 6, Ane Clausen, Experiences book
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
49
What to explore in the pilot
• build / implement automated tests (architecture)
– different ways to build stable tests (e.g. 10 – 20)
• maintenance
– different versions of the application
– reduce maintenance for most likely changes
• failure analysis
– support for identifying bugs
– coping with common bugs affecting many automated tests
• reporting
Also: naming conventions, measurement
50
Example: my pilot project at InGenius
• get basic framework up and running
– UI interactions
– basic test case flow
– basic reporting
• not automating the product
– actually automating a tool that the manual testers use to
test telephony providers
– delivered a setup tool for manual testers to get latest tool
and install it
• migrating to API calls instead
– to move along faster and eliminate UI slowness, we are
moving to the API level
CL
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
51
Setting realistic expectations
• Investment
– takes time and effort to
build good automation
– needs continuing support –
new asset not quick project
– different skills to testing
– may take longer to
automate (esp at first)
– isn’t a panacea
• doesn’t replace manual
testing (or testers)
• doesn’t improve quality of
requirements, design, code
or test design
• Returns
– good automation of
appropriate tests can give:
• faster response
• more accuracy
• more frequency
• more reliability
• better quality software?
• fewer quality “slip-backs”
– frees manual testers to do
other (better) manual testing
• more exploratory testing
• testing things that shouldn’t be
automated
52
Example: Setting expectations
• set clear expectations:
– rudimentary framework / basic reporting
– not automating the product but a tool
– 6-8 months before something useable
– 12–18 months before seeing impact
• be blessed with a great manager! J
• don’t over-promise
– this can kill your credibility and ultimately your project
– don’t be afraid to apply the Scotty Principle
CL
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
53
On-going automation
• regular “pruning” of tests
– check for overlap, removed features
• each test should earn its place
• you are never finished with automation
– don’t “stand still” - schedule regular review and re-
factoring of the automation
– change tools, hardware when needed
– re-structure if current approach causes problems
– monitor your automation “health”
54
What to automate?
• tests
– give the most value
– most important tests
– tests run most often
– “hassle factor”
– “wally tests”
– human-error-prone
– easy to automate (relatively)
Note: what is great now may not be best later.
What’s great for someone else may not be best for you.
Build in flexibility
to choose
subsets of tests
Be selective:
sample, not
everything
Pattern: AUTOMATE GOOD TESTS
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
55
Angie Jones: what tests to automate
• Gut feel vs
• Customer risk: impact, probability of use
• Value of the test: distinctness, probability of fix
• Cost efficiency: ease to write script, quick to write
• History: similarity to weak areas, previous failure
frequency
• end up with a number between 4 and 100
– 75-100 automate, < 25 don’t
• spreadsheet available – email me
– info@dorothygraham.co.uk
56
Automated tests/automated testing
Select / identify test cases to run
Set-up test environment:
• create test environment
• load test data
Repeat for each test case:
• set-up test pre-requisites
• execute
• compare results
• log results
• analyse test failures
• report defect(s)
• clear-up after test case
Clear-up test environment:
• delete unwanted data
• save important data
Summarise results
Automated tests
Select / identify test cases to run
Set-up test environment:
• create test environment
• load test data
Repeat for each test case:
• set-up test pre-requisites
• execute
• compare results
• log results
• clear-up after test case
Clear-up test environment:
• delete unwanted data
• save important data
Summarise results
Analyse test failures
Report defects
Automated testing
Automated processManual process
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
57
What we automated outside of tests
• my favorite example!
– phone bills
• manual tool set up
– utilities to set up tools and environments
• documentation
• environment set-up
CL
58
What can be lost in automation?
• awareness
– of the application (big picture view of overall tester)
– of the testing (what’s been tested, what not)
– of the unexpected (bugs not covered by scripts)
• user perspective (esp not testing through GUI)
• intuition (“feel” for where bugs are)
• variety (runs the same tests every time)
• flexibility (adapting to changing context)
Adapt your automation so you don’t
lose what’s important for you!
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
59
Contents
Introduction to this tutorial
Test automation objectives
Responsibilities for automation tasks
Reporting and benefits
Other management issues
Technical issues / pitfalls
Conclusion
System-level Test Automation:
ensuring a good start
Twitter: @DorothyGraham & @AutomationChris
60
Comparison of tasks
Clerical
Intellectual
one-off
activity
activity
repeated
many times
Governs the
quality of tests
Identify
Design
Build
Execute
Check
Good to
automate
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
61
Testware/Automation architecture
abstraction here:
easier to write
automated tests
è widely used
abstraction here:
easier to maintain,
and change tools
è long life
testware	architecture
Testers	
Test	Execution	Tool
runs	scripts
High Level Keywords
Structured Scripts
structured	
testware
Test
Automator(s)
write	tests	(in	DSTL)
Pattern: TESTWARE ARCHITECTURE
62
Easy way out: use the tool’s architecture
• tool will have its own way of organizing tests
– where to put things (for the convenience of the tool!)
– will “lock you in” to that tool – good for vendors!
• a better way (gives independence from tools)
– organise your tests to suit you – keep tool-specific
scripts to a minimum
• build your own framework
– in pre-processing, copy files to where the tool needs
(expects) to find them
– in post-processing, copy back to your structure
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
63
Tool-specific script ratio
Testers	
Test	Execution	Tool
Testers	
Test	Execution	Tool
Tool-specific
scripts
Not Tool-
specific
High
maintenance
and/or tool-
dependence
64
Same tests in four tools
• started with the tests in Visual Test
• company bought WinRunner so we migrated all tests to it
• then HP bought Mercury and end of life’d WR in favour of QTP,
so we migrated to it
• company was bought by IBM, so we migrated tests to RFT.
• Selenium came out and we saw the light and jumped to it
– since it was open source and needed a framework around it we wrote
one
• finally we abstracted the tool out of the equation and coded to
our own framework.
– still pointed to Selenium but allowed changing tool much easier
– current framework uses both Selenium and White but automation
developers don’t see a difference
CL
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
65
Test status – pass or fail?
• tool cannot judge pass or fail
– only “match” or “no match”
– assumption: expected results are correct
• when a test fails (i.e. the software fails)
– need to analyse the failure
• true failure? write up bug report
• test fault? fix the test (e.g. expected result)
• known bug or failure affecting many automated
tests?
– this can eat a lot of time in automated testing
– solution: additional test statuses
66
Known bug / known failure
• minor bug in all test results
• alternatives:
– add bug to expected results
• bug preservation
– ignore test results / don’t run
• untested - miss other things
– analyze every result
• waste of time
• a better way - new test
status: expected fail
known bug
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
67
Expected Fail status
Compare to No differences found Differences found
(true) expected outcome Pass Fail
expected fail outcome Expected Fail Unknown
don’t know / missing Unknown Unknown
• other possible additional test statuses
– test blocked
– environment problem (e.g. network down, timeouts)
– set-up problems (files missing)
– failed again
– test needs to be changed but not done yet
– different version of expected results
68
Expected fail examples
CL
• percentage calculation
– should have been 0.0% but was simply 0
• spacing
– simple text should return “value”, but was
returning “rn value rn”
• column adjustment
– Column width should have auto handled a
large value
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
69
Breadth or depth? the coverage illusion
major bug minor bug
Breadth / width is coverage
Depth / lumpy testing is selective
“I’ve covered /
tested everything -
haven’t missed
anything!
An illusion, a trap
What is better testing?
70
Examples of “coverage”
• navigational tests
– sanity runs
• deep dive into silo
– manual testing removal
• automating the customer bug
– meant all regression tests as well
CL
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
71
Contents
Introduction to this tutorial
Test automation objectives
Responsibilities for automation tasks
Reporting and benefits
Other management issues
Technical issues / pitfalls
Conclusion
Twitter: @DorothyGraham & @AutomationChris
System-level Test Automation:
ensuring a good start
72
What next?
• we have looked at a number of ideas about
test automation today
• what is your situation?
– what are the most important things for you now?
– where do you want to go?
– how will you get there?
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
73
Key automation strategy questions
– what is to be tested?
– what levels & types of testing are to be automated?
– what existing tools do we have?
– what processes must interact with the tools?
– what other tools must the test tools integrate with?
– information (data) requirements (input/output)
– what environments are needed?
– what metrics/reports should be produced?
– how will we show benefits?
– whose support will we need?
74
Your Test Automation Strategy/Plan
• note your plans now (while ideas are fresh,
before your brain overflows)
– what were your objectives for today? Met them?
– review your “take-aways” from each section (p 1)
– look at your automation objectives (p 4)
– look at responsibilities and reporting (p 5, 6)
– identify the top 3 things you need to do, or changes
you want to make to your automation
Handout, page 7
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
75
Please fill in the evaluation form
• against this tutorial’s description (below)
– half-day (full-day?)
– balance of information and exercises / discussion
– description accurate? (what did you expect?)
• we appreciate:
– improvement suggestions (content, timing etc)
– high marks ;-)
– if you give a lower mark, please explain why
• and put your name on the form – thanks – so we can
ask for further feedback
76
Tutorial description
• Many organizations invest a lot of effort in test automation at the system level
but then have serious problems later on. As a leader, how can you ensure that
your new automation efforts will get off to a good start? What can you do to
ensure that your automation work provides continuing value?
• This tutorial covers both “theory” and “practice”. Dot Graham explains the
critical issues for getting a good start, and Chris Loder describes his
experiences in getting good automation started at a number of companies.
• We cover the most important management issues for test automation
success, particularly when you are new to automation, and how to choose the
best approaches - no matter which automation tools you use.
• Focusing on system level testing, Dot and Chris explain how automation
affects staffing, who should be responsible for which automation tasks, how
managers can best support automation efforts to promote success, what you
can realistically expect in benefits and how to report them.
• They explain (for non-techies) the key technical issues that can make or break
your automation effort. Come away with your own clarified automation
objectives and a draft test automation strategy that you can use to plan your
own system level test automation.
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
77
Summary: key points
• objectives for automation
• realistic expectations, measurable
• responsibilities: automation and testing
• reporting and benefits
• pilot project, HL mgt, developers, agile
• technical issues
• testware architecture, scripting levels
• your strategy / action plan
Twitter: @DorothyGraham & @AutomationChris
System-level Test Automation:
ensuring a good start
78
any more questions?
please email us!
info@DorothyGraham.co.uk
chris.loder@ingenius.com
• thank you for coming today
• we hope this was / will be useful for you
• all the best in your automation!
System-level Test Automation:
ensuring a good start
presented by Dorothy Graham and Chris Loder
© Dorothy Graham & Chris Loder 2018
chris.loder@ingenius.com
info@dorothygraham.co.uk
www.DorothyGraham.co.uk
79
Good objectives for automation?
– run regression tests evenings and weekends
– increase test coverage
– run tests tedious and error-prone if run manually
– gain confidence in the system
– reduce the number of defects found by users
only if they are worthwhile tests!
can be a good one but depends what is meant by “test” coverage
good objective
an objective for testing, but automated regression tests help achieve it
good objective for testing, maybe not a good objective for automation!
80
Automation success = find lots of bugs?
• tests find bugs, not automation
• automation is a mechanism for running tests
• the bug-finding ability of a single test is not
affected by the manner in which it is executed
• this can be a dangerous objective
– especially for regression automation!
Automated tests Manual Scripted Exploratory Fix Verification
9.3% 24.0% 58.2% 8.4%
Experiences of Test Automation, Ch 27, p 503, Ed Allen & Brian Newman
Pattern: SET CLEAR GOALSTestAutomationPatterns.org

System-Level Test Automation: Ensuring a Good Start

  • 1.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 1 System-level Test Automation: ensuring a good start Prepared and presented by Dorothy Graham and Chris Loder © Dorothy Graham and Chris Loder 2018 www.DorothyGraham.co.uk info@dorothygraham.co.uk Twitter: @DorothyGraham www.ingenius.com chris.loder@ingenius.com Twitter: @AutomationChris 2 Tutorial description • Many organizations invest a lot of effort in test automation at the system level but then have serious problems later on. As a leader, how can you ensure that your new automation efforts will get off to a good start? What can you do to ensure that your automation work provides continuing value? • This tutorial covers both “theory” and “practice”. Dot Graham explains the critical issues for getting a good start, and Chris Loder describes his experiences in getting good automation started at a number of companies. • We cover the most important management issues for test automation success, particularly when you are new to automation, and how to choose the best approaches - no matter which automation tools you use. • Focusing on system level testing, Dot and Chris explain how automation affects staffing, who should be responsible for which automation tasks, how managers can best support automation efforts to promote success, what you can realistically expect in benefits and how to report them. • They explain (for non-techies) the key technical issues that can make or break your automation effort. Come away with your own clarified automation objectives and a draft test automation strategy that you can use to plan your own system level test automation.
  • 2.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 3 Contents Introduction to this tutorial Test automation objectives Responsibilities for automation tasks Reporting and benefits Other management issues Technical issues / pitfalls Conclusion System-level Test Automation: ensuring a good start Twitter: @DorothyGraham & @AutomationChris 4 Objectives of this tutorial • help you achieve better success in starting system-level functional automation – independent of any particular tool • theory and practice – generic advice from Dot – practical implementation advice from Chris • mainly management plus key technical pitfalls – including objectives, responsibilities, reporting • help you plan an effective automation strategy – Chris’s experience of what works or re-starting
  • 3.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 5 What is today about? (and not about) • test execution automation (not other tools) • main focus: system level functional test automation • test automation, not testing • we will NOT cover: – comparative tool information – demos of tools (no time, which one, expo) – (but see Chris afterwards for more information) 6 CL • Connects leading phone systems and CRM platforms with computer telephony integration (CTI)
  • 4.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 7 Chris • grew up in mining town in Labrador and got my first IT job there. – started automation then and didn’t know it • worked for Cognos/IBM, Halogen Software and now InGenius Software – written numerous automation frameworks – used a LOT of tools – have a patent pending for algorithm used in automation framework I wrote while at Halogen CL 8 Dot • first job: Bell Labs NJ, programmer put in test group, wrote test execution and comparison tools • Ferranti UK, Police command & control systems, developer but keen on testing • independent, specialising in testing – consultancy & training (Grove Consultants) – UK BCS SIGIST, 1st EuroStar conference, helped start ISTQB, many STAR conferences (since ‘92) – attempted retirement in 2008
  • 5.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 9 Shameless commercial plug www.DorothyGraham.co.uk info@dorothygraham.co.uk Part 1: How to do automation - still relevant today, though we plan to update it at some point THIRD EDITION For your lifelong learning solutions, visit www.cengage.co.uk and course.cengage.com Purchase your next print book, e-book or e-chapter at www.CengageBrain.com Visit the website at www.cengage.co.uk/istqb3 FOUNDATIONS OF Rex Black Erik Van Veenendaal Dorothy Graham Edition updated for ISTQB Foundation Syllabus 2011 andGlossary 2.1 ISTQB CERTIFICATION SOFTWARE TESTING Latest book (2012) TestAutomationPatterns.wikispaces.com TestAutomationPatterns.org 10 About you –other people here today • may have similar problems to you • may use the same tools or tools you want to know about • may have ideas to help you • you may have knowledge that will help them –show of hands • job role (tester, test manager, automator, other) • automation experience – none / < 6 months / 6 – 12 months / 1 – 2 yrs / > 2 yrs • test execution or framework tools used – (major ones)
  • 6.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 11 The tutorial materials –copies of slides • please write your own comments / notes on the pages –the handout pages • your objectives for this tutorial (vs description) • a one-page summary of today – note key “take-aways” as we go through • exercises: objectives, responsibilities, etc • your automation strategy /action plan –free book! (for one person) your objec- tives 12 Contents Introduction to this tutorial Test automation objectives Responsibilities for automation tasks Reporting and benefits Other management issues Technical issues / pitfalls Conclusion System-level Test Automation: ensuring a good start Twitter: @DorothyGraham & @AutomationChris
  • 7.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 13 Good objectives for automation? – run regression tests evenings and weekends – increase test coverage – run tests tedious and error-prone if run manually – gain confidence in the system – reduce the number of defects found by users 14 Automation objectives exercise • 1) a list of objectives – are they good ones? – are they already in place in your organisation? Handout, page 2
  • 8.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 15 fast testing slow testing Effectiveness Low High EfficiencyManual testing Automated Efficiency and effectiveness poor fast testing poor slow testing goodgood greatest benefit not good but common worst better 16 Same tests automated edit tests (maintenance) set-up execute analyse failures clear-up Manual testing More mature automation Reduce test execution time
  • 9.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 17 Automate x% of tests? manual tests automated tests tests that shouldn’t be automated new approaches, e.g. monkey testing, HiVAT* manual tests automated (% manual) tests (& verification) not possible to do manually tests not automated yet *High Volume Automated Testing See http://kaner.com 18 What finds most bugs? regression tests exploratory testing likelihood of finding bugs most often automated What is usually automated?
  • 10.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 19 Automation success = find lots of bugs? • tests find bugs, not automation • automation is a mechanism for running tests • the bug-finding ability of a single test is not affected by the manner in which it is executed • this can be a dangerous objective – especially for regression automation! Automated tests Manual Scripted Exploratory Fix Verification Experiences of Test Automation, Ch 27, p 503, Ed Allen & Brian Newman Pattern: SET CLEAR GOALSTestAutomationPatterns.org 20 Experience stories •unattended regression: • check for new build, run regression tests, publish & email results •tests we didn’t automate • not worth the effort / not possible (contrast test example) •confidence level in the automation • nothing released, not even new environment changes • devs asked for automation run before merging branches •always found more bugs automating the tests than automation itself ever found. CL
  • 11.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 21 Goals that work (and don’t) • working goals for me have been: – reduce the time it takes for regression testing – make life easier for other team members • Not working: – reduce % of testing – reduce # of manual testers – automate x% of the tests CL 22 Automation objectives exercise • 1) a list of objectives – are they good ones? – are they already in place in your organisation? • 2) feedback on the list – for discussion • 3) select test automation objectives for you – why are they good ones? • 4) how will you measure them? Handout, pages 3 & 4
  • 12.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 23 Contents Introduction to this tutorial Test automation objectives Responsibilities for automation tasks Reporting and benefits Other management issues Technical issues / pitfalls Conclusion System-level Test Automation: ensuring a good start Twitter: @DorothyGraham & @AutomationChris 24 What is an automated test? • a test! – designed by a tester for a purpose • test is executed – implemented / constructed to run automatically using a tool – could be run manually also • who decides what tests to run? • who decides how a test is run?
  • 13.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 25 Who should do the automation work? • testers? (popular current view) – not all testers can automate (well) – not all testers want to automate – not all automators want to test! • conflict of responsibilities – (if you are both tester and automator) – should I automate tests or run tests manually? • get additional resources as automators? – contractors? borrow a developer? tool vendor? 26 Automation rolesAutomation roles engine: test tool passengers: test cases car: testware architecture driver: tester mechanic: test automator fleet manager: test automation manager car designer: testware / automation architect
  • 14.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 27 Responsibilities • test the software – design tests – select tests for automation • requires planning / negotiation • execute automated tests – should not need detailed technical expertise • analyse failed automated tests – report bugs found by tests – problems with the tests may need help from the automation team • automate tests (requested by testers) • support automated testing – allow testers to execute tests – help testers debug failed tests – provide additional tools (home- grown) • predict – maintenance effort for software changes – cost of automating new tests • improve the automation – more benefits, less cost Testers Automators Pattern: AUTOMATION ROLES 28 Example structure & responsibilities • how we organized – manual Testers • built test suites • tested the product – automation Developers • converted the manual tests written by manual testers into automated tests • built page classes and test classes – automation Framework Developers • built the automation framework for others to use • didn’t test or write automated tests • focused on adding functionality to framework • built and maintained run environments CL
  • 15.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 29 Responsibilities exercise • list of tasks – who is or should be responsible for what? • do you want to change who is responsible for what? Handout, page 5 30
  • 16.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 31 Contents Introduction to this tutorial Test automation objectives Responsibilities for automation tasks Reporting and benefits Other management issues Technical issues / pitfalls Conclusion System-level Test Automation: ensuring a good start Twitter: @DorothyGraham & @AutomationChris 32 Reporting failures to developers: • #1 goal is to reduce the turn-around time from test case failure to identifying the cause, bug logging and/or fixing • make it easy to read and analyze • provide as much information as possible • soft vs. hard failures – soft: log defect, test can continue – hard: test must be stopped, blocks the rest CL
  • 17.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 33 CL full output for each run attemptthe automation logs the logs for the application under test exact failure reason screen shots! 34 CL even more reports!!
  • 18.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 35 Reporting progress to managers: • #1 goal is to communicate the essentials quickly – “at a glance” • make it easy to read and analyze • provide as concise information as possible • executives only want a yes or no CL 36 flaky tests re-tried without known failures aka “expected fail” bug in script or automation no. of comparisons to some expected result CL
  • 19.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 37 Is this Return on Investment (ROI)? • tests are run more often • tests take less time to run • it takes less human effort to run tests • we can test (cover) more of the system • we can run the equivalent of days / weeks of manual testing in a few minutes / hours • faster time to market these are (good) benefits but are not ROI ROI = (benefit – cost) cost 38 Making benefits visible • get something running, regularly and consistently • don’t bite off more than you can chew • put your results in to the right hands – when they are ready! • be your own cheerleader! CL
  • 20.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 39 Report to managers by release 3 9 0 1000 2000 3000 4000 5000 6000 7000 11.7 11.8 11.9 12.0 16.0 16.1 16.2 17.0 NumberofTests Release Version Automation Test Case Comparison by Release March 2017Sept 2014 CL 40 30,000 foot view CL collect and build automation and tools VM’s run the automation automation interacts with Phones, ICE and CRMs generate and email the results Jenkins used as job scheduler
  • 21.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 41 Reporting exercise • what needs to be reported? • who should it be reported to? • how is reporting different for different audiences? Handout, page 6 42 Contents Introduction to this tutorial Test automation objectives Responsibilities for automation tasks Reporting and benefits Other management issues Technical issues / pitfalls Conclusion System-level Test Automation: ensuring a good start Twitter: @DorothyGraham & @AutomationChris
  • 22.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 43 What’s the best tool? Commercial or open source? poor benefits low cost good benefits high cost good benefits low cost poor benefits high cost benefits cost budget investment in good automation good benefits moderate cost commercial tools? open source tools? 44 No “best” tool, but good suitable tools • start by using 2 or 3 open source tools – perhaps 1 week each with the same tests – what other tools, what do they need to link with • what are your objectives for automation? – what tool features are needed to achieve them? • what skills? • commercial tools – make use of free trials (not day 29 of 30 day trial) – give vendors your tests for their demo • but keep some others back for the day
  • 23.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 45 Automation and agile • can’t do agile without automation – in agile teams, developer-tester works well • apply agile principles to automation – automation sprints, refactor when needed • support manual and automated tests • fitting automation into agile development – ideal: automation is part of “done” for each sprint – alternative: automation in the following sprint -> • may be better for system level tests See Chapter 1, Lisa Crispin, Experiences of Test Automation book, www.satisfice.com/articles/agileauto-paper.pdf (James Bach) 46 Automation in agile A manual testing of this release (testers) A B B CA FEDCBA regression testing (automators automate the best tests) run automated tests (testers)
  • 24.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 47 Automation and agile – how we did it • starts with a new feature • scrum teams: testers and automation engineers • agreed responsibilities (see previous section) • goals and QA practices – QA have separate goals, not report to dev • distribution between manual/aut, unit/system, Rest/API/Webdriver • need QA approval before release CL 48 Pilot project • reasons – you’re unique – many variables / unknowns at start • benefits – find the best way for you (best practice) – solve problems once – establish confidence (based on experience) – set realistic targets • objectives – demonstrate tool value – gain experience / skills in the use of the tool – identify changes to existing test process – set internal standards and conventions – refine assessment of costs and achievable benefits Pattern: DO A PILOT See Chapter 6, Ane Clausen, Experiences book
  • 25.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 49 What to explore in the pilot • build / implement automated tests (architecture) – different ways to build stable tests (e.g. 10 – 20) • maintenance – different versions of the application – reduce maintenance for most likely changes • failure analysis – support for identifying bugs – coping with common bugs affecting many automated tests • reporting Also: naming conventions, measurement 50 Example: my pilot project at InGenius • get basic framework up and running – UI interactions – basic test case flow – basic reporting • not automating the product – actually automating a tool that the manual testers use to test telephony providers – delivered a setup tool for manual testers to get latest tool and install it • migrating to API calls instead – to move along faster and eliminate UI slowness, we are moving to the API level CL
  • 26.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 51 Setting realistic expectations • Investment – takes time and effort to build good automation – needs continuing support – new asset not quick project – different skills to testing – may take longer to automate (esp at first) – isn’t a panacea • doesn’t replace manual testing (or testers) • doesn’t improve quality of requirements, design, code or test design • Returns – good automation of appropriate tests can give: • faster response • more accuracy • more frequency • more reliability • better quality software? • fewer quality “slip-backs” – frees manual testers to do other (better) manual testing • more exploratory testing • testing things that shouldn’t be automated 52 Example: Setting expectations • set clear expectations: – rudimentary framework / basic reporting – not automating the product but a tool – 6-8 months before something useable – 12–18 months before seeing impact • be blessed with a great manager! J • don’t over-promise – this can kill your credibility and ultimately your project – don’t be afraid to apply the Scotty Principle CL
  • 27.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 53 On-going automation • regular “pruning” of tests – check for overlap, removed features • each test should earn its place • you are never finished with automation – don’t “stand still” - schedule regular review and re- factoring of the automation – change tools, hardware when needed – re-structure if current approach causes problems – monitor your automation “health” 54 What to automate? • tests – give the most value – most important tests – tests run most often – “hassle factor” – “wally tests” – human-error-prone – easy to automate (relatively) Note: what is great now may not be best later. What’s great for someone else may not be best for you. Build in flexibility to choose subsets of tests Be selective: sample, not everything Pattern: AUTOMATE GOOD TESTS
  • 28.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 55 Angie Jones: what tests to automate • Gut feel vs • Customer risk: impact, probability of use • Value of the test: distinctness, probability of fix • Cost efficiency: ease to write script, quick to write • History: similarity to weak areas, previous failure frequency • end up with a number between 4 and 100 – 75-100 automate, < 25 don’t • spreadsheet available – email me – info@dorothygraham.co.uk 56 Automated tests/automated testing Select / identify test cases to run Set-up test environment: • create test environment • load test data Repeat for each test case: • set-up test pre-requisites • execute • compare results • log results • analyse test failures • report defect(s) • clear-up after test case Clear-up test environment: • delete unwanted data • save important data Summarise results Automated tests Select / identify test cases to run Set-up test environment: • create test environment • load test data Repeat for each test case: • set-up test pre-requisites • execute • compare results • log results • clear-up after test case Clear-up test environment: • delete unwanted data • save important data Summarise results Analyse test failures Report defects Automated testing Automated processManual process
  • 29.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 57 What we automated outside of tests • my favorite example! – phone bills • manual tool set up – utilities to set up tools and environments • documentation • environment set-up CL 58 What can be lost in automation? • awareness – of the application (big picture view of overall tester) – of the testing (what’s been tested, what not) – of the unexpected (bugs not covered by scripts) • user perspective (esp not testing through GUI) • intuition (“feel” for where bugs are) • variety (runs the same tests every time) • flexibility (adapting to changing context) Adapt your automation so you don’t lose what’s important for you!
  • 30.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 59 Contents Introduction to this tutorial Test automation objectives Responsibilities for automation tasks Reporting and benefits Other management issues Technical issues / pitfalls Conclusion System-level Test Automation: ensuring a good start Twitter: @DorothyGraham & @AutomationChris 60 Comparison of tasks Clerical Intellectual one-off activity activity repeated many times Governs the quality of tests Identify Design Build Execute Check Good to automate
  • 31.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 61 Testware/Automation architecture abstraction here: easier to write automated tests è widely used abstraction here: easier to maintain, and change tools è long life testware architecture Testers Test Execution Tool runs scripts High Level Keywords Structured Scripts structured testware Test Automator(s) write tests (in DSTL) Pattern: TESTWARE ARCHITECTURE 62 Easy way out: use the tool’s architecture • tool will have its own way of organizing tests – where to put things (for the convenience of the tool!) – will “lock you in” to that tool – good for vendors! • a better way (gives independence from tools) – organise your tests to suit you – keep tool-specific scripts to a minimum • build your own framework – in pre-processing, copy files to where the tool needs (expects) to find them – in post-processing, copy back to your structure
  • 32.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 63 Tool-specific script ratio Testers Test Execution Tool Testers Test Execution Tool Tool-specific scripts Not Tool- specific High maintenance and/or tool- dependence 64 Same tests in four tools • started with the tests in Visual Test • company bought WinRunner so we migrated all tests to it • then HP bought Mercury and end of life’d WR in favour of QTP, so we migrated to it • company was bought by IBM, so we migrated tests to RFT. • Selenium came out and we saw the light and jumped to it – since it was open source and needed a framework around it we wrote one • finally we abstracted the tool out of the equation and coded to our own framework. – still pointed to Selenium but allowed changing tool much easier – current framework uses both Selenium and White but automation developers don’t see a difference CL
  • 33.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 65 Test status – pass or fail? • tool cannot judge pass or fail – only “match” or “no match” – assumption: expected results are correct • when a test fails (i.e. the software fails) – need to analyse the failure • true failure? write up bug report • test fault? fix the test (e.g. expected result) • known bug or failure affecting many automated tests? – this can eat a lot of time in automated testing – solution: additional test statuses 66 Known bug / known failure • minor bug in all test results • alternatives: – add bug to expected results • bug preservation – ignore test results / don’t run • untested - miss other things – analyze every result • waste of time • a better way - new test status: expected fail known bug
  • 34.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 67 Expected Fail status Compare to No differences found Differences found (true) expected outcome Pass Fail expected fail outcome Expected Fail Unknown don’t know / missing Unknown Unknown • other possible additional test statuses – test blocked – environment problem (e.g. network down, timeouts) – set-up problems (files missing) – failed again – test needs to be changed but not done yet – different version of expected results 68 Expected fail examples CL • percentage calculation – should have been 0.0% but was simply 0 • spacing – simple text should return “value”, but was returning “rn value rn” • column adjustment – Column width should have auto handled a large value
  • 35.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 69 Breadth or depth? the coverage illusion major bug minor bug Breadth / width is coverage Depth / lumpy testing is selective “I’ve covered / tested everything - haven’t missed anything! An illusion, a trap What is better testing? 70 Examples of “coverage” • navigational tests – sanity runs • deep dive into silo – manual testing removal • automating the customer bug – meant all regression tests as well CL
  • 36.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 71 Contents Introduction to this tutorial Test automation objectives Responsibilities for automation tasks Reporting and benefits Other management issues Technical issues / pitfalls Conclusion Twitter: @DorothyGraham & @AutomationChris System-level Test Automation: ensuring a good start 72 What next? • we have looked at a number of ideas about test automation today • what is your situation? – what are the most important things for you now? – where do you want to go? – how will you get there?
  • 37.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 73 Key automation strategy questions – what is to be tested? – what levels & types of testing are to be automated? – what existing tools do we have? – what processes must interact with the tools? – what other tools must the test tools integrate with? – information (data) requirements (input/output) – what environments are needed? – what metrics/reports should be produced? – how will we show benefits? – whose support will we need? 74 Your Test Automation Strategy/Plan • note your plans now (while ideas are fresh, before your brain overflows) – what were your objectives for today? Met them? – review your “take-aways” from each section (p 1) – look at your automation objectives (p 4) – look at responsibilities and reporting (p 5, 6) – identify the top 3 things you need to do, or changes you want to make to your automation Handout, page 7
  • 38.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 75 Please fill in the evaluation form • against this tutorial’s description (below) – half-day (full-day?) – balance of information and exercises / discussion – description accurate? (what did you expect?) • we appreciate: – improvement suggestions (content, timing etc) – high marks ;-) – if you give a lower mark, please explain why • and put your name on the form – thanks – so we can ask for further feedback 76 Tutorial description • Many organizations invest a lot of effort in test automation at the system level but then have serious problems later on. As a leader, how can you ensure that your new automation efforts will get off to a good start? What can you do to ensure that your automation work provides continuing value? • This tutorial covers both “theory” and “practice”. Dot Graham explains the critical issues for getting a good start, and Chris Loder describes his experiences in getting good automation started at a number of companies. • We cover the most important management issues for test automation success, particularly when you are new to automation, and how to choose the best approaches - no matter which automation tools you use. • Focusing on system level testing, Dot and Chris explain how automation affects staffing, who should be responsible for which automation tasks, how managers can best support automation efforts to promote success, what you can realistically expect in benefits and how to report them. • They explain (for non-techies) the key technical issues that can make or break your automation effort. Come away with your own clarified automation objectives and a draft test automation strategy that you can use to plan your own system level test automation.
  • 39.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 77 Summary: key points • objectives for automation • realistic expectations, measurable • responsibilities: automation and testing • reporting and benefits • pilot project, HL mgt, developers, agile • technical issues • testware architecture, scripting levels • your strategy / action plan Twitter: @DorothyGraham & @AutomationChris System-level Test Automation: ensuring a good start 78 any more questions? please email us! info@DorothyGraham.co.uk chris.loder@ingenius.com • thank you for coming today • we hope this was / will be useful for you • all the best in your automation!
  • 40.
    System-level Test Automation: ensuringa good start presented by Dorothy Graham and Chris Loder © Dorothy Graham & Chris Loder 2018 chris.loder@ingenius.com info@dorothygraham.co.uk www.DorothyGraham.co.uk 79 Good objectives for automation? – run regression tests evenings and weekends – increase test coverage – run tests tedious and error-prone if run manually – gain confidence in the system – reduce the number of defects found by users only if they are worthwhile tests! can be a good one but depends what is meant by “test” coverage good objective an objective for testing, but automated regression tests help achieve it good objective for testing, maybe not a good objective for automation! 80 Automation success = find lots of bugs? • tests find bugs, not automation • automation is a mechanism for running tests • the bug-finding ability of a single test is not affected by the manner in which it is executed • this can be a dangerous objective – especially for regression automation! Automated tests Manual Scripted Exploratory Fix Verification 9.3% 24.0% 58.2% 8.4% Experiences of Test Automation, Ch 27, p 503, Ed Allen & Brian Newman Pattern: SET CLEAR GOALSTestAutomationPatterns.org