KEMBAR78
Agile Mobile Testing Workshop | PDF
AGILE MOBILE 
TESTING 
WORKSHOP 
PUNE AGILE CONFERENCE 
JULIAN HARTY 
Creative Commons License 
How to design your mobile apps by Julian Harty is licensed under 
a Creative Commons Attribution-ShareAlike 3.0 Unported License. 
http://creativecommons.org/licenses/by-sa/3.0/deed.en_US 
Contact me: Rev: 22 Nov 2014 julianharty@gmail.com
AGILE 
TESTING
http://lisacrispin.com/2011/11/08/using-the-agile-testing-quadrants/
TIME TO USEFUL FEEDBACK 
TTUF 
Information is more valuable when it is timely
CONTINUOUS INTEGRATION 
(FOR MOBILE APPS) 
Raw Ingredients 
• Code 
• Code Repository (git, svn, …) 
• Triggers 
• Build tools 
• Automated tests 
• Run time environment(s) 
• Emulators 
• Simulators 
• Devices
WORKSHOP 
WHERE ARE 
THE 
LATENCIES
WHERE ARE THE 
LATENCIES? 
• Build times 
• Affects end-to-end Unit Test runtime 
• Commissioning run-time environment 
• Automated tests 
• Deployment 
• Installing the app so it can be tested 
• App Store Approval 
• Feedback from the market 
• Feedback from the field 
• App Qualities 
• Failures & Defects in use
TTUF 
INTERACTIVE 
TESTING
TBS 
T 
B S 
T = Testing 
B = Bug reporting 
S = Setup 
We want to maximize T and minimize B & S
MINIMIZE SETUP 
Email or SMS URLs to phone 
Have a configuration workstation with all the drivers installed 
Create apps on your build server and make them available
MINIMIZE BUG 
INVESTIGATION 
• Screenshot utilities 
• Learn how to access, filter and store device logs 
• Good quality camera for close-up screenshots 
• Write a bug report that will still be valuable when 
the bug will actually be investigated
MAXIMIZE TESTING 
Use testing heuristics 
• I SLICED UP FUN (Jonathan Kohl) 
• COP FLUNG GUN (Moolya)
ANTIFRAGILE TESTING 
HEURISTICS & 
MNEMONICS
http://moolya.com/blogs/2012/04/121/Test-Mobile-applications-with-COP-who-FLUNG-GUN
WiFi Password: FW4WFAAA 
TEST THIS 
Kiwix 
USING THIS GUIDE 
http://moolya.com/blogs/
TTUF 
AUTOMATED 
TESTS
TESTABILITY 
REDUCE 
EFFORT
SPENDING MONEY WISELY 
TESTABILITY
WHAT IS 
TESTABILITY? 
The concept of designing & implementing software 
so it is easier to test 
• Testing can be automated 
• Testing can be interactive
SCALES OF 
TESTABILITY 
easy 
interfacing 
transparency 
challenging 
transparent 
opaque 
There are at least 2 dimensions of Testability: 
• ease of interfacing 
• transparency into the state & behaviour of the software being tested.
DESIGNING FOR 
TESTABILITY: HOOKS 
Programmatic Hooks 
To connect test automation easily 
Consider whether to leave them in situ
DESIGNING FOR 
TESTABILITY: VISIBILITY 
“Eyes into the Soul of the machine...” 
Expose internal data and state 
• Makes some checks easier to confirm 
• e.g. Error recovery mechanisms cleaned up the 
app’s internal state 
Beware: 
• Non-test code might start using the data 
• If so, consider formalising the access in an API
TESTABILITY: 
LAYERING OF CODE 
Already covered some aspects in the Segmented 
Design topic 
Ideal to be able to automate the testing of each 
layer or component independently 
Then testing of the composite software can focus 
on testing the composite aspects 
Beware of emergent behaviour 
• Test the qualities: non-functional-testing (NFT)
TESTABILITY: SEPARATION 
OF CONCERNS 
Separate generic and platform-specific code 
Generic code: 
• Application logic: What the app does, functionality 
Platform-specific code: 
• User Interface 
• Threading 
• Calls to platform-specific APIs
TESTABILITY: ISOLATE 
COMPLEX CODE 
Try encapsulating & isolating complex code 
• Provide an interface* 
• Have excellent automated tests exercise it 
• Warn casual developers (and testers) not to tamper 
with it 
• Now the rest of our code is easier to understand & 
manage 
In parallel consider ways to replace complex code 
with simpler code 
* e.g. See the Facade design pattern
BIG PICTURE 
BACK TO 
“VALUE”
SPENDING WISELY? 
FULL 
LIFECYCLE 
COSTS
FULL LIFECYCLE 
COSTS 
The initial development effort may be dwarfed by 
maintenance work 
There are trade-offs between reducing the cost of initial 
development and the cost of maintenance work 
Code that costs more to modify is undesirable. Well 
designed code & good automated tests can reduce the risk 
and cost of maintenance work. 
Beware of premature aging of your app’s codebase!
WHERE AND WHEN TO 
SPEND 
MONEY ON 
TESTING?
NOVODA 
Costs 60% more to ‘add’ test automation to 
Android projects 
Who’s willing to sign off on it? 
Where and when does the ROI start?
THINGS TO CONSIDER 
How long do your code bases ‘last’? 
Who pays for ‘maintenance’? 
Where is the expertise to maintain the code? 
Active apps need ongoing nurture & investments 
even if you’re not changing the functionality
ALTERNATIVES TO 
TESTING 
Testing is not the only way to obtain useful 
feedback. Sometimes it’s not the best way 
either.
COMPLEMENTING 
TESTING WITH OTHER 
INFORMATION SOURCES 
• Crowd Sourcing 
• Log Analysis & Crash Dumps 
• Analytics 
• In-app feedback
VISUALIZATION TOOLS 
UiAutomationViewer (for Android) 
Using visualisation tools to help define the test automation interface
SECTION 7 
USING 
MOBILE 
ANALYTICS
USING MOBILE 
ANALYTICS 
An overview of Mobile Analytics 
How they can help augment our testing
TOPOLOGY 
Data Collector 
Database 
Filter(s) 
Analytics 
WebServer 
Overview of Mobile Analytics 
Each step may be delayed 
Business 
view 
Mobile Apps sending 
Analytics data
TYPES OF EVENTS 
Mobile app Analytics 
Library 
Analytics 
Collector 
1:1 App-initiated 
event 
m:1 App-initiated 
event 
Library-initiated 
event 
E1 
E 
… E4 
E 
Ea 
L 
E 
Ea Analytics 
L 
Database 
Internet 
connection
ANALYTICAL QUESTIONS 
Engineering Activity, 
Benchmarking, 
Testing 
Trends, Defect 
Reports Extrapolation 
Software quality 
models, bottleneck 
analysis 
Specification 
refinement, asset 
reallocation 
Failure prediction 
models 
What’s Happened? 
(Reporting) 
What’s Happened? 
(Alerts) 
What will Happen? 
(Forecasting) 
How and why did it 
happen? 
(Factor analysis) 
What is the next best 
action? 
(Recommendation) 
What’s the best / worst 
that can happen? 
(Modeling / Simulation) 
Information 
Insight 
Past Present Future
FISHBONES 
Feasible Practical Useful

Agile Mobile Testing Workshop

  • 1.
    AGILE MOBILE TESTING WORKSHOP PUNE AGILE CONFERENCE JULIAN HARTY Creative Commons License How to design your mobile apps by Julian Harty is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. http://creativecommons.org/licenses/by-sa/3.0/deed.en_US Contact me: Rev: 22 Nov 2014 julianharty@gmail.com
  • 2.
  • 3.
  • 4.
    TIME TO USEFULFEEDBACK TTUF Information is more valuable when it is timely
  • 5.
    CONTINUOUS INTEGRATION (FORMOBILE APPS) Raw Ingredients • Code • Code Repository (git, svn, …) • Triggers • Build tools • Automated tests • Run time environment(s) • Emulators • Simulators • Devices
  • 6.
    WORKSHOP WHERE ARE THE LATENCIES
  • 7.
    WHERE ARE THE LATENCIES? • Build times • Affects end-to-end Unit Test runtime • Commissioning run-time environment • Automated tests • Deployment • Installing the app so it can be tested • App Store Approval • Feedback from the market • Feedback from the field • App Qualities • Failures & Defects in use
  • 8.
  • 9.
    TBS T BS T = Testing B = Bug reporting S = Setup We want to maximize T and minimize B & S
  • 10.
    MINIMIZE SETUP Emailor SMS URLs to phone Have a configuration workstation with all the drivers installed Create apps on your build server and make them available
  • 11.
    MINIMIZE BUG INVESTIGATION • Screenshot utilities • Learn how to access, filter and store device logs • Good quality camera for close-up screenshots • Write a bug report that will still be valuable when the bug will actually be investigated
  • 12.
    MAXIMIZE TESTING Usetesting heuristics • I SLICED UP FUN (Jonathan Kohl) • COP FLUNG GUN (Moolya)
  • 13.
  • 14.
  • 15.
    WiFi Password: FW4WFAAA TEST THIS Kiwix USING THIS GUIDE http://moolya.com/blogs/
  • 16.
  • 17.
  • 18.
  • 19.
    WHAT IS TESTABILITY? The concept of designing & implementing software so it is easier to test • Testing can be automated • Testing can be interactive
  • 20.
    SCALES OF TESTABILITY easy interfacing transparency challenging transparent opaque There are at least 2 dimensions of Testability: • ease of interfacing • transparency into the state & behaviour of the software being tested.
  • 21.
    DESIGNING FOR TESTABILITY:HOOKS Programmatic Hooks To connect test automation easily Consider whether to leave them in situ
  • 22.
    DESIGNING FOR TESTABILITY:VISIBILITY “Eyes into the Soul of the machine...” Expose internal data and state • Makes some checks easier to confirm • e.g. Error recovery mechanisms cleaned up the app’s internal state Beware: • Non-test code might start using the data • If so, consider formalising the access in an API
  • 23.
    TESTABILITY: LAYERING OFCODE Already covered some aspects in the Segmented Design topic Ideal to be able to automate the testing of each layer or component independently Then testing of the composite software can focus on testing the composite aspects Beware of emergent behaviour • Test the qualities: non-functional-testing (NFT)
  • 24.
    TESTABILITY: SEPARATION OFCONCERNS Separate generic and platform-specific code Generic code: • Application logic: What the app does, functionality Platform-specific code: • User Interface • Threading • Calls to platform-specific APIs
  • 25.
    TESTABILITY: ISOLATE COMPLEXCODE Try encapsulating & isolating complex code • Provide an interface* • Have excellent automated tests exercise it • Warn casual developers (and testers) not to tamper with it • Now the rest of our code is easier to understand & manage In parallel consider ways to replace complex code with simpler code * e.g. See the Facade design pattern
  • 26.
    BIG PICTURE BACKTO “VALUE”
  • 27.
    SPENDING WISELY? FULL LIFECYCLE COSTS
  • 28.
    FULL LIFECYCLE COSTS The initial development effort may be dwarfed by maintenance work There are trade-offs between reducing the cost of initial development and the cost of maintenance work Code that costs more to modify is undesirable. Well designed code & good automated tests can reduce the risk and cost of maintenance work. Beware of premature aging of your app’s codebase!
  • 29.
    WHERE AND WHENTO SPEND MONEY ON TESTING?
  • 30.
    NOVODA Costs 60%more to ‘add’ test automation to Android projects Who’s willing to sign off on it? Where and when does the ROI start?
  • 31.
    THINGS TO CONSIDER How long do your code bases ‘last’? Who pays for ‘maintenance’? Where is the expertise to maintain the code? Active apps need ongoing nurture & investments even if you’re not changing the functionality
  • 32.
    ALTERNATIVES TO TESTING Testing is not the only way to obtain useful feedback. Sometimes it’s not the best way either.
  • 33.
    COMPLEMENTING TESTING WITHOTHER INFORMATION SOURCES • Crowd Sourcing • Log Analysis & Crash Dumps • Analytics • In-app feedback
  • 34.
    VISUALIZATION TOOLS UiAutomationViewer(for Android) Using visualisation tools to help define the test automation interface
  • 35.
    SECTION 7 USING MOBILE ANALYTICS
  • 36.
    USING MOBILE ANALYTICS An overview of Mobile Analytics How they can help augment our testing
  • 37.
    TOPOLOGY Data Collector Database Filter(s) Analytics WebServer Overview of Mobile Analytics Each step may be delayed Business view Mobile Apps sending Analytics data
  • 38.
    TYPES OF EVENTS Mobile app Analytics Library Analytics Collector 1:1 App-initiated event m:1 App-initiated event Library-initiated event E1 E … E4 E Ea L E Ea Analytics L Database Internet connection
  • 39.
    ANALYTICAL QUESTIONS EngineeringActivity, Benchmarking, Testing Trends, Defect Reports Extrapolation Software quality models, bottleneck analysis Specification refinement, asset reallocation Failure prediction models What’s Happened? (Reporting) What’s Happened? (Alerts) What will Happen? (Forecasting) How and why did it happen? (Factor analysis) What is the next best action? (Recommendation) What’s the best / worst that can happen? (Modeling / Simulation) Information Insight Past Present Future
  • 40.