The RAD Approach
Rick Nolle, CLU, FLMI/M, MCP
Vice President of Information Technology
RGA Reinsurance
Session 37-TS October 28, 2002
Agenda
tFundamentals of IT Projects
tLearning from History
tRapid Application Development
tKeeping Projects on Track
tTrust but Verify
tSummary
1
How to Build a Computer Application
tChoose a methodology
tEstimate the size of the project
tForm a project team
tDevelop a project plan
tExecute the plan
tImplement the system
When are your development
projects usually finished?
tEarly 0%
tOn time 56%
tLate 43%
How do we change this?
Computerworld poll of CIOs – October 16, 2002
2
Columbus is RAD
Armstrong is SAD
October 28, 2002
Two Projects
Columbus and Armstrong
tTwo pioneers who took mankind to a new place
tTwo missions of very different nature
tColumbus had an vision
tArmstrong has a mission
3
Columbus
tBorn in Genoa, Italy in 1446
tGrew up to be an accomplished sea captain
tShipping in the 1400’s was limited
tMediterranean Sea
tEuropean Coast
tNorthern Africa
tMarco Polo explored the East
Columbus
tColumbus, through study and observation, believed
the world was round
tColumbus calculated the circumference of the earth
at 20,000 miles
tColumbus (mis)calculated that India was 3,000
miles west of Spain
4
Columbus
tSought executive sponsorship
tItaly and Portugal turned him down
tSpain’s Queen Isabella and King Ferdinand stepped up
tProvided three ships and a crew
tUsed new technologies
tThecompass
tDead reckoning
Columbus
tAfter about 30 days of traveling due west, land was
sighted
tCalled the people Indians, declared
mission a success and returned a hero
tReward was even greater than the
original expectation
5
Armstrong
t Began with executive sponsorship
t JFK’s vision in 1961
t Involved tens of thousands of scientists,
engineers and others
t Highly organized, very specialized
t All risks managed or eliminated
Armstrong
t Apollo program cost $25 billion
t Utilized new technology and invented
technology that wasn’t available:
t Computers for calculations
t Rocket science
t Mission was flawless
t Reward was exactly what was envisioned:
landed man on the moon
6
Armstrong
New Technology
First Network Diagram of APRANet c. 1969
Application Development
Methodologies
tStandard methodology is the Waterfall
tSimplest methodology is Code-and-Fix
tRapid application development is the Spiral
tOther methodologies include:
tEvolutionaryDelivery
tDesign to Schedule
tDesign to Tools
7
Software
Concept Waterfall
Requirement
Analysis
Architecture
Design
Detailed
Design
Coding and
Debugging
Unit
Testing
System
Testing
Waterfall
Plusses Minuses
t Works best on complex, t Not flexible
well understood projects t Must know goal at start
t Produces high quality t Difficult to swim upstream
t Provides structure for
weak or inexperienced
staff
8
RAD Methodology
tIt’s
a spiral methodology, like a cinnamon roll
tAnalyst and user work hand in hand
Determine
Objectives
Review & Identify
Test Risks
Develop Evaluate
Deliverables Alternatives
RAD Spiral
Copyright 2000 -Carnegie Mellon University
9
RAD
Plusses Minuses
t Shortest possible timeline t Complicated model
t Risk decreases as project
progresses
t Ultimate flexibility in
defining end product
t High user visibility
Rapid Application Development
tIs applicable to entrepreneurial endeavors where
the business plan isn’t fully developed
tIt’s a joint effort between developers and the client
tIt’s an iterative approach involving prototyping and
incremental building
tIt’s taking a big, monolithic project and breaking it
into many joint efforts
10
When to Use RAD?
tRAD is a good methodology when there’s not an
exact need or precise goal
tRAD involves working together to figure out
where you’re going as you go
tRAD involves uncertainty
Columbus was RAD
Armstrong was SAD
tRAD allows you to change course as you go
tRAD provides for discovery of facts along the way
tRAD builds upon iterative successes
tSAD has detailed up-front planning
tSAD requires specialization and structure
tSAD is more expensive and takes more time
11
General Strategies for RAD
tAvoid classic mistakes
tApply development fundamentals
tManage risks to avoid setbacks
tApply schedule-oriented practices
Classic Mistakes
1. Feature Creep
2. Gold-plating
3. Shortchanged quality
4. Overly optimistic schedules
5. Inadequate design
6. Silver bullet syndrome
7. Weak staff/Contractor failure
8. Friction between developers and customers
9. Imbalanced product, time and resources
12
Classic Mistake
Feature Creep
tAverage project experiences 25% addition in
features
tMust be actively managed through written
requests and project impact analysis
tRAD allows for feature definition at each iteration
tSAD requires a swim up the waterfall
Classic Mistake
Gold Plating
tParkinson’s Law
Work expands to fill available time.
13
Classic Mistake
Quality Assurance
tAn IBM study in the early 1990’s found that
products with the lowest defect counts were also
the products with the shortest schedules
tThe 95% Rule
Finding 95% of the bugs in prerelease is the point
where projects achieve the shortest schedules,
least effort and highest level of user satisfaction
Classic Mistake
Overly Optimistic Schedules
tUnderdeveloped requirements/specifications
tUnderestimation of interdependencies
tOverestimation of individual capabilities
14
Classic Mistake
Inadequate Design
tSingle design cycle isn’t sufficient
tRush to develop – not spending time in
requirements and design
Classic Mistake
Silver Bullet Syndrome
tBeta version of anything
tBrand new hardware that finally makes the
architecture possible
tNew tool/version that handles the
problem that plagues your project
15
Classic Mistake
Weak Staff/Contractor Failure
tRAD demands higher-than-average talent
tBusiness knowledge is imperative
tConsultants infuse knowledge
tContractors are short-term
Classic Mistake
Poor Client Involvement
Customer involvement will:
tImprove efficiency
tReduce rework
tMinimize risk
Managing Expectations
tReality versus perception
16
Classic Mistake
Imbalanced Projects
Pick Two: Software Trade-off Triangle
tGood Schedule
tFast
tCheap
Cost Product
Fundamentals for Speed
tPeople
tProcess
tProduct
tTechnology
17
Manage Risk
tIdentifypotential risks
tRisk analysis
tRisk prioritization
tControlling risks
tMonitor risks
Trust but Verify
18
Algorithms
Testing Time Allocations
tRequirements 20%
tDetailedDesign 25%
tCoding and Debugging 35%
tTesting 20%
Alorithms
Size Estimation
tFunction point estimation
tSize estimation software
tComparative estimations
19
Function Points
Total Function Points
Function Points 3X2= 4X2= 6X 7=
Low Medium High Number of inputs 12 8 42
Number of inputs 3 4 6 4X3= 5X7= 7X2=
Number of outputs 4 5 7 Number of outputs 12 35 14
Processes 3 4 6 3 X 3 =4 X 2 = 6 X 2 =
Internal interfaces 7 10 15 Processes 9 8 12
External interfaces 5 7 10 7 X 5 = 10 X 2 = 15 X 4 =
Count of Each Function Internal interfaces 35 20 60
Number of inputs 4 2 7 5 X 6 = 7 X 1 =10 X 1 =
Number of outputs 3 7 5 External interfaces 30 7 10
Inquiries 3 2 2 Sum of Total Points 335
Internal interfaces 5 2 4 Influence Multiplier X 1.15
External interfaces 6 1 1 385
Algorithms
Teams, Plans and Timelines
tEstimating software schedule
tSchedule in months = 3.0 * person-months ^ (1/3)
tSchedule in months = Function Points ^ k
k = (~0.43)
tTo increase the speed of a project:
tDecrease size of product
tIncrease people on team
20
Conclusion
tRapid Application Development is right for many
projects
tWhen RAD isn’t right, the large project can be
broken down into many smaller RAD efforts
tTime is money
Rapid Application Development
21
Bibliography
Rapid Application Development, Steve McConnell,
Microsoft Press, 1996
Spiral Development: Experience, Principles and
Refinements, Barry Boehm, Software Engineering Institute,
2000
IT / Metrics / Benchmark Resources & Links,
www.metricnet.com, Howard Rubin, 2002
Various articles, www. garynorth.com, Gary North, 1998
Various articles, www.sei.cmu.edu, Carnegie-Mellon
University, 2002
22