04-Task Analysis
1
Topics
• User analysis
• Task analysis
• Domain analysis
• Requirements analysis
6.813/6.831 User Spring
Interface Design and 2011 2
Implementation
01-User Analysis
Know Your User
• Identify characteristics of target user
population
– Age, gender, culture, language
– Education (literacy? numeracy?)
– Physical limitations
– Computer experience (typing?)
– Motivation, attitude
– Domain experience
– Application experience
– Work environment and other social context
6.813/6.831 User
– Relationships and communication
Interface Design
Implementation
and
Spring
2011 patterns 4
Multiple Classes of Users
• Many applications have several kinds of users
– By role (student, teacher)
– By characteristics (age, motivation)
• Example: Olympic Message System
– Athletes
– Friends & family
– Telephone operators
– Sysadmins
6.813/6.831 User Spring
Interface Design and 2011 5
Implementation
Personas
• A persona is a fictitious character used as a specific
representative of a user class
– Yoshi is a 20-year-old pole vaulter from Tokyo who speaks some
English
– Bob is an IBM sysadmin in New York
– Fritz is the 50-year-old father of a German swimmer
• Advantages
– Convenient handle for talking about user classes
– Focuses on a typical user, rather than an extreme
– Encourages empathy
• Disadvantages
– May be misleading
– Stereotype trap
6.813/6.831 User Spring
Interface Design and 2011 6
Implementation
How To Do User Analysis
• Techniques
– Questionnaires
– Interviews
– Observation
• Obstacles
– Developers and users are sometimes
systematically isolated from each other
• Tech support shields developers from users
• Marketing shields users from developers
– Some users are expensive to talk to
6.813/6.831 User
• Doctors, executives, union
Spring
Interface Design and 2011members 7
Implementation
02-Task Analysis
Task Analysis
• Identify the individual tasks the program
might solve
• Each task is a goal (what, not how)
• Often helps to start with overall goal of the
system and then decompose it hierarchically
into tasks
6.813/6.831 User Spring
Interface Design and 2011 9
Implementation
Essential Parts of Task Analysis
• What needs to be done?
– Goal
• What must be done first to make it possible?
– Preconditions
• Tasks on which this task depends
• Information that must be known to the user
• What steps are involved in doing the task?
– Subtasks
– Subtasks may be decomposed recursively
6.813/6.831 User Spring
Interface Design and 2011 10
Implementation
Example from OMS
• Goal
– Send message to another athlete
• Preconditions
– Must know: my country code, my username, my
password, the other athlete’s name
• Subtasks
– Log in (identify yourself)
– Identify recipient
– Record message
6.813/6.831 User Spring
– Hang up
Interface Design and
Implementation
2011 11
Other Questions to Ask About a
Task
• Where is the task performed?
– At a kiosk, standing up
• What is the environment like? Noisy, dirty, dangerous?
– Outside
• How often is the task performed?
– Perhaps a couple times a day
• What are its time or resource constraints?
– A minute or two (might be pressed for time!)
• How is the task learned?
– By trying it
– By watching others
– Classroom training? (probably not)
• What can go wrong? (Exceptions, errors, emergencies)
– Enter wrong country code
– Enter wrong user name
– Get distracted while recording message
• Who else is involved in the task?
6.813/6.831 User Spring
Interface Design and 2011 12
Implementation
How to Do a Task Analysis
• Interviews with users
• Direct observation of users performing tasks
6.813/6.831 User Spring
Interface Design and 2011 13
Implementation
Example: Elevator Task Analysis
• Suppose we’re designing the Student Center
elevator interface
• What are the tasks?
6.813/6.831 User Spring
Interface Design and 2011 14
Implementation
03-Domain Analysis
Domain Analysis
• Identify important things in the domain
– People (user classes)
• Athletes, friends & family, sysadmins
– Physical objects
• Namecard, telephone Account
– Information objects
• Messages, accounts
Athlete Sysadmin Namecard Message
6.813/6.831 User Spring
Interface Design and 2011 16
Implementation
Domain Analysis
• Determine important relations between the things
– Athletes have accounts
– Accounts have messages
– Family & friends know athletes
– Sysadmins register athletes or create accounts
Sysadmin
create
Athlete account Account messages Message
6.813/6.831 User Spring
Interface Design and 2011 17
Implementation
Domain Analysis
• Identify multiplicities of things and relations
– Numbers are best, but simple multiplicity
indicators (!,?,+,*) help too
Sysadmin
100
create
Athlete account Account messages Message
10,000
! 10 [0-100]
6.813/6.831 User Spring
Interface Design and 2011 18
Implementation
Feedback to User & Task Analysis
• People entities who really should be user
classes
• Missing tasks
– CRUD: Create, Read, Update, Delete
6.813/6.831 User Spring
Interface Design and 2011 19
Implementation
Example: Twitter Domain Analysis
• Suppose we’re reimplementing Twitter.
• What are its entities, relationships, and
multiplicities?
6.813/6.831 User Spring
Interface Design and 2011 20
Implementation
04-Requirement Analysis
Requirements Analysis
• Requirements: what should the system do?
Users
Requirements
Tasks
Domain
6.813/6.831 User Spring
Interface Design and 2011 22
Implementation
Requirements:
what, how and why?
What: Two aims
• Understand as much as possible about users, task, context
• Produce a stable set of requirements
How:
• Data gathering activities
• Data analysis activities
• Expression as ‘requirements’
• All of this is iterative
23
Requirements:
what, how and why?
Why
•Requirement's definition: the
stage where failure occurs most
commonly
•Getting requirements right is
crucial
24
Volere Requirements Template
• Volere Requirements Specification is a
Template used as a basis for discovering and
communicating requirements of software
systems.
• Requirements as belonging to a type
25
Volere Requirements Template
26
Establishing requirements
What do users want? • Requirements need clarification, refinement,
What do users need? completion, re-scoping
Input • Requirement's document (maybe)
Output • Stable requirements
• Requirements arise from understanding users’ needs
Why establish? • Requirements can be justified & related to data
27
Different kinds of requirements
• Functional:
– What the system should
do
– Historically the focus of
requirements activities
– A functional
requirement for a word
processor may be that it
should support a variety
of formatting styles
28
Different kinds of requirements
• Non-Functional:
– Portability, response
time, etc.
– A non-functional
requirement for a
word processor
might be that it must
be able to run on a
variety of platforms
such as PCs, Macs
and Unix machines
29
Different kinds of requirements
• Data:
– Data requirements
capture the type,
volatility, size,
amount, persistence,
accuracy, and value
of the amounts of
the required data
– How will they be
stored (e.g.,
database)?
30
Environment or context of use
Physical
• dusty, noisy, vibration, light, heat,
humidity, …. (e.g., ATM)
Social
• sharing of files, view of files
synchronously across great distances,
work individually, privacy for clients
Organizational
• hierarchy, IT department’s attitude,
user support, communications
structure and infrastructure,
availability of training
31
Users: Who are they?
Characteristics
• ability, background, attitude to computers
System use: novice, expert, casual, frequent
• Novice: step-by-step (prompted), constrained, clear
information
• Expert: flexibility, access/power
• Frequent: short cuts
• Casual/infrequent: clear instructions, e.g., menu paths
32
What are the users’ capabilities?
Size
• of hands may affect the size and positioning of input buttons
motor abilities
• may affect the suitability of certain input and output devices
Height
• if designing a physical kiosk
Strength
• a child’s toy requires little strength to operate, but greater strength to change batteries
Disabilities
• (e.g., sight, hearing)
33
Requirements Vary
• What factors (environmental, user, usability)
would affect the following systems?
– Self-service filling and payment system for a petrol
(gas) station
– On-board ship data analysis system for geologists
searching for oil
– Fashion clothes website
34
Personas
A personal facade
Capture user
that one presents to
characteristics
the world
Not real people, but
Should not be
synthesized from real
idealized
user characteristics
Bring them to life
with a name,
characteristics, goals,
personal background
35
Personas
36
Data gathering for requirements
• Interviews
– Props, e.g., sample
scenarios of use,
prototypes, can be
used in interviews
– Good for exploring
issues
– But are time
consuming and may be
infeasible to visit
everyone
37
Data gathering for requirements
• Focus groups interviews
– Group interviews
– Good at gaining a
consensus view
and/or highlighting
areas of conflict
– But can be
dominated by
individuals
38
Data gathering for requirements
Questionnaires
• Often used in conjunction with other techniques
• Can give quantitative or qualitative data
• Good for answering specific questions from a
large, dispersed group of people
Researching similar products
• Good for prompting requirements
39
Data gathering for requirements
• Direct observation
– Gain insights into
stakeholders’ tasks
– Good for understanding
the nature and context
of the tasks
– But it requires time and
commitment from a
member of the design
team, and it can
result in a huge amount
of data
40
Data gathering for requirements
• Indirect observation
– It largely involves analyzing
textual material generated
indirectly, e.g.,
• Diaries
• Interaction logging (key
presses, mouse /
device movements)
– Not often used in
requirements activity
– Good for logging current
tasks
41
Data gathering for requirements
• Studying documentation
– Procedures and rules are often written down in
manuals
– Good source of data about the steps involved in
an activity, and any regulations governing a task
– Not to be used in isolation
– Good for understanding legislation, and getting
background information
– No stakeholder time, which is a limiting factor on
the other techniques
42
Data gathering for requirements
• Contextual Inquiry
– An approach to ethnographic study (scientific
description of individual human societies) where
user is expert, designer is apprentice
– A form of interview, but
• at users’ workplace (workstation)
• 2 to 3 hours long
– Four main principles:
• Context: see workplace & what happens
• Partnership: user and developer collaborate
• Interpretation: observations interpreted by user and
developer together
• Focus: project focus to understand what to look for
43
Problems with data gathering
Identifying • users, managers, developers, customer
stakeholders: reps?, union reps?, shareholders?
Involving • workshops, interviews, workplace
studies, co-opt stakeholders onto the
stakeholders: development team
‘Real’ users, • traditionally a problem in software
not managers engineering, but better now
44
Problems with data gathering
Requirements management:
• version control, ownership
Communication between parties:
• within development team
• with customer/user
• between users… different parts of an organization use different terminology
Domain knowledge distributed and implicit:
• difficult to dig up and understand
• knowledge articulation (extraction of knowledge from individuals )
• Availability of key people
45
Problems with data gathering
Political problems
within the organization
Dominance of certain
stakeholders
Economic and business
environment changes
Balancing functional
and usability demands
46
Some basic guidelines
Focus on Focus on identifying the stakeholders’ needs
Involve Involve all the stakeholder groups
Involve more than one representative from
Involve each stakeholder group
Use a combination of data gathering
Use techniques
47
Some basic guidelines
Support the process with props such as prototypes and task descriptions
Run a pilot session
You will need to compromise on the data you collect and the analysis to be
done, but before you can make sensible compromises, you need to know what
you’d really like
Consider carefully how to record the data
48
Data interpretation and analysis
This phase starts soon after data
gathering phase
Initial interpretation before deeper
analysis
Different approaches emphasize
different elements e.g., class
diagrams for object-oriented
systems, entity-relationship
diagrams for data intensive systems
49
Task descriptions
Scenarios Use cases Essential use cases
an informal narrative story, assume interaction with a abstract away from the
simple, ‘natural’, personal, system details
not generalizable assume detailed does not have the same
understanding of the assumptions as use cases
interaction
50
Scenario: university admissions
office
• You walk in, and are greeted the supervisor,
who starts by saying something like:
– “Well, this is where the admissions forms arrive.
We receive about 50 a day during the peak
application period. Brian here opens the forms and
checks that they are complete, that is, that all the
documentation has been included. You see, we
require copies of relevant school exam results or
evidence of work experience before we can
process the application. Depending on the result
of this initial inspection, the forms get passed to…”
51
Use Case Diagram
52
Use Case Diagram
53
Task analysis
Task descriptions are often used to envision new systems or
devices
Task analysis is used mainly to investigate an existing
situation
What are people trying to achieve?
It is important not to focus on Why are they trying to achieve it?
superficial activities How are they going about it?
Many techniques, the most popular is Hierarchical Task
Analysis (HTA)
54
Hierarchical Task Analysis
Involves breaking a task down
HTA focuses on physical and
into subtasks, then sub-sub-
observable actions, and
tasks and so on. These are
includes looking at actions not
grouped as plans which
related to software or an
specify how the tasks might
interaction device
be performed in practice
Start with a user goal which is
Tasks are sub-divided into sub-
examined and the main tasks
tasks
for achieving it are identified
55
Hierarchical Task Analysis
0. In order to buy a DVD
1. locate DVD
2. add DVD to shopping basket
3. enter payment details
4. complete address
5. confirm order
plan 0:
• If regular user do 1-2-5.
• If new user do 1-2-3-4-5.
56
Hierarchical Task Analysis
57
Best Practices For Writing User
Requirements
Understand the User Perspective:
1.Gain a deep understanding of the target
user group, their characteristics,
preferences, and goals.
2.Identify the user personas and their specific
needs, roles, and responsibilities within the
system.
58
Best Practices For Writing User
Requirements
Involve Users and Stakeholders:
1.Collaborate closely with users and relevant
stakeholders throughout the requirement-
gathering process.
2.Conduct user interviews, surveys, or
usability tests to gather insights and
validate requirements.
59
Best Practices For Writing User
Requirements
Use User-Centric Language:
1.Express user requirements in user-centric
language, using terms and concepts familiar
to the intended users.
2.Avoid technical jargon or complex
terminology that may confuse or alienate
the users.
60
Best Practices For Writing User
Requirements
Focus on User Goals and Tasks:
1.Identify the goals and tasks users need to
accomplish using the software system.
2.Frame requirements around the specific
actions or functionalities that support user
goals and tasks.
61
Best Practices For Writing User
Requirements
Be Specific and Concrete:
1.Clearly define the desired behavior or
outcomes from the user’s perspective.
2.Use concrete examples, scenarios, or
stories to illustrate the desired interactions
or workflows.
62
Best Practices For Writing User
Requirements
Consider User Experience (UX) Design:
1.Address the user experience aspects such
as ease of use, intuitiveness, and efficiency
in completing tasks.
2.Specify requirements related to navigation,
layout, interaction design, and visual
aesthetics.
63
Best Practices For Writing User
Requirements
Include Performance Expectations:
1.Define user requirements related to system
responsiveness, speed, and efficiency.
2.Specify performance targets or thresholds
that contribute to a satisfactory user
experience.
64
Best Practices For Writing User
Requirements
Prioritize User Needs:
1.Assign priorities to user requirements to
ensure the most critical needs are
addressed first.
2.Prioritization helps in resource allocation,
decision-making, and trade-off analysis.
65
Best Practices For Writing User
Requirements
Validate with User Feedback:
1.Regularly validate user requirements with
user representatives or usability experts.
2.Seek feedback on prototypes, mock-ups, or
design iterations to ensure that
requirements align with user expectations.
66
Best Practices For Writing User
Requirements
Consider Accessibility and Inclusivity:
1.Incorporate requirements to address
accessibility and inclusivity considerations.
2.Ensure the software system accommodates
diverse user needs, including those with
disabilities or different cultural
backgrounds.
67
Best Practices For Writing User
Requirements
Review and Iterate:
1.Conduct regular reviews and iterations of
user requirements with stakeholders and
the development team.
2.Ensure that the requirements are complete,
accurate, and align with the overall project
goals.
68
Managing User Requirements throughout
the UI Development Lifecycle
Requirements Elicitation
• To gather user requirements effectively, employ various
techniques during the requirements elicitation phase.
Consider these practices:
• Conduct interviews with users, stakeholders, and subject
matter experts to understand their needs, preferences, and
expectations.
• Utilize surveys or questionnaires to collect feedback from a
broader user population, allowing for a comprehensive
understanding of their requirements.
• Conduct observations or user shadowing sessions to gain
insights into how users interact with existing systems or
69
perform their tasks.
Managing User Requirements throughout
the UI Development Lifecycle
• Analysis and Refinement
• Once user requirements are gathered, analyzing and
refining them is essential to ensure clarity,
completeness, and consistency. Consider these
practices:
• Organize and categorize user requirements based on
their similarities or related functionalities to identify
patterns or commonalities.
• Remove any duplicate or conflicting requirements to
eliminate ambiguity and ensure consistency.
• Collaborate with users and stakeholders to validate
and refine the requirements, ensuring they accurately
capture the desired functionality and user experience.70
Managing User Requirements throughout
the UI Development Lifecycle
• Traceability and Impact Analysis
• Establishing traceability between user requirements and
other project artifacts is crucial for impact analysis and
change management. Consider these practices:
• Use unique identifiers or tags to link user requirements to
design decisions, test cases, and other project artifacts.
• Maintain a traceability matrix that shows the relationships
between user requirements and other project elements,
enabling impact analysis during changes.
• Assess the impact of proposed changes on user
requirements to understand the potential consequences
and make informed decisions.
71
Managing User Requirements throughout
the UI Development Lifecycle
• User-Centric Design and Prototyping
• Incorporating user requirements into the design and
prototyping stages of the software development process
helps validate and refine the user experience. Consider
these practices:
• Create design mock-ups, wireframes, or interactive
prototypes that reflect the user requirements, allowing
users to provide feedback and validate the proposed
solutions.
• Conduct usability testing sessions with users to gather
insights and identify any usability issues or areas for
improvement.
• Iteratively refine the design and prototype based on user
feedback, ensuring that the final product meets user 72
expectations and needs.
Managing User Requirements throughout
the UI Development Lifecycle
• User Acceptance Testing
• Involving users in the acceptance testing phase ensures
that the developed software meets their requirements and
expectations. Consider these practices:
• Develop user acceptance testing scenarios that align with
the documented user requirements.
• Collaborate with users to perform acceptance testing,
allowing them to validate whether the software meets their
needs and performs as expected.
• Address any identified issues or discrepancies between the
software and user requirements, ensuring necessary
adjustments are made before deployment.
73
Common Errors in User Analysis
• Describing what your ideal users should be, rather than what they actually
are
– “Users should be literate in English, fluent in spoken Swahili, right-handed,
and color-blind”
6.813/6.831 User Spring
Interface Design and 2011 74
Implementation
Common Errors in Task Analysis
• Thinking from the system’s point of view, rather than the user’s
– “Notify user about appointment”
– vs. “Get a notification about appointment”
• Fixating too early on a UI design vision
– “The system bell will ring to notify the user about an appointment…”
• Bogging down in what users do now (concrete tasks), rather than why
they do it (essential tasks)
– “Save file to disk”
– vs. “Make sure my work is kept”
• Duplicating a bad existing procedure in software
• Failing to capture good aspects of existing procedure
6.813/6.831 User Spring
Interface Design and 2011 75
Implementation
Hints for Better User & Task
Analysis
• Questions to ask
– Why do you do this? (goal)
– How do you do it? (subtasks)
• Look for weaknesses in current situation
– Goal failures, wasted time, user irritation
• Contextual inquiry
• Participatory design
6.813/6.831 User Spring
Interface Design and 2011 76
Implementation
Contextual Inquiry
• Observe users doing real work in the real work
environment
• Be concrete
• Establish a master-apprentice relationship
– User shows how and talks about it
– Interviewer watches and asks questions
• Challenge assumptions and probe surprises
6.813/6.831 User Spring
Interface Design and 2011 77
Implementation
Participatory Design
• Include representative users directly in the
design team
• OMS design team included an Olympic athlete
as a consultant
6.813/6.831 User Spring
Interface Design and 2011 78
Implementation
DATA ANALYSIS
79
Qualitative vs Quantitative Data
Qualitative Data Quantitative Data
Overview: Overview:
•Deals with descriptions. •Deals with numbers.
•Data can be observed but •Data which can be
not measured. measured.
•Colors, textures, smells, •Length, height, area, volume,
tastes, appearance, beauty, weight, speed, time,
etc. temperature, humidity, sound
levels, cost, members, ages,
•Qualitative → Quality etc.
•Quantitative → Quantity
Example 1: Oil Painting Example 1: Oil Painting
Qualitative data: Quantitative data:
*red/green color, gold frame *picture is 10" by 14”
*smells old and musty •with frame 14" by 18”
*texture shows brush strokes of oil paint * weighs 8.5 pounds
*peaceful scene of the country •surface area of painting is 140 sq.
*masterful brush strokes in.
*cost $300
Example 2: Latte Example 2: Latte
Qualitative data: Quantitative data:
*robust aroma *12 ounces of latte
*frothy appearance *serving temperature 1500 F.
* strong taste *serving cup 7 inches in height
*glass cup *cost $4.95
Example 3: Freshman Class Example 3: Freshman Class
Qualitative data: Quantitative data:
*friendly demeanors *672 students
*civic minded *394 girls, 278 boys
*environmentalists *68% on honor roll
*positive school spirit *150 students accelerated in
mathematics
Make one qualitative observation about the picture above.
Explain why this is a qualitative observation.
Make one quantitative observation about the picture above.
Explain why this is a quantitative observation.
Make one qualitative observation about the picture above.
Explain why this is a qualitative observation.
Make one quantitative observation about the picture above.
Explain why this is a quantitative observation.