Making a better web form
Caroline Jarrett Christopher Minott
Effortmark Limited Loanbright.com
caroline.jarrett@effortmark.co.uk chris@loanbright.com
ABSTRACT
Which usability changes make the biggest difference? We were surprised by some of the results when we
tested a selection of changes to a typical on-line form. Changes included layout, wording of questions,
and addition of extra pages. The only one to make a significant difference was re-working the introduction
to the form: re-designing the preamble and adding pages for 'about us' and 'contact us'.
INTRODUCTION
Background
LoanBright is a web business that finds customers for mortgage providers and offers a range of tools to
help mortgage providers manage their sales process.
The basic flow is:
- user arrives at a form by clicking on an advertisement (generally, low conversion) or sponsored
link from a search engine (generally, much higher conversion)
- user provides a selection of information about the mortgage they want and some personal details
- when the form is submitted, the mortgage quote finder send the information to some appropriate
mortgage providers
- the mortgage providers bid for the business
- the mortgage rate quote finder selects the best four providers and displays them to the user.
The business is funded by the fees paid by the mortgage providers for the leads that they receive.
The marketplace is especially competitive. Respectable web businesses are fighting it out with traditional
financial institutions that are moving to the web, and there is also a nasty presence of spammers.
A key business measurement is the ‘conversion rate’: the percentage of users who complete a form (to
the ‘submit’ or ‘send’ button) compared to those who arrive at it. It is the inverse of ‘drop out rate’.
The problem
Loanbright’s form, shown in Figure 1 below, had done good service for several years but the feeling
within Loanbright was that it was looking old-fashioned and might be inefficient. (They have a variety of
forms with minor variations, but this one is representative).
Also, some competitors used photos of people and/or property and they wondered whether these might
create a more attractive or engaging appearance, resulting in a higher conversion rate.
Making a better web form
Figure 1 The top of the original form.
Figure 2 The top of a typical competitive form.
Making a better web form
METHOD Phase 1: Expert Inspection
Because there were some differences of opinion about which changes would have the greatest effect,
Loanbright decided to start with an expert inspection from Caroline Jarrett of Effortmark Ltd.
She reviewed their form, and those of seven competitors. This was feasible because all of these forms
are fairly short, typically three or four screenfuls of questions. Figure 2 shows a typical competitor’s form.
This was an ordinary expert review, and is included here as background for the second phase
RESULTS Phase 1: Expert Inspection
Expert inspection of the form and of competitive forms showed that the Loanbright form was one of the
best available. Typical faults of competitor’s forms included:
• nothing to show how much work was required
• too many questions asked compared to purpose of the form
• questions with confusing wording
• questions asking for personal data without justifying why it was required
• excessively crowded screens.
For example, Figure 3 points out a few of the faults of one of the competitive forms.
Form has 5 pages, too many for this task
Preamble is
split between
left margin
and top of
form
Question about
Unnecessary military service
instruction: do applies to few,
not use the should be left to
‘back button later in the
process
Figure 3 The top of a typical competitive form.
The Loanbright form had many good points:
• a single-paged form, approximately three screenfuls at 800 x 600 resolution
• spacious layout with simple dividers between sections
• ‘sensible’ appearance without gimmicks
• most questions had clear and obvious wording
Making a better web form
However, there were points where the Loanbright form could be improved:
1. ‘wall of words’ preamble: some of the words are unnecessary, and the presentation as a solid
block of text is uninviting
2. nothing to indicate that the organization behind the form is a respectable business (no ‘about us’
or ‘contact us’).
3. colour scheme might be seen as dull or unattractive
4. the question “have you ever had a bankruptcy?” (a question about credit history is a standard
item on these forms) did not offer a choice for ‘never’ instead, the user had to pick ‘none /
bankruptcy over 7 years ago’
5. some other minor changes to wording.
Loanbright decided to address point 4 immediately, so we have no data on whether they are significant
changes or merely minor good ideas. Point 6 was left aside for the future.
METHOD Phase 2: Comparative testing
These forms are reached through advertising or after using a search engine, so they are spur of the
moment, rather than a planned destination. Financial and organisational constraints meant that traditional
usability testing with users was not practical. Fortunately, LoanBright was willing to try a selection of form
variants on the live service.
As we had complete control over how forms were served to users, we elected to create many different
versions of the form with a variety of changes made in each version. These forms were served to a
randomized, balanced stream of traffic so that each version was seen by directly comparable audiences.
There was no observation of users or indeed any contact with them other than their interaction with
whatever form was served. We therefore have no qualitative data about why one
variant worked better, or worse, than another variant.
The forms we tested had combinations of these variations:
1. Centered in the browser window / left justified in the browser window
2. Coloured backgrounds in fields / plain (white) backgrounds in fields
3. Blue background behind form / yellow background behind form / plain (white)
background
4. Reworked preamble / reworked preamble and links for “About us” and
“contact us links” / original preamble
5. Small photo of woman / large photo of woman / photo of house and people Figure 4 Photo of
/ no photo house/ people
Figure 5 (next page) shows three of the variants tested.
We put together the variants and started testing, then we drew up Table 1 (next page) showing all the
variants tested. At that point, we noticed that it would have been better:
• to test more forms with the old preamble
• to test more than one variant with the new preamble but no ‘about us/contact us’ links
• to include some forms with a photo and the old preamble.
As the testing had already started, we decided to continue with the variants we had already prepared
rather than adding any extra variants.
Making a better web form
Form 28 Form 10 Form 26
• centred in browser • new preamble / no 'start over' • centred in browser
button • yellow background
• coloured fields • coloured fields
• new preamble / no 'start over' button
• small photo of woman
Figure 5 Some of the variants tested
Table 1 Form variants tested. Form 19 is the baseline (no changes at all).
Form centred coloured back- new preamble / about us photo
fields ground no 'start over' /contact us
10 y y y
11 y y y small photo of woman
12 y y photo of house / people
13 blue y y
14 y y small photo of woman
15 y large photo of woman
16 y y photo of house / people
17 y y y
18 y blue y y
19
20 y y y
21 y y blue y y
22 y y yellow y y
23 y y y y small photo of woman
24 y blue y y
25 y y y small photo of woman
26 y y yellow y y small photo of woman
27 y y y small photo of woman
28 y
29 y y y photo of house / people
Making a better web form
RESULTS
A preliminary test with a very poor stream of traffic (typical conversion rate around 1%) showed that the
variants were comparable or better than the original form. LoanBright therefore decided to expose the
variants to a better stream of traffic, this time with anticipated conversion rate around 5%.
During the month of October 2003, the average conversion rate of all the variants was 7.3%, adding 2.5%
to the conversion rates and therefore 33% better. This was encouraging and in itself justified the work of
producing the variants.
However, as Table 2 shows, we had difficulty working out which changes gave the best results. Three of
the new variants were actually slightly worse than the original. It was also hard to map rank in Table 2 to
any of the groups of changes.
Table 2 First stream of traffic, October 2003
Form Conversion rate Rank from October
2003
21 10.2% 1
11 9.1% 2
13 9.0% 3
27 9.0% 4
26 8.8% 5
20 8.7% 6
24 8.3% 7
18 8.2% 8
23 7.5% 9
10 7.2% 10
16 7.2% 11
25 7.1% 12
15 6.7% 13
22 6.5% 14
17 6.3% 15
29 6.1% 16
(original) 19 5.4% 17
28 5.0% 18
12 4.9% 19
14 4.7% 20
As the average improvement was good, LoanBright decided to try the variants on a better stream of traffic
(typical conversion rates around 10%) during November to see whether the improvement was
maintained.
We were pleased that the average conversion rate again improved by 2.5%. Table 3 (on the next page)
was also difficult to interpret, as forms that had previously done well dropped in the rankings and others
rose.
Making a better web form
Table 3 Second stream of traffic, November 2003
Form Rank from October 2003 Second stream of traffic
November 2003
22 14 15.5%
27 3 14.2%
17 15 14.0%
13 4 13.4%
10 11 13.3%
26 5 13.3%
11 2 13.1%
24 7 12.8%
20 6 12.8%
25 12 12.1%
29 16 11.8%
21 1 11.5%
12 19 11.3%
18 8 11.3%
15 13 11.2%
23 9 11.2%
16 10 10.9%
14 20 10.7%
(original) 19 17 10.1%
28 18 8.7%
We decided that the next step needed to be some statistical analysis. We used Michael Hughes’s
Usability Data Analyzer (1) to compare forms with a specific change to the others without that change.
New preamble is better
The first analysis was to compare variants with the original preamble (references 19 and 28) with variants
with the new preamble (all the others). Again, in hindsight it was a pity that we only had two of one type
and 18 of the others. Both tests, October and November, showed a significant difference (p= 0.047
October, p=0.006 November) between the conversion rates for the variants with the original preamble
and with the new preamble.
No significant differences for any other changes
From this point, we assumed that the new preamble was definitely better so the remaining tests excluded
the two forms with the old preamble. Table 4 on the next page gives our analysis of the effects of four
changes:
1. Adding a photo (we did not distinguish between type of photo or size of photo, due to small sample
sizes)
2. Adding coloured backgrounds to fields
3. Centering the form in the browser window
4. Adding a blue background to the window.
Making a better web form
Table 4 Analysis of effects of changes
Change Mean Standard Mean Standard p Conclusion
conversion deviation conversion deviation
without change with change
1. Adding a 13.1%, N=8 1.4% 12.0%, N=10 1.2% .085 Adding a photo is
photo slightly worse
2. Add coloured 11.9%, N=8 1.0% 12.9%, N=10 1.5% .115 Inconclusive
background
in fields
3. Centre form 12.4%, N=12 1.3% 12.7%, N=6 1.6% .684 Definitely no
in browser difference
window
4. Adding a blue 12.2%, N=4 1.0% 12.2%, N=12 1.2% .998 Definitely no
background difference
to window
Conclusions and speculation
We conclude from this study that:
• a clear, short, neatly arranged statement of the purpose of the form in the preamble is better than
similar but longer information presented as two blocks of text.
• tinkering with a design that is reasonably tidy and organized in the first place is unlikely to make a
major difference to the success of a form.
We also speculate that offering variants of the same form with minor visual changes may in itself increase
conversion rates, simply because potential users are being presented with something a little different
each time.
REFERENCES
1. Michael Hughes “The Usability Data Analyzer” available for download at
http://www.mindspring.com/~mikehughes.
Making a better web form