Measuring Mobile Voice and Data Quality of Experience: Call For Input Publication Date: Closing Date For Responses
Measuring Mobile Voice and Data Quality of Experience: Call For Input Publication Date: Closing Date For Responses
Contents
Section Page
1 Introduction 1
2 Understanding the consumer experience 5
3 Technical performance metrics 14
Annex Page
1 How to make submissions 20
Call for Input: Measuring mobile quality of experience
Section 1
1 Introduction
Purpose of this document
1.1 Consumers and citizens are growing increasingly dependent on mobile networks to
make phone calls and access data services. The performance of these networks can
vary between operators, by location and time of day and may not always meet the
expectations of consumers. In this document we use the phrase ‘quality of
experience’ (‘QoE’) to describe the technical performance 1 of the services delivered
to consumers.
1.3 For fixed broadband services, Ofcom has, for several years, collected information on
broadband speeds. This information has enabled consumers to improve their
purchasing decisions, and appears to have driven improvements in service quality by
operators. In this Call for Input, we wish to explore whether there is similar
information that we might provide in the mobile arena. Specifically, we want to
identify what network and/or service performance information Ofcom could gather
which accurately reflects the consumer QoE and which we could publish in a way
that would assist consumers in making informed choices about the mobile service
they purchase.
1.5 Alongside this Call for Input, where appropriate, we will seek to engage directly with
the mobile network operators to ensure that any other relevant information can be
taken into account.
1
By technical performance, we are referring to the operation of the network and services (i.e. the coverage,
speed, capacity and reliability) rather than customer service related aspects of a mobile service such as billing,
call centres and sales.
2
Section 3 of the Communications Act 2003
1
Call for Input: Measuring mobile quality of experience
1.8 Our 2012 Communications Market Report 3 found that 94% of UK adults use a mobile
phone and through the rise in smartphones and tablets, consumers increasingly rely
on mobile networks to provide a mobile data connection as well as a voice and text
service. Our Infrastructure Report 2012 Update found that the capacity of the UK’s
communications infrastructure is changing quickly. This is in response to a rapid
increase in consumers’ take-up and use of communications services and the
resulting investment by operators. 4 Data via mobile devices more than doubled
between 2011 and 2012.
1.9 In a time of such rapid change, it is all the more important that consumers have
access to timely and accurate information on the quality of services available in the
market.
1.10 There is also increasing Government interest in ensuring that UK consumers are able
to access mobile services which meet all their needs and expectations. The UK
already has a high level of mobile signal coverage; based on figures derived from
operator predicted coverage models 5 we estimate that 99.7% of UK premises receive
an outdoor 2G signal6 from at least one operator and 93.6% of premises receive a
signal from all operators. The government has announced an initiative aimed at
extending existing mobile voice coverage further still through its mobile infrastructure
project 7 in recognition of the importance of mobile services to citizens and the
economy.
1.12 Coverage of 3G services is lower than 2G, but improving, with a recently increased
(outdoor) coverage obligation placed on operators to reach 90% of UK premises due
to be met by June 2013. Furthermore, Vodafone and Telefónica (O2) have recently
begun to share their radio access networks, a development which has the potential to
reduce materially the number of partial not spots 8 in the UK.
1.13 Additionally, we will require that one of the 800MHz licensees in the current 4G
spectrum auction deliver a high speed, mobile data service indoors to 98% of UK
premises and 95% of premises in each of the Nations by the end of 2017. We
3
Figure 5.55 - http://stakeholders.ofcom.org.uk/binaries/research/cmr/cmr12/CMR_UK_2012.pdf
4
http://stakeholders.ofcom.org.uk/binaries/research/telecoms-research/infrastructure-report/Infrastructure-
report2012.pdf
5
Each mobile operator uses planning tools to predict signal strength in different areas.However, as with any
planning tool, these predictions are subject to an error of margin and do not necessarily account for all the factors
that can affect the quality of a mobile voice call or data session.
6
i.e. the signal is predicted to be sufficiently strong to make and sustain a call while outside.
7
http://www.culture.gov.uk/what_we_do/telecommunications_and_online/8757.aspx
8
We define partial notspots as areas with coverage from 1 or more MNOs, but not all MNOs
2
Call for Input: Measuring mobile quality of experience
anticipate that the resulting outdoor coverage will be materially higher than the indoor
requirements.
1.14 These developments are expected to bring about significant mobile coverage
improvements for consumers over the next few years, but it will be important to keep
track of how these improvements progress.
Coverage vs quality
1.15 The reach and coverage of mobile signals is just one part of delivering a mobile
service. Consumers sometimes experience misalignment between predicted
coverage and their day-to-day experiences of using their mobile phones. Concerns
about consumers’ ability to make and receive calls or use the internet on their mobile
phones have been expressed directly by consumers. These complaints have been
received by Ofcom 9 as well as by MPs, our own Advisory Committees and through
reports in the media.
1.16 We know that a number of factors can affect consumers’ QoE. This can sometimes
be the result of localised low signal quality caused, for example, by ‘signal
shadowing’ by buildings, how many people are using the network in a particular area
or the performance of a particular handset. While some of these factors may be
outside the control of the mobile operators (for example, handset performance), the
technical performance of each operator’s network does represent the key
differentiator in the consumer QoE delivered by different networks.
1.17 There are signs mobile network operators (MNOs) are increasingly competing on
issues of service quality and reliability, particularly as they ready themselves to offer
new 4G services (subject to the outcome of the 4G auction in 2013). For example,
Orange provides a Network Performance Promise 10 which compensates consumers
for dropped calls, and Vodafone recently referred to the depth of its network as ‘deep
pan’ pizza 11 - able to provide service deeper into buildings.
1.18 The extent to which operators are incentivised to improve their consumers’ QoE is in
part related to the competitive advantage that they can gain from offering the higher
quality. However, unless consumers are able to take the QoE offered by different
operators into account when making purchasing decisions, there is less incentive for
operators to invest in improving it.
4G Auction
1.19 2013 is likely to see rapid change in the mobile market as the 4G auction concludes
and operators roll out 4G networks. We are currently considering the potential scope
of initial research into 4G QoE with a likely initial focus on connection speeds and
coverage.. The lessons from any 4G-specific research will be combined with the
feedback we receive to this Call for Input and will inform Ofcom’s longer term
research objectives in the area of mobile QoE across 2G, 3G and 4G networks.”
9
We estimate that mobile coverage and quality issues represent approximately 5% of all mobile complaints
received by Ofcom
10
http://help.orange.co.uk/orangeuk/support/personal/480099/2
11
https://www.vodafone.co.uk/our-network-and-coverage/what-makes-a-great-network/index.htm
3
Call for Input: Measuring mobile quality of experience
1.21 In section 3 we set out the type of data that would be necessary to produce the
relevant consumer information and how this data could be collected.
Next steps
1.22 We welcome feedback from all stakeholders on this Call for Input. We intend to
review responses in April before deciding on how best to proceed.
1.23 We are particularly keen to get the views of stakeholders representing the needs of
consumers in different parts of the UK to ensure we have a clear view of the
information that consumers would find useful when purchasing mobile services. If
there is sufficient interest from stakeholders, we propose to host a workshop in
March to facilitate the development of ideas and options. To register your interest in a
workshop, please contact us by 15 February 2013.
1.24 Details on how to contact us and how to respond to this Call for Input are provided in
Annex 1.
4
Call for Input: Measuring mobile quality of experience
Section 2
2.2 Under section 14 of the Act we are required to make arrangements to find out about
the experiences of consumers using electronic communications services and the way
they are provided, and we do this by carrying out research into their experiences of
these services. Under section 15 of the Act we have a duty to publish the results of
our research and to take account of it in carrying out our functions; for example we
do this through our annual Communications Market Reports, and our Consumer
Experience Reports.
2.3 We may also inform our thinking by conducting economic or technical research,
and/or by engaging with consumer groups and industry. We gather data directly from
industry on a regular basis.
2.4 In addition, and in keeping with our duty to consider the interests of consumers and
citizens, we also seek to provide advice and information to help consumers make
better and more informed decisions about their telecommunications services.
Consumer information plays a critical role in ensuring competitive communications
markets, and we noted this in our Customer Service Satisfaction report in December
2012 12. A lack of information may lead consumers to make poor purchasing
decisions, or inhibit them from switching provider. If such information is not readily
available or is presented in a complex way, there may be a case for Ofcom to
intervene to address issues in the interests of and to protect consumers.
2.5 To update our understanding of consumers’ experience of using their mobile phone
and to help us keep track of improvements in consumers’ QoE, we carried out a
consumer survey in November 2012. We expect to conduct research of this kind
annually. This research helps us understand whether and to what extent mobile
phone reception issues affect consumers and, if so, what types of problems are most
prevalent and of most concern.
2.6 Our research also sought to examine whether there are differences in consumers’
QoE in urban and rural areas and in each of the Nations. We have published a report
of our findings alongside this Call for Input 13 and provide highlights of the results
relevant to consumers’ QoE below.
12
http://media.ofcom.org.uk/2012/12/04/latest-customer-service-satisfaction-levels-revealed-2/
13
Mobile Coverage Report: http://stakeholders.ofcom.org.uk/binaries/consultations/mobile-voice-data-
experience/annexes/usage.pdf
5
Call for Input: Measuring mobile quality of experience
2.7 The ability to make or receive calls or texts is consistently selected as the most
important feature when thinking about their mobile operator, followed closely by the
price of the service (38% and 34% respectively for the UK as a whole – Figure 1). In
Wales and Northern Ireland, and in rural areas, the ability to make or receive calls is
particularly important when selecting an operator. Mobile users in Wales (53%) and
Northern Ireland (47%) are significantly more likely than those in England (37%) or
Scotland (34%) to say that this is the most important factor when choosing a provider
and users living in rural areas are significantly more likely than those in urban areas
to say this (45% v. 37%). This may be a reflection of a poorer consumer experience
in those locations, although we do not have sufficient information to determine this for
certain.
Figure 1: Most important element when considering mobile provider, by nation and
urban/rural
60
53
50 47
45
Ability to make/ receive
calls
40 38 37 38 37
34 35 34 35 Cost of phone
service\contract
30
28
% 30 Speed or reliability of
24 internet
2.8 Mobile users were also asked about the importance of the ability to make or receive
calls alongside other aspects of mobile reception (Figure 2). The ability to make and
receive calls remains the most important for mobile users when thinking about their
mobile provider by a considerable margin (50% of UK mobile users). This is
particularly so for those in Northern Ireland (68%). Quality of voice calls is the next
most frequently cited aspect among UK users, with 16% saying this is most
important.
6
Call for Input: Measuring mobile quality of experience
0
Total England Scotland Wales Northern Urban Rural
Ireland
2.9 Our survey found that in the UK as a whole overall satisfaction with mobile providers
was 81% 14 (6% reported they are dissatisfied). There are no differences in levels of
overall satisfaction by urban or rural location or by nation.
2.10 When considering mobile functions and services, illustrated in figure 3, the highest
level of satisfaction is with the handset, with 78% of users either somewhat or very
satisfied. This is followed by satisfaction with the ability to make or receive calls or
text messages (74%).
2.11 The number of people satisfied with the speed or reliability of internet is lower, with
47% either somewhat or very satisfied. However, when filtered by those who use the
internet on their mobile phone the proportion saying they are either somewhat or very
satisfied increases to 70%.
14
Another recent Ofcom survey found that overall satisfaction with mobile phone services was higher than this
http://stakeholders.ofcom.org.uk/market-data-research/market-data/consumer-experience-reports/consumer-
experience/ at 89%. The difference may be explained by question ordering; in our November 2012 survey the
question about overall satisfaction was positioned immediately after several questions about individual aspects of
service, which may have has some influence over what the respondent was considering when rating the ‘overall’
service.
7
Call for Input: Measuring mobile quality of experience
2.12 There are also some differences between the nations, shown in Figure 4, below.
2.13 Users in Scotland are the most satisfied with speed or reliability of the internet (55%),
with those in Wales being the least likely to be satisfied (37%).
2.14 Users in Northern Ireland are the most likely to report dissatisfaction with their ability
to make or receive calls or text messages. This is almost double the proportion who
are dissatisfied with this aspect of service in England (13%).
90 20
33 30 28
80 38 43 40 38
49 51
70
% 27
60
27
50 35 21
35
37 28
40 34
26
29
30 44
38 35
20 12 22 27 11 24
18 15
10 9 9
7 6 3 5 6 5 2
0 4 2 6 2 2 4 1 1 3
1 4
Make/ Cost Speed/ Handset Customer Make/ Cost Speed/ Handset Customer
receive reliability service receive reliability service
calls internet calls internet
8
Call for Input: Measuring mobile quality of experience
Very dissatisfied Somewhat dissatisfied Neither satisfied nor dissatisfied Somewhat satisfied Very satisfied
100
11 18
90 21
80 39 34
26 45 44 49 50
% 70 23 60
60 41
50
27 37
21
40 21
55 36 32 49
30 9 23
22 29
20 13 23
27
10 14 14 8
7 4 12 7
6 4 5 4
0 5 2
0 6 5
0 5 1 3 3 3
Make/ Cost Speed/ Handset Customer Make/ Cost Speed/ Handset Customer
receive reliability service receive reliability service
calls internet calls internet
Wales Northern Ireland
The majority of consumers are satisfied with different aspects of mobile reception
2.15 We also asked respondents about their satisfaction with various aspects of mobile
reception (Figure 5). The element with the highest level of satisfaction among UK
mobile users is good quality voice calls (78%). This is followed by calls not getting
cut off (75%), mobile reception (74%) and text messages sent/delivered without
delay (also 74%). Figure 5 shows that just under half (48%) said that they were
satisfied with using the internet, though this rises to 71% when filtered to include only
those who use the internet on their mobile.
Satisfaction with aspects of mobile reception is lower in rural areas and of the four UK
nations is lowest in Northern Ireland
2.16 There are some differences between urban and rural users. Rural users are more
likely than those in urban areas to be very dissatisfied with their ability to make or
receive calls (6% vs. 3%). Figure 6, below, shows differences in satisfaction between
9
Call for Input: Measuring mobile quality of experience
the nations. Users in Northern Ireland appear to be the least satisfied with their
ability to make or receive calls (18% report that they are dissatisfied). They also have
the highest levels of dissatisfaction with the quality of voice calls (12%) and calls not
getting cut off (12%).
England Scotland
100
90 22
29
36 38
80 39 42 44 48 46
54
70
26
60 24
%
50
38
40 36 33
40 31 28
31 20
30 44 40
20 15 16
14 19 16 17 18
15
10
9 7 8
6 5 5 6 6 6 6
0 3 2 6 2 2 2 0 2 2 1
Make/ Good quality Texts sent/ Calls not cut Ability use Make/ Good quality Texts sent/ Calls not cut Ability use
receive calls voice calls delivered no off internet receive calls voice calls delivered no off internet
delay delay
Very dissatisfied Somewhat dissatisfied Neither satisfied nor dissatisfied Somewhat satisfied Very satisfied
100
10
90 19
29 28
80 36 35
42 41 43 45
31 19
70
60
% 41
50 45 29
42 32 28
40 35 34 54
51
30
12 19
20 15 25 15
11 9 9
8
10 5 8 6
6 10 8 6
3 5 10 7 7
5 6 3 3 6 3
0 1
Make/ Good quality Texts sent/ Calls not cut Ability use Make/ Good quality Texts sent/ Calls not cut Ability use
receive calls voice calls delivered no off internet receive calls voice calls delivered no off internet
delay delay
Over half of UK mobile users say they have experienced problems with reception –
this rises to six in ten in Wales and three-quarters in Northern Ireland.
2.17 Just over half (53%) of UK mobile users have ever experienced any issues with
mobile reception with 12% experiencing four or more problems.
2.18 The most common problem is having no signal/reception on phone (34%), followed
by poor sound quality/sound breaks up, call ending unexpectedly and being unable to
use the mobile internet (all 15%), being unable to make/connect a call even though
10
Call for Input: Measuring mobile quality of experience
the phone shows “bars” present and text messages not arriving or arriving late (both
13%), being unable to send text messages (12%), and being unable to send or
receive emails (8%).
2.19 There are no statistically significant differences between users living in rural and
urban locations. However, among the nations (Figure 7), mobile users in Northern
Ireland are significantly more likely than those in England, Scotland and Wales ever
to experience a problem (75% vs. 52%, 51% and 60%). Around a third (32%) of
people in Northern Ireland say they have experienced four or more of these
problems.
Figure 7: Mobile phone users who have ever experienced problems with reception, by
nation
100
Any problem
90
No problem
80 75
No signal/ reception on phone
70
60 59 Poor sound quality/ call breaks up
60
% 52 51 Call ends unexpectedly
50 48
43 42
39 Unable to use mobile internet
40 34
32 31 33 Unable to make/connect a call
30 30
30 26 25
24
22 Text message does not arrive/ arrives
20 17 late
151415 14 16
121212 13 1212 13 13 Unable to send text messages
11 11
8 9 9 9
10 6
Unable to send/ receive emails
0
England Scotland Wales Northern Ireland
2.20 Figure 8 shows the frequency with which problems are experienced. Having no
signal or reception on the phone is experienced most frequently, with 10% of mobile
users saying they experience this frequently. The frequency across the other
problems we asked about is lower, with between two and five per cent of mobile
users frequently experiencing these.
11
Call for Input: Measuring mobile quality of experience
90 21
29 28 28 28 28
31 28
80
70
60
% 53
50 44 43
46 49 44
48 53 Frequently
40 Sometimes
Rarely
30
20
29 27 29 27
25 23
10 21 19
0
No signal/ Poor sound Call ends Unable to use Unable to Text Unable to Unable to
reception on quality/ call unexpectedly mobile make/connect message send text send/ receive
phone breaks up internet a call does not messages emails
arrive/ arrives
late
Conclusions
2.22 These results show us that, as one might expect, being able to use your phone to
make and receive calls is very important to consumers. Many mobile users say that
they experience no problems at all 15 and the majority of UK mobile users are
satisfied with their mobile service overall (81%). However, a significant minority of
consumers (most notably in Northern Ireland) experience a range of recurring
problems when they try to use their mobile phones (see paragraph 2.20 and figure 7
above).
2.23 The mobile reception issue consumers are most dissatisfied about is their ability to
make and receive calls (12% in England, 8% in Scotland, and 11% in Wales are
dissatisfied), with those in Northern Ireland most likely to be dissatisfied (18%). No
mobile signal or reception (in order to make or receive calls or texts) is also the most
common problem consumers say they have experienced (more than twice as many
people have ever experienced that problem compared to each of the other problems
we asked about – see Figure 7 and paragraph 2.20 above).
2.24 The research clearly indicates that some consumers are not wholly satisfied with the
QoE of their mobile service but consider that various aspects of QoE are important to
them. This suggests that if appropriate information were available to allow consumers
15
43% in England, 48% in Scotland, 39% in Wales and 24% in Northern Ireland stated they had never
experienced a problem.
12
Call for Input: Measuring mobile quality of experience
to compare operator’s QoE then they would use this information when purchasing
mobile services and select packages / providers that better suited their needs.
2.25 We have considered similar matters in the past in the 2010 Quality of Service
research report 16. This research found that consumers particularly valued information
on price and network quality of service. Ofcom’s accreditation scheme for price
comparison websites encourages clear and accurate consumer information on price.
As a result of the research report, we also considered the information provided to
consumers on fixed line broadband speeds. This initiative has brought about
improvements in both the information provided by broadband providers as well as the
speeds consumers can expect from their broadband service.
2.26 Adopting a similar approach to the technical performance of mobile networks has the
potential to bring about further improvements for those who have a poor experience
with mobile performance. Ofcom has a number of existing publications where this
information could be published, including (but not limited to) the Communications
Market Report, Infrastructure Report and Consumer Experience Report.
2.27 Publishing information on QoE will enable consumers to make better informed
purchasing decisions and drive competition between operators. This in turn will result
in improved network performance for the benefit of consumers.
16
http://stakeholders.ofcom.org.uk/consultations/topcomm/qos-report/
13
Call for Input: Measuring mobile quality of experience
Section 3
3.3 Although it is likely that QoE information will need to be presented and tailored to
meet the needs of different consumers groups, there are a number of core
characteristics to QoE information that we think will be universally applicable. These
include:
• The consumer ‘use case’. For any given location, QoE can also vary depending
on whether the consumer is indoors or outdoors and whether they are on the
move (whether on foot, in a motor vehicle or on a train).
• Network performance by time of day and day of week. Our experience from
measuring fixed broadband is that performance can degrade at peak usage
times. For mobile broadband (and potentially voice calls) similar effects may be
present and consumers may wish to know which operator provides the highest
network capacity in the areas they wish to use their mobile service.
3.4 For those consumers who particularly value the quality of voice and text services, we
consider that there are a number of important QoE metrics:
• Locations in which they are able to reliably make and receive a call under
different use cases
14
Call for Input: Measuring mobile quality of experience
3.5 For mobile data services, the following information may be of use to consumers:
3.6 For each of the metrics above, given that performance may vary in different parts of
the UK, under different use cases and at different times, there may be merit in
providing information at a local level, by use case and by time of day.
3.9 It is clearly important that published information is accurate and up-to-date if the
market for mobile service is to operate effectively. Given the rapid rate of change in
the market (particularly with the advent of 4G services) regular updates to the
information will be required. We currently collect coverage data annually, but we
welcome views on how often information should be refreshed to ensure that mobile
markets work effectively.
3.11 The data for some of these metrics may be readily available from MNO operation
systems at the granularity required – in which case the data can be directly converted
to the information provided to consumers. For example, MNOs may already collect
data on dropped calls.
3.12 Where suitable data is not available to produce particular QoE metrics, then it may
be necessary to use proxies. For example, data on actual network coverage may not
be readily available from the MNO operational systems, but can be estimated using
15
Call for Input: Measuring mobile quality of experience
planning models (potentially validated with field measurement). This is the approach
we currently use when reporting network coverage – predicted signal strengths
produced from MNO planning tools are used as a proxy of actual coverage.
3.13 Although proxies may be less accurate than actual data (and hence may be of less
use to consumers who wish to know whether they can use a specific service in a
specific location and use case) if defined correctly they can be used by consumers to
make comparisons between MNOs.
3.14 Under our Infrastructure Reporting duty we are required to report on the state of
communications networks in the UK as well as the services they carry. As such, the
network data we collect for the Infrastructure Report may also be useful for deriving
QoE proxies.
3.16 There are advantages and disadvantages for each approach. Actual performance
data will typically better represent the consumer experience in that it provides
information based on consumers’ actual usage and location. However, the
disadvantage of actual performance data is that it will only provide data where the
tests are carried out (i.e. it will not cover 100% of a geographic area). It will, for
example, provide no data in not spots (by definition) and in areas where little data is
available (such as highly rural areas) it may not be possible to derive statistically
robust comparisons between operators.
3.17 Predicted performance is likely to offer far more granular geographic data (as
operator planning tools can operate down to a high level of geographic granularity)
but accuracy of predictions will be subject to error margins given the complexity of
predicting radio propagation in cluttered environments and inside buildings. In
addition planners cannot always predict how heavily a network will be used. As such
predicted performance will not always reflect actual performance.
Predicted performance
3.19 We already collect data from MNOs on predicted signal strength for 2G & 3G
networks. We use this to estimate geographic and premises coverage across the UK.
16
Call for Input: Measuring mobile quality of experience
This data is collected at a granularity of 200mx200m and is the basis of the coverage
information we publish in the CMR and Infrastructure Reports.
3.20 In addition to signal strength, there may be other metrics generated by MNOs’
planning tools that it would be appropriate for us to gather. For example, for a given
location (e.g. a 200mx200m pixel) or cell site footprint the data types shown in figure
7 may provide additional valuable information.
Figure 7: Possible predicted performance metrics available from MNO planning tools
Metric Benefit in collecting the data
Signal to noise and interference Potentially a better indicator than signal strength
ratio alone in estimating network coverage
Network technology e.g. 2G, 3G, Would allow the roll out of different technology
HSDPA, LTE etc. and which 3GPP types to be tracked. This could be used as a
software revision has been rolled proxy for mobile broadband performance.
out
The radio spectrum band and Provides insight into spectrum utilisation and
number of carriers in use network capacity
The backhaul arrangements for a Provides insights into speed, capacity and
given cell site potentially latency of mobile broadband
The geographic area, number of
premises, vehicles per day and/or
predicted number of calls
/day
3.21 We will seek to engage directly with the MNOs to explore which metrics are produced
by their planning tools which might be useful in deriving proxies of QoE.
Actual performance
3.22 Whilst predicted performance data is generally only available from MNOs’ planning
tools, actual performance data can be provided by MNOs or third parties.
3.23 Typically actual performance metrics are collected by third parties (often on behalf of
MNOs) by placing test calls and data on the networks in different locations. Often
referred to as ‘drive testing’ this approach seeks to mimic end user behaviour and so
provides a good insight to consumer QoE. The main disadvantage of drive testing is
the high costs required to cover a representative sample of the UK, particularly if it
has to be repeated at regular intervals.
3.24 As an alternative to drive testing, MNOs are likely to have very rich data from their
operational systems which could provide very granular information on service quality
– effectively analysing the performance data associated with the millions of calls and
data sessions made each day on their networks, rather than relying on a small
number of drive tests. The advent of “big data” tools has made it possible to process
this data cost effectively and it may be possible to produce suitable proxies of QoE.
3.25 We intend to explore with the MNOs what actual performance data they collect, but
we also welcome the views of other stakeholders on the types of actual performance
data that are available.
17
Call for Input: Measuring mobile quality of experience
Collection approaches
3.26 There are several alternative methodologies for collecting the underlying data that is
needed to provide consumer information. Each will have different merits with respect
to the granularity of the data collected, the costs of collection and the quality of the
data.
3.27 Broadly, we envisage that the data will come from either the MNOs themselves, from
third parties or from a hybrid approach.
3.28 As outlined above, MNOs may be able to extract a wide range of relevant data from
their existing planning tools and operational systems. We recognise that the
information available may vary between MNOs and so work would be required to
identify a common set of metrics that would allow MNO performance to be compared
fairly.
3.29 Ofcom has previously commissioned research into mobile network performance from
third parties 17 and as described in paragraph 1.19 is considering research into 4G
QoE in 2013. Ofcom also undertakes research using third party data to measure
fixed broadband 18. By commissioning a third party contractor to undertake this work
we are not reliant on the service providers to extract data from their systems and we
have been able to collect data that is not available in their systems. It also ensures
that data is collected across operators in a consistent way and is truly independent of
the operators.
3.30 There are a number of approaches third parties adopt to collect data. These include:
3.31 Each approach has its merits. Crowd sourcing can be a cost effective way to gather
large quantities of data, and it reflects actual user locations and use. However it does
not cover all locations, it is not always clear where the device is located when the test
is made (so it could be indoors or outdoors, in a bag or in the users’ hand) and
sufficient quantities of volunteers are needed for robust results. It also may not be
possible to gather all the metrics required. This raises potential challenges in
17
http://media.ofcom.org.uk/2011/05/26/mobile-broadband-speeds-revealed/
18
http://stakeholders.ofcom.org.uk/market-data-research/other/telecoms-research/broadband-
speeds/broadband-speeds-may2012/
18
Call for Input: Measuring mobile quality of experience
gathering statistically robust results and specific important information. There may
also be costs associated with recruiting the crowd sourced volunteers.
3.32 Drive testing and/or walk testing measurements are taken in a far more controlled
environment because the location of the tests is chosen and the location of the test
device is known. Tests can be repeated in specific locations as required. However,
the cost of data collection can be higher, mainly because of travel costs and,
potentially, call and data charges. Drive testing may be more appropriate if targeted
data collection is required. For example, concentrating tests in areas which are
predicted to have poor performance or are otherwise of specific interest and/or in
sample areas to allow predicted performance to be validated with actual performance
data.
3.33 Fixed probes are likely to give robust, comparable data because tests are completed
in the same place at regular intervals. However, only these locations are sampled.
Data costs can be high because of the high volume of traffic sent.
3.34 In our mobile broadband research in 2011, we used all three of the measurement
approaches discussed above – measurement devices in fixed location, drive testing
in a small number of case study areas and an application downloaded to volunteers’
smart phones. We wish to explore through this CFI and through our planned work in
2013 the most effective approaches to third-party collection of mobile QoE
information.
Other approaches
3.35 Hybrid approaches, where third parties collect data from mobile operators’ systems,
may provide a good balance of cost vs. independence and quality/depth of data.
Such an approach could ensure that data were comparable between operators and
consistently analysed. Potentially, the third party could also aggregate data before it
is provided to Ofcom.
3.36 Industry led initiative. Our primary objective is to ensure consumers have access to
accurate and comparable information on mobile performance and this does not
necessarily require Ofcom to collect and publish all the relevant data (although we do
have duties to collect and publish certain data). An industry led initiative could
achieve a similar outcome, possibly in conjunction with comparison websites or
consumer information bodies. However, the absence of such an initiative to date
suggests that the necessary incentives or coordination are not in place.
3.37 We recognise that the collection of any data will incur costs, whether for Ofcom or
operators. It is therefore important that we are proportionate when collecting data –
balancing the benefits that are derived from providing information to consumers
against the costs of collecting it.
3.38 We believe that Ofcom has a role to play in collecting some form of third party data to
ensure information is independent and accurate. We would welcome comments from
respondents on Ofcom taking this role and also whether respondents consider the
role should be more focused on validating data from operators or collecting the data.
19
Call for Input: Measuring mobile quality of experience
Annex 1
A1.2 We invite written views and comments on the issues raised in this document, to be
made by 5pm on 1 April 2013.
A1.3 We are particularly keen to get the views of stakeholders representing the needs of
consumers in different parts of the UK to ensure we have a clear view of the
information that consumers would find useful when purchasing mobile services. If
there is sufficient interest from stakeholders, we propose to host a workshop in
March to facilitate the development of ideas and options. To register your interest
in a workshop, please contact us by 15 February 2013.
• post to the address below, marked with the title of the consultation ‘Measuring
mobile quality of experience’ (and a completed consultation response cover
sheet – see last page).
Ruth John
Ofcom
Riverside House
2A Southwark Bridge Road
London SE1 9HA
A1.5 Note that we do not need a hard copy in addition to an electronic version. We will
acknowledge receipt of responses if they are submitted using the online web form
but not otherwise.
Confidentiality
A1.6 We believe it is important for everyone interested in an issue to see the views
expressed by consultation respondents. We will therefore usually publish all
responses on our website, www.ofcom.org.uk, ideally on receipt. If you think your
response should be kept confidential, can you please specify what part or whether
all of your response should be kept confidential, and specify why. Please also place
such parts in a separate annex.
20
Call for Input: Measuring mobile quality of experience
A1.7 If someone asks us to keep part or all of a response confidential, we will treat this
request seriously and will try to respect this. But sometimes we will need to publish
all responses, including those that are marked as confidential, in order to meet legal
obligations.
A1.8 Please also note that copyright and all other intellectual property in responses will
be assumed to be licensed to Ofcom to use. Our approach on intellectual property
rights is explained further on its website at
http://www.ofcom.org.uk/about/accoun/disclaimer/
21
Call for Input: Measuring mobile quality of experience
BASIC DETAILS
Consultation title:
To (Ofcom contact):
Name of respondent:
CONFIDENTIALITY
Please tick below what part of your response you consider is confidential, giving your
reasons why
If you want part of your response, your name or your organisation not to be published, can
Ofcom still publish a reference to the contents of your response (including, for any
confidential parts, a general summary that does not disclose the specific information or
enable you to be identified)?
DECLARATION
I confirm that the correspondence supplied with this cover sheet is a formal consultation
response that Ofcom can publish. However, in supplying this response, I understand that
Ofcom may need to publish all responses, including those which are marked as confidential,
in order to meet legal obligations. If I have sent my response by email, Ofcom can disregard
any standard e-mail text about not disclosing email contents and attachments.
22