Measuring QoE
-
Upload
beyan-jacob -
Category
Documents
-
view
212 -
download
0
Transcript of Measuring QoE
-
8/2/2019 Measuring QoE
1/6
mm2010-20 1
Abstract In the (mobile) communications domain, Quality of
Experience (QoE) has become the ultimate metric influencing the
success or failure of new applications and services. Since it is
increasingly argued in the literature that QoE should be seen as a
multi-dimensional concept, this paper focuses on QoE from an
interdisciplinary perspective.
In this paper, we discuss a framework that can be used for theevaluation of QoE in mobile settings. Additionally, QoE related
to a mobile video-streaming application was evaluated by a user
panel in a semi real life context. In this study, users subjective
evaluation of relevant QoE dimensions is related to a set of
objective parameters.
Index Terms Performance evaluation, objective evaluation
techniques, subjective evaluation techniques, QoS, QoE,
measurement
I. INTRODUCTIONver the last years, Quality of Experience (QoE) has
become one of the ultimate business metrics in the
industry as well as an important research topic in the academic
world. In this respect, the academic interest in the QoE
concept is not limited to one single discipline or research field
such as telecommunications, Human-Computer Interaction,
user-driven innovation. Moreover, it concentrates on various
aspects such as the evaluation and optimization of
standardized measures, the development of new measurement
approaches, the definition and elements of QoE, its links with
related concepts such as User Experience, etc.
In this paper, we mainly focus on the measurement of QoE.
More specifically, we try to integrate both objective, technical
aspects and subjective, user-related dimensions in our QoEmeasurement approach. Although the former are usually
comprised in traditional definitions of QoE [1], the link to
Manuscript received February 27, 2010. This work was supported by the
IBBT GR@SP (Bridging the gap between QoE and QoS for mobileapplications) project, funded by the IBBT (Interdisciplinary institute for
BroadBand Technology), a research institute founded by the Flemish
Government in 2004. W. Joseph is a Post-Doctoral Fellow of the FWO-V
(Research Foundation - Flanders).
Istvn Ketyk, Wout Joseph and Luc Martens are with the Department of
Information Technology, Ghent University/IBBT, Gaston Crommenlaan 8,B-9050 Ghent, Belgium (phone: +32-9-33-14918; e-mail:
[email protected], [email protected], [email protected] ).
Katrien De Moor and Lieven De Marez are with the Department of
Communication Sciences, Ghent University/IBBT, Korte Meer 7-9-11,
B-9000 Ghent, Belgium (e-mail: [email protected],[email protected]).
subjective dimensions inherent to users experiences is often
not made explicit. The conceptual notion of QoE underlying
this paper consists of four main elements, i.e. the user, the
(ICT) product (comprising the technical aspects), the use
process and the context. These components consist of a
number of subdimensions [2].The plea for a multi-dimensional and more user-centric
definition of QoE has also found its way into the empirical
research. In this respect, the value of frameworks that enable
true user-centric evaluation of QoE in natural environments is
emphasized in the literature [3]. On the other hand, totally
moving away from the controlled test environment increases
the research complexity considerably. In this paper, we
therefore present results from an empirical study which
combines objective and subjective aspects of QoE related to
the use of a mobile video-streaming application in a mobile,
semi-natural setting. Related to the semi-natural setting, six
different usage contexts were defined: these usage contexts
were at home indoor/outdoor, travelling train and bus/outdoor,at work indoor/outdoor. The test users could decide when they
wanted to watch the clip, as long as they respected these usage
contexts. We investigate whether there is a relation between
monitored, physical parameters and subjective parameters
(explicit evaluation of QoE aspects by test users).
Section II of the paper gives a short overview of the related
work. Section III describes the measurement setup and
concept, introduces the MyExperience application and
discusses the parameters. Section IV presents the most
important results and introduces our QoE model, and Section
V is dedicated to our conclusions.
II. RELATED WORKIn the literature, several objective (e.g., PSNR, PSQA, etc.)
and subjective (e.g., SVQA) video quality evaluation metrics
and methods can be found [4]-[6]. Although most of thesehave already been validated and standardized, they hold a
number of limitations: usually, this type of research takes
place in controlled research settings, focuses on very narrow
technical parameters and often, users are not actively
involved. As a result, few studies have already focused on themulti-dimensional evaluation of QoE in natural, real-life
environments. In some recent studies on mobile TV and User
Experience related to mobile video, the extension towards
such natural, daily life environments was made by organizinge.g. field trials or conducting Living Laboratory studies [7].
PERFORMING QoE-MEASUREMENTS IN
AN ACTUAL 3G NETWORKIstvn Ketyk, Katrien De Moor, Wout Joseph,Member, IEEE,
Luc Martens,Member, IEEE, Lieven De Marez
O
-
8/2/2019 Measuring QoE
2/6
mm2010-20 2
In [3], the relevance of such Living Labs is discussed in the
context of measuring QoE. As Living Labs bring the lab tothe people and draw on real peoples experiences, QoE
research in such settings will likely yield more accurate results
and have a higher ecological validity than research in
controlled environments [8].The study presented in this paper therefore draws on a
framework for evaluating QoE in mobile, Living Lab settings
[3] and took place in a semi-Living Lab setting. It was
conducted in a real 3G environment, and users were able to
perform the tests in their own natural environment, but theywere asked to take into account a number of predefined
contexts and could not freely choose the content, therefore wetreat this as a semi-Living Lab setting.
III. METHOD AND STUDY SET-UPA. Mobile Video Streaming
The mobile video streaming measurement is performed in a
commercial Universal Mobile Telecommunications System(UMTS) / High Speed Packet Access (HSPA) network in
Belgium [9]. Fig. 1 shows an overview of the framework for
QoE measurements of mobile video streaming. The video
server is a Darwin Streaming Server 5.5.5 (DSS) [10]connected to the academic network at the Ghent University.
The mobile client is an HTC Touch HD smart phone with
UMTS/HSPA communication interface, and the video playeris the HTC Streaming Player 1.1. The used communications
protocols are based on the Internet Protocol (IP) architecture.
For the session setup and control Real Time StreamingProtocol (RTSP) over Transmission Control Protocol (TCP) is
used, and for the streaming flow itself Real-time Transport
Protocol (RTP) over User Datagram Protocol (UDP) [11].
Fig. 1. Overview of the framework for QoE measurements of mobile videostreaming in a UMTS network.
B. Measurement MethodThe measurement approach is partly based on the Context-
Aware Experience Sampling Method [12]. This data collection
method, which is based on the Experience Sampling Method[13], combines objective and subjective data for evaluating
user experiences. It draws on certain information (e.g.,
location, time, event, etc.) to trigger self-reports from testusers in a more sophisticated way: in a certain context or when
a specific event happens, an automatic signal is provided. As a
result, this method allows various possibilities to study and
evaluate users experiences with ubiquitous computing
applications in a daily life context.
In-situ Measurement Tools
Table I compares some state-of-the-art tools created forautomatically capturing data in field experiments [14]. The
data can roughly be categorized as relating to usage, context or
subjective user feedback. All three types of information havebeen captured with success in proof-of-concept studies. The
star (*) indicates which category is implemented in the given
tool.
TABLEI
EXISTING TOOLS FOR CAPTURING DATA
Tool Context Usage User Feedback Platform
ContextPhone [15] * Symbian S60
SocioXensor [16] * * * Windows Mobile
RECON [17] * * * Windows Mobile
MyExperience [18] * * Windows Mobile
Larger companies such as Nokia have developed in-house
tools which may be even more advanced than these, but thoseare not openly available. In this study, the Context-Aware
Experience Sampling tool called MyExperience is used at the
client-side. This tool was chosen because at the moment of the
tests, this was the most suitable accessible implementation to
our experiment.
Fig. 2. Summary of the scalability and situatedness of current data collectionapproaches and where MyExperience [18] fits in.
Fig. 2 summarizes the scalability and situatedness of thecurrent data collection approaches and shows where
MyExperience fits in.
Test Users and Usage Context
In total, 18 people participated to the presented study, 9 of
them are female, 9 male. The study consisted of two phasesand started out with a short pre-usage questionnaire in which
the users were asked to write down attributes of a good
experience with a mobile application, to give an indication oftheir expectations towards the use of advanced mobile
applications and to list possible thresholds.
During the usage phase, which is the most important one for
this paper, the test users took home the HTC Touch HD. They
were asked to watch a predefined movie trailer [19] (contentvariable is kept stable) in six different usage contexts during a
normal day (in their natural setting): these usage contexts wereat home indoor/outdoor, travelling train and bus/outdoor, at
-
8/2/2019 Measuring QoE
3/6
mm2010-20 3
work indoor/outdoor. The test users could decide when they
wanted to watch the clip, as long as they respected these usagecontexts and were instructed to focus on the quality of the
sound and screen in these different usage contexts. Using the
MyExperience tool, the test users received a number of pop-up
questions (self-reporting) on the device, related to theiremotions, current physical and social context and the quality
of their perceived user experience right before and after every
video watching moment. All answers were saved in a
database.
The Stimuli Material Format
For the stimuli material, a nature movie trailer of 2:10minutes was used. The video file was encoded with the ffmpeg
tool from the original file in HD quality downloaded from the
Apple Trailers website [19]. The video quality vs. mobilenetwork bandwidth trade-off gave a lot of difficulties in the
course of finding of the most appropriate encoded version for
UMTS.
TABLEII.
THE ENCODING DETAILS FOR THE STIMULI MATERIAL
Audio Video
Codec AAC LC Codec H.264 [email protected]
Bandwidth 64 Kbit/s Bandwidth 128 Kbit/s
Channels 2 Resolution 240*132
Sampling frequency 44100 Hz Framerate 15 fps
Table II lists the encoding details chosen for this study. The
total bit rate is 192 Kbit/s, and thus a good quality video is3-times larger than used by YouTube mobile video streams.
YouTube implemented to have 64 Kbit/s-bit rate videos
encoded with H.263 codec and mono AMR-NB sound.
Observed ParametersSince it is not possible to capture network-layer level packet
information at the client-side because Windows Mobile OS
6.1 does not support to run any packet capture application(like tcpdump) , the capturing of packet information was
performed at the server-side by using the Wireshark tool [20].
In this way, all the RTSP, RTP and Real-time Transport
Control Protocol (RTCP) packets are captured and are used to
calculate the necessary information. In case of videostreaming, the two important network parameters to
investigate are the packet loss and the jitter [21]. Since, the
video player is a closed-source application and the DSS does
not provide network jitter information, the RTCP packets are
used to obtain all the necessary values.RSSI is obtained from the MyExperience application and
logged event-based. Every time when Windows Mobile OSdetects the changing of the RSSI, it reports this to the
MyExperience tool and the old and new values are logged into
a file.
The packet loss and jitter values were obtained at the
server-side, from the RTCP Receiver Records (RR). Theserecords are sent periodically (every 5 seconds) from the video
client to the streaming server during the playback session.
There are separated records for the audio and the video trackitself. The packet loss values correspond with the cumulative
IP packet loss value occurred during the video watching
session. In every RTCP RR there is an estimation report of the
inter-arrival jitter of the received RTP packets from the
streaming server. The client application calculates a runningestimate of this mean using (1) according to [11], where j i is
the ith estimated mean jitter in milliseconds, ji-1 is the i-1th
estimated mean jitter, ti is the reception timestamp of the ith
packet, ti-1 is the reception timestamp of the i-1th packet:
j 15 j abst t
16. (1)
It is important to note that in the case of RTCP, the value
reported only reflects the most recent few hundredmilliseconds before the value was calculated (the last 16
packets).
For the analysis packet loss rate and highest jitter values
were used. The packet loss rate PLR [%] is defined as follows:
PLR 100 lost packets
total number of packets, (2)
2765 packets were belonging to the video track and 5602packets were belonging to the audio track. The amount of the
packets with video payload is less because the video streaming
server packs the video payload into the highest possible-sizeIP packets (the MTU is 1478 bytes). If it is not large enough,
the rest of the video payload is sent in a new IP packet (with a
size around 400 bytes). In case of the audio content the IPpacket sizes are always around 200 bytes and are sent three at
a time. The average inter-departure time for the packets is
around 60 ms in case of both payloads.
TABLEIII.THE MEASURED OBJECTIVE PARAMETERS ANDNOTATIONS
Parameter Values Unit Sampling rate
Video packet lossrate (VL)
[0, 100] [%]Cumulated lossdata in every 5
seconds
Video packet jitter
(VJ)[0, [ [ms]
Estimated mean
itter in every 5
seconds
Audio packet loss
rate (AL)[0, 100] [%]
Cumulated lossdata in every 5
seconds
Audio packet jitter
(AJ)[0, [ [ms]
Estimated mean
itter in every 5seconds
RSSI [-113, -51] [dBm] Event-based
Table III summarizes the measured objective parameters
(marked the value ranges and units) monitored during the
experiment. Sample rates are also indicated.Table IV gives an overview of the subjective parameters
that are used in the analysis. These subjective parameters were
derived from the self-report data, i.e., the explicit evaluation of
a set of QoE aspects by the test users on the device. A first setof items is related to quality aspects, a second one to
emotional aspects (see in Table IV). For each item, the
respondents indicated their position on 5/10point Likertscales (totally disagree / disagree / neutral / agree / totally
agree). In Table IV the Kaiser-Meyer-Olkin Measure of
Sampling Adequacy (KMO) is also indicated. The KMO is an
index for comparing the magnitudes of the observedcorrelation coefficients to the magnitudes of the partial
correlation coefficients. This measure varies between 0 and 1,
-
8/2/2019 Measuring QoE
4/6
mm2010-20 4
and values closer to 1 are better. A value of 0.6 is a suggested
minimum. The KMO is 0.75 for the 6 items related to thequality aspects, and 0.84 for the 11 items related to emotional
aspects.
TABLEIV. SUBJECTIVE PARAMETERS
Quality aspects (6 items) KMO=0.75
Content 5-point scale
Fit to feeling 5-point scale
Sound quality 5-point scalePicture quality 5-point scale
Fluentness 5-point scaleAudio/Video synchronization 5-point scale
Emotional aspects (11 items) KMO=0.84
Not disappointed 5-point scale
Happy 5-point scale
Enjoyed 5-point scale
Not frustrated 5-point scaleRich experience 5-point scale
Relaxed 5-point scale
Not bored 5-point scale
Curious 5-point scale
No time-waste 5-point scaleNot doing other stuff 5-point scale
Not distracted 5-point scale
General QoE evaluation (QoE) 10-point scale
IV. ANALYSIS AND RESULTSA. Components of QoE
We reduced the number of the question items by using the
statistical method called Principal Component Analysis (PCA)
[22]. PCA involves a mathematical procedure that transforms
a number of possibly correlated variables into a smallernumber of uncorrelated variables called principal components
(PC). First, all items were rescaled in the same direction.Secondly, we performed the PCA with Varimax Rotation (a
linear transformation to maximize factor loadings but keeping
the orthogonality of the PCs) on both sets of items. Next, weperformed an analysis of reliability on the PCs that were found
(using Cronbachs for components with more than two
items, and Pearsons Correlation test for components with twoitems).
The results of the PCAs are the following five orthogonal
(uncorrelated) PCs: the spatial and temporal quality, the
emotional satisfaction, the interest, and the focus level,
respectively. These PCs represent five different aspects of thegeneralQoE, and the given names to the PCs are intended to
express these aspects. Subsequently, these 5 PCs will be used
for the further analyses instead of the numerous originalquestion items.
PCA on Quality AspectsThe PCA related to the quality aspects resulted in 2 PCs
with eigenvalues higher than 1 (according to the Kaisers
criteria), explaining 62.17% of the total variance of theoriginal 6 items (see in Table V).
The 1st PC is named as spatial quality to reflect those
question items which have high factor loadings in it. In [23],
spatial parameters related to videos are characterized bypicture resolution and color depth. Ourspatial quality PC is
formed by the content, the sound quality, the fit to feeling, and
the picture quality original question items. The highest factor
loading of content represents that it has the highest role in
forming thespatial quality QoE aspect.
TABLEV. PCA ON QUALITY ASPECTS
1STPRINCIPAL COMPONENT (SPATIAL QUALITY)
Variability Explained Reliability
43.26% Cronbachs alpha=0.73Question Item Factor Loading Communality
Content 0.83 0.71
Sound quality 0.72 0.57
Fit to feeling 0.72 0.52
Picture quality 0.56 0.66
2NDPRINCIPAL COMPONENT (TEMPORAL QUALITY)
Variability Explained Reliability
18.91% r=0.28, p=0.000
Question Item Factor Loading Communality
Fluentness 0.85 0.75
A/V sync. 0.65 0.53
The 2nd PC is named as temporal quality to reflect the
contained fluentness, and the A/V synchronization question
items in conformance to [23], where temporality describes thedynamic nature of a motion picture.
PCA on Emotional Aspects
We did the same for the emotional aspects and this yielded
3 PCs with eigenvalues higher than 1, explaining 61.65% ofthe total variance of the 11 original items (see Table VI).
TABLEVI. PCA ON EMOTIONAL ASPECTS
1STPRINCIPAL COMPONENT (EMOTIONAL SATISFACTION)
Variability Explained Reliability49.35% Cronbachs alpha=0.89
Question Item Factor Loading Communality
Not disappointed 0.83 0.70Happy 0.76 0.68
Enjoyed 0.74 0.76Not frustrated 0.73 0.67
Rich experience 0.70 0.73Relaxed 0.67 0.61
Not bored 0.51 0.73
2NDPRINCIPAL COMPONENT (INTEREST)
Variability Explained Reliability
12.50% r=0.44, p=0.000
Question Item Factor Loading Communality
Curious 0.86 0.75
No time-waste 0.69 0.75
3RDPRINCIPAL COMPONENT (FOCUS LEVEL)
Variability Explained Reliability9.83% r=0.57, p=0.000
Question Item Factor Loading Communality
Not doing other stuff 0.85 0.77
Not distracted 0.81 0.73
The 1st PC is named as emotional satisfaction to reflect the
contained question items. Here, the level of disappointment
has the highest weight in forming this QoE aspect. The 2nd PC
is called as interestbecause it is related to the curious, and theno time-waste items. Finally, the 3rd PC is called asfocus level
since it is referred the focus level of the test users.
Correlations between the PCs and the general QoETable VII lists the correlations between the PCs and the
general QoE. There is a large correlation with the spatial
quality (r=0.82), and the emotional satisfaction (r=0.73) PCs.There is a medium correlation with the temporal quality
(r=0.43), and the interest(r=0.37) PCs. There is only a small
-
8/2/2019 Measuring QoE
5/6
mm2010-20 5
correlation with thefocus level(r=0.12) PC: this correlation is
not significant. These correlations show that thespatial qualityand the emotional satisfaction were the most important
aspects of the general QoE for our test users during the
experiment.
TABLEVII.
CORRELATIONS BETWEEN THE PRINCIPAL COMPONENTS AND GENERAL QOE
Principal Component Pearsons correlation coefficient Significance
Spatial quality r=0.82 p=0.000
Temporal quality r=0.43 p=0.000
Emotional satisfaction r=0.73 p=0.000
Interest r=0.37 p=0.000
Focus r=0.12 p=0.193
B. Modeling of QoE Components by QoS ParametersThere is obviously an association between some QoE
aspects and QoS (objective parameters). Spatialand temporal
quality QoE aspects correlate with the objective parameters,
since these were also created from technical-quality relatedquestion items (e.g., sound and picture quality, fluentness,
etc.). Table VIII lists these correlations.
TABLEVIII.
CORRELATIONS BETWEEN OBJECTIVE PARAMETERS AND QOEASPECTS
Objective parameters/
QoE aspects
QoE Spatial quality Temporal quality
Video packet loss rate(VL)
r=-0.60, p=0.000 r=-0.45, p=0.000 r=-0.36, p=0.000
Video packet jitter (VJ) r=0.02, p=0.846 r=0.02, p=0.863 r=0.04, p=0.719
Audio packet loss rate
(AL)
r=-0.62, p=0.000 r=-0.46, p=0.000 r=-0.34, p=0.000
Audio packet jitter (AJ) r=-0.64, p=0.000 r=-0.64, p=0.000 r=-0.36, p=0.000
RSSI r=-0.34, p=0.000 r=-0.25, p=0.008 r=-0.08, p=0.380
Based on the correlations in Table VIII, the average spatial
quality (SQ ), temporal quality (TQ ), and generalQoE (QoE )values can be modeled by multiple linear regression models
(Eq. (3)-(5)) [24]. The averagegeneralQoE (QoE ) is modeledby the following multiple linear regression model (3):
QoE 8.49 0.02 AL 0.01 VL 1.12 AJ 0.04 RSSI.
(3)
The R2 value is 43.5%. The model shows that both the audio
packet loss rate (AL), the audio packet jitter (AJ), the video
packet loss rate (VL), and the RSSI affect the generalQoE.The involved objective parameters are not the same in case of
the different QoE aspects only.The average spatial quality (SQ ) is modeled by the
following multiple linear regression model (4):
SQ 3.86 0.74 AJ 0.01 RSSI. (4)
Eq. (4) is the result of a stepwise regression (backwardelimination, the stepping method criteria is p(F)>0.10)
analysis removing the non-relevant objective parameters and
keeping only the significant ones in the model. The R2 value is25.7%. According to this model, the spatial quality QoE
aspect is affected only by the audio packet jitter (AJ), and the
RSSI.
The average temporal quality (TQ ) is modeled by thefollowing multiple linear regression model (5):
TQ 3.65 2.60 10 VJ 0.1 VL. (5)
Eq. (5) is the result of a stepwise regression (backwardelimination, the stepping method criteria is p(F)>0.10)
analysis removing the non-relevant objective parameters andkeeping only the significant ones in the model. The R2 value is
12.9%. In the model of the temporal quality QoE aspect the
video packet jitter (VJ) and the video packet loss (VL)
parameters are presented.
C. Interference of Context on QoE ComponentsIt is expected that the context has an influence on the
general QoE. Different locations and circumstances for the
users determine these contexts.
Fig. 3 shows the mean scores of the sound quality question
item in different locations. The result of the 1-way Analysis ofVariance (ANOVA [24]) confirms that the difference in the
means of sound quality is significant (F=4.465, p=0.005) at95% significance level. As it is expected, the test users found
the mean sound quality lower during travelling than in indoorcontexts (at home or at work).
Fig. 3. Interference of the location on the sound quality.
Fig. 4 illustrates the mean values offocus levelPCs in twodifferent contexts. Thefocus levelof the test users was higher
when no people were around them, and was lower in the
presence of anyone else. The 1-way ANOVA confirms this
difference at 95% significance level (F=4.932, p=0.028). Thefocus levelof the test users was higher when they watched the
video streaming alone.
D. Discussion and Future WorkEq. (3)-(5) can be used for modeling the technical-quality
related QoE aspects. If the individual preferences of the users
are known (by gathering their statistics and opinions duringthe usage of a product) related to the importance of the
different QoE aspects, it is possible to optimize the video
streams for different network circumstances. This study shows
-
8/2/2019 Measuring QoE
6/6
mm2010-20 6
that thespatial quality and the emotional satisfaction PCs are
the most relevant QoE aspects for our test users. For thespatial quality the content plays the most important role.
Future studies using this concept have to provide exact
answers to the influence-level of broad/personalized content
on QoE.
Fig. 4. Interference of the number of people around on thefocus levelPC.
V. CONCLUSIONIn this paper the measurement concept of QoE related to
mobile video streaming in 3G network environment is
introduced. The state-of-the-art tools created for automatic
data capturing in field experiments are compared. Results of asemi-Living Lab experiment using the MyExperience in-situ
measurement tool are presented and analyzed. Five orthogonal
QoE aspects are proposed by performing Principal Component
Analysis on 17 question items from the questionnaires.
Technical-quality related QoE aspects are modeled by thetechnical QoS parameters (at wireless infrastructure and
network level) measured during the experiment. It is shown
that QoE of mobile video streaming is influenced by the QoSand by the context also.
REFERENCES
[1] D. Lopez, F. Gonzalez, L. Bellido, A. Alonso, Adaptive multimediastreaming over IP based on customer oriented metrics, Proceedings ofthe International Symposium on Computer Networks, ISCN 2006, 2006.
[2] GR@SP, IBBT-ISBO project (2009-2011), Bridging the gap betweenQoE and QoS. Available: https://projects.ibbt.be/grasp
[3] K. De Moor, I. Ketyk, W. Joseph, T. Deryckere, L. De Marez, L.Martens, G. Verleye, Proposed Framework for Evaluating Quality of
Experience in a Mobile, Testbed-oriented Living Lab Setting, Mobile
Networks and Applications, 2010.
[4] VQEG, Final report from the video quality experts group on thevalidation of objective models of video quality assessment, PHASE I,
http://www.vqeg.org/, 2000.[5] S. Jumisko-Pyykk, J. Hkkinen, Evaluation of subjective video quality
of mobile devices,MULTIMEDIA '05: Proceedings of the 13th annual
ACM international conference on Multimedia, pp. 535-538, 2005.
[6] H. Knoche, J. McCarthy, M. Sasse, Can small be beautiful?: assessingimage resolution requirements for mobile TV, MULTIMEDIA '05:
Proceedings of the 13th annual ACM international conference onMultimedia, 2005.
[7] D. Schuurman, T. Evens, L. De Marez, A Living Lab Approach forMobile TV, inProceedings of EuroITV2009, pp. 189-196, 2009.
[8] J. Schumacher, V-P. Niitamo (eds.), European Living Labs - a newapproach for human centric regional innovation, WissenschaftlicherVerlag Berlin, Berlin, 2008.
[9] 3GPP TS 25.308, High Speed Downlink Packet Access (HSDPA);Overall description; Stage 2, Release 9, Sept. 2009.
[10] Darwin Streaming Server 5.5.5. Available: http://dss.macosforge.org[11] H. Schulzrinne, RTP: A Transport Protocol for Real-Time Applications,
IETF RFC 3550, July 2003. Available:http://www.ietf.org/rfc/rfc3550.txt
[12] S. S. Intille, J. Rondoni, C. Kukla, I. Ancona, and L. Bao, A context-aware experience sampling tool, in CHI '03 Extended Abstracts on
Human Factors in Computing Systems, ACM, pp. 972-973, 2003.
[13] M. Csikszentmihalyi, R. Larson, Validity and reliability of theExperience Sampling Method, Journal of Nervous & Mental Disease,
175 (9), pp 526-536, 1987.
[14] K. L. Jensen, L. Larsen, Evaluating the mobile and ubiquitous userexperience, inProceedings of IMUx 2008, 2008.
[15] M. Raento, A. Oulasvirta, R. Petit, H. Toivonen, ContextPhone - Aprototyping platform for context-aware mobile applications, IEEE
Pervasive Computing, 4 (2), pp. 51-59, 2005.
[16] I. Mulder, G. H. ter Hofte, J. Kort, SocioXensor: Measuring userbehaviour and user eXperience in ConteXt with mobile devices, inProceedings of Measuring Behavior 2005, The 5th InternationalConference on Methods and Techniques in Behavioral Research, pp.
355-358, 2005.
[17] T. H. Mogensen, C. lholm, Recon: A framework for remoteautomated field evaluation of mobile systems,Masters thesis, AalborgUniversity, 2007.
[18] J. Froehlich, M. Chen, S. Consolvo, B. Harrison, J. Landay,MyExperience: A System for In Situ Tracing and Capturing of User
Feedback on Mobile Phones, Proceedings of the 5th international
conference on Mobile systems, applications and services, pp. 57-70,
2007.[19] The Walt Disney Company. (September 15, 2009). Earth movie trailer
[Online]. Available: http://www.apple.com/trailers/disney/earth[20] G. Combs. (September 15, 2009). Ethereal [Online]. Available:
http://www.wireshark.org
[21] H. Riiser, P. Halvorsen, C. Griwodz, B. Hestnes, PerformanceMeasurements and Evaluation of Video Streaming in HSDPA NetworksWith 16QAM Modulation, in IEEE International Conference on
Multimedia & Expo (ICME), 2008.
[22] K. Pearson, On Lines and Planes of Closest Fit to Systems of Points inSpace,Philosophical Magazine, 2, pp. 559-572, 1901.
[23] Ronnie T. Apteker, James A. Fisher, Valentin S. Kisimov, HanochNeishlos, "Video Acceptability and Frame Rate," IEEE MultiMedia, vol.2, no. 3, pp. 32-40, Sept. 1995.
[24] M. H. Kutner, C.J. Nachtsheim, J. Neter, W. Li, Applied LinearStatistical Models, 5th edition, New York, McGraw-Hill, 2005.