A Test-bed For Quality of Multimedia Experience Evaluation of Sensory Effects
-
Upload
christian-timmerer -
Category
Technology
-
view
1.410 -
download
0
description
Transcript of A Test-bed For Quality of Multimedia Experience Evaluation of Sensory Effects
A Test-bed For Quality of Multimedia Experience Evaluation of Sensory Effects
Christian Timmerer and Markus Waltl
Klagenfurt University (UNIKLU) Faculty of Technical Sciences (TEWI)Department of Information Technology (ITEC) Multimedia Communication (MMC)
http://research.timmerer.com http://blog.timmerer.com mailto:[email protected]
Co-authors: Markus Waltl, Christian Timmerer, and Hermann HellwagnerAcknowledgment: This work is supported in part by the European Commission in the context of the
InterMedia project. Further information is available at http://intermedia.miralab.unige.ch/.Slides available at http://www.slideshare.net/christian.timmerer
QoMEX’09
Christian Timmerer, Klagenfurt University, Austria 2
Outline
• Introduction / Motivation
• Sensory Effect Description Language
• Test-Bed: Annotation Tool and Simulator
• Test Environment and Preliminary Results
• Conclusions
• Demo & Video
2009/07/31
Christian Timmerer, Klagenfurt University, Austria 3
Introduction• Universal Multimedia Access (UMA)
– Anywhere, anytime, any time + technically feasible– Main focus on devices and network connectivity issues
• Universal Multimedia Experience (UME)– Take the user into account
• Multimedia Adaptation and Quality Models/Metrics– Single modality (i.e., audio, image, or video only) or a simple combination of two modalities
(i.e., audio and video)• Triple user characterization model
– Sensorial, e.g., sharpness, brightness– Perceptual, e.g., what/where is the content– Emotional, e.g., feeling, sensation
• Ambient Intelligence– Add’l light effects are highly appreciated for both audio and visual content– Calls for a scientific framework to capture, measure, quantify, judge, and explain the user
experience
2009/07/31
F. Pereira, “A triple user characterization model for video adaptation and quality of experience evaluation,” Proc. of the 7th Workshop on Multimedia Signal Processing, Shanghai, China, October 2005, pp. 1–4.
B. de Ruyter, E. Aarts. “Ambient intelligence: visualizing the future”, Proceedings of the Working Conference on Advanced Visual Interfaces, New York, NY, USA, 2004, pp. 203–208.E. Aarts, B. de Ruyter, “New research perspectives on Ambient Intelligence”, Journal of Ambient Intelligence and Smart Environments, IOS Press, vol. 1, no. 1, 2009, pp. 5–14.
Christian Timmerer, Klagenfurt University, Austria 4
Motivation• Consumption of multimedia content may stimulate also other
senses– Vision or audition– Olfaction, mechanoreception, equilibrioception, thermoception, …
• Annotation with metadata providing so-called sensory effects that steer appropriate devices capable of rendering these effects
2009/07/31
… giving her/him the sensation of being part of the particular media
➪ worthwhile, informative user experience
Christian Timmerer, Klagenfurt University, Austria 5
Sensory Effect Description Language (SEDL)
• XML Schema-based language for describing sensory effects– Basic building blocks to describe, e.g., light, wind, fog, vibration, scent– MPEG-V Part 3, Sensory Information– Adopted MPEG-21 DIA tools for adding time information (synchronization)
• Actual effects are not part of SEDL but defined within the Sensory Effect Vocabulary (SEV)– Extensibility: additional effects can be added easily w/o affecting SEDL– Flexibility: each application domain may define its own sensory effects
• Description conforming to SEDL :== Sensory Effect Metadata (SEM)– May be associated to any kind of multimedia content (e.g., movies, music, Web sites,
games)– Steer sensory devices like fans, vibration chairs, lamps, etc. via an appropriate
mediation device
➪ Increase the experience of the user ➪ Worthwhile, informative user experience
2009/07/31
Christian Timmerer, Klagenfurt University, Austria 6
Sensory Effect Description Language (cont’d)
2009/07/31
SEM ::=[DescriptionMetadata](Declarations|GroupOfEffects| Effect|ReferenceEffect)+
Declarations ::= (GroupOfEffects|Effect|Parameter)+
GroupOfEffects ::= timestamp EffectDefinition EffectDefinition (EffectDefinition)*
Effect ::= timestamp EffectDefinition
EffectDefinition ::= [activate][duration][fade][alt] [priority][intensity][position] [adaptability]
Christian Timmerer, Klagenfurt University, Austria 7
Test-Bed: Annotation Tool and Simulator
Annotation Tool: SEVino Simulator: SESim
2009/07/31
Christian Timmerer, Klagenfurt University, Austria 8
Test Environment• Based on amBX (Ambient Experience) system + SDK
– Two fan devices, a wrist rumbler, two sound speakers, a subwoofer, two lights, and a wall washer
– Everything controlled by SEM descriptionsexcept light effect
➪ automatic color calculation is deployed• Advantages
– Reduction of description size– Speeds up authoring stage
• Different automatic color calculation methods may lead to different user experiences– (1) Average color in the RGB color space – (2-4) Dominant color in the RGB, HSV, and HMMD
• Requires a lot of computational resources: (2-4) > (1) due to the management of color bins & amBX supports only RGB
2009/07/31
Christian Timmerer, Klagenfurt University, Austria 9
Preliminary Results
More or less constant color patternA lot of different colors which change very rapidly
2009/07/31
Color calculation is performed only on every pth frame (p=5) for efficiency reasons
Christian Timmerer, Klagenfurt University, Austria 10
Conclusions• Test-bed for the QoMEX evaluation of sensory effects
– SEVino: a video annotation tool for sensory effects– SESim: a corresponding simulation tool– A real world test environment based on the amBX system and SDK
• Major findings– Average color for the automatic color calculation ➪ immediate reaction to
color changes, appealing effects, low computational requirements, real-time applicable
– RGB, HSV, and HMMD dominant color ➪ smoother reaction to color changes, higher computational requirements
• Future work– Optimization of the automatic color calculation (real-time)– Subjective tests (already started & stay tuned)– (Semi-)automatic extraction of sensory effect information
2009/07/31
Christian Timmerer, Klagenfurt University, Austria 11
References• M. Waltl, C. Timmerer, and H. Hellwagner, “A Test-Bed for Quality of Multimedia
Experience Evaluation of Sensory Effects”, Proceedings of the First International Workshop on Quality of Multimedia Experience (QoMEX 2009), San Diego, USA, July 29-31, 2009.
• C. Timmerer, J. Gelissen, M. Waltl, and H. Hellwagner, “Interfacing with Virtual Worlds”, accepted for publication in the Proceedings of the 2009 NEM Summit, Saint-Malo, France, September 28-30, 2009.
• C. Timmerer, “MPEG-V: Media Context and Control”, 89th ISO/IEC JTC 1/SC 29/WG 11 (MPEG) Meeting, London, UK, June 2009. https://www-itec.uni-klu.ac.at/mmc/blog/2009/07/08/mpeg-v-media-context-and-control/
• MPEG-V: http://www.chiariglione.org/mpeg/working_documents.htm#MPEG-V
• MPEG-V reflector: http://lists.uni-klu.ac.at/mailman/listinfo/metaverse
2009/07/31
Christian Timmerer, Klagenfurt University, Austria 122009/07/31
Demo & Video
Christian Timmerer, Klagenfurt University, Austria 13
Thank you for your attention
... questions, comments, etc. are welcome …
Ass.-Prof. Dipl.-Ing. Dr. Christian TimmererKlagenfurt University, Department of Information Technology (ITEC)
Universitätsstrasse 65-67, A-9020 Klagenfurt, [email protected]
http://research.timmerer.com/Tel: +43/463/2700 3621 Fax: +43/463/2700 3699
© Copyright: Christian Timmerer
2009/07/31