The Perception of Symmetry in the Moving Imagestellayu... · THE PERCEPTION OF SYMMETRY IN THE...

1
THE PERCEPTION OF SYMMETRY IN THE MOVING IMAGE Multi-Level Computational Analysis of Cinematographic Scene Structure and its Visual Reception Jakob Suchan, Mehul Bhatt (University of Bremen, DE), and Stella Yu. (UC Berkeley, USA) Scene Level Object Level Image Level Figure 1: Multi-level analysis of symmetry in the composition of the moving image. (GRAND BUDAPEST HOTEL,WES ANDERSON, 2014) 1 INTRODUCTION This research is driven by visuo-spatial perception focussed cogni- tive film studies, where the key emphasis is on the systematic study and generation of evidence that can characterise and establish corre- lates between principles for the synthesis of the moving image, and its cognitive (e.g., embodied visuo-auditory, emotional) recipient effects on observers [Suchan and Bhatt 2016b; Suchan and Bhatt 2016a]. Within this context, we focus on the case of “symmetryin the cinematographic structure of the moving image, and propose a multi-level model of interpreting symmetric patterns therefrom. This provides the foundation for integrating scene analysis with the analysis of its visuo-spatial perception based on eye-tracking data. This is achieved by the integration of: computational semantic interpretation of the scene [Suchan and Bhatt 2016b] —involving scene objects (people, objects in the scene), cinematographic aids (camera movement, shot types, cuts and scene structure)— and per- ceptual artefacts (fixations, saccades, scan-path, areas of attention). 2 SYMMETRY IN SPACE AND TIME: A MULTI-LEVEL MODEL Symmetry is represented within our computational framework as a multi-level model allowing formal analysis at different layers of abstraction; we look at symmetry on three levels (L1–L3): (L1). SCENE LEVEL Symmetry in the editing of a scene is defined by symmetric use of cinematographic aids, e.g. intercutting between characters, symmetric camera movements. (L2). OBJECT LEVEL Symmetry on the object level is defined based on the placement of objects and people in the frame. E.g. placing characters in the symmetry axis of the frame. (L3). IMAGE LEVEL Image level symmetry is defined based on low- Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third- party components of this work must be honored. For all other uses, contact the owner/author(s). © 2016 Copyright held by the owner/author(s). SAP ’16, July 22-23, 2016, Anaheim, CA, USA ISBN: 978-1-4503-4383-1/16/07 DOI: http://dx.doi.org/10.1145/2931002.2948721 level features that support a symmetry axis, i.e., contrast edges, tex- tures, etc. On this level symmetry can occur in multiple places and objects can be symmetric within them selves. Our multi-level model of symmetry provides a comprehensive char- acterisation of the symmetric structure of a scene, connecting high- level conceptual categories to low-level visual features in the image. 3 VISUAL PERCEPTION OF SYMMETRY The reception of symmetry is studied by analysing eye-movement behaviour of spectators, and correlating them to the multi-level symmetry model of the scene (Sec. 2). Perceptual data encom- passes individual eye-movements, and aggregated gaze data of multiple spectators. EXPERIMENTAL DATASET We investigate the perception of symme- try in the context of an eye-tracking dataset. 1 We particularly focus on select films of director Wes Anderson, whose use of symmetry in scene structuring / character placement as well as in editing & camera movement is well-known. SYMMETRY, EYE-MOVEMENT PATTERNS, AND GAZE TRANSITIONS Founded on [Yu 2010], our experiment focusses on analysing the relationship between spatio-temporal symmetry and resulting gaze transitions vis-a-vis low-level features & corresponding high-level objects. Our preliminary results suggest that symmetry in the composition of frames and the editing reduces the gaze transitions on the object level and that symmetric editing and camera movement reduces eye-movements after a cut. References SUCHAN, J., AND BHATT, M. 2016. Semantic Question-Answering with Video and Eye- Tracking Data – AI Foundations for Human Visual Perception Driven Cogni- tive Film Studies. In Intl. Joint Conference on Artificial Intelligence (IJCAI 2016). SUCHAN, J., AND BHATT, M. 2016. The Geometry of a Scene: On Deep Semantics for Visual Perception Driven Cognitive Film Studies. In WACV 2016, IEEE. YU, S. X. 2010. Feature Transitions with Saccadic Search: Size, Color, and Orienta- tion Are Not Alike. In NIPS 2010. Canada., Curran Associates, Inc., 2523–2531. 1 The eye-tracking dataset is generated from experiments involving a to- tal of 31 subjects involving 16 scenes (per subject) from 14 films. We con- ducted the experiment with the stationary Tobii X2-60 Eye Tracker, collect- ing eye movement data with a rate of 60 Hz.

Transcript of The Perception of Symmetry in the Moving Imagestellayu... · THE PERCEPTION OF SYMMETRY IN THE...

Page 1: The Perception of Symmetry in the Moving Imagestellayu... · THE PERCEPTION OF SYMMETRY IN THE MOVING IMAGE Multi-Level Computational Analysis of Cinematographic Scene Structure and

THE PERCEPTION OF SYMMETRY IN THE MOVING IMAGEMulti-Level Computational Analysis of Cinematographic Scene Structure and its Visual Reception

Jakob Suchan, Mehul Bhatt (University of Bremen, DE), and Stella Yu. (UC Berkeley, USA)

Scene Level Object Level Image Level

Figure 1: Multi-level analysis of symmetry in the composition of the moving image. (GRAND BUDAPEST HOTEL, WES ANDERSON, 2014)

1 INTRODUCTION

This research is driven by visuo-spatial perception focussed cogni-tive film studies, where the key emphasis is on the systematic studyand generation of evidence that can characterise and establish corre-lates between principles for the synthesis of the moving image, andits cognitive (e.g., embodied visuo-auditory, emotional) recipienteffects on observers [Suchan and Bhatt 2016b; Suchan and Bhatt2016a]. Within this context, we focus on the case of “symmetry”in the cinematographic structure of the moving image, and proposea multi-level model of interpreting symmetric patterns therefrom.This provides the foundation for integrating scene analysis withthe analysis of its visuo-spatial perception based on eye-trackingdata. This is achieved by the integration of: computational semanticinterpretation of the scene [Suchan and Bhatt 2016b] —involvingscene objects (people, objects in the scene), cinematographic aids(camera movement, shot types, cuts and scene structure)— and per-ceptual artefacts (fixations, saccades, scan-path, areas of attention).

2 SYMMETRY IN SPACE AND TIME:A MULTI-LEVEL MODEL

Symmetry is represented within our computational framework asa multi-level model allowing formal analysis at different layers ofabstraction; we look at symmetry on three levels (L1–L3):(L1). SCENE LEVEL Symmetry in the editing of a scene is defined bysymmetric use of cinematographic aids, e.g. intercutting betweencharacters, symmetric camera movements.(L2). OBJECT LEVEL Symmetry on the object level is defined basedon the placement of objects and people in the frame. E.g. placingcharacters in the symmetry axis of the frame.(L3). IMAGE LEVEL Image level symmetry is defined based on low-

Permission to make digital or hard copies of part or all of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for profit or commercial advantage and that copiesbear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contactthe owner/author(s). © 2016 Copyright held by the owner/author(s).SAP ’16, July 22-23, 2016, Anaheim, CA, USAISBN: 978-1-4503-4383-1/16/07DOI: http://dx.doi.org/10.1145/2931002.2948721

level features that support a symmetry axis, i.e., contrast edges, tex-tures, etc. On this level symmetry can occur in multiple places andobjects can be symmetric within them selves.Our multi-level model of symmetry provides a comprehensive char-acterisation of the symmetric structure of a scene, connecting high-level conceptual categories to low-level visual features in the image.

3 VISUAL PERCEPTION OF SYMMETRY

The reception of symmetry is studied by analysing eye-movementbehaviour of spectators, and correlating them to the multi-levelsymmetry model of the scene (Sec. 2). Perceptual data encom-passes individual eye-movements, and aggregated gaze data ofmultiple spectators.

EXPERIMENTAL DATASET We investigate the perception of symme-try in the context of an eye-tracking dataset.1 We particularly focuson select films of director Wes Anderson, whose use of symmetryin scene structuring / character placement as well as in editing &camera movement is well-known.

SYMMETRY, EYE-MOVEMENT PATTERNS, AND GAZE TRANSITIONSFounded on [Yu 2010], our experiment focusses on analysingthe relationship between spatio-temporal symmetry and resultinggaze transitions vis-a-vis low-level features & correspondinghigh-level objects. Our preliminary results suggest that symmetryin the composition of frames and the editing reduces the gazetransitions on the object level and that symmetric editing andcamera movement reduces eye-movements after a cut.

ReferencesSUCHAN, J., AND BHATT, M. 2016. Semantic Question-Answering with Video and

Eye- Tracking Data – AI Foundations for Human Visual Perception Driven Cogni-tive Film Studies. In Intl. Joint Conference on Artificial Intelligence (IJCAI 2016).

SUCHAN, J., AND BHATT, M. 2016. The Geometry of a Scene: On Deep Semanticsfor Visual Perception Driven Cognitive Film Studies. In WACV 2016, IEEE.

YU, S. X. 2010. Feature Transitions with Saccadic Search: Size, Color, and Orienta-tion Are Not Alike. In NIPS 2010. Canada., Curran Associates, Inc., 2523–2531.

1The eye-tracking dataset is generated from experiments involving a to-tal of 31 subjects involving 16 scenes (per subject) from 14 films. We con-ducted the experiment with the stationary Tobii X2-60 Eye Tracker, collect-ing eye movement data with a rate of 60 Hz.