Demo: Detecting Group Formations using iBeacon Technology · Demo: Detecting Group Formations using...

1
Demo: Detecting Group Formations using iBeacon Technology Kleomenis Katevas, Laurissa Tokarchuk, Hamed Haddadi, Richard G. Clegg, Muhammad IrfanQueen Mary University of London, UK. University of Genova, Italy. [email protected] 1. INTRODUCTION Researchers from different disciplines have examined crowd behavior in the past by employing a variety of methods including ethnographic studies, computer vision techniques and manual annotation based data analysis. However, be- cause of the inherent difficulties in collecting, processing and analyzing the data, it is difficult to obtain large data sets for study. In this work we present a system for detecting stationary interactions inside crowds, depending entirely on the sensors available in a modern smartphone device such as Bluetooth Smart (BLE) and Accelerometer. By utiliz- ing Apple’s iBeacon TM implementation of Bluetooth Smart using SensingKit 1 , our open-source multi-platform mobile sensing framework [1], we are able to detect the proximity of users carrying a smartphone in their pocket. We then use an algorithm based on graph theory to predict group in- teractions inside the crowd. Previous work in this area has been limited to the detection of interactions between only two people and therefore our approach goes beyond current state of the art in its ability to detect group formations with more than two people involved. Our approach is particularly beneficial to the design and implementation of crowd behav- ior analytics, design of influence strategies, and algorithms for crowd reconfiguration. 2. DEMONSTRATION As shown in Figure 1, our system consists of: an iOS mobile application titled CrowdSense[3], and a cloud-based server back-end. Each user needs to have CrowdSense in- stalled, enabling his/her device to both broadcast as an iBeacon TM , but at the same time scan for other beacons around the space. The app is also collecting the Linear Ac- celeration of the device, classifying the user’s motion activity between ‘stationary’ and ‘in motion’. Our back-end receives the data in real-time and visualizes the detected group for- mations though a web interface. Additionally, it produces 1 https://www.sensingkit.org Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). MobiSys’17 June 19-23, 2017, Niagara Falls, NY, USA c 2017 Copyright held by the owner/author(s). ACM ISBN 978-1-4503-4928-4/17/06. DOI: http://dx.doi.org/10.1145/3081333.3089335 analytics about the detected crowd such as: types of inter- actions over time, clustering and degree distributions. For more information about how our technology works, please refer to [2]. Our setup includes some iPhone devices with the Crowd- Sense app installed, that will be given to people interested in our demo. Since CrowdSense is available in the AppStore, users could also use their personal devices to participate. The back-end will analyze the streamed data, detect the users’ social interactions in real-time and visualize them on a screen through a web-interface. A video recording of an actual social event (e.g. a conference coffee break) will also be displayed on another screen, while our system will an- alyze more complex scenarios of interactions happening in that event. The purpose of this demo is to present a technology that is capable of providing in-depth analytics about how the crowd behaves during social events. A video showcase of the demo can be found at: https://www.sensingkit.org/MobiSys17- Demo.m4v. CrowdSense a. Example of a social event b. CrowdSense App c. Back-end 0 5 10 10:00 10:30 11:00 11:30 Figure 1: System Design Acknowledgments This work is supported by funding from the UK Defense Science and Technology Laboratory. 3. REFERENCES [1] K. Katevas, H. Haddadi, and L. Tokarchuk. Sensingkit: Evaluating the sensor power consumption in ios devices. In 2016 12th International Conference on Intelligent Environments (IE), pages 222–225, Sept 2016. [2] K. Katevas, H. Haddadi, L. Tokarchuk, and R. G. Clegg. Detecting group formations using iBeacon technology. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, UbiComp ’16, pages 742–752, New York, NY, USA, 2016. ACM. [3] Kleomenis Katevas. CrowdSense for iOS. https://itunes.apple.com/app/crowdsense/id930853606.

Transcript of Demo: Detecting Group Formations using iBeacon Technology · Demo: Detecting Group Formations using...

Page 1: Demo: Detecting Group Formations using iBeacon Technology · Demo: Detecting Group Formations using iBeacon Technology Kleomenis Katevasy, Laurissa Tokarchuky, Hamed Haddadiy, Richard

Demo: Detecting Group Formations using iBeaconTechnology

Kleomenis Katevas†, Laurissa Tokarchuk†, Hamed Haddadi†, Richard G. Clegg†,Muhammad Irfan†�

†Queen Mary University of London, UK. �University of Genova, [email protected]

1. INTRODUCTIONResearchers from different disciplines have examined crowd

behavior in the past by employing a variety of methodsincluding ethnographic studies, computer vision techniquesand manual annotation based data analysis. However, be-cause of the inherent difficulties in collecting, processing andanalyzing the data, it is difficult to obtain large data setsfor study. In this work we present a system for detectingstationary interactions inside crowds, depending entirely onthe sensors available in a modern smartphone device suchas Bluetooth Smart (BLE) and Accelerometer. By utiliz-ing Apple’s iBeaconTM implementation of Bluetooth Smartusing SensingKit1, our open-source multi-platform mobilesensing framework [1], we are able to detect the proximityof users carrying a smartphone in their pocket. We thenuse an algorithm based on graph theory to predict group in-teractions inside the crowd. Previous work in this area hasbeen limited to the detection of interactions between onlytwo people and therefore our approach goes beyond currentstate of the art in its ability to detect group formations withmore than two people involved. Our approach is particularlybeneficial to the design and implementation of crowd behav-ior analytics, design of influence strategies, and algorithmsfor crowd reconfiguration.

2. DEMONSTRATIONAs shown in Figure 1, our system consists of: an iOS

mobile application titled CrowdSense[3], and a cloud-basedserver back-end. Each user needs to have CrowdSense in-stalled, enabling his/her device to both broadcast as aniBeaconTM, but at the same time scan for other beaconsaround the space. The app is also collecting the Linear Ac-celeration of the device, classifying the user’s motion activitybetween ‘stationary’ and ‘in motion’. Our back-end receivesthe data in real-time and visualizes the detected group for-mations though a web interface. Additionally, it produces

1https://www.sensingkit.org

Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s).

MobiSys’17 June 19-23, 2017, Niagara Falls, NY, USAc© 2017 Copyright held by the owner/author(s).

ACM ISBN 978-1-4503-4928-4/17/06.

DOI: http://dx.doi.org/10.1145/3081333.3089335

analytics about the detected crowd such as: types of inter-actions over time, clustering and degree distributions. Formore information about how our technology works, pleaserefer to [2].

Our setup includes some iPhone devices with the Crowd-Sense app installed, that will be given to people interested inour demo. Since CrowdSense is available in the AppStore,users could also use their personal devices to participate.The back-end will analyze the streamed data, detect theusers’ social interactions in real-time and visualize them ona screen through a web-interface. A video recording of anactual social event (e.g. a conference coffee break) will alsobe displayed on another screen, while our system will an-alyze more complex scenarios of interactions happening inthat event.

The purpose of this demo is to present a technology that iscapable of providing in-depth analytics about how the crowdbehaves during social events. A video showcase of the democan be found at: https://www.sensingkit.org/MobiSys17-Demo.m4v.

CrowdSense

a. Example of a social event b. CrowdSense App c. Back-end

0

5

10

10:00 10:30 11:00 11:30

Figure 1: System Design

AcknowledgmentsThis work is supported by funding from the UK DefenseScience and Technology Laboratory.

3. REFERENCES[1] K. Katevas, H. Haddadi, and L. Tokarchuk. Sensingkit:

Evaluating the sensor power consumption in ios devices. In2016 12th International Conference on IntelligentEnvironments (IE), pages 222–225, Sept 2016.

[2] K. Katevas, H. Haddadi, L. Tokarchuk, and R. G. Clegg.Detecting group formations using iBeacon technology. InProceedings of the 2016 ACM International JointConference on Pervasive and Ubiquitous Computing:Adjunct, UbiComp ’16, pages 742–752, New York, NY, USA,2016. ACM.

[3] Kleomenis Katevas. CrowdSense for iOS.https://itunes.apple.com/app/crowdsense/id930853606.