65 Motion Origami DanielBartos - University of...

Post on 29-Jun-2018

219 views 0 download

Transcript of 65 Motion Origami DanielBartos - University of...

MotionOrigamiDanielBartoš1

1FAMU,Prague,CzechRepublicdaniel.bartos@gmail.com

Abstract.TheMotionOrigamiprojectexploresliveperformancestrategiesfocusedongesturebasedcontrolofsound.ThesoundprocessinginvolvesgranularsoundsynthesisinaMax/MSPpatch.MotionOrigamiisdesignedforaliveperformancescenarioasaninteractivemusicinstrument.MotionOrigamimakesuseoftheliveperformanceaudioinputorofthepre-recordedsoundmaterial.Thisliveinterfaceprototypeexploresagesture-basedmusiccompositionandmusicperformancetechniques.Thesoundtransformationsaredrivenbythehandgestures,whiletheuseofmotiontrackingdeviceletstheuserbuildupaspecificexperienceandvirtuosity.

Keywords:LeapMotionsensor,gesturerecognition,motiontracking,musicexpression,granularsynthesis,Max/MSP

IntroductionThenameoftheMotionOrigamiproject1isinspiredbytheJapanesearttraditionoforigamifolding.Theactualprocessofpaperfoldingisreflectedinaspecificcompositionalstrategy,whichusescapturedhandgestures.Inotherwordstheperformingartist,themusician,isableto'fold'soundswithhisownhandgesturesintonewsoundobjects.Thesimple'folds'thereforeresultintocomplexsoundscapesduringtheperformance.

1MotionOrigami;theprojectpresentationcanbeaccessedonlineathttp://www.danielbartos.com/motion-origami

Figure1.MotionOrigami–MAX/MSPpatchwiththeLeapMotion

ThispaperdescribesanoriginalMax/MSPpatch,whichusestheSmoothOverlappingGranularSynthesisobjectsogs~2asit'smainsoundtransformingelement.Thenoveltyofthisapproachisnotintheproductionofanewcodeoraparticularspecificdesign,butinmakingacreativeuseoftheexistingandadvancedaudioprocessingtoolsinthesoundandmusicperformance.Theprojectshowshowaliveinterfaceprototypecanbeturnedintoanoriginalcompositionaltool.Assuch,theMotionOrigamiprojectrepresentsarecipeonhowtoapproachadesignofanexperimentalmusicinstrumentandshowsanapproachsimilartotherapidprototypingtechniqueappliedtotherealmoftheadvancedMax/MSPaudioprocessingdomain.

LeapMotionTheLeapMotion3sensorwasdesignedfortouch-lesstrackingofhandgesturesandtheirmovements.ThesensorinconjunctionwiththeLeapMotionSDKbecomesasophisticatedcontrolleranddeliverscomplexinformationmonitoringthehandmovementsinreal-time.Handgesturesarecapturedwithahighprecisionaccuracyalongwiththeindividualfingerpositions,rotationsandfingertipsaccelerations.

TheLeapMotionsensorwasintroducedtothemarketinthemid2013andswiftlyfounditsplaceamongotherdevicesdesignedforbodymovementsandgesturetracking–asforexampletheKinect4andtheWiicontroller5.Suchdevicesworkusuallywithgeneralbodytrackingandcanberepurposed.TheLeapMotiononthecontraryisdesignedtocapturethehandgesturesandmovementsonly.InfacttheLeapMotionsensorcouldbethoughtofasaspecialinteractivespace6ofpredefinedvolumeofair.CommonmusicapplicationsoftheLeapMotionsensorareprimarilybasedontheimitationsofphysicalinterfacesofexistingmusicalinstruments.Thisistrueespeciallyforthefollowingprojects:Air-Keyes,CrystalPiano,AirHarp,etc.(Han2014).Themainreasonforthatis,thatthevirtualkeyboardrepresentsanidealsetupfortestingofthesystemlatency(Silva2013),asthelowlatencyresponseisoneofthemostimportantelementsforanyreal-timemusicperformanceapplication.ThelatencyoftheLeapMotion,asadvertisedbythemanufacturer,isanywherefrom5msto20ms,butthisparticularfigureobviouslydependsonthewholesystemconfigurationandcomponentsused.AnothercategoryintheexistingapplicationsoftheLeapMotionsensorisrepresentedbyvarioushybridinstruments.AspecificselectionofsuchprojectsisdescribedinthepaperLessonsLearnedinExploringLeapMotion™(Han,Gold2014).

TheMotionOrigamiBody&SoundInteractionThethemeofphysicalinteractionandsoundprocessingisthoroughlyinvestigatedinthepapercalledSoundDesignasHumanMatterInteraction(Wei2013),whereasthemostimportantkeywordbecomesthetermmaterialcomputation7.IntheextendedsensewecanthinkoftheLeapMotionsensorasifitisconstitutinganinteractivespaceonitsown,wherethecalculationsandinteractiontakeplace.Thegesturerecognitioninconjunctionwithamusicperformanceis

2MAX/MSPobjectsogs~(SmoothOverlapGranularSynthesis)byNorbertSchnell,IRCAM–CentrePompidou;moreinformationcanbeaccessedonlineathttp://forumnet.ircam.fr/product/max-sound-box3LeapMotion,completesensorandSDKspecificationcanbeaccessedonlineathttp://leapmotion.com4MicrosoftKinectisasensorusingdepthmaptocreate3Dspacerepresentation.DevelopedbyMicrosoftforagameconsoleMicrosoft© Xbox360.Kinectspecificationcanbeaccessedonlineatwww.microsoft.com/en-us/kinectforwindows5WiiRemoteispartofNintendo©WiiConsoledevelopedbyNintendo©Company,Ltd.Wiispecificationcanbeaccessedonlineatwww.nintendo.com/wii6Interactivespaceof8cubicfeet,equivalentof0,22m3respectively,asstatedbythemanufacturer;moreinformationcanbeaccessedonlineathttp://leapmotion.com7„Realtime,continuouslyresponsiveenvironmentsforthedesignoftechnicalsystemsforhuman-computerinteractiondesign…“.Ibid,p.2010.

alsobeingexploredbytheIRCAM'sISMMresearchteam8.Whilethephysicalbody&soundinteractionconceptispresentforexampleintheprojectsofMarcoDonnarumma,whousessetofrepurposedbio-sensors9.

InthecaseoftheMotionOrigamiMax/MSPpatch,theperformer'sphysicalgestureinteractionistheprimarysourceoftheresultingsoundtransformations.Theperformercreatesnewsoundobjectswiththecapturedhandgestures.ThehandgesturesintheMotionOrigamipatcharerecognisedintheinteractivespaceabovethesensorandareturnedintoacontrolmechanismcodedintheMax/MSPpatch.Asinglerecognisedhandgestureinitialisestheaudiobufferwiththeincomingaudio.Thebufferactsasastartingpointforthegranularsynthesissoundscapebuilding.Thehandgesturescontrolthegranularsynthesisengineparameters,alongwiththetimingofthebufferinitializationwithanewaudiomaterialduringtheliveperformanceaswell.Thehandgesturescontrolthewet&dryratiooftheaudiosignalinputandalsothemultichannelaudiodistributionviatheAmbisonicsengine10.

MotionOrigamiPatchImplementationTheMotionOrigamipatchisprogrammedinMax/MSP11.DatafromtheLeapMotionsensorarecapturedbytheSwirlyLeapobject12.TheupdatedversionofthepatchusesthecurrentandwelldocumentedIRCAM'sleapmotion

8IRCAM–ISMMteam{SOUNDMUSICMOVEMENT}INTERACTION;moreinformationcanbeaccessedonlineathttp://ismm.ircam.fr9MarcoDonnarumma;projectpresentationscanbeaccessedonlineat:www.marcodonnarumma.com10AmbisonicsToolsfromtheInstituteforComputerMusicandSoundTechnologyICST,ZurichUniversityoftheArtscanbeaccessedonlineat:https://www.zhdk.ch/index.php?id=icst_ambisonicsexternals11Max/MSPvisualprogrammingenvironmentbyCycling74,moreinformationcanbeaccessedonlineathttp://www.cycling74.com

Figure2:MotionOrigami–theindividualpatchsectionsexplained

object13.TheSmoothOverlappingGranularSynthesisobjectsogs~14waschosenbecauseitoffersasimpleandcreativecontrolovertheaudiocapturedintotheaudiobufferandcanbeusedforaspecificnavigationandexplorationoftheaudiobuffer.Thesogs~objectalsomimicsthepaperfoldingtechniqueinthesense,thattheoriginalpapersurfaceissubstitutedwitha2Dplainmadeoftwoindividualparameters:thegrainpositionandthegrainsize.ThedatafromtheLeapMotionsensoraremappedtodrivethesogs~objectwiththosetwoselectedparameters:theperformerthannavigatesthespaceoftheaudiobufferdefinedbythegrainsizeandthegrainpositionparametersrespectively.

Thewet&drymixratio,whichisalsomappedtohandgestures,offersdetailedcontroloverthesoundmergingandaccentstheactual'soundfolds'.These'soundfolds'canbuildupacertainlevelofcomplexitythankstothefact,thattheliveaudiosourceiscoupledwiththeaudiomaterialstoredinthebuffer.Althoughtheaudioismodifiedinthegranulationprocess,itsharesthesamespectralandtonalcharacteristicswiththeoriginalsoundsource.Thisinturncreateselaboratedsoundtransformations,whichcanbethoughofastheintroduced'soundfolding'process.Thepatchrecognisesaspecificgesture,whichisrequiredtostarttheinitialisationoftheaudiobuffer.Inthiswaytheaudiobufferisfilledwithanewaudiomaterial.Thebufferinitialisationstartswithagestureofaclosedhand.Thisgesturecanbeparaphrasedas'grabthesound'gesture.Inthisverymoment,thebufferisfilledwiththelivesoundinputandbecomesavailabletosogs~object.Subsequenthandgesturescontrolvariousaspectsofthegranularsynthesisengine:horizontalhandswipecontrolsgrainpositionselection,verticalhandmovementcontrolstimelengthofagrain.Moreovertheoverallpalmpositionabovethesensorinthex-yplanedefinesthesoundsourcepositioninthemultichannelAmbisonicsspaceandaddsamultichannelspatialisationlayertotheperformance.TheotherLeapMotionrecognisedvariablesasyaw,pitcharollarealternativelymappedtoextendedFXprocessing(reverb,distortion,etc.),dependingontheperformancescenario.

12SwirlyLeapMax/MSPaccesstheLeapMotionAPI,writtenbyTomRitchfordfromNewYorku.Projectcanbeaccessedonlineathttps://github.com/rec/swirly-leap13WelldocumentedMax/MSPobjectleapmotionbyIRCAM,moreinformationcanbeaccessedonlineathttp://ismm.ircam.fr/leapmotion/14Max/MSPobjectsogs~(SmoothOverlapGranularSynthesis)byNorbertSchnell,IRCAM–CentrePompidou,moreinformationcanbeaccessedathttp://forumnet.ircam.fr/product/max-sound-box

Figure3.MotionOrigami–therecognisedgesturesillustrated

Conclusions&PerformanceThemostinvitingapplicationoftheMotionOrigamipatch15isavocalperformanceorsimplyamusicinstrument,whichleavesenoughspacefortheinteractionwiththesensoritself.Thebeautyoftheliveperformanceapproachliesthefact,thattheperformercaninteractwithhis/hercapturedmusicmaterialandaddmultiplelayersofexpressionbysolelyusingthehandgestures.Newlayerofimprovisationcanbeintroduced,whilenewthemesandphrasesemerge.Theperformerthaninteractswithanewmusicmaterial,whichisbasedonthesoundqualitiesoftheoriginalmusicinstrumentorthevocalperformance.TheperformercancontrolthefollowingparametersintheMotionOrigamiliveinterface:timebasedselectionofaphrasesampledintothebuffer;grainsizeandit'spositioninthebuffer;wet&drymixratioandAmbisonicssoundsourcespaceposition(ifapplicable).

Usinggesturesinmusiccompositionandperformanceprovestobeveryintuitive.Thesensoralonehastobe'learned'tobeoperatedproperlyandthisfactdeliversaspecificvirtuosityovertime.TheLeapMotionsensorwiththeMotionOrigamipatchopensupanewexcitingfieldofmusiccompositionandsoundprocessingcoupledwithimmediategesturalinteraction.Thebiggestchallengeinthegesturebasedperformanceistherecognitionofthequantizedgestures(Potter2013).Whileparametriccontrolofthevariouspatchelementsdoesn'tpresentanytechnicalproblemtracking-wise,therecognitionofthequantizedanduniquegesturesproveddifficultthroughoutthedevelopmentphaseofthepatch.Forexample,whileplayingonavirtualkeyboard,onecanlimitthekeystrokestoaspecificscaleandlimitthemis-triggerednotesthisway.Butwhenitcomestoevokingaspecificfunctionality(samplinginitialization,soundbanktoggle,etc.)thegestureshavetoberecognizedwithexceptionallyhighprecisionasthosedecisionscreateanintegralpartoftheperformanceitself.Thisaspectoftheliveperformancegivesusaspecificconstrainsandwehavetoconsiderthemintheliveperformancescenarios,whenusingtheLeapMotionsensor.Thequalityofthetrackingdependsalsoonthepresentlightconditionsandtheoverallsensorsetup.Forexample,adirectlightreachingthesensors'ssurfacecanintroduceinconsistencyinthetracking.

15MotionOrigami,theprojectpresentationcanbeaccessedonlineathttp://www.danielbartos.com/motion-origami

Figure4:MotionOrigami–adetailofthesubpatchcontrollingtheaudiobufferinitialisation

Overall,theLeapMotionisverysuitableforvariousintuitivemusiccompositionandperformancescenarios.Occasionalerrorsintrackingcanbeovercomewithagoodpatchdesignandtherestricteduseofthequantizedgestures–leavingoutthequantizedgesturestobeservicesbytraditionalhardwarecontrollers.Havingsaidthat,theLeapMotionsensorexcelsintheintuitivegestureinteractionperformanceandthegesturebasedmusiccompositionstrategies.

Acknowledgements.TheMotionOrigamiprojectwasfundedbyaresearchgrantscheme“Sensorsasmusicinstruments“atHAMU,MusicAcademyinPraguerunbyMichalRataj.SpecialthanksgotoMichalRataj16andMatthewOstrowski17fortheirsupport,helpwiththepatchdesignandtheirideasaboutperformanceandmusiccomposition.

ReferencesHan,Jihyun;Gold,Nicolas.2014.“LessonsLearnedinExploringtheLeapMotion™SensorforGesture-basedInstrumentDesign,”inCaramiaux,BandTahiroğlu,KandFiebrink,RandTanaka,A(eds.).ProceedingsoftheInternationalConferenceonNewInterfacesforMusicalExpression(NIME'14)London2014,GoldsmithsUniversityofLondon:371-374.Potter,LeightElen;Araullo,Jake;CarterLewis.2013.“TheLeapMotioncontroller:Aviewonsignlanguage,”Proceedingsofthe25thAustralianComputer-HumanInteractionConference:Augmentation,Application,Innovation,CollaborationAdelaide,ACM,NewYork:175-178.

Silva,EduardoS;deAbreu,JaderAndersonO.;deAlmeida,JanielHenriqueP.;Teichrieb,Veronica;Ramalho,GeberL..2013.“APreliminaryEvaluationoftheLeapMotionSensorasControllerofNewDigitalMusicalInstruments,”ProceedingsoftheBrazilianSymposiumonComputerMusicBrasilia,SBCM:59-70.Schwarz,Diemo.2006.“Concatenativesoundsynthesis:Theearlyyears,”JournalofNewMusicResearch35(1):3–22(March).Wei,ShaXin;Freed,Adrian;Navab,Navid.2013.“SoundDesignAsHumanMatterInteraction,”ProceedingsofCHI'13ExtendedAbstractsonHumanFactorsinComputingSystemsinParis,ACM,NewYork:2009-2018.

16MichalRataj–musiccomposerandresearcheratHAMU(MusicAcademyinPrague).Moreinformationcanbeaccessedonlineathttp://michalrataj.com17MatthewOstrowski–expertonsensorsandmultimediaprogrammingbasedatHarvestworksNYC,offeredvaluableinsightsintoLeapMotionprogramming.Moreinformationcanbeaccessedonlineathttp://www.ostrowski.infoandhttp://www.harvestworks.org