Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

97
Core Audio in iOS 6 Chris Adamson • @invalidname CocoaConf San Jose April 20, 2013 Sides and code available on my blog: http://www.subfurther.com/blog Monday, April 29, 13

Transcript of Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Page 1: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Core Audio in iOS 6Chris Adamson • @invalidname

CocoaConf San JoseApril 20, 2013

Sides and code available on my blog:http://www.subfurther.com/blog

Monday, April 29, 13

Page 2: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Plug!

Monday, April 29, 13

Page 3: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

The Reviews Are In!

Monday, April 29, 13

Page 4: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

The Reviews Are In!

Monday, April 29, 13

Page 5: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

The Reviews Are In!

Monday, April 29, 13

Page 6: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

The Reviews Are In!

Monday, April 29, 13

Page 7: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Legitimate copies!

• Amazon (paper or Kindle)

• Barnes & Noble (paper or Nook)

• Apple (iBooks)

• Direct from InformIT (paper, eBook [.epub + .mobi + .pdf], or Bundle)

Monday, April 29, 13

Page 8: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

What You’ll Learn

• What Core Audio does and doesn’t do

• When to use and not use it

• What’s new in Core Audio for iOS 6

Monday, April 29, 13

Page 9: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Monday, April 29, 13

Page 10: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Simple things should be simple,complex things should be possible.

–Alan Kay

Monday, April 29, 13

Page 11: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Simple things should be simple,complex things should be possible.

–Alan Kay

AV Foundation, Media Player

Monday, April 29, 13

Page 12: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Simple things should be simple,complex things should be possible.

–Alan Kay

AV Foundation, Media Player

Core Audio

Monday, April 29, 13

Page 13: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Core Audio

• Low-level C framework for processing audio

• Capture, play-out, real-time or off-line processing

• The “complex things should be possible” part of audio on OS X and iOS

Monday, April 29, 13

Page 14: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Chris’ CA Taxonomy

• Engines: process streams of audio

• Capture, play-out, mixing, effects processing

• Helpers: deal with formats, encodings, etc.

• File I/O, stream I/O, format conversion, iOS “session” management

Monday, April 29, 13

Page 15: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Helpers: Audio File

• Read from / write to multiple audio file types (.aiff, .wav, .caf, .m4a, .mp3) in a content-agnostic way

• Get metadata (data format, duration, iTunes/ID3 info)

Monday, April 29, 13

Page 16: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Helpers: Audio File Stream

• Read audio from non-random-access source like a network stream

• Discover encoding and encapsulation on the fly, then deliver audio packets to client application

Monday, April 29, 13

Page 17: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Helpers: Converters

• Convert buffers of audio to and from different encodings

• One side must be in an uncompressed format (i.e., Linear PCM)

Monday, April 29, 13

Page 18: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Helpers: ExtAudioFile

• Combine file I/O and format conversion

• Read a compressed file into PCM buffers

• Write PCM buffers into a compressed file

Monday, April 29, 13

Page 19: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Helpers: Audio Session

• iOS-only API to negotiate use of audio resources with the rest of the system

• Deetermine whether your app mixes with other apps’ audio, honors ring/silent switch, can play in background, etc.

• Gets notified of audio interruptions

• See also AVAudioSession

Monday, April 29, 13

Page 20: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Engines: Audio Units

• Low-latency (~10ms) processing of capture/play-out audio data

• Effects, mixing, etc.

• Connect units manually or via an AUGraph

• Much more on this topic momentarily…

Monday, April 29, 13

Page 21: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Engines: Audio Queue

• Convenience API for recording or play-out, built atop audio units

• Rather than processing on-demand and on Core Audio’s thread, your callback provides or receives buffers of audio (at whatever size is convenient to you)

• Higher latency, naturally

• Supports compressed formats (MP3, AAC)

Monday, April 29, 13

Page 22: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Engines: Open AL

• API for 3D spatialized audio, implemented atop audio units

• Set a source’s properties (x/y/z coordinates, orientation, audio buffer, etc.), OpenAL renders what it sounds like to the listener from that location

Monday, April 29, 13

Page 23: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Engines and Helpers• Audio Units

• Audio Queue

• Open AL

• Audio File

• Audio File Stream

• Audio Converter

• ExtAudioFile

• Audio Session

Monday, April 29, 13

Page 24: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Audio Units

Monday, April 29, 13

Page 25: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Audio Unit

AUSomething

Monday, April 29, 13

Page 26: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Types of Audio Units

• Output (which also do input)

• Generator

• Converter

• Effect

• Mixer

• Music

Monday, April 29, 13

Page 27: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Pull Model

AUSomething

Monday, April 29, 13

Page 28: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Pull Model

AUSomething

AudioUnitRender()

Monday, April 29, 13

Page 29: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Pull Model

AUSomethingAUSomethingElse

Monday, April 29, 13

Page 30: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Buses (aka, Elements)

AUSomething

AUSomethingElse

AUSomethingElse

Monday, April 29, 13

Page 31: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AUGraph

AUSomething

AUSomethingElse

AUSomethingElse

Monday, April 29, 13

Page 32: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Render Callbacks

AUSomething

AUSomethingElse

OSStatus converterInputRenderCallback (void *inRefCon,AudioUnitRenderActionFlags *ioActionFlags,const AudioTimeStamp *inTimeStamp,UInt32 inBusNumber,UInt32 inNumberFrames,AudioBufferList * ioData) {

CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon;

// read from bufferioData->mBuffers[0].mData = player.preRenderData;

return noErr;}

Monday, April 29, 13

Page 33: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AURemoteIO

• Output unit used for play-out, capture

• A Core Audio thread repeatedly and automatically calls AudioUnitRender()

• Must set EnableIO property to explicitly enable capture and/or play-out

• Capture requires setting appropriate AudioSession category

Monday, April 29, 13

Page 34: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Create AURemoteIOCheckError(NewAUGraph(&_auGraph), ! ! "couldn't create au graph"); !CheckError(AUGraphOpen(_auGraph), ! ! "couldn't open au graph"); !AudioComponentDescription componentDesc;componentDesc.componentType = kAudioUnitType_Output;componentDesc.componentSubType = kAudioUnitSubType_RemoteIO;componentDesc.componentManufacturer =

kAudioUnitManufacturer_Apple; !AUNode remoteIONode;CheckError(AUGraphAddNode(_auGraph, ! ! ! ! ! ! &componentDesc, ! ! ! ! ! ! &remoteIONode), ! ! "couldn't add remote io node");

Monday, April 29, 13

Page 35: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Getting an AudioUnit from AUNode

! CheckError(AUGraphNodeInfo(self.auGraph, ! ! ! ! ! ! ! remoteIONode, ! ! ! ! ! ! ! NULL, ! ! ! ! ! ! ! &_remoteIOUnit), ! ! ! "couldn't get remote io unit from node");

Monday, April 29, 13

Page 36: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AURemoteIO Buses

AURemoteIO

Monday, April 29, 13

Page 37: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AURemoteIO Buses

AURemoteIObus 0

to output H/W

Monday, April 29, 13

Page 38: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AURemoteIO Buses

AURemoteIObus 0

to output H/Wbus 0

from app

Monday, April 29, 13

Page 39: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AURemoteIO Buses

AURemoteIObus 0

to output H/W

bus 1from input H/W

bus 0from app

Monday, April 29, 13

Page 40: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AURemoteIO Buses

AURemoteIObus 0

to output H/W

bus 1from input H/W

bus 1to app

bus 0from app

Monday, April 29, 13

Page 41: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

EnableIO ! UInt32 oneFlag = 1; ! UInt32 busZero = 0; ! CheckError(AudioUnitSetProperty(self.remoteIOUnit, ! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Output, ! ! ! ! ! ! ! ! ! busZero, ! ! ! ! ! ! ! ! ! &oneFlag, ! ! ! ! ! ! ! ! ! sizeof(oneFlag)), ! ! ! "couldn't enable remote io output"); ! UInt32 busOne = 1; ! CheckError(AudioUnitSetProperty(self.remoteIOUnit, ! ! ! ! ! ! ! ! ! kAudioOutputUnitProperty_EnableIO, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Input, ! ! ! ! ! ! ! ! ! busOne, ! ! ! ! ! ! ! ! ! &oneFlag, ! ! ! ! ! ! ! ! ! sizeof(oneFlag)), ! ! ! "couldn't enable remote io input");

Monday, April 29, 13

Page 42: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Pass Through

AURemoteIO

bus 1from input H/W

bus 0to output H/W

Monday, April 29, 13

Page 43: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Connect In to Out

! UInt32 busZero = 0; ! UInt32 busOne = 1; ! CheckError(AUGraphConnectNodeInput(self.auGraph, ! ! ! ! ! ! ! ! ! remoteIONode, ! ! ! ! ! ! ! ! ! busOne, ! ! ! ! ! ! ! ! ! remoteIONode, ! ! ! ! ! ! ! ! ! busZero), ! ! ! "couldn't connect remote io bus 1 to 0");

Monday, April 29, 13

Page 44: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Pass-Through with Effect

bus 0to output H/W

AURemoteIO

AUEffect

bus 1from input H/W

Monday, April 29, 13

Page 45: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Demo: Delay EffectNew in iOS 6!

Monday, April 29, 13

Page 46: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Creating the AUDelay ! componentDesc.componentType = kAudioUnitType_Effect; ! componentDesc.componentSubType = kAudioUnitSubType_Delay; ! componentDesc.componentManufacturer =

kAudioUnitManufacturer_Apple; ! ! AUNode effectNode; ! CheckError(AUGraphAddNode(self.auGraph, ! ! ! ! ! ! ! &componentDesc, ! ! ! ! ! ! ! &effectNode), ! ! ! "couldn't create effect node"); ! AudioUnit effectUnit; ! CheckError(AUGraphNodeInfo(self.auGraph, ! ! ! ! ! ! ! effectNode, ! ! ! ! ! ! ! NULL, ! ! ! ! ! ! ! &effectUnit), ! ! ! "couldn't get effect unit from node");

Monday, April 29, 13

Page 47: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

The problem with effect units

• Audio Units available since iPhone OS 2.0 prefer int formats

• Effect units arrived with iOS 5 (arm7 era) and only work with float format

• Have to set the AUEffect unit’s format on AURemoteIO

Monday, April 29, 13

Page 48: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Setting formats ! AudioStreamBasicDescription effectDataFormat; ! UInt32 propSize = sizeof (effectDataFormat); ! CheckError(AudioUnitGetProperty(effectUnit, ! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Output, ! ! ! ! ! ! ! ! ! busZero, ! ! ! ! ! ! ! ! ! &effectDataFormat, ! ! ! ! ! ! ! ! ! &propSize), ! ! ! "couldn't read effect format"); ! CheckError(AudioUnitSetProperty(self.remoteIOUnit, ! ! ! ! ! ! ! ! ! kAudioUnitProperty_StreamFormat, ! ! ! ! ! ! ! ! ! kAudioUnitScope_Output, ! ! ! ! ! ! ! ! ! busOne, ! ! ! ! ! ! ! ! ! &effectDataFormat, ! ! ! ! ! ! ! ! ! propSize), ! ! ! "couldn't set bus one output format");

Then repeat AudioUnitSetProperty() for input scope / bus 0

Monday, April 29, 13

Page 49: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AUNewTimePitch

• New in iOS 6!

• Allows you to change pitch independent of time, or time independent of pitch

• How do you use it?

Monday, April 29, 13

Page 50: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AUTimePitch

! AudioComponentDescription effectcd = {0}; ! effectcd.componentType = kAudioUnitType_FormatConverter; ! effectcd.componentSubType = kAudioUnitSubType_NewTimePitch; ! effectcd.componentManufacturer = kAudioUnitManufacturer_Apple; ! ! AUNode effectNode; ! CheckError(AUGraphAddNode(self.auGraph, ! ! ! ! ! ! ! &effectcd, ! ! ! ! ! ! ! &effectNode), ! ! ! "couldn't get effect node [time/pitch]");

Notice the type is AUFormatConverter, not AUEffect

Monday, April 29, 13

Page 51: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AudioUnitParameters.h// Parameters for AUNewTimePitchenum { ! ! // Global, rate, 1/32 -> 32.0, 1.0 ! kNewTimePitchParam_Rate!! ! ! ! ! ! = 0, ! ! // Global, Cents, -2400 -> 2400, 1.0 ! kNewTimePitchParam_Pitch! ! ! ! ! ! = 1, ! ! // Global, generic, 3.0 -> 32.0, 8.0 ! kNewTimePitchParam_Overlap! ! ! ! ! ! = 4, ! ! // Global, Boolean, 0->1, 1 ! kNewTimePitchParam_EnablePeakLocking! ! ! = 6};

This is the entire documentation for the AUNewTimePitch parameters

Monday, April 29, 13

Page 52: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AUNewTimePitch parameters

• Rate: kNewTimePitchParam_Rate takes a Float32 rate from 1/32 speed to 32x speed.

• Use powers of 2: 1/32, 1/16, …, 2, 4, 8…

• Pitch: kNewTimePitchParam_Pitch takes a Float32 representing cents, meaning 1/100 of a musical semitone

Monday, April 29, 13

Page 53: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Pitch shifting

• Pitch can vary, time does not

• Suitable for real-time sources, such as audio capture

Monday, April 29, 13

Page 54: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Demo: Pitch ShiftNew in iOS 6!

Monday, April 29, 13

Page 55: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Rate shifting

• Rate can vary, pitch does not

• Think of 1.5x and 2x speed modes in Podcasts app

• Not suitable for real-time sources, as data will be consumed faster. Files work well.

• Sources must be able to map time systems with kAudioUnitProperty_InputSamplesInOutput

Monday, April 29, 13

Page 56: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Demo: Rate ShiftNew in iOS 6!

Monday, April 29, 13

Page 57: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AUSplitter

AUSplitter

AUSomethingElse

AUSomethingElse

New in iOS 6!

Monday, April 29, 13

Page 58: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AUMatrixMixer

AUMatrixMixer

AUSomethingElse

AUSomethingElse

AUSomethingElse

AUSomethingElse

AUSomethingElse

New in iOS 6!

Monday, April 29, 13

Page 59: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Audio Queues(and the APIs that help them)

Monday, April 29, 13

Page 60: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AudioQueue

• Easier than AURemoteIO - provide data when you want to, less time pressure, can accept or provide compressed formats (MP3, AAC)

• Recording queue - receive buffers of captured audio in a callback

• Play-out queue - enqueue buffers of audio to play, optionally refill in a callback

Monday, April 29, 13

Page 61: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Audio Queue

Monday, April 29, 13

Page 62: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Audio Queue

Monday, April 29, 13

Page 63: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Audio Queue

Monday, April 29, 13

Page 64: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Audio Queue

Monday, April 29, 13

Page 65: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Common AQ scenarios

• File player - Read from file and “prime” queue buffers, start queue, when called back with used buffer, refill from next part of file

• Synthesis - Maintain state in your own code, write raw samples into buffers during callbacks

Monday, April 29, 13

Page 66: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Web Radio

• Project from Thursday’s workshop

• Use Audio File Stream Services to pick out audio data from a network stream

• Enqueue these packets as new AQ buffers

• Dispose used buffers in callback

Monday, April 29, 13

Page 67: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Parsing web radio

Monday, April 29, 13

Page 68: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Parsing web radio

NSData NSData

Packets Packets Packets Packets Packets

NSURLConnection delivers NSData buffers, containing audio and framing info. We pass it to Audio File Services.

Monday, April 29, 13

Page 69: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Parsing web radio

NSData NSData

Packets Packets Packets Packets Packets

Packets Packets

Packets Packets Packets

NSURLConnection delivers NSData buffers, containing audio and framing info. We pass it to Audio File Services.

Audio File Services calls us back with parsed packets of audio data.

Monday, April 29, 13

Page 70: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Parsing web radio

NSData NSData

Packets Packets Packets Packets Packets

Packets Packets

Packets Packets Packets

012Packets

PacketsPackets

PacketsPackets

Packets

NSURLConnection delivers NSData buffers, containing audio and framing info. We pass it to Audio File Services.

Audio File Services calls us back with parsed packets of audio data.

We create an AudioQueueBuffer with those packets and enqueue it for play-out.

Monday, April 29, 13

Page 71: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

A complex thing!

• What if we want to see that data after it’s been decoded to PCM and is about to be played?

• e.g., spectrum analysis, effects, visualizers

• AudioQueue design is “fire-and-forget”

Monday, April 29, 13

Page 72: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AudioQueue Tap!

http://www.last.fm/music/Spinal+TapMonday, April 29, 13

Page 73: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AudioQueueProcessingTap

• Set as a property on the Audio Queue

• Calls back to your function with decoded (PCM) audio data

• Three types: pre- or post- effects (that the AQ performs), or siphon. First two can modify the data.

• Only documentation is in AudioQueue.h

Monday, April 29, 13

Page 74: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Creating an AQ Tap

! ! // create the tap ! ! UInt32 maxFrames = 0; ! ! AudioStreamBasicDescription tapFormat = {0}; ! ! AudioQueueProcessingTapRef tapRef; ! ! CheckError(AudioQueueProcessingTapNew(audioQueue, ! ! ! ! ! ! ! ! ! ! ! tapProc, ! ! ! ! ! ! ! ! ! ! ! (__bridge void *)(player), ! ! ! ! ! ! ! ! ! ! ! kAudioQueueProcessingTap_PreEffects, ! ! ! ! ! ! ! ! ! ! ! &maxFrames, ! ! ! ! ! ! ! ! ! ! ! &tapFormat, ! ! ! ! ! ! ! ! ! ! ! &tapRef), ! ! ! ! "couldn't create AQ tap");

Notice that you receive maxFrames and tapFormat. These do not appear to be settable.

Monday, April 29, 13

Page 75: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AQ Tap Procvoid tapProc (void * inClientData, ! ! ! AudioQueueProcessingTapRef inAQTap, ! ! ! UInt32 inNumberFrames, ! ! ! AudioTimeStamp * ioTimeStamp, ! ! ! UInt32 * ioFlags, ! ! ! UInt32 * outNumberFrames, ! ! ! AudioBufferList * ioData) { ! CCFWebRadioPlayer *player =

(__bridge CCFWebRadioPlayer*) inClientData; ! UInt32 getSourceFlags = 0; ! UInt32 getSourceFrames = 0; ! AudioQueueProcessingTapGetSourceAudio(inAQTap, ! ! ! ! ! ! ! ! ! ! inNumberFrames, ! ! ! ! ! ! ! ! ! ! ioTimeStamp, ! ! ! ! ! ! ! ! ! ! &getSourceFlags, ! ! ! ! ! ! ! ! ! ! &getSourceFrames, ! ! ! ! ! ! ! ! ! ! ioData);// then do something with ioData// ...

Monday, April 29, 13

Page 76: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

So what should we do with the audio?

Monday, April 29, 13

Page 77: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

So what should we do with the audio?

Let’s apply our pitch-shift effect

Monday, April 29, 13

Page 78: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Shouldn’t this work?

AUEffect

Monday, April 29, 13

Page 79: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Shouldn’t this work?

AUEffectAudioUnitRender()

Monday, April 29, 13

Page 80: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

AudioUnitRender()

• Last argument is an AudioBufferList, whose AudioBuffer members have mData pointers

• If mData != NULL, audio unit does its thing with those samples

• If mData == NULL, audio data pulls from whatever it’s connected to

• So we just call with AudioBufferList ioData we got from tap callback, right?

Monday, April 29, 13

Page 81: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Psych!

• AQ tap provides data as signed ints

• Effect units only work with floating point

• We need to do an on-the-spot format conversion

Monday, April 29, 13

Page 82: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

invalidname’s convert-and-effect recipe

AUGenericOutputAUConverterAUEffectAUConverter

OSStatus converterInputRenderCallback (void *inRefCon,AudioUnitRenderActionFlags *ioActionFlags,const AudioTimeStamp *inTimeStamp,UInt32 inBusNumber,UInt32 inNumberFrames,AudioBufferList * ioData) {

CCFWebRadioPlayer *player = (__bridge CCFWebRadioPlayer*) inRefCon;

// read from bufferioData->mBuffers[0].mData = player.preRenderData;

return noErr;}

Note: red arrows are float format, yellow arrows are int

Monday, April 29, 13

Page 83: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

How it works

• AUGraph: AUConverter → AUEffect → AUConverter → AUGenericOutput

• Top AUConverter is connected to a render callback function

Monday, April 29, 13

Page 84: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

The trick!

• Copy mData pointer to a state variable and NULL it in ioData

• Call AudioQueueRender() on output unit. The NULL makes it pull from the graph.

• Top of the graph pulls on render callback, which gives it back the mData we copied off.

Monday, April 29, 13

Page 85: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Yes, reallyThis is the rest of tapProc()

! // copy off the ioData so the graph can read from it // in render callback ! player.preRenderData = ioData->mBuffers[0].mData; ! ioData->mBuffers[0].mData = NULL; ! ! OSStatus renderErr = noErr; ! AudioUnitRenderActionFlags actionFlags = 0; ! renderErr = AudioUnitRender(player.genericOutputUnit, ! ! ! ! ! ! ! ! &actionFlags, ! ! ! ! ! ! ! ! player.renderTimeStamp, ! ! ! ! ! ! ! ! 0, ! ! ! ! ! ! ! ! inNumberFrames, ! ! ! ! ! ! ! ! ioData); ! NSLog (@"AudioUnitRender, renderErr = %ld",renderErr);}

Monday, April 29, 13

Page 86: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Yes, really

OSStatus converterInputRenderCallback (void *inRefCon, ! ! ! ! ! ! ! ! ! AudioUnitRenderActionFlags *ioActionFlags, ! ! ! ! ! ! ! ! ! const AudioTimeStamp *inTimeStamp, ! ! ! ! ! ! ! ! ! UInt32 inBusNumber, ! ! ! ! ! ! ! ! ! UInt32 inNumberFrames, ! ! ! ! ! ! ! ! ! AudioBufferList * ioData) { ! CCFWebRadioPlayer *player =

(__bridge CCFWebRadioPlayer*) inRefCon; ! ! // read from buffer ! ioData->mBuffers[0].mData = player.preRenderData;

! return noErr;}

This is the render callback that supplies data to the int→float converter

Monday, April 29, 13

Page 87: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Demo: AQ Tap + AUNewTimePitch

New in iOS 6!

Monday, April 29, 13

Page 88: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Monday, April 29, 13

Page 89: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Meanwhile in a mini-bus near

Copenhagan…

Monday, April 29, 13

Page 90: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Audiobus

Monday, April 29, 13

Page 91: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Audiobus

• Allows multiple audio apps to exchange data in realtime

• Works by sending raw data in MIDI

• Actually approved by Apple

• Actually supported in GarageBand

Monday, April 29, 13

Page 92: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Monday, April 29, 13

Page 93: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Monday, April 29, 13

Page 94: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Supporting Audiobus

• Get the SDK from audiob.us

• Enable background mode, add an audiobus-compatible URL scheme, get API key from audiob.us

• Create and use ABAudiobusController, ABOutputPort/ABInputPort, and ABAudiobusAudioUnitWrapper

Monday, April 29, 13

Page 95: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Wrapping up…

Monday, April 29, 13

Page 96: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Takeaways

• Core Audio fundamentals never change

• New stuff is added as properties, typedefs, enums, etc.

• Watch the SDK API diffs document to find the new stuff

• Hope you like header files and experimentation

Monday, April 29, 13

Page 97: Core Audio in iOS 6 (CocoaConf San Jose, April 2013)

Q&A

• Slides will be posted to slideshare.net/invalidname

• Code will be linked from there and my blog

• Watch CocoaConf glassboard, @invalidname on Twitter/ADN, or [Time code]; blog for announcement

• Thanks!

Monday, April 29, 13