Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)

Post on 12-Jan-2017

622 views 4 download

Transcript of Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)

Video Killed The Rolex Star

Chris Adamson • @invalidname CocoaConf San Jose • November, 2015

Media Support in watchOS 2.0

watchOS 2.0

• App Extension runs on watch, not on iPhone

• New watchOS APIs for media playback and recording

• Prepare yourself, they’re limited!

From watchOS 2.0 Transition Guide

You must implement your extension using the frameworks in the watchOS SDK instead of the iOS SDK. For any features not available in the provided frameworks, you must rely on your iPhone app to perform the corresponding task.

From watchOS 2.0 Transition Guide

Your extension now stores files and data on Apple Watch. Any data that is not part of your Watch app or WatchKit extension bundle must be fetched the network or from the companion iOS app running on the user’s iPhone. You cannot rely on a shared group container to exchange files with your iOS app. Fetching files involves transferring them wirelessly to Apple Watch.

Media files. The Watch app handles audio and video playback in your app. If your WatchKit extension downloads media files from the network or the companion iOS app, you must place those files in a shared group container that is accessible to both your Watch app and WatchKit extension. For more information about managing media-related files, see Managing Your Media

From watchOS 2.0 Transition Guide

Media functionality

• Video playback

• Audio playback

• Audio recording

Video Playback

• WKInterfaceMovie — Canned UI component for movie playback

• WKInterfaceController — A/V features provided by primary UI controller class

WKInterfaceMovie

A WKInterfaceMovie object lets you play back video and audio content directly from your interface. A movie object displays a poster image with a play button on top of it. When the user taps the play button, WatchKit plays the movie in a modal interface.

// WKInterfaceMovie.h // WatchKit

WK_AVAILABLE_WATCHOS_ONLY(2.0) @interface WKInterfaceMovie : WKInterfaceObject

- (void)setMovieURL:(NSURL *)URL; - (void)setVideoGravity:(WKVideoGravity)videoGravity; // default is WKVideoGravityResizeAspect - (void)setLoops:(BOOL)loops;

- (void)setPosterImage:(nullable WKImage *)posterImage;

@end

Video Gravity• Conceptually identical to video gravity constants in AV

Foundation

• Resize — stretch pixels to fill container

• Aspect (Fit) — honoring aspect ratio, scale to reach one set of bounds (top/bottom or right/left), then letter-/pillar-box

• Aspect Fill — honoring aspect ratio, scale to reach both sets of bounds, allowing contents to be clipped if needed

Original 16:9 Frame

WKVideoGravity.ResizeAspect

This is the default for WKInterfaceMovie.videoGravity

WKVideoGravity.ResizeAspectFill

WKVideoGravity.Resize

Please never do this

WKInterfaceController

• Media playback and recording methods provided by the base controller class

• Can use these to play video whenever the app decides it’s time to do so

- (void)presentMediaPlayerControllerWithURL:(NSURL *)URL options:(nullable NSDictionary *)options completion:(void(^)(BOOL didPlayToEnd, NSTimeInterval endTime, NSError * __nullable error))completion WK_AVAILABLE_WATCHOS_ONLY(2.0);

- (void)dismissMediaPlayerController WK_AVAILABLE_WATCHOS_ONLY(2.0);

options dictionary of presentMediaPlayerController takes WKVideoGravity key, uses the WKVideoGravity constants

Video Considerations

1280x720 320x180

921,600 pixels 57,600 pixels

1/16 the size!

Encoding Guidelines

• Video Codec: H.264 High profile

• Bitrate: 160 kbps, 30 frames/sec

• Size: 320x180 (landscape), 208x260 (portrait)

• Audio: 32 kbps

160 kbps video

93% smaller file size!

WKInterfaceController Audio

• Playback works just like video

• Same recommendation for audio bitrate: 32 kbps

• Audio playback always uses Bluetooth headphones/speakers if paired, otherwise internal speaker

Audio Recording

-(void)presentAudioRecordingControllerWithOutputURL:(NSURL *)URL preset:(WKAudioRecordingPreset)preset maximumDuration:(NSTimeInterval)maximumDuration actionTitle:(nullable NSString *)actionTitle completion:(void (^)(BOOL didSave, NSError * __nullable error))completion WK_AVAILABLE_WATCHOS_ONLY(2.0);

- (void)dismissAudioRecordingController WK_AVAILABLE_WATCHOS_ONLY(2.0);

presentAudioRecordingControllerWithOutputURL:• actionTitle — A string to use in an “end recording”

button once audio capture is underway

• preset — WKAudioRecording quality preset

• NarrowBandSpeech (8 kHz sampling, 24 kbps AAC)

• WideBandSpeech (16 kHz sampling, 32 kbps AAC)

• HighQualityAudio (44.1 kHz sampling, 96 kbps AAC)

Audio Player

• “Headless” API for playing audio programmatically

• Assumes app provides own UI, or doesn’t have one

WKAudioFileAsset

+ (instancetype)assetWithURL:(NSURL *)URL;

+ (instancetype)assetWithURL:(NSURL *)URL title:(nullable NSString *)title albumTitle:(nullable NSString *)albumTitle artist:(nullable NSString *)artist;

WKAudioFilePlayerItem

• status — .Unknown, .ReadyToPlay, .Failed

• Notifications — Time jumped, Played to End, Failed to Play to End

+ (WKAudioFilePlayerItem *)playerItemWithAsset:(WKAudioFileAsset *)asset;

@property (nonatomic, readonly) WKAudioFileAsset *asset; @property (nonatomic, readonly) WKAudioFilePlayerItemStatus status; @property (nonatomic, readonly, nullable) NSError *error; @property (nonatomic, readonly) NSTimeInterval currentTime;

WKAudioFilePlayer+ (instancetype)playerWithPlayerItem: (WKAudioFilePlayerItem *)item;

- (void)play; - (void)pause;

- (void)replaceCurrentItemWithPlayerItem: (nullable WKAudioFilePlayerItem *)item;

@property(nonatomic, readonly, nullable) WKAudioFilePlayerItem *currentItem;

@property (nonatomic, readonly) WKAudioFilePlayerStatus status; @property (nonatomic, readonly, nullable) NSError *error;

@property (nonatomic) float rate;

@property (nonatomic, readonly) NSTimeInterval currentTime;

WKAudioFileQueuePlayer@interface WKAudioFileQueuePlayer : WKAudioFilePlayer

+ (instancetype)queuePlayerWithItems: (NSArray<WKAudioFilePlayerItem *> *)items;

- (void)advanceToNextItem;

- (void)appendItem:(WKAudioFilePlayerItem *)item;

- (void)removeItem:(WKAudioFilePlayerItem *)item;

- (void)removeAllItems;

@property(nonatomic, readonly) NSArray<WKAudioFilePlayerItem *> *items;

@end

And that’s it!

Video Killed The Rolex Star

Chris Adamson • @invalidname CocoaConf San Jose • November, 2015

Slides available at slideshare.net/invalidname Code (eventually) at github.com/invalidname

Now wait a darn minute!

AVFoundation, Core Image, and Core Audio are huge and complex, but required for lots of app types. Will any audio, video, or image APIs be available? Will they only be possible through limited, high-level interfaces?

http://www.marco.org/2015/05/28/watch-sdk-questions

Available System Technologies Extensions built specifically for watchOS 2 have access to the following system frameworks:

ClockKit Contacts Core Data Core Foundation Core Graphics Core Location Core Motion EventKit Foundation

HealthKit HomeKit ImageIO MapKit Mobile Core Services PassKit Security Watch Connectivity WatchKit

Notice the absence of AV Foundation, Core Audio, Core Media, and Core Video

From watchOS 2.0 Transition Guide

You must implement your extension using the frameworks in the watchOS SDK instead of the iOS SDK. For any features not available in the provided frameworks, you must rely on your iPhone app to perform the corresponding task.

overcast.fm

AUGraph

AUFilePlayer AURemoteIO

AVAudioEngine is conceptually similar, but can’t do a step we need later, so this is the Core Audio approach

AUGraph

AUFilePlayer AURemoteIOAUNew TimePitch

Offline AUGraphs

AUFilePlayer AUGeneric Output

AUNew TimePitch

AudioUnit Render()

+ExtAudioFile

Write()

OSStatus timeShift(NSURL *inSourceURL, NSURL *inDestinationURL, float inSpeed) { OSStatus err = noErr; // crate graph AUGraph auGraph; err = NewAUGraph(&auGraph); if (err != noErr) {goto fail;} // goto fail, go directly to fail... AudioComponentDescription compDesc = {0};

// file player NSLog (@"Making AUFilePlayer"); AUNode filePlayerNode; AudioUnit filePlayerUnit; compDesc.componentType = kAudioUnitType_Generator; compDesc.componentSubType = kAudioUnitSubType_AudioFilePlayer; compDesc.componentManufacturer = kAudioUnitManufacturer_Apple;

err = AUGraphAddNode(auGraph, &compDesc, &filePlayerNode); if (err != noErr) {goto fail;} // goto fail, go directly to fail... err = AUGraphNodeInfo(auGraph, filePlayerNode, NULL, &filePlayerUnit); if (err != noErr) {goto fail;} // goto fail, go directly to fail... // AUNewTimePitch NSLog (@"Making AUNewTimePitch"); AUNode timePitchNode; AudioUnit timePitchUnit; memset(&compDesc, 0, sizeof(compDesc)); compDesc.componentType = kAudioUnitType_FormatConverter; compDesc.componentSubType = kAudioUnitSubType_NewTimePitch; compDesc.componentManufacturer = kAudioUnitManufacturer_Apple;

// and another 100 lines or so of this!

?

Watch ConnectivityThe Watch Connectivity framework (WatchConnectivity.framework) provides a two-way communications conduit between an iOS app and a WatchKit extension on a paired Apple Watch. Apps use this framework to pass files and data back and forth. Most transfers happen in the background when the receiving app is inactive. When the app wakes up, it is notified of any data that arrived while it was inactive. Live communication is also possible when both apps are active.

Demo (beginning)

Phone: Activate WCSession

if WCSession.isSupported() { WCSession.defaultSession().delegate = self WCSession.defaultSession().activateSession() NSLog ("iPhone WCSession ready") }

Watch: Activate WCSession

if WCSession.isSupported() { WCSession.defaultSession().delegate = self WCSession.defaultSession().activateSession() NSLog ("Watch WCSession ready") }

Phone: transfer filelet transfer = WCSession.defaultSession().transferFile(url, metadata: nil) NSLog ("transferring: \(transfer)")

//MARK: WatchConnectivity delegate func session(session: WCSession, didFinishFileTransfer fileTransfer: WCSessionFileTransfer, error: NSError?) { NSLog ("didFinishFileTransfer: \(fileTransfer), error: \(error)") }

Watch: receive file

func session(session: WCSession, didReceiveFile file: WCSessionFile) { NSLog ("didReceiveFile: \(file)") let docsURL = NSFileManager.defaultManager().URLsForDirectory( NSSearchPathDirectory.DocumentDirectory, inDomains: NSSearchPathDomainMask.UserDomainMask).first let storageURL = docsURL?.URLByAppendingPathComponent( file.fileURL.lastPathComponent!) do { try NSFileManager.defaultManager().copyItemAtURL(file.fileURL, toURL: storageURL!) pushControllerWithName("player", context: storageURL) } catch let error as NSError { NSLog ("copy error: \(error)") } }

file: If you want to keep the file referenced by this parameter, you must move it synchronously to a new location during your implementation of this method. If you do not move the file, the system deletes it after this method returns.

Demo (continued!)

What about video?

AVAssetExportSession

AVAssetExportSession

Export Preset Names for Apple DevicesYou use these export options to produce files that can be played on the specific Apple devices.DeclarationSWIFTlet AVAssetExportPresetAppleM4VCellular: String let AVAssetExportPresetAppleM4ViPod: String let AVAssetExportPresetAppleM4V480pSD: String let AVAssetExportPresetAppleM4VAppleTV: String let AVAssetExportPresetAppleM4VWiFi: String let AVAssetExportPresetAppleM4V720pHD: String let AVAssetExportPresetAppleM4V1080pHD: String let AVAssetExportPresetAppleProRes422LPCM: String

AVAssetWriter

• Low-level access for writing media files

• Allows you to specify output size, encoding settings, bitrate, etc.

• Requires you to write each CMSampleBuffer individually

AVAssetWriterInput

SWIFTlet AVVideoCodecKey: String let AVVideoCodecH264: String let AVVideoCodecJPEG: String let AVVideoCodecAppleProRes4444: String let AVVideoCodecAppleProRes422: String let AVVideoWidthKey: String let AVVideoHeightKey: String let AVVideoCompressionPropertiesKey: String let AVVideoAverageBitRateKey: String let AVVideoQualityKey: String let AVVideoMaxKeyFrameIntervalKey: String let AVVideoProfileLevelKey: String let AVVideoProfileLevelH264Baseline30: String let AVVideoProfileLevelH264Baseline31: String let AVVideoProfileLevelH264Baseline41: String let AVVideoProfileLevelH264Main30: String

let AVVideoProfileLevelH264Main31: String let AVVideoProfileLevelH264Main32: String let AVVideoProfileLevelH264Main41: String let AVVideoProfileLevelH264High40: String let AVVideoProfileLevelH264High41: String let AVVideoPixelAspectRatioKey: String let AVVideoPixelAspectRatioHorizontalSpacingKey: String let AVVideoPixelAspectRatioVerticalSpacingKey: String let AVVideoCleanApertureKey: String let AVVideoCleanApertureWidthKey: String let AVVideoCleanApertureHeightKey: String let AVVideoCleanApertureHorizontalOffsetKey: String let AVVideoCleanApertureVerticalOffsetKey: String

Video SettingsThese constants define dictionary keys for configuring video compression and compression settings for video assets.

- initWithMediaType:outputSettings:sourceFormatHint:

AVAssetReader Output

AVAssetReader Output

AVAssetWriter Input

AVAssetWriter Input

Audio path (output settings to change bitrate)

Video path (output settings to change size, encoding, bitrate)

Takeaways

• Basic support for file-based audio/video playback and audio recording

• Playback files are either in your bundle or downloaded by your iOS app + extension

• Any downloading or media processing needs to be performed on your iPhone, then sent to watch extension/app via Watch Connectivity

Video Killed The Rolex Star

Chris Adamson • @invalidname CocoaConf San Jose • November, 2015

Slides will be at slideshare.net/invalidname Code (eventually, maybe) at github.com/invalidname