Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)

Post on 13-Jan-2015

868 views 0 download

Tags:

description

Graphics on iOS and OS X isn't just about stroking shapes and paths in Core Graphics and trying to figure out OpenGL. The Core Image framework gives you access to about 100 built-in filters, providing everything from photographic effects and color manipulation to face-finding and QR Code generation. It can leverage the power of the GPU to provide performance fast enough to perform complex effects work on real-time video capture. But even if you're not writing the next Final Cut Pro or Photoshop, it's easy to call in Core Image for simple tasks, like putting a blur in part of your UI for transitions or privacy reasons. In this session, we'll explore the many ways Core Image can make your app sizzle.

Transcript of Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)

Core Image The Most Fun API You’re Not Using

Chris Adamson • @invalidname CocoaConf Columbus, August 2014

“Core Image is an image processing and analysis technology designed to provide near real-time

processing for still and video images.”

Agenda

• Images, Filters, and Contexts

• The Core Image Filter Gallery

• Neat Tricks with Built-In Filters

• Core Image on OS X

Core Image, Core Concepts

• Core Image is of the time

• A chain of filters describes a “recipe” of processing steps to be applied to one or more images

• “Stringly typed”

• You only get pixels when you render

Typical Workflow

• Start with a source CIImage

• Apply one or more filters

• Render resulting CIImage to a CIContext, or convert CIImage out to another type

• A few filters take or produce types other than CIImage (CIQRCodeGenerator)

CIImage

• An image provided to or produced by Core Image

• But no bitmap of pixel data!

• Immutable

• -imageByCroppingToRect,-imageByApplyingTransform

• -extent — a CGRect of the image’s size

CIImage sources

• NSURL

• CGImageRef

• Bitmap or JPEG/PNG/TIFF in NSData

• OpenGL texture

• Core Video image/pixel buffer

CIContext• Rendering destination for a CIImage (-

[drawImage:inRect:fromRect:])

• This is where you get pixels (also, this is the processor-intenstive part)

• On iOS, must be created from an EAGLContext. On Mac, can be created with CGContextRef

• Can also produce output as a CGImageRef, bitmap data, or a CVPixelBuffer (iOS only)

????

CIFilter

• Performs an image processing operation

• Typically takes and produces a CIImage

• All parameters are provided via -[setValue:forKey:]

• Stringly-typed!

• Output is retrieved with -[valueForKey:]

–Core Image Cat

“I can has filterz?

Yes, you can has Filterz!

Core Image Filter ReferenceFilter NameParameters

Note the type & number to provide

CategoriesWatch for CICategoryBuiltIn

and CICategoryVideo

Example Figure

AvailabilityWatch for versioning and

OS X-only filters

Filter Categories

• Group filters by functionality: CICategoryBlur, CICategoryGenerator, CICategoryCompositeOperation, etc.

• Also group filters by availability and appropriateness: CICategoryBuiltIn, CICategoryVideo, CICategoryNonSquarePixels

CICategoryGenerator

• No input image, just produces an output

• CICategoryGradient is also output-only

• Example: CICheckerboardGenerator

CICategoryBlur

• Algorithmically spreads/blends pixels

• CICategorySharpen offers an opposite effect

• Example: CIGaussianBlur

CICategoryColorAdjustement

• Changes distribution of color throughout an image

• Example: CIColorControls (adjusts saturation, brightness, contrast)

CICategoryColorEffect

• Color changes that affect the subjective nature of the image

• Example: CIPhotoEffectNoir

CICategoryDistortionEffect

• Moves pixels to achieve an effect

• Example: CITorusLensDistortion

CICategoryStylize

• Various stylistic effects

• Example: CIPointillize

CICategoryGeometryAdjustment

• Moves pixels via cropping, affine transforms, etc.

• Example: CICrop

CICategoryTileEffect

• Repeatedly copies all or part of an image

• Example: CIAffineTile

CICategoryCompositeOperation

• Combines multiple images

• Example: CISourceOverCompositing

Demo

Creating CIColorControls Filter

_colorControlsFilter

Setting input values

[self! ! ! ! ! ! ! ! ! ! ! ![self! ! ! ! ! ! ! ! ! ! ! ![self! ! ! ! ! ! ! ! ! ! ! !

Setting input image

CIImage! ! ! ! ! ! ! ! ! ![self! ! ! ! ! ! ! !

Getting output image

ciImage = [UIImageself

Other output options

• Use a CIContext

• -[drawImage:inRect:fromRect:] draws pixels to the EAGLContext (iOS) or CGContextRef (OS X) that the CIContext was created from.

• CIContext can also render to a void* bitmap

• On iOS, can create a CVPixelBufferRef, typically used for writing to a file with AVAssetWriter

Chaining filters

• Use the output of one filter as the input to the next

• This doesn’t cost anything, because the CIImages just hold state, not pixels

Demo

Creating CIContext

if (! ! !! ! ! ! ! !! !}!! ! !// make CIContext from GL context, clearing out default color spaceself! ! ! ! ! ! ! ! ! ! ! ! !! ! ! !! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !!

Note: This is in a subclass of GLKView

Set up Sepia Tone filter

_sepiaToneFilter[_sepiaToneFilter! ! ! ! ! ! ! ! !

Set up Mask to Alpha filter

UIImage_circleMaskFilterCIImage! ! ! ! ! ! ! ! ! ! ! ! circleImageUI.[_circleMaskFilter! ! ! ! ! ! ! _circleMask

circle-mask-100x100.png

Set up Blend with Mask filter_constantColorGeneratorFilter! ! ! ! ! ! ! ! ! ![_constantColorGeneratorFilter! ! ! [! ! ! ! ! ! ! ! ! ! ! ! ! ! forKey_backgroundAlphaFill! ! ! ! ! ! ! ! ! ! !!_blendWithMaskFilter[_blendWithMaskFilter! ! ! ! ! ! ! ![_blendWithMaskFilter! ! ! ! ! ! ! !

Apply filters

// Get CIImage from source imageCGImageRef! ! ! ! ! ! ! !loupeImage = [!// Apply sepia filter[selfloupeImage = [!// Set sepia-filtered image as input to blend-with-mask[_blendWithMaskFilter! ! ! ! ! !loupeImage = [

Render in CIContextif ([! [}!! ![self!// GL-on-Retina fixCGRectdrawBoundsInPoints.drawBoundsInPoints.! !// drawing to CIContext draws to the EAGLESContext it's based on[self! ! ! ! !! ! !! ! ! ! ! ! ! !!// Refresh GLKView contents immediately[self

Working with Video

• AVFoundation AVCaptureVideoDataOutput and AVAssetReader deliver CMSampleBuffers

• CMSampleBuffers have timing information and CVImageBuffers/CVPixelBuffers

• +[CIImage imageWithCVPixelBuffer:]

Demo

Chroma Key (“green screen” recipe

• Use a CIColorCube to map green-ish colors to transparent

• Use CISourceOverCompositing to draw this alpha’ed image over another image

CIColorCube

Maps colors from one RGB “cube” to another

http://en.wikipedia.org/wiki/RGB_color_space

Using CIColorCube

CIColorCube maps green(-ish) colors to 0.0 alpha, all other colors pass through

CISourceOverCompositing

inputBackgroundImage inputImage

outputImage

CIColorCube Dataconst unsigned int size = 64;!size_t cubeDataSize = size * size * size * sizeof (float) * 4;!float *keyCubeData = (float *)malloc (cubeDataSize);!//! float *alphaMatteCubeData = (float *)malloc (cubeDataSize);! //! float rgb[3], hsv[3], *keyC = keyCubeData, *alphaC = alphaMatteCubeData;! float rgb[3], hsv[3], *keyC = keyCubeData;!// Populate cube with a simple gradient going from 0 to 1!for (int z = 0; z < size; z++){!! rgb[2] = ((double)z)/(size-1); // Blue value! ! for (int y = 0; y < size; y++){! ! ! rgb[1] = ((double)y)/(size-1); // Green value! ! ! for (int x = 0; x < size; x ++){! ! ! ! rgb[0] = ((double)x)/(size-1); // Red value! !! ! ! // Convert RGB to HSV! ! ! ! // You can find publicly available rgbToHSV functions on the Internet! !! ! ! RGBtoHSV(rgb[0], rgb[1], rgb[2],! ! ! ! ! ! &hsv[0], &hsv[1], &hsv[2]);! !! ! ! // RGBtoHSV uses 0 to 360 for hue, while UIColor (used above) uses 0 to 1.! ! ! ! hsv[0] /= 360.0;! ! ! ! ! ! ! ! // Use the hue value to determine which to make transparent! ! ! ! // The minimum and maximum hue angle depends on! ! ! ! // the color you want to remove! ! ! ! ! ! ! ! bool keyed = (hsv[0] > minHueAngle && hsv[0] < maxHueAngle) &&! ! ! ! (hsv[1] > minSaturation && hsv[1] < maxSaturation) &&! ! ! ! (hsv[2] > minBrightness && hsv[2] < maxBrightness);! ! ! ! ! ! ! ! float alpha = keyed ? 0.0f : 1.0f;! ! ! ! ! ! ! ! // re-calculate c pointer! ! ! ! keyC = (((z * size * size) + (y * size) + x) * sizeof(float)) + keyCubeData;! ! ! ! ! ! ! ! // Calculate premultiplied alpha values for the cube! ! ! ! keyC[0] = rgb[0] * alpha;! ! ! ! keyC[1] = rgb[1] * alpha;! ! ! ! keyC[2] = rgb[2] * alpha;! ! ! ! keyC[3] = alpha;! ! ! ! ! ! ! ! ! ! ! }! ! }! }!

See “Chroma Key Filter Recipe” in Core Image Programming Guide

Create CIColorCube from mapping data

// build the color cube filter and set its data to above!self.colorCubeFilter = [CIFilter filterWithName:@"CIColorCube"];![self.colorCubeFilter setValue:[NSNumber numberWithInt:size]!! !! ! ! ! ! ! ! ! forKey:@"inputCubeDimension"];! NSData *data = [NSData dataWithBytesNoCopy:keyCubeData!! !! ! ! ! ! ! ! length:cubeDataSize! ! !! ! ! ! ! freeWhenDone:YES];! [self.colorCubeFilter setValue:data forKey:@"inputCubeData"];!

Create CISourceOverCompositing

// source over filter!self.backgroundImage = [UIImage imageNamed:!! !! ! ! ! ! ! ! ! ! @"img_washington_small_02.jpg"]; ! self.backgroundCIImage = [CIImage imageWithCGImage:!! !! ! ! ! ! ! ! ! ! ! self.backgroundImage.CGImage];! self.sourceOverFilter = [CIFilter filterWithName:!! !! ! ! ! ! ! ! ! ! @"CISourceOverCompositing"];! [self.sourceOverFilter setValue:self.backgroundCIImage !! !! ! ! ! ! ! ! forKeyPath:@"inputBackgroundImage"];!

Apply Filters in Capture Callback

CIImage *bufferCIImage = [CIImage imageWithCVPixelBuffer:cvBuffer];!![self.colorCubeFilter setValue:bufferCIImage !! !! ! ! ! ! ! ! ! ! forKey:kCIInputImageKey];! CIImage *keyedCameraImage = [self.colorCubeFilter valueForKey:!! !! ! ! ! ! ! ! ! ! ! ! kCIOutputImageKey];! ![self.sourceOverFilter setValue:keyedCameraImage !! !! ! ! ! ! ! ! ! forKeyPath:kCIInputImageKey];! !CIImage *compositedImage = [self.sourceOverFilter valueForKeyPath:!! !! ! ! ! ! ! ! ! ! ! kCIOutputImageKey];

Then draw compositedImage to CIContext as before

Other Points of Interest• CIQRCodeGenerator filter — Converts data (e.g., a string) to

a QR Code

• CILenticularHaloGenerator filter — aka, lens flare

• CIDetector — Class (not a filter) to find features in images. Currently only supports face finding (returned as an array of CIFeatures). Optionally detects smiles and eye blinks within faces.

• CIImage has a red-eye enhancement that takes the array of face CIFeatures to tell it where to apply the effect

Core Image on OS X• Core Image is part of QuartzCore (or Image Kit), so

you don’t @import CoreImage

• Many more filters are available

• Can create your own filter with OpenGL Shading Language (plus some CI extensions). See CIKernel.

• Also available in iOS 8

• Filters can be set on CALayers

CALayer Filters on OS X• Views must be layer-backed (obviously)

• Must also call -[NSView setLayerUsesCoreImageFilters:] on 10.9+

• CALayer has properties: filters, compositingFilter, backgroundFilters, minificationFilter, magnificationFilter

• These exist on iOS, but do nothing

Demo

Adding CIPixellate to layer’s filters

self! !! ! ! ! ! ! ! ! self[self! !! ! ! ! ! [ ! !! ! ! ! ! ! ! ! ![self! !! ! ! ! !! ! ! ! ! ! ! ! ! self! !! ! !

Updating a layer’s filters

-(void! [ ! !! ! ! ! ! !! ! ! ! ! ! ! ! ! ! ! !! ! ! ! }

Wrap Up: Stuff to Remember

• Get psyched about filters, but remember to check that they’re on your targeted platform/version.

• Drawing to a CIContext on iOS must be GL-backed (e.g., with a GLKView)

Q&ASlides and code will be posted to:

http://www.slideshare.net/invalidname/!

@invalidname http://subfurther.com/blog