Swifter Swift Image Processing With GPUImage
Join the DZone community and get the full member experience.
Join For FreeI'm a big fan of Apple's Core Image technology: my Nodality application is based entirely around Core Image filters. However, for new users, the code for adding a simple filter to an image is a little oblique and the implementation is very"stringly" typed
This post looks at an alternative, GPUImage from Brad Larson. GPUImage is a framework containing a rich set of image filters, many of which aren't in Core Image. It has a far simpler and more strongly typed API and, in some cases, is faster than Core Image.
To kick off, let's look at the code required to apply a Gaussian blue to an image (inputImage) using Core Filter:
let inputImage = UIImage() let ciContext = CIContext(options: nil) let blurFilter = CIFilter(name: "CIGaussianBlur") blurFilter.setValue(CIImage(image: inputImage), forKey: "inputImage") blurFilter.setValue(10, forKey: "inputRadius") let outputImageData = blurFilter.valueForKey("outputImage") as CIImage! let outputImageRef: CGImage = ciContext.createCGImage(outputImageData, fromRect: outputImageData.extent()) let outputImage = UIImage(CGImage: outputImageRef)!
...not only do we need to explicitly define the context, both the filter name and parameter are strings and we need a few steps to convert the filter's output into a UIImage.
Here's the same functionality using GPUImage:
let inputImage = UIImage() let blurFilter = GPUImageGaussianBlurFilter() blurFilter.blurRadiusInPixels = 10 let outputImage = blurFilter.imageByFilteringImage(inputImage)
Here, both the filter and its blur radius parameter are properly typed and the filter returns a UIImage instance.
On the flip-side, there is some setting up to do. Once you've got a local copy of GPUImage, drag the framework project into your application's project. Then under the application target's build phases, add a target dependency, a reference to GPUImage.framework under link binaries and a copy files stage.
Your build phases screen should look like this:
Then, by simply importing GPUImage, you're ready to roll.
To show off some of the funkier filters contained in GPUImage, I've created a little demonstration app,GPUImageDemo.
The app demonstrates Polar Pixellate, Polka Dot, Sketch, Threshold Sketch, Toon, Smooth Toon, Emboss, Sphere Refraction and Glass Sphere - none of which are available in Core Image.
The filtering work is all done in my GPUImageDelegate class where a switch statement declares aGPUImageOutput variable (the class that includes the imageByFiltering() method) and sets it to the appropriate concrete class depending on the user interface.
For example, if the picker is set the threshold sketch, the following case statement is executed:
case ImageFilter.ThresholdSketch: gpuImageFilter = GPUImageThresholdSketchFilter() if let gpuImageFilter = gpuImageFilter as? GPUImageThresholdSketchFilter { if values.count > 1 { gpuImageFilter.edgeStrength = values[0] gpuImageFilter.threshold = values[1] } }
If you build this project, you may encounter a build error on the documentation target. I've simply deleted this target on affected machines.
GPUImage is fast enough to filter video. I've taken my recent two million particles experiment and added a post processing step that consists of a cartoon filter and an emboss filter. These are packaged together in aGPUImageFilterGroup:
let toonFilter = GPUImageSmoothToonFilter() let embossFilter = GPUImageEmbossFilter() let filterGroup = GPUImageFilterGroup() toonFilter.threshold = 1 embossFilter.intensity = 2 filterGroup.addFilter(toonFilter) filterGroup.addFilter(embossFilter) toonFilter.addTarget(embossFilter) filterGroup.initialFilters = [ toonFilter ] filterGroup.terminalFilter = embossFilter
Since GPUImageFilterGroupextends GPUImageFilterOutput, I can take the output from the Metal texture, create aUIImage instance of it and pass it to the composite filter:
self.imageView.image = self.filterGroup.imageByFilteringImage(UIImage(CGImage: imageRef)!)
On my iPad Air 2, the final result of 2,000,000 particles with a two filter post process on a 1,024 x 1,024 image still runs at around 20 frames per second. Here's a real time screen capture:
The source code for my GPUImageDemo is available at my GitHub repository here and GPUImagelives here.
Published at DZone with permission of Simon Gladman, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments