Swift Filter Chaining Demo Application
Join the DZone community and get the full member experience.
Join For FreeOff the back of my last Swift experiment, an image tone curve editor, I thought I'd go one step further and create an iPad app that allows users to build a chain of image filters.
My Filter Chaining Demo presents filters as a series of nodes inside a UICollectionView at the bottom of the screen. The first node, rendered as a circle, allows the user the select an image and subsequent nodes, rendered as squares, allow the user to select a filter and edit its parameters (using my numeric dials) or change its filter type in the middle panel.
At the top of the screen are two images; on the left, with a blue border, is a render of the filter chain up to the selected image and and the right, with a black border, is a render of the entire filter chain.
The state of the application is modelled with an array of UserDefinedFilter instances. These contain the filter type and the user set values of the parameters for filter. I've also used Swift's ability to overload operators to create a bespoke '==' operator. Since each UserDefinedFilter has a UUID constant, my new '==' looks like this:
func == (left: UserDefinedFilter, right: UserDefinedFilter) -> Bool { return left.uuid == right.uuid }
Each UserDefinedFilter has a Filter instance which contains an instance of a CIFilter. Filter also has an array of FilterParameter structures. For example, The Color Control filter contains three FilterParameter instances for saturation, brightness and contrast.
The code in the view controller contains three main components for the three sections: FiltersCollectionView contains the filter nodes, FilterParameterEditor contains a picker to change the filter and parameter values and ImagePreviewcontains the two images. The filtering 'engine' is all done inside a separate class, FilteringDelegate.
The FiltersCollectionView component is actually a UIControl with a UICollectionView added to it as a subview. It acts as the UICollectionView's datasource and delegate so has to implement two protocols: UICollectionViewDataSource andUICollectionViewDelegate. As a datasource, the component returns the number of items in the user's array of filters and as a delegate, the component returns the class I want to use as an item renderer, FiltersCollectionViewCell.
Implementing a custom item renderer takes a few steps: just after instantiation, I have t register the renderers class:
uiCollectionView = UICollectionView(frame: CGRectZero, collectionViewLayout: layout) uiCollectionView.registerClass(FiltersCollectionViewCell.self, forCellWithReuseIdentifier: "Cell")
...and inside the delegate's collectionView() method for cellForItemAtIndexPath, I have to define which class to use and inject the correct item into the renderer:
func collectionView(collectionView: UICollectionView, cellForItemAtIndexPath indexPath: NSIndexPath) -> UICollectionViewCell { let cell = collectionView.dequeueReusableCellWithReuseIdentifier("Cell", forIndexPath: indexPath) as FiltersCollectionViewCell cell.userDefinedFilter = userDefinedFilters[indexPath.item] return cell }
FiltersCollectionViewCell extends UICollectionViewCell and comes with variables such as selected. So, using that with a few properties of the user defined filter being passed in, I set the colours and shape of the renderer internally:
func updateUI() { label.textColor = selected ? UIColor.blueColor() : UIColor.lightGrayColor() backgroundColor = UIColor.whiteColor() if let userDefinedFilterConst = userDefinedFilter { layer.borderWidth = 2 layer.cornerRadius = (userDefinedFilterConst.isImageInputNode || userDefinedFilterConst.isImageOutputNode) ? frame.width / 2 : 10 layer.borderColor = userDefinedFilterConst.isImageOutputNode ? UIColor.blackColor().CGColor : selected ? UIColor.blueColor().CGColor : UIColor.lightGrayColor().CGColor } }
When the user selects an item in the collection view, it sends an action for the .ValueChanged control event which is picked up by the view controller. This sets its own selectedFilter property which, through a didSet observer sets the filter on the FilterParameterEditor.
FilterParameterEditor contains both a picker to change the filter type and launches an image picker, so it implements four additional protocols: UIPickerViewDataSource, UIPickerViewDelegate, UINavigationControllerDelegate andUIImagePickerControllerDelegate. I would normally aim for classes to have a fewer responsibilities that this, so this class is ripe for refactoring.
The FilterParameterEditor responds when its userDefinedFilter property is changed via, you guessed it, a didSetobserver. If its userDefinedFilter has a filter (the terminal nodes don't), it creates the correct number of numeric dials - one for each of the filter's parameters. If the value happens to be the first image loader node, it creates a 'Load Image' button.
When any of the dials are changed by the user, FilterParameterEditor dispatches an action for .ValueChanged which, again, is picked up in the view controller and this is when the filtering magic begins.
The view controller has a constant instance of FilteringDelegate which exposes an applyFilters() function which accepts the totality of all the user defined filters, the currently selected filter and a callback function which is invoked once the filtering is done. So, the view controller invokes that function like so:
filteringDelegate.applyFilters(userDefinedFilters, selectedUserDefinedFilter: selectedFilter!, imagesDidChange)
Using Tobias's Async library, I send all that off, along with a reference to the CIContext to a background thread throughapplyFiltersAsync().
In a nutshell, applyFiltersAsync() loops over all the selected filters and through all the filter parameters of all those filters and creates a chain of filtersWhen it reaches either the selected filter or the final filter, it does a little extra work and creates an actual UIImage instance that can be displayed on the screen:
if userDefinedFilter == selectedUserDefinedFilter || index == userDefinedFilters.count - 2 { let filteredImageRef = context.createCGImage(filteredImageData, fromRect: filteredImageData.extent()) let filteredImage = UIImage(CGImage: filteredImageRef) if userDefinedFilter == selectedUserDefinedFilter { selectedImage = filteredImage } if (index == userDefinedFilters.count - 2) { finalImage = filteredImage } }
These images are passed back to the view controller wrapped up in a FilteredImages struct via the callback function, which then hands those two images to the image preview widget to be displayed:
func imagesDidChange(images: FilteredImages) { imagePreview.filteredImages = images }
Now we have an almost usable little application for creating complex chains of filters. However, there's still plenty of room for improvement. On my personal product backlog is:
- Removing loops over arrays in favour of a more functional approach
- Tidy up replicated code in view controller and move into observers
- Implementing PHImageManager (see this article at NSHipster)
- Animating the transitions when adding and removing components such as numeric dials
- Using CoreData to persist application state
- Add my Tone Curve widget, natch.
Published at DZone with permission of Simon Gladman, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Trending
-
RBAC With API Gateway and Open Policy Agent (OPA)
-
A Data-Driven Approach to Application Modernization
-
JavaFX Goes Mobile
-
How To Scan and Validate Image Uploads in Java
Comments