# A Core Image Transverse Chromatic Aberration Filter in Swift

### Take a peek at some more image trickery from Simon Gladman as he takes you through his kernel line-by-line.

· Mobile Zone · Opinion
Save
3.37K Views Transverse or lateral chromatic aberration is an optical artifact caused by different wavelengths of light focussing at different positions on a camera's focal plane. It appears as blue and purple fringing which increases towards the edge of an image. In addition to the Wikipedia entry on chromatic aberration, there's a great article here at Photography Life which discusses the phenomenon in great detail.

Although the bane of many photographers, I've created a Core Image filter to simulate the effect which can be used to add a low-fi, grungy look to images. The basic mechanics of the filter are pretty simple: it essentially consists of three zoom filters each with slightly different offsets for red, green, and blue. The technique borrows from Session 515: Developing Core Image Filters for iOS talk at WWDC 2014.

Let's look at the general kernel I've written for the filter and then step through it line by line:

`````` let transverseChromaticAberrationKernel = CIKernel(string:
"kernel vec4 chromaticAberrationFunction(sampler image, vec2 size, float sampleCount, float start, float blur) {" +
"  int sampleCountInt = int(floor(sampleCount));" + // 1
"  vec4 accumulator = vec4(0.0);" + // 2
"  vec2 dc = destCoord(); " + // 3
"  float normalisedValue = length(((dc / size) - 0.5) * 2.0);" + // 4
"  float strength = clamp((normalisedValue - start) * (1.0 / (1.0 - start)), 0.0, 1.0); " + // 5

"  vec2 vector = normalize((dc - (size / 2.0)) / size);" + // 6
"  vec2 velocity = vector * strength * blur; " + // 7

"  vec2 redOffset = -vector * strength * (blur * 1.0); " + // 8
"  vec2 greenOffset = -vector * strength * (blur * 1.5); " + // 8
"  vec2 blueOffset = -vector * strength * (blur * 2.0); " + // 8

"  for (int i=0; i < sampleCountInt; i++) { " + // 9
"      accumulator.r += sample(image, samplerTransform (image, dc + redOffset)).r; " + // 10
"      redOffset -= velocity / sampleCount; " + // 11

"      accumulator.g += sample(image, samplerTransform (image, dc + greenOffset)).g; " +
"      greenOffset -= velocity / sampleCount; " +

"      accumulator.b += sample(image, samplerTransform (image, dc + blueOffset)).b; " +
"      blueOffset -= velocity / sampleCount; " +
"  } " +
"  return vec4(vec3(accumulator / sampleCount), 1.0); " + // 12

"}")
``````

1. Because Core Image Kernel Language only allows float scalar arguments and `sampleCount` needs to be an integer to construct a loop, I create an `int` version of it.
2. As the filter loops over pixels, it will accumulate their red, green, and blue values into this three component vector.
3. `destCoord()` returns the position of the pixel currently being computed in the coordinate space of the image being rendered.
4. Although the filter can calculate the size of the image with `samplerSize()`, passing the size as an argument reduces the amount of processing the kernel needs to do. This line converts the coordinates to the range -1 to +1 for both axes.
5. `strength` is a normalized value that starts at zero at the beginning of the effect and reaches one at the edge of the image.
6. `vector` is the direction of the effect which radiates from the center of the image. `normalize` keeps the sum of the vector to one.
7. Multiplying the direction by the strength by the maximum blur argument gives a velocity vector for how much blur and in what direction the filter applies to the current pixel.
8. Transverse chromatic aberration increases in strength proportionally to the wavelength of light. The filter simulates this by offsetting the effect the least for red and the most for blue.
9. The filter iterates once for each `sampleCount`.
10. For each color, the filter takes accumulates a sample offset along the direction of the vector - effectively summing the pixels along a radial line.
11. The offset for each color is decremented.
12. The accumulated colors are averages and returned with an alpha value of 1.0.

The number of samples controls the quality of the effect and the performance of the filter. With a large maximum blur of 20 but only 3 samples, the effects looks like: But with the same blur amount and 40 samples, the effect is a lot smoother: The falloff parameter controls where the effect begins - a value of 0.75 means the effect begins three-quarters of the distance from the center of the image to the edge: ### Core Image for Swift

There's a full `CIFilter` implementation of this filter under the Filterpedia repository. However, if you'd like to learn more about writing custom Core Image kernels with Core Image Kernel Language, may I recommend my book, Core Image for Swift. It is available from both Apple's iBooks Store or, as a PDF, from Gumroad. IMHO, the iBooks version is better, especially as it contains video assets which the PDF version doesn't.

Core Image for Swift from iBooks Store

Core Image for Swift from Gumroad

Topics:
swift, core image

Published at DZone with permission of Simon Gladman, DZone MVB.

Opinions expressed by DZone contributors are their own.