DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
  1. DZone
  2. Coding
  3. Frameworks
  4. A Core Image Transverse Chromatic Aberration Filter in Swift

A Core Image Transverse Chromatic Aberration Filter in Swift

Take a peek at some more image trickery from Simon Gladman as he takes you through his kernel line-by-line.

Simon Gladman user avatar by
Simon Gladman
·
May. 26, 16 · Opinion
Like (1)
Save
Tweet
Share
4.24K Views

Join the DZone community and get the full member experience.

Join For Free


Transverse or lateral chromatic aberration is an optical artifact caused by different wavelengths of light focussing at different positions on a camera's focal plane. It appears as blue and purple fringing which increases towards the edge of an image. In addition to the Wikipedia entry on chromatic aberration, there's a great article here at Photography Life which discusses the phenomenon in great detail.


Although the bane of many photographers, I've created a Core Image filter to simulate the effect which can be used to add a low-fi, grungy look to images. The basic mechanics of the filter are pretty simple: it essentially consists of three zoom filters each with slightly different offsets for red, green, and blue. The technique borrows from Session 515: Developing Core Image Filters for iOS talk at WWDC 2014. 


Let's look at the general kernel I've written for the filter and then step through it line by line:


 let transverseChromaticAberrationKernel = CIKernel(string:
        "kernel vec4 chromaticAberrationFunction(sampler image, vec2 size, float sampleCount, float start, float blur) {" +
        "  int sampleCountInt = int(floor(sampleCount));" + // 1
        "  vec4 accumulator = vec4(0.0);" + // 2
        "  vec2 dc = destCoord(); " + // 3
        "  float normalisedValue = length(((dc / size) - 0.5) * 2.0);" + // 4
        "  float strength = clamp((normalisedValue - start) * (1.0 / (1.0 - start)), 0.0, 1.0); " + // 5

        "  vec2 vector = normalize((dc - (size / 2.0)) / size);" + // 6
        "  vec2 velocity = vector * strength * blur; " + // 7

        "  vec2 redOffset = -vector * strength * (blur * 1.0); " + // 8
        "  vec2 greenOffset = -vector * strength * (blur * 1.5); " + // 8
        "  vec2 blueOffset = -vector * strength * (blur * 2.0); " + // 8

        "  for (int i=0; i < sampleCountInt; i++) { " + // 9
        "      accumulator.r += sample(image, samplerTransform (image, dc + redOffset)).r; " + // 10
        "      redOffset -= velocity / sampleCount; " + // 11

        "      accumulator.g += sample(image, samplerTransform (image, dc + greenOffset)).g; " +
        "      greenOffset -= velocity / sampleCount; " +

        "      accumulator.b += sample(image, samplerTransform (image, dc + blueOffset)).b; " +
        "      blueOffset -= velocity / sampleCount; " +
        "  } " +
        "  return vec4(vec3(accumulator / sampleCount), 1.0); " + // 12

        "}")


  1. Because Core Image Kernel Language only allows float scalar arguments and sampleCount needs to be an integer to construct a loop, I create an int version of it. 
  2. As the filter loops over pixels, it will accumulate their red, green, and blue values into this three component vector. 
  3. destCoord() returns the position of the pixel currently being computed in the coordinate space of the image being rendered. 
  4. Although the filter can calculate the size of the image with samplerSize(), passing the size as an argument reduces the amount of processing the kernel needs to do. This line converts the coordinates to the range -1 to +1 for both axes. 
  5. strength is a normalized value that starts at zero at the beginning of the effect and reaches one at the edge of the image. 
  6. vector is the direction of the effect which radiates from the center of the image. normalize keeps the sum of the vector to one. 
  7. Multiplying the direction by the strength by the maximum blur argument gives a velocity vector for how much blur and in what direction the filter applies to the current pixel.
  8. Transverse chromatic aberration increases in strength proportionally to the wavelength of light. The filter simulates this by offsetting the effect the least for red and the most for blue.
  9. The filter iterates once for each sampleCount.
  10. For each color, the filter takes accumulates a sample offset along the direction of the vector - effectively summing the pixels along a radial line.
  11. The offset for each color is decremented.
  12. The accumulated colors are averages and returned with an alpha value of 1.0.

The number of samples controls the quality of the effect and the performance of the filter. With a large maximum blur of 20 but only 3 samples, the effects looks like:



But with the same blur amount and 40 samples, the effect is a lot smoother:



The falloff parameter controls where the effect begins - a value of 0.75 means the effect begins three-quarters of the distance from the center of the image to the edge:



Core Image for Swift

There's a full CIFilter implementation of this filter under the Filterpedia repository. However, if you'd like to learn more about writing custom Core Image kernels with Core Image Kernel Language, may I recommend my book, Core Image for Swift. It is available from both Apple's iBooks Store or, as a PDF, from Gumroad. IMHO, the iBooks version is better, especially as it contains video assets which the PDF version doesn't.

Core Image for Swift from iBooks Store

Core Image for Swift from Gumroad

Filter (software) Swift (programming language)

Published at DZone with permission of Simon Gladman, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • Top Authentication Trends to Watch Out for in 2023
  • SAST: How Code Analysis Tools Look for Security Flaws
  • A Simple Union Between .NET Core and Python
  • Data Mesh vs. Data Fabric: A Tale of Two New Data Paradigms

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: