Recreating Kai's Power Tools Goo in Swift

DZone 's Guide to

Recreating Kai's Power Tools Goo in Swift

Create a time machine in under ten lines for a blast from the past and have some fun with your images using SwiftGoo.

· Mobile Zone ·
Free Resource

If, like me, you're old enough to fondly remember Kai's Power Tools Goo software, here's a fun little Swift and Core Image demo that will take you back to those heady days.

SwiftGoo uses a custom Core Image warp kernel to push an image's pixels around the screen based upon a user's touch. There are alternative techniques, such as mesh deformation which is used in PhotoGoo written by fellow Swifter Jameson Quave.

SwiftGoo's code isn't complicated - in fact, I rustled it up on a plane journey on my trusty MacBook Pro. 

A Goo Engine in Under Ten Lines

The engine that drives SwiftGoo is a short Core Image warp kernel. A warp kernel is designed solely for changing the geometry of an input image: given the coordinates of the pixel it's currently computing in the output image, it returns the coordinates of where the filter should sample from in the input image. For example, a warp kernel that returns the destination coordinates with the 'y' component subtracted from the image height will turn an image upside down.

The SwiftGoo kernel accepts arguments for the effect radius, the force to be applied (i.e. how much distortion to apply), the location of the user's touch and the direction that the touch is travelling in.

It calculates the distance between the current pixel coordinate (destCoord()) and the touch location. If that distance is less than the effect radius, it returns the current coordinate offset by the direction multiplied by the distance smoothly interpolated between zero and the radius.  

The Core Image Kernel Code used to create the kernel is passed to a CIWarpKernel as a string, so the code required is:

let warpKernel = CIWarpKernel(string:
        "kernel vec2 gooWarp(float radius, float force,  vec2 location, vec2 direction)" +
        "{ " +
        " float dist = distance(location, destCoord()); " +
        "  if (dist < radius)" +
        "  { " +
        "     float normalisedDistance = 1.0 - (dist / radius); " +
        "     float smoothedDistance = smoothstep(0.0, 1.0, normalisedDistance); " +

        "    return destCoord() + (direction * force) * smoothedDistance; " +
        "  } else { " +
        "  return destCoord();" +
        "  }" +


The kernel is executed inside my view controller's touchesMoved function. In here, I loop over the touch's coalesced touches and use the difference between the locationInView and previousLocationInView to create aCIVector - which is the direction of the touch movement. The radius and force are, depending on whether 3D touch is available, either hard coded or based on the touch's force.

With these values in place, the warp kernel defined above is invoked with:

let arguments = [radius, force, location, direction]

    let image = warpKernel.applyWithExtent(
            (index, rect) in
            return rect
        inputImage: accumulator.image(),
        arguments: arguments)

Throughout the loop, the updated image is accumulated with a CIImageAccumulator and the final image is displayed using OpenGL with a GLKView.

The final project is, as always, available in the SwiftGoo GitHub repository

Core Image for Swift

If you'd like to learn more about custom image filters using Core Image kernels or how to display filter output images using OpenGL, may I suggest my book, Core Image for Swift.

Core Image for Swift is available from both Apple's iBooks Store or, as a PDF, from Gumroad. IMHO, the iBooks version is better, especially as it contains video assets which the PDF version doesn't.

swift ,graphics

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}