BristlePaint: Embossed Painting with Individual Bristles using SpriteKit Normal Mapping

DZone 's Guide to

BristlePaint: Embossed Painting with Individual Bristles using SpriteKit Normal Mapping

Simon Gladman continues his experiments with the Apple Pencil to create painting effects on the iPad using Swift.

· Mobile Zone ·
Free Resource

Following on from FurrySketch and MercurialPaint, my experiments with painting and drawing techniques for iOS in Swift continue with BristlePaint. BristlePaint draws the individual bristles of a brush and uses SpriteKit's normal mapping to give the image a nice, glossy embossed effect.

This demo is geared quite heavily for the iPad Pro and Pencil. The drawing code uses the force and azimuth and altitude angles to control the brush effect, but, that said, there's no reason why that code couldn't be changed to work with standard touch events.

As a demo, all the code is bundled into a single view controller. So, with no further ado, let's jump in and see how it's all put together.


Simply put, BristlePaint uses two CoreImage Image Accumulators to store separate images for the visible, coloured image and a grayscale bump map. Those images are converted to SpriteKit textures (with the greyscale bump map getting converted to an RGB normal map with textureByGeneratingNormalMapWithSmoothness()) which are then mapped to a single SpriteKit sprite.

Touch Handling 

Because creating the textures isn't instantaneous, with each invocation of touchesMoved(), rather than attempting a final render, I store the relevant touch information for each of the coalesced touches in an array. To do this, I define a type alias and declare that array at the class level:

typealias TouchDatum = (location: CGPoint, force: CGFloat, azimuthVector: CGVector, azimuthAngle: CGFloat)

    var touchData = [TouchDatum]()

And, inside touchesMoved(), use map to populate that array:

    guard let
        touch = touches.first,
        coalescedTouces = event?.coalescedTouchesForTouch(touch) where
        touch.type == UITouchType.Stylus else

        $0.force / $0.maximumPossibleForce,


It's in touchesEnded() that I create a path from that touch data and queue it up to be rendered in the background. To create the path, I have a static function (I made it static to be sure it could have no side effects) named pathFromTouches which returns a CGPath from the array of TouchDatum:

    guard let path = ViewController.pathFromTouches(touchData, bristleAngles: bristleAngles) else


The bristleAngles array contains CGFloats which define the angle of each bristle. I've populated mine with twenty values which will give me a brush with, unsurprisingly, twenty bristles. pathFromTouches loops over each bristle and then over every item in touchData. It simply generates a UIBezierPath from those items using the force and angle to give a paint brush effect that mimics a real world brush:

    let bezierPath = UIBezierPath()

    for var i = 0; i < bristleAngles.count; i++
        let x = firstTouchDatum.location.x + sin(firstBristleAngle) * forceToRadius(firstTouchDatum.force)
        let y = firstTouchDatum.location.y + cos(firstBristleAngle) * forceToRadius(firstTouchDatum.force)

        bezierPath.moveToPoint(CGPoint(x: x, y: y))

        for touchDatum in touchData
            let bristleAngle = bristleAngles[i]

            let x = touchDatum.location.x + sin(bristleAngle + touchDatum.azimuthAngle)
                * forceToRadius(touchDatum.force)
                * touchDatum.azimuthVector.dy

            let y = touchDatum.location.y + cos(bristleAngle + touchDatum.azimuthAngle)
                * forceToRadius(touchDatum.force)
                * touchDatum.azimuthVector.dx

            bezierPath.addLineToPoint(CGPoint(x: x, y: y))


When that path is returned to touchesEnded(), it's appended to an array of paths pending rendering and drawPendingPath() is invoked which will attempt to render it:

    pendingPaths.append((path, origin, diffuseColor, temporaryLayer))


Now we have the path for the user's gesture, it's time to convert that to the maps that SpriteKit require and this is done in a background thread to keep the user interface responsive. drawPendingPath() picks the first item from thependingPaths array:

    guard pendingPaths.count > 0 else

    let pendingPath = pendingPaths.removeFirst()

...and then in the background uses another pure, static function, textureFromPath(), to create SpriteKit textures from that path. Since the compositing technique and the image accumulators are different for the diffuse and normal maps, they have to be passed into textureFromPath(), so it has quite a long signature:

    static func textureFromPath(path: CGPathRef,
        origin: CGPoint,
        imageAccumulator: CIImageAccumulator,
        compositeFilter: CIFilter,
        color: CGColorRef,

        lineWidth: CGFloat) -> SKTexture

But the guts of the function are pretty simple: it uses a CGContext to generate a UIImage from the supplied path:


    let cgContext = UIGraphicsGetCurrentContext()

    CGContextSetLineWidth(cgContext, lineWidth)
    CGContextSetLineCap(cgContext, CGLineCap.Round)

    CGContextSetStrokeColorWithColor(cgContext, color)

    CGContextAddPath(cgContext, path)


    let drawnImage = UIGraphicsGetImageFromCurrentImageContext()


Then, using the accumulator and the compositor, composites the new image over the previous:

    compositeFilter.setValue(CIImage(image: drawnImage),
        forKey: kCIInputImageKey)
        forKey: kCIInputBackgroundImageKey)

    imageAccumulator.setImage(compositeFilter.valueForKey(kCIOutputImageKey) as! CIImage)

    let filteredImageRef = ciContext.createCGImage(imageAccumulator.image(),

        fromRect: CGRect(origin: CGPointZero, size: size))

...and finally creates and returns a SpriteKit texture from the composited image:

    return SKTexture(CGImage: filteredImageRef)

drawPendingPath() invokes this method twice, first for the diffuse map and second for the normal map:

    let diffuseMap = ViewController.textureFromPath(pendingPath.path,
        origin: pendingPath.origin,
        imageAccumulator: self.diffuseImageAccumulator,
        compositeFilter: self.diffuseCompositeFilter,
        color: pendingPath.color.CGColor,
        lineWidth: 2)

    let normalMap = ViewController.textureFromPath(pendingPath.path,
        origin: pendingPath.origin,
        imageAccumulator: self.normalImageAccumulator,
        compositeFilter: self.normalCompositeFilter,
        color: UIColor(white: 1, alpha: 0.1).CGColor, lineWidth: 2)

        .textureByGeneratingNormalMapWithSmoothness(0.75, contrast: 3)

...and sets the texture and normalTexture on background SpriteKit node:

    backgroundNode.texture = diffuseMap
    backgroundNode.normalTexture = normalMap

After that, drawPendingPath() invokes itself to render any other gesture paths that may have been added topendingPaths() during that process.


SpriteKit's normal mapping offers a convenient way to create a pseudo-3D embossed drawing andtextureByGeneratingNormalMap() makes converting an easily generated bump map to a normal map super easy. By doing that work in a background thread, the user interface can be kept super responsive . Furthermore, utilising the data from Pencil allows the brush to mimic a real paint brush by following the angles and increasing the spread with pressure. 

As always, the code for this project is available in my GitHub repository here.

ios ,swift

Published at DZone with permission of Simon Gladman , DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}