DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
View Events Video Library
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Integrating PostgreSQL Databases with ANF: Join this workshop to learn how to create a PostgreSQL server using Instaclustr’s managed service

Mobile Database Essentials: Assess data needs, storage requirements, and more when leveraging databases for cloud and edge applications.

Monitoring and Observability for LLMs: Datadog and Google Cloud discuss how to achieve optimal AI model performance.

Automated Testing: The latest on architecture, TDD, and the benefits of AI and low-code tools.

Related

  • iOS Spring Boot Code Generation in One Minute With Clowiz
  • SwiftData Dependency Injection in SwiftUI Application
  • MuleSoft: Tactical and Strategical Role of an Application Template
  • A Better Web3 Experience: Account Abstraction From Flow (Part 1)

Trending

  • AWS Amplify: A Comprehensive Guide
  • Supercharging Productivity in Microservice Development With AI Tools
  • Logging to Infinity and Beyond: How To Find the Hidden Value of Your Logs
  • DevSecOps: Integrating Security Into Your DevOps Workflow
  1. DZone
  2. Coding
  3. Frameworks
  4. Swift Tone Curve Editor

Swift Tone Curve Editor

Simon Gladman user avatar by
Simon Gladman
·
Sep. 17, 14 · Interview
Like (0)
Save
Tweet
Share
3.56K Views

Join the DZone community and get the full member experience.

Join For Free


One of the nice features in my Nodality application is a little widget for editing the points of a tone curve. A tone curve changes the luminosity of an image for a given tonal range. For example, increasing the value the leftmost point on the curve makes shadows brighter and, conversely, decreasing the value of the rightmost point makes highlights darker.

The CIToneCurve filter is part of CoreImage and accepts five points that form the curve. This blog post looks at creating a Swift application that allows a user to load an image and edit the tone curve of that image using five vertical sliders.

I've built the application with two main controls - a ToneCurveEditor which contains the vertical sliders and an ImageWidget which loads and displays an image and applies the filter to it. The main ViewController hosts them both.

Up until now, I haven't really thought about responding  to layout changes, but this application demands that the user interface responds sensibly to different orientations. When in landscape, I want the tone curve controls on the left and the image on the right and when in portrait, I want the tone curve controls below and the image above:


To do this, my overridden view controller's viewDidLayoutSubviews() is a little smarter than my previous versions. It compares its frame's width and height to see what orientation it's in and sizes and positions the components accordingly:

     override func viewDidLayoutSubviews()
    {
        let topMargin = Int(topLayoutGuide.length)
        
        if view.frame.size.width < view.frame.size.height
        {
            // portrait mode
            let widgetWidth = Int(view.frame.size.width)
            let widgetHeight = Int(view.frame.size.height) / 2
            
            imageWidget.frame = CGRect(x: 5, y: topMargin, width: widgetWidth - 10, height: widgetHeight - topMargin - topMargin)
            toneCurveEditor.frame = CGRect(x: 0, y: widgetHeight, width: widgetWidth, height: widgetHeight - 50)
            
        }
        else
        {
            // landscape mode
            let widgetWidth = Int(view.frame.size.width) / 2
            let widgetHeight = Int(view.frame.size.height)
            
            imageWidget.frame = CGRect(x: widgetWidth, y: topMargin, width: widgetWidth - 5, height: widgetHeight - topMargin - topMargin)
            toneCurveEditor.frame = CGRect(x: 0, y: 0, width: widgetWidth, height: widgetHeight - 50)
        }
    }
Now, when my device is rotated, the user interface smoothly segues between the two layouts. 
The ToneCurveEditor contains five sliders that set the tone curve values and an extended CALayer, ToneCurveEditorCurveLayer, which renders a curve joining the values.

I could manually create each slider, but its far better to create them in a loop. This reduces code replication and if, for example, CIToneCurve changes in the future to support more curve points, all I'd have to do is change the upper bound of the loop to create more sliders.

The slider creation is done inside ToneCurveEditor's overridden init() function. Each one is an instance of UISlider which is rotated 90°, given an action handler and added as a sub view:
     func createSliders()
    {
        let rotateTransform = CGAffineTransformIdentity
        
        for i in 0..<5
        {
            let slider = UISlider(frame: CGRectZero)
  
            slider.transform = CGAffineTransformRotate(rotateTransform, CGFloat(-90.0 * M_PI / 180.0));
            slider.addTarget(self, action: "sliderChangeHandler:", forControlEvents: .ValueChanged)
            
            sliders.append(slider)
            
            addSubview(slider)
        }
    }

When any slider is changed, I regenerate the ToneCurveEditor's curveValues array and dispatch a change event. Because curveValues has a didSet observer, ToneCurveEditor redraws the background curve when it changes:

     var curveValues : [Double] = [Double](count: 5, repeatedValue: 0.0)
    {
        didSet
        {
            for (i : Int, value : Double) in enumerate(curveValues)
            {
                sliders[i].value = Float(value)
            }
            
            drawCurve() // forces a curveLayer.setNeedsDisplay()
        }
    }

Down in the ToneCurveEditorCurveLayer, I've overridden drawInContext() to draw a set of Bezier curves to link the points together. The best way of doing this would be with a Catmull Rom spline, but I ran out of time to implement this and I do it with a series of curves which look slightly nicer than a set of straight lines. Stop Press: Now updated with a Hermite Spline, see this blog post for more information.


This is done by creating a new UIBezierPath, looping over the curve points and adding cubic Bezier sections for each point:

for (i: Int, value: Double) in enumerate(curveValues)
{
      let pathPointX = i * (widgetWidth / curveValues.count) + (widgetWidth / curveValues.count / 2)
      let pathPointY = thumbRadius + margin + widgetHeight - Int(Double(widgetHeight) * value)
                
      if i == 0
      {
             previousPoint = CGPoint(x: pathPointX,y: pathPointY)
                    
             path.moveToPoint(previousPoint)
      }
      else
      {
             // TODO - implement as Catmull-Rom
             let currentPoint = CGPoint(x: pathPointX, y: pathPointY)

             let controlPointOne = CGPoint(x: currentPoint.x, y: previousPoint.y)
             let controlPointTwo = CGPoint(x: previousPoint.x, y: currentPoint.y)
                    
             path.addCurveToPoint(CGPoint(x: pathPointX, y: pathPointY), controlPoint1: controlPointOne, controlPoint2: controlPointTwo)
                    
             previousPoint = currentPoint
       }
}

With the end result looking like this:

The ImageWidget contains a UIButton, which allows the user to load and image, and a UIImageView which displays the image.

When the button is clicked, it uses an injected reference to its parent view controller to invoke presentViewController() with a UIImagePickerController. Because ImageWidget implements the UIImagePickerControllerDelegate protocol, once the user has selected an image, imagePickerController() is invoked.

I found that trying to filter large images (my Fuji XE-1 gives 16MP images) was painfully slow and started to crash the application. So, I've written a rather natty extension to UIImage that resizes the loaded image to within a certain bounding square:

 extension UIImage
{
    func resizeToBoundingSquare(#boundingSquareSideLength : CGFloat) -> UIImage
    {
        let imgScale = self.size.width > self.size.height ? boundingSquareSideLength / self.size.width : boundingSquareSideLength / self.size.height
        let newWidth = self.size.width * imgScale
        let newHeight = self.size.height * imgScale
        let newSize = CGSize(width: newWidth, height: newHeight)
        
        UIGraphicsBeginImageContext(newSize)
        
        self.drawInRect(CGRect(x: 0, y: 0, width: newWidth, height: newHeight))
        
        let resizedImage = UIGraphicsGetImageFromCurrentImageContext()
        
        UIGraphicsEndImageContext();
        
        return resizedImage
    }
}
In a production application, you could use this smaller proxy during interaction and swap out for the raw image for a final render.

So, imagePickerController()'s main responsibility is to create a copy of the raw image to fit within a 1024 x 1024 square. When either the image or the injected curve points change, ImageWidget has to apply the tone curve filter to the resized image. To keep the user interface responsive, the filtering is done in a background thread (again using Tobias's Async library). So the instance method looks like this:

func applyFilterAsync()
    {
        backgroundBlock = Async.background
        {
            if !self.filterIsRunning && self.loadedImage != nil
            {
                self.filterIsRunning = true
                self.filteredImage = ImageWidget.applyFilter(loadedImage: self.loadedImage!, curveValues: self.curveValues, ciContext: self.ciContext, filter: self.filter)
            }
        }
        .main
        {
            self.imageView.image = self.filteredImage
            self.filterIsRunning = false
        }
    }
....and the class function that does the hard work looks like this:

     class func applyFilter(#loadedImage: UIImage, curveValues: [Double], ciContext: CIContext, filter: CIFilter) -> UIImage
    {
        let coreImage = CIImage(image: loadedImage)
        
        filter.setValue(coreImage, forKey: kCIInputImageKey)
        
        filter.setValue(CIVector(x: 0.0, y: CGFloat(curveValues[0])), forKey: "inputPoint0")
        filter.setValue(CIVector(x: 0.25, y: CGFloat(curveValues[1])), forKey: "inputPoint1")
        filter.setValue(CIVector(x: 0.5, y: CGFloat(curveValues[2])), forKey: "inputPoint2")
        filter.setValue(CIVector(x: 0.75, y: CGFloat(curveValues[3])), forKey: "inputPoint3")
        filter.setValue(CIVector(x: 1.0, y: CGFloat(curveValues[4])), forKey: "inputPoint4")
        
        let filteredImageData = filter.valueForKey(kCIOutputImageKey) as CIImage
        let filteredImageRef = ciContext.createCGImage(filteredImageData, fromRect: filteredImageData.extent())
        let filteredImage = UIImage(CGImage: filteredImageRef)
       
        return filteredImage
    }
Both the CIContext and the CIFilter are instance constants which only need be instantiated once and can then be reused, so I pass them into applyFilter().

The combination of using a proxy image and doing the filtering in the background make the application very smooth and responsive.

And that's about that. As always, the source code is available in myGitHub repository, enjoy!


application Swift (programming language)

Published at DZone with permission of Simon Gladman, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • iOS Spring Boot Code Generation in One Minute With Clowiz
  • SwiftData Dependency Injection in SwiftUI Application
  • MuleSoft: Tactical and Strategical Role of an Application Template
  • A Better Web3 Experience: Account Abstraction From Flow (Part 1)

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: