3 Approaches to Applying Blur Effects in iOS

It’s a fairly common requirement to apply a blur effect to iOS content. This article introduces three different approaches by using three different iOS frameworks: UIKit, CoreImage, and Metal.

1. Using UIBlurEffect to Blur Image

UIBlurEffect is a neatly designed UI element in UIKit. By using it along with UIVisualEffectView, a visual effect view is added on top of the background content.

To add the blur effect, do the following:

let blurEffect = UIBlurEffect(style: .light)
let blurEffectView = UIVisualEffectView()
blurEffectView.frame = CGRect(x: 0, y: 0, width: imageView.frame.width, height: 400)
blurEffectView.center = imageView.center
self.imageView.addSubview(blurEffectView)
UIView.animate(withDuration: 5) {
    blurEffectView.effect = blurEffect
}

Here’s what is happening:

UIVisualEffectView contains another two private layers (UIVisualEffectBackdropView and UIVisualEffectSubView) that would be interesting to explore.

Pros

Cons

2. Apply CIFilter to the Image

CIFilter is the image processer in the Core Image framework. It has dozens of built-in image filters and offers the ability to build your own custom filters. Core Image can use OpenGL/OpenGL ES for high-performance, GPU-based rendering for real-time performance. It can also use CPU-based rendering with Quartz 2D if it doesn’t require real-time performance.

To apply the blur effect for this example, we use CIFilter with CIGaussianBlur type:

extension UIImage {
    func blurredImage(with context: CIContext, radius: CGFloat, atRect: CGRect) -> UIImage? {
        guard let ciImg = CIImage(image: self) else { return nil }

        let cropedCiImg = ciImg.cropped(to: atRect)
        let blur = CIFilter(name: "CIGaussianBlur")
        blur?.setValue(cropedCiImg, forKey: kCIInputImageKey)
        blur?.setValue(radius, forKey: kCIInputRadiusKey)

        if let ciImgWithBlurredRect = blur?.outputImage?.composited(over: ciImg),
           let outputImg = context.createCGImage(ciImgWithBlurredRect, from: ciImgWithBlurredRect.extent) {
            return UIImage(cgImage: outputImg)
        }
        return nil
    }
}

Pros

Cons

3. Metal, the GPU Acceleration Way

Metal allows us to use GPU directly to perform computing and graphics operations. In doing so, we also free up the CPU so that it is available for other operations. GPU is optimized for highly parallel workflows. It can perform graphics tasks faster and more efficiently than the CPU. A few Apple frameworks, including Core Image, use Metal under the hood to delegate graphic workloads to the GPU, while Core ML uses Metal to perform its low-level operations on the GPU.

// Instance of MTKView
@IBOutlet weak var mtkView: MTKView!

// Metal resources
var device: MTLDevice!
var commandQueue: MTLCommandQueue!
var sourceTexture: MTLTexture!

// Core Image resources
var context: CIContext!
let filter = CIFilter(name: "CIGaussianBlur")!
let colorSpace = CGColorSpaceCreateDeviceRGB()
override func viewDidLoad() {
    super.viewDidLoad()

    device = MTLCreateSystemDefaultDevice()
    commandQueue = device.makeCommandQueue()
    let textureLoader = MTKTextureLoader(device: device)
    sourceTexture = try! textureLoader.newTexture(cgImage: UIImage(named: "street.png")!.cgImage!)
    let view = self.mtkView!
    view.delegate = self
    view.device = device
    view.framebufferOnly = false

    context = CIContext(mtlDevice: device)
}
extension ViewController: MTKViewDelegate {
    func draw(in view: MTKView) {
        if let currentDrawable = view.currentDrawable,
           let commandBuffer = commandQueue.makeCommandBuffer() {

            let inputImage = CIImage(mtlTexture: sourceTexture)!.oriented(.down)
            filter.setValue(inputImage, forKey: kCIInputImageKey)
            filter.setValue(10.0, forKey: kCIInputRadiusKey)

            context.render(filter.outputImage!,
                to: currentDrawable.texture,
                commandBuffer: commandBuffer,
                bounds:  mtkView.bounds,
                colorSpace: colorSpace)

            commandBuffer.present(currentDrawable)
            commandBuffer.commit()
        }
    }
}

Pros

Cons

Conclusion

The article provides three different approaches to adding the blur effect to an image in iOS. These approaches work with applying the blur effect to any instances of UIView.

There is a lot more potential to explore. Thank you for reading. Please leave any questions you might have in the comments.

All code mentioned above can be found in this GitHub repo.

Original post: https://betterprogramming.pub/three-approaches-to-apply-blur-effect-in-ios-c1c941d862c3