iOS8 introduced an interesting visual effect: when you pull down your home screen to reveal the search field, the background gets blurred. This blur occurs in an interactive fashion, as you completely control it with the movement of your finger. Interactive animations typically feel more natural, because this is how the objects would behave in the real world when you interact with them.

Blurring methods

There`s a couple of ways you can go about blurring an image. You could use CoreImage filters:

// preparation for blurring
ciContext = CIContext(options: nil)

ciImage = CIImage(image: originalImage)

ciFilter = CIFilter(name: "CIGaussianBlur")

ciFilter.setValue(ciImage, forKey: kCIInputImageKey)

// actual blur, can be done many times with different
// radiuses without running preparation again
ciFilter.setValue(radius, forKey: "inputRadius")

var cgImage = ciContext.createCGImage(ciFilter.outputImage, fromRect: ciImage.extent())
var blurredImage = UIImage(CGImage: cgImage)!

CIContext, CIImage and CIFilter can be created just once. For each calculation, you just set the blur radius and run the filter.

GPUImage framework can also be used for this purpose:

gpuBlurFilter = GPUImageGaussianBlurFilter()

gpuBlurFilter.blurRadiusInPixels = CGFloat(radius)

var blurredImage = gpuBlurFilter.imageByFilteringImage(originalImage)

Again, you create the filter just once. Gaussian blur is a bit different from what iOS does, but may still look good for your purpose.

GPUImage also has a filter named GPUImageiOSBlurFilter that attempts to get closer to the actual effect used on iOS. It looks quite nice, but it alters the image significantly even for small blur radiuses. This causes a glitch – animation “jumps” from the original image to a very different one as soon as it starts.

Finally, you can use Apple provided UIImage+ImageEffects category. This category can reproduce the actual blur effect used on iOS, and doesn`t have problems with small blur radiuses.

var blurredImage = originalImage.applyBlurWithRadius(
    tintColor: nil,
    saturationDeltaFactor: 1.0,
    maskImage: nil
Different blur radiuses: 0, 1, 2, 5, 20
Different blur radiuses: 0, 1, 2, 5, 20

Blurring speed

Another quite important aspect of the blur effect, aside its quality, is the speed. Let`s see how fast can the blurs be computed using previously mentioned methods. If we could process them 60 times per second, then we can do that on-the-fly and animation would be smooth.

iPhone 5C blurring speed
iPhone 5C blurring speed
iPhone 5S blurring speed
iPhone 5S blurring speed

Unfortunately, as it can be seen on the chart, the processing speed calculations result in too low frame rates. Core Image blurring on iPhone 5S goes above the desired 60 FPS, but you can`t really target just that device (yet).

You are probably wondering whether new iOS8 UIBlurEffect class can help us here. UIBlurEffect is introduced in iOS8 and can be used to create an overlay that blurs everything that’s behind it. It works in real time, but unfortunately blur radius can’t be specified.

I also tried using FXBlurView. It will allow you to specify the blur radius, but you`ll get the same image for slightly different blur radiuses. It`s as if it`s rounding the given blur radius to the nearest integer.

Blurring in stages

Another approach we can try is to precalculate blurs for a couple of blur radiuses and then interpolate intermediate images. We start by placing the original image at the bottom.
Image blurred with the first blur radius we`ve chosen goes on top of it with it`s alpha set to zero. At this point only the original image is visible.

To transition to the next stage, we just gradually increase the alpha of the second image. What we get is original image morphing into a slightly blurred one. Now just continue in the same fashion, always overlaying the current image with the next one using an appropriate alpha.

There is a tradeoff between the quality of interpolated blurs and the time required to do the preprocessing. If you have only 2 or 3 stages, it`s visible that something other than pure blurring is going on. 10 stages look almost as a real thing, but it takes around 0.9 seconds on iPhone 5C to preprocess them (0.5 on 5S). Depending on your use case, this may or may not be acceptable. Blurring will however work faster if you`re blurring a smaller part of the screen.

Using 1, 3, 10 or 50 stagesHere`s how this looks in actual code. We are using invisible full screen scroll view to control how blurred the image is. When content offset is at zero, we see the original image. As you scroll down image gets blurred.

func scrollViewDidScroll(scrollView: UIScrollView) {
	var r = Double(scrollView.contentOffset.y / CGFloat(kScrollViewTravel))
	var blur = max(0, min(1, r)) * Double(kNumberOfStages)
	var blurIndex = Int(blur)
	var blurRemainder = blur - Double(blurIndex)

	firstImageView.image = blurredImages[blurIndex]
	secondImageView.image = blurredImages[blurIndex + 1]
	secondImageView.alpha = CGFloat(blurRemainder)

This works smooth as nothing is calculated during the animation, we only display the appropriate image with the correct alpha in each step.

It may be useful to do the preprocessing on a background thread. Apple’s category is thread safe, you can call its blurring method from a background thread and use the resulting UIImage from the main thread.

With CoreImage you have to be careful not to create a CIFilter on the main thread and then use it on a background thread. Create all objects you need on a background thread and run the blurring there.

GPUImage is not thread-safe. It may seem to work, but we`ve seen strange hard-to-reproduce bugs appearing, so be cautious.

Share your experiences with us. We`ve created a simple demo app written in Swift, you can download it here. Try changing the number of stages to see how it looks like and how long it takes to preprocess them. We`ve also included the benchmark class used to measure the frame rates of different blurring methods.