Apple provides a few ways of image grayscale conversion or applying grayscale on UIViews, but their results may vary. First, you should decide what you want to use it for and how it should work.

TL;DR

The three ways of applying grayscale on images in Swift I'm talking in this article are:

  • CIFilter that adds grayscale, but take a while to keep the picture quality,
  • using CoreGraphics filters that take less time and can be applied on multiple images at one time, but turn transparent elements into color,
  • simply adding the grayscale option in View.

Want details? Follow me!

The first way - CIFilter

When I think about gray-scaling an image, filters for pictures come to my mind immediately. There are plenty of apps that enable you to apply filters on images. And I think that's also the most common way of using grayscale in Swift.

In that case, I’d use CIFilter created just for adding effects and filters on images. CIFilter has a component called CIPhotoEffectNoir used for effects in iOS Photos app.

There also 2 similar effects that can be applied on images:

  • CIPhotoEffectMono,
  • and CIPhotoEffectTonal.

So if you simply want to add grayscale to your image without losing image quality, just apply this function:

func grayscale(image: UIImage) -> UIImage? {
    let context = CIContext(options: nil)
    if let filter = CIFilter(name: "CIPhotoEffectNoir") {
        filter.setValue(CIImage(image: image), forKey: kCIInputImageKey)
        if let output = filter.outputImage {
            if let cgImage = context.createCGImage(output, from: output.extent) {
                return UIImage(cgImage: cgImage)
            }
        }
    }
    return nil
}

Although this effect works like a charm, as all CIFilters, the problem is with timing. Changing image with CIPhotoEffectNoir (or others) takes a couple of seconds. That's because CIFilters keep image quality. So if you want to convert more than one image at once, then well, it will take time.

The second way - CGColorSpaceCreateDeviceGray

Another option, which takes less time and can be implemented for more than one image at a time is CoreGraphics.

Let’s take a look at our function:

func convertToGrayScale(image: UIImage) -> UIImage? {
        let imageRect:CGRect = CGRect(x:0, y:0, width:image.size.width, height: image.size.height)
        let colorSpace = CGColorSpaceCreateDeviceGray()
        let width = image.size.width
        let height = image.size.height
        let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.none.rawValue)
        let context = CGContext(data: nil, width: Int(width), height: Int(height), bitsPerComponent: 8, bytesPerRow: 0, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
        if let cgImg = image.cgImage {
            context?.draw(cgImg, in: imageRect)
            if let makeImg = context?.makeImage() {
                let imageRef = makeImg
                let newImage = UIImage(cgImage: imageRef)
                return newImage
            }
        }
        return UIImage()
    }

It works much faster, but also has its limitations. You cannot use CGImageAlphaInfo.none for images that contain transparent elements. It means all PNGs will be transformed into images without transparency. Clear background colors in images will turn black.

This can be also handled with replacing colors, but in this tutorial I’d rather like to focus only on grayscale.

The third way - Swift

At last, I’ll show you the simplest way to convert images to grayscale. This one works only with SwiftUI, so you can’t use it for older apps. You simply add an option called grayscale in View:

func grayscale(_ amount: Double) -> some View

where "amount" is the intensity of grayscale you want to apply on a view. This option is very easy to implement, but you must remember that it also doesn’t work as fast as transforming image to a `CGColorSpace` object.

Wrapping up

Now, that's it. I hope that thanks to this tutorial, the grayscale image conversion in Swift is no longer rocket science or black magic to you!