Search code examples
swiftxcodecore-image

Swift Vision API Person Segmentation Not Masking Person


Been trying to learn Apple's Vision API to segment people out of the photo. The issue i'm having is that the background mask image completely replaces my "Selfie" image. The Selfie image is clear and isnt blurry so i dont think the image quality is the problem. This is the same implementation that i see in this video: https://developer.apple.com/videos/play/wwdc2021/10040/

Can anyone check out my implementation and let me know if I'm missing something? My expectation is that this "space" image will replace my selfie's background.

import UIKit import Vision import CoreImage import CoreImage.CIFilterBuiltins

class ImageSegmentationVC: UIViewController {
    
    let backgroundImageView: UIImageView = {
        let imageView = UIImageView(image: UIImage(named: "Selfie"))
        imageView.translatesAutoresizingMaskIntoConstraints = false
        imageView.contentMode = .scaleAspectFit
        return imageView
    }()
    
    // The Vision PersonSegmentation requests 
    private var segmentationRequest = VNGeneratePersonSegmentationRequest()
    
    override func viewDidLoad() {
        super.viewDidLoad()
        
        view.addSubview(backgroundImageView)
        backgroundImageView.topAnchor.constraint(equalTo: view.topAnchor).isActive = true
        backgroundImageView.leadingAnchor.constraint(equalTo: view.leadingAnchor).isActive = true
        backgroundImageView.trailingAnchor.constraint(equalTo: view.trailingAnchor).isActive = true
        backgroundImageView.bottomAnchor.constraint(equalTo: view.bottomAnchor).isActive = true
        
        guard let backgroundImage = backgroundImageView.image else { return }
        intializeRequests(for: backgroundImage)
        
    }
    
    private func intializeRequests(for image: UIImage) {
        // Update segmentation properties
        segmentationRequest.qualityLevel = .accurate
        segmentationRequest.outputPixelFormat = kCVPixelFormatType_OneComponent8
        
        if let image = image.cgImage {
            generatePhoto(backgroundImage: image)
        }
    }
    
    func generatePhoto(backgroundImage: CGImage) {
        ///2 Create Request Handler
        let requestHandler = VNImageRequestHandler(cgImage: backgroundImage, options: [:])
        
        try? requestHandler.perform([segmentationRequest])
        
        guard let maskPixelBuffer = segmentationRequest.results?.first?.pixelBuffer else {
            return
        }
        
        let maskImage = CGImage.create(pixelBuffer: maskPixelBuffer)
        applyingMask(buffer: maskPixelBuffer)
    }
    
    func applyingMask(buffer: CVPixelBuffer) {
    let input = CIImage(cgImage: backgroundImageView.image!.cgImage!)
    let mask = CIImage(cvPixelBuffer: buffer)
    let background = CIImage(image: UIImage(named: "starfield")!)!
    
    let maskScaleX = input.extent.width / mask.extent.width
    let maskScaleY = input.extent.height / mask.extent.height
    let maskScaled = mask.transformed(by: __CGAffineTransformMake(maskScaleX, 0, 0, maskScaleY, 0, 0))
    
    let backgroundScaleX = input.extent.width / background.extent.width
    let backgroundScaleY = input.extent.height / background.extent.height
    let backgroundScaled = background.transformed(by: __CGAffineTransformMake(backgroundScaleX, 0, 0, backgroundScaleY, 0, 0))
    
    let blendFilter = CIFilter.blendWithRedMask()
    blendFilter.inputImage = input
    blendFilter.backgroundImage = backgroundScaled
    blendFilter.maskImage = maskScaled
    
    let blendedImage = blendFilter.outputImage
    
    backgroundImageView.image = UIImage(ciImage: blendedImage!)
}

This is what it looks like currently enter image description here


Solution

  • Solution i have above is correct. Reason why it didnt work was because i was running on Simulator. But when tested on actual device, it worked. Not sure why simulator didnt work. Currently running an intel 2018 macbook pro.