Heads up! We're sunsetting this forum to streamline our community space.

join us on Discord for support, updates, and all things Snap AR! See you there!
Sign up for our live Camera Kit office hour on July 17 @ 9:30am (PT) / 4:30pm (GMT)

Quick Tip: CameraKit Output -> UIImage

aidan
aidan Posts: 32 🔥

Hi y'all,

A useful function for CameraKit might be to grab the AR View and convert it into an Image imagining you might want to do something like create an image gallery, generate a GIF, or do an image operation of some kind. Following the same protocol structure as PhotoCapture and Record, here's how to get an Image out of CameraKit.

Create a new class somewhere as follows:

class CameraKitOutputToImage: NSObject, Output, OutputRequiringPixelBuffer, SCCameraKitOutputRequiringPixelBufferDelegate  {

    var view: UIViewController;

    var currentlyRequiresPixelBuffer: Bool

    var delegate: SCCameraKitOutputRequiringPixelBufferDelegate?

    func outputChangedRequirements(_ output: OutputRequiringPixelBuffer) {

    }

    func cameraKit(_ cameraKit: CameraKitProtocol, didOutputTexture texture: Texture) {

    }

    func cameraKit(_ cameraKit: CameraKitProtocol, didOutputVideoSampleBuffer sampleBuffer: CMSampleBuffer) {

        var image = sampleBuffer.image()

    }

    func cameraKit(_ cameraKit: CameraKitProtocol, didOutputAudioSampleBuffer sampleBuffer: CMSampleBuffer) {

    }

    init () {
        self.currentlyRequiresPixelBuffer = true;
        super.init();
    }
}

The gist is we're creating a new output with currentlyRequiresPixelBuffer ON meaning we'll receive frames as sampleBuffer in func cameraKit(_ cameraKit: CameraKitProtocol, didOutputVideoSampleBuffer sampleBuffer: CMSampleBuffer)

We then need to convert CMSampleBuffer to UIImage which we can do by creating an extension for CMSampleBuffer as follows:

extension CMSampleBuffer {
    /// https://stackoverflow.com/questions/15726761/make-an-uiimage-from-a-cmsamplebuffer
    func image(orientation: UIImage.Orientation = .up, scale: CGFloat = 1.0) -> UIImage? {
        if let buffer = CMSampleBufferGetImageBuffer(self) {
            let ciImage = CIImage(cvPixelBuffer: buffer)

            return UIImage(ciImage: ciImage, scale: scale, orientation: orientation)
        }

        return nil
    }

    func imageWithCGImage(orientation: UIImage.Orientation = .up, scale: CGFloat = 1.0) -> UIImage? {
        if let buffer = CMSampleBufferGetImageBuffer(self) {
            let ciImage = CIImage(cvPixelBuffer: buffer)

            let context = CIContext(options: nil)

            guard let cg = context.createCGImage(ciImage, from: ciImage.extent) else {
                return nil
            }

            return UIImage(cgImage: cg, scale: scale, orientation: orientation)
        }

        return nil
    }
}

Then we use cameraKit's add(output:) to add our new class as an output.

let cameraFrame = CameraKitOutputToImage();
cameraKit.add(output: cameraFrame);

With everything together you should start to receive UIImage's to do whatever you want with! Hope this helps.

Comments

  • stevenxu
    stevenxu Posts: 612 👻

    @aidan you are a godsend...! On behalf of the current community & future (when we're out of closed beta), thank you!

Categories