Hi,
My end goal is to convert the CMSampleBuffer
within func frameAvailable(_ sampleBuffer: CMSampleBuffer!)
to a CGImage
.
I’m initializing DeepAR with:
let cgFrame = NSRectToCGRect(NSScreen.main!.frame)
self.arView = self.deepAR.initializeView(withFrame: cgFrame)
self.deepAR.startCapture(withOutputWidth: 1280, outputHeight: 720, subframe: cgFrame)
self.deepAR.delegate = self
captureOutput
only enqueues the sampleBuffer. At this point, I can read the video sampleBuffer fine.
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
self.deepAR.enqueueCameraFrame(sampleBuffer, mirror: true)
}
frameAvailable
doesn’t return any processed video feed. The sampleBuffer isn’t nil, but theres nothing to convert to an image.
func frameAvailable(_ sampleBuffer: CMSampleBuffer!) {
let ciImage = CIImage(cvImageBuffer: sampleBuffer.imageBuffer!)
guard let cgImage = context.createCGImage(ciImage, from: ciImage.extent) else { return }
DispatchQueue.main.async { [unowned self] in
self.frame = cgImage
}
}
Any tips on how I can get the processed feed to convert to an image would be appreciated.
Thanks,
Jon
jelena
March 8, 2023, 7:18pm
2
Hi, i think the problem here is just an invalid subframe size. It should be in [0-1] range, like so:
Example from our github:
self.deepAr.startCapture(withOutputWidth: 720, outputHeight: 1280, subframe: CGRect(x: 0.0, y: 0.0, width: 1.0, height: 1.0))
Thanks for the response. That got me closer.
In frameAvailable, there is still no feed in the CIImage, but the properties look correct now (image attached from debugger). Im not seeing any errors or output, so I’m at a loss. I’m curious if I’m not processing the sampleBuffer correctly? I’m not convinced the videoBuffer is being dropped, but it’s a bit of a black box.
Thanks for the help,
Jon
jelena
March 9, 2023, 4:21pm
4
Have you checked this code for reference? We convert buffers to CIImage in this example
//
// ViewController.m
// videoprocessing-ios-objc
//
// Created by Matej Trbara on 29/08/2020.
// Copyright © 2020 Kodbiro. All rights reserved.
//
#import "ViewController.h"
#import <DeepAR/ARView.h>
#import <DeepAR/CameraController.h>
#import <AVKit/AVKit.h>
#import <AVFoundation/AVFoundation.h>
#import <MobileCoreServices/MobileCoreServices.h>
#import "OffscreenProcessingViewController.h"
#define USE_EXTERNAL_CAMERA 0
#define ITEMS_PER_ROW ((CGFloat)3)
#define SECTION_INSETS (UIEdgeInsetsMake(20.0, 20.0, 20.0, 20.0))
This file has been truncated. show original
Thanks for sharing. I can’t get this to work with the macos github sample you provide either, but it seems to work fine on iOS.
I’ll keep exploring other options.
Jon