Hi guys,
How to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected:
When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker).
When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker).
I want to make use of IOKit extension to hidden or ignore build-in camera to realize the requirement.
however the extension can't be loaded for Invalid permissions in MacOS 15.4.1, buildVersion:24E263. I also tried to run in in MacOS 14.4.1, which can be loaded but can't auto load when restart laptop as KDK version not match.
Could you please give me some suggestion? Is it possible hidden build-in camera in MacOS M-series chip? Is there any other method to realize the feature. Thanks a lot.
Continuity Camera
RSS for tagSupport automatic camera switching and high-quality, high-resolution photo capture in your macOS app when iPhone is used as a camera for Mac.
Posts under Continuity Camera tag
3 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
A functioning Multiplatform app, which includes use of Continuity Camera on an M1MacMini running Sequoia 15.5, works correctly capturing photos with AVCapturePhoto. However, that app (and a test app just for Continuity Camera) crashes at delegate callback when run on a 2017 MacBookPro under MacOS 13.7.5. The app was created with Xcode 16 (various releases) and using Swift 6 (but tried with 5). Compiling and running the test app with Xcode 15.2 on the 13.7.5 machine also crashes at delegate callback.
The iPhone 15 Continuity Camera gets detected and set up correctly, and preview video works correctly. It's when the CapturePhoto code is run that the crash occurs.
The relevant capture code is:
func capturePhoto() {
let captureSettings = AVCapturePhotoSettings()
captureSettings.flashMode = .auto
photoOutput.maxPhotoQualityPrioritization = .quality
photoOutput.capturePhoto(with: captureSettings, delegate: PhotoDelegate.shared)
print("**** CameraManager: capturePhoto")
}
and the delegate callbacks are:
class PhotoDelegate: NSObject, AVCapturePhotoCaptureDelegate {
nonisolated(unsafe) static let shared = PhotoDelegate()
// MARK: - Delegate callbacks
func photoOutput(
_ output: AVCapturePhotoOutput,
didFinishProcessingPhoto photo: AVCapturePhoto,
error: (any Error)?
) {
print("**** CameraManager: didFinishProcessingPhoto")
guard let pData = photo.fileDataRepresentation() else {
print("**** photoOutput is empty")
return
}
print("**** photoOutput data is \(pData.count) bytes")
}
func photoOutput(
_ output: AVCapturePhotoOutput,
willBeginCaptureFor resolvedSettings: AVCaptureResolvedPhotoSettings
) {
print("**** CameraManager: willBeginCaptureFor")
}
func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) {
print("**** CameraManager: willCaptureCapturePhotoFor")
}
}
The crash report significant parts are.....
Crashed Thread: 3 Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000000
Exception Codes: 0x0000000000000001, 0x0000000000000000
Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11
Terminating Process: exc handler [30850]
VM Region Info: 0 is not in any region. Bytes before following region: 4296495104
REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
UNUSED SPACE AT START
--->
__TEXT 100175000-10017f000 [ 40K] r-x/r-x SM=COW ...tinuityCamera
Thread 0:: Dispatch queue: com.apple.main-thread
0 libsystem_kernel.dylib 0x7ff803aed552 mach_msg2_trap + 10
1 libsystem_kernel.dylib 0x7ff803afb6cd mach_msg2_internal + 78
2 libsystem_kernel.dylib 0x7ff803af4584 mach_msg_overwrite + 692
3 libsystem_kernel.dylib 0x7ff803aed83a mach_msg + 19
4 CoreFoundation 0x7ff803c07f8f __CFRunLoopServiceMachPort + 145
5 CoreFoundation 0x7ff803c06a10 __CFRunLoopRun + 1365
6 CoreFoundation 0x7ff803c05e51 CFRunLoopRunSpecific + 560
7 HIToolbox 0x7ff80d694f3d RunCurrentEventLoopInMode + 292
8 HIToolbox 0x7ff80d694d4e ReceiveNextEventCommon + 657
9 HIToolbox 0x7ff80d694aa8 _BlockUntilNextEventMatchingListInModeWithFilter + 64
10 AppKit 0x7ff806ca59d8 _DPSNextEvent + 858
11 AppKit 0x7ff806ca4882 -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1214
12 AppKit 0x7ff806c96ef7 -[NSApplication run] + 586
13 AppKit 0x7ff806c6b111 NSApplicationMain + 817
14 SwiftUI 0x7ff90e03a9fb 0x7ff90dfb4000 + 551419
15 SwiftUI 0x7ff90f0778b4 0x7ff90dfb4000 + 17578164
16 SwiftUI 0x7ff90e9906cf 0x7ff90dfb4000 + 10340047
17 ContinuityCamera 0x10017b49e 0x100175000 + 25758
18 dyld 0x7ff8037d1418 start + 1896
Thread 1:
0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0
Thread 2:
0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0
Thread 3 Crashed:: Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext
0 ??? 0x0 ???
1 AVFCapture 0x7ff82045996c StreamAsyncStillCaptureCallback + 61
2 CoreMediaIO 0x7ff813a4358f __94-[CMIOExtensionProviderHostContext captureAsyncStillImageWithStreamID:uniqueID:options:reply:]_block_invoke + 498
3 libxpc.dylib 0x7ff803875b33 _xpc_connection_reply_callout + 36
4 libxpc.dylib 0x7ff803875ab2 _xpc_connection_call_reply_async + 69
5 libdispatch.dylib 0x7ff80398b099 _dispatch_client_callout3 + 8
6 libdispatch.dylib 0x7ff8039a6795 _dispatch_mach_msg_async_reply_invoke + 387
7 libdispatch.dylib 0x7ff803991088 _dispatch_lane_serial_drain + 393
8 libdispatch.dylib 0x7ff803991d6c _dispatch_lane_invoke + 417
9 libdispatch.dylib 0x7ff80399c3fc _dispatch_workloop_worker_thread + 765
10 libsystem_pthread.dylib 0x7ff803b28c55 _pthread_wqthread + 327
11 libsystem_pthread.dylib 0x7ff803b27bbf start_wqthread + 15
Of course, the MacBookPro is an old device - but Continuity Camera works with the installed Photo Booth app, so it's possible.
Any thoughts on solving this situation would be appreciated.
Regards, Michaela
I'm developing a macOS application using Swift and a camera extension. I'm utilizing the Vision framework's VNGeneratePersonSegmentationRequest to apply a background blur effect. However, I'm experiencing significant lag in the video feed. I've tried optimizing the request, but the issue persists. Could anyone provide insights or suggestions on how to resolve this lagging issue?
Details:
Platform: macOS
Language: Swift
Framework: Vision
code snippet I am using are below
`class ViewController: NSViewController, AVCaptureVideoDataOutputSampleBufferDelegate {
var frameCounter = 0
let frameSkipRate = 2
private let visionQueue = DispatchQueue(label: "com.example.visionQueue")
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
frameCounter += 1
if frameCounter % frameSkipRate != 0 {
return
}
guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
performPersonSegmentation(on: ciImage) { [self] mask in
guard let mask = mask else { return }
let blurredBackground = self.applyBlur(to: ciImage)
let resultImage = self.composeImage(with: blurredBackground, mask: mask, original: ciImage)
let nsImage = ciImageToNSImage(ciImage: resultImage)
DispatchQueue.main.async { [self] in
// Update your NSImageView or other UI elements with the composite image
if needToStream {
if (enqueued == false || readyToEnqueue == true), let queue = self.sinkQueue {
enqueued = true
readyToEnqueue = false
if let _ = image, let cgImage = nsImage.cgImage(forProposedRect: nil, context: nil, hints: nil) {
enqueue(queue, cgImage)
}
}
}
}
}
}
private func performPersonSegmentation(on image: CIImage, completion: @escaping (CIImage?) -> Void) {
let request = VNGeneratePersonSegmentationRequest()
request.qualityLevel = .fast // Adjust quality level as needed
request.outputPixelFormat = kCVPixelFormatType_OneComponent8
let handler = VNImageRequestHandler(ciImage: image, options: [:])
visionQueue.async {
do {
try handler.perform([request])
guard let result = request.results?.first as? VNPixelBufferObservation else {
completion(nil)
return
}
let maskPixelBuffer = result.pixelBuffer
let maskImage = CIImage(cvPixelBuffer: maskPixelBuffer)
completion(maskImage)
} catch {
print("Error performing segmentation: \(error)")
completion(nil)
}
}
}
private func composeImage(with blurredBackground: CIImage, mask: CIImage, original: CIImage) -> CIImage {
// Invert the mask to blur the background
let invertedMask = mask.applyingFilter("CIColorInvert")
// Ensure mask is correctly resized to match original image
let resizedMask = invertedMask.transformed(by: CGAffineTransform(scaleX: original.extent.width / invertedMask.extent.width, y: original.extent.height / invertedMask.extent.height))
// Blend the images using the mask
let blendFilter = CIFilter(name: "CIBlendWithMask")!
blendFilter.setValue(blurredBackground, forKey: kCIInputImageKey)
blendFilter.setValue(original, forKey: kCIInputBackgroundImageKey)
blendFilter.setValue(resizedMask, forKey: kCIInputMaskImageKey)
return blendFilter.outputImage ?? original
}
private func ciImageToNSImage(ciImage: CIImage) -> NSImage {
let cgImage = context.createCGImage(ciImage, from: ciImage.extent)!
return NSImage(cgImage: cgImage, size: ciImage.extent.size)
}
private func applyBlur(to image: CIImage) -> CIImage {
let blurFilter = CIFilter.gaussianBlur()
blurFilter.inputImage = image
blurFilter.radius = 7.0 // Adjust the blur radius as needed
return blurFilter.outputImage ?? image
}
}`