join us on Discord for support, updates, and all things Snap AR! See you there!
CameraKit crashes if on audio call (iOS)
**Tested with Discord and FaceTime - if I have an active call going it messes with the AVCaptureSession/ARKit somehow and causes CameraKit to fail crashing the whole app
If I set AVSessionInput.audioEnabled to False it at least doesn't crash anymore:avSession = AVSessionInput(session: self.captureSession) avSession!.audioEnabled = false
BUT the camera still freezes up due to something wrong with the ARSession:
2023-04-08 04:00:13.437839-0400 Gotcha[1259:72917] [Session] ARSession <0x13f250d00>: did fail with error: Error Domain=com.apple.arkit.error Code=102 "Required sensor failed." UserInfo={NSLocalizedFailureReason=A sensor failed to deliver the required input., NSUnderlyingError=0x2808bcc00 {Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x2809c8b40 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}}}, NSLocalizedRecoverySuggestion=Make sure that the application has the required privacy settings., NSLocalizedDescription=Required sensor failed.}
Only happens when on call! Very strange behavior.
Comments
-
Update: The error doesn't seem to have anything to do with Microphone or Video usage in other apps (I disabled their permissions)
Now my hunch is the error is a conflict with
AVAudioSession
- no matter what audio settings are set in advance CameraKit will shut down background audio making me think it collides with audio calls.Is it possible that CameraKit is overriding
AVAudioSession
settings? FYI I've now tested with FaceTime and Discord audio calling and both cause CameraKit to fail.iPhone 12 Pro running 16.4.1
FYI this is important use case for my app as I'm making Lens games streamable
Here's a little more so you can see what's going on: Camera loads then Lens loads up (Lens has worldtracking functionality) but after ~0.1s the entire View freezes. Can still be dismissed though. Another potentially helpful detail is when the view occasionally loads without displaying the Lens (just the camera feed) it does not freeze up and the camera feed continues to stream in.
1 -
Update #2: Confirming this error does NOT occur within a facetracking lens (back or front camera doesn't matter) so it seems to be a core issue with how ARKit is handled inside CameraKit and I have suspicion the
ARSessionInput(session:ARSession())
provided toCameraKit.start()
is not being used as I also don't receive any events in my ARSessionDelegate so it's potentially ignoring my attempt to useprovidesAudioData=false
which could be the reason it's failingTL;DR: background audio call + CameraKit + Lens with worldtracking = CRASH!
1 -
-
Hi @aidan, thank you for reporting this and your patience while we looked into it. It may be that the compass, microphone, or some other sensor--used by ARKit in this case--are unavailable during a call; hence the message: "Code=102 'Required sensor failed' ". (Well, maybe "sensor is unavailable" would've been more apt.)
To handle this case, more generally, you might handle the interruption: stop your
ARSessionInput
andAVSessionInput
when a user gets a call and start once the interruption has ended. As a rough example you might implement the handing of interruptions along these lines:NotificationCenter.default.addObserver(self, selector: #selector(wasInterrupted), name: .AVCaptureSessionWasInterrupted, object: nil) NotificationCenter.default.addObserver(self, selector: #selector(interruptionEnded), name: .AVCaptureSessionInterruptionEnded, object: nil) NotificationCenter.default.addObserver(self, selector: #selector(interruptionEnded), name: UIApplication.didBecomeActiveNotification, object: nil)
Or whatever cases you wish to handle, and then stop and start your inputs:
@objc func wasInterrupted() { cameraController?.arInput.stopRunning() cameraController?.cameraInput?.stopRunning() } @objc func interruptionEnded() { if cameraController?.arInput.isRunning == false { cameraController?.arInput.startRunning() cameraController?.cameraInput?.startRunning() } }
Let us know whether this addresses the issue or if you have any further questions. Thanks again.
2 -
Hey @arashp, thanks for the detailed response however for my use case I need ARKit to keep alive during an active call. If I'm understanding you correctly the compass is an optional feature in ARKit so would I be able to disable the compass while using CameraKit? Then I would be able to stream the CameraKit session during screen broadcasting. Thanks so much
1 -
@arashp @stevenxu update on this issue? this is putting me in the unfortunate position to migrate off Camera Kit
Big hunch this fix is as simple as setting
ARWorldTrackingConfiguration().providesAudioData
toFalse
. This is done somewhere inside CameraKit.start which I don't have access to. If I override the delegate, I can see that the ARSession is indeed sending audio dataARSession did output audio sample buffer data
2 -
@aidan Thanks for holding us accountable and your patience!! I've updated the jira ticket with your hunch and eng just confirmed they are scoping it this week with your information.
This would be a huge L for our team if you were to migrate off, but completely understand why given our lack of progress and time to resolve this. For that, I apologize. No excuses on our side.
Hang tight as I'm looking to get a full update for you by the date in the calendar invite sent.
0 -
Hi there,
I’m facing a similar issue with my app, which involves several different lenses. When I don't grant microphone permission, some lenses work fine without needing it. However, in some cases, when the camera opens to start the lens, it freezes and displays an error message in the Xcode console. I'm pretty sure I haven't used any audio-related features in these lenses directly, but it seems that certain features like WorldTracking might be indirectly using audio, causing an error when a user denies microphone permission. You can find the error message from the Xcode console and the devices/software with their versions used during these tests below. I would greatly appreciate any help or information about the ongoing process to solve this problem if there is any.
Device: iPhone 11, IOS: 17.5.1
Flutter Version: 3.19.5
SCCameraKit SDK Version: 1.30.0
Lens Studio Version: 4.55.1
Xcode Version: 15.4Error Message:
ARSession <0x132fc4b70>: did fail with error: Error Domain=com.apple.arkit.error Code=104 "Microphone access not authorized." UserInfo={NSLocalizedRecoverySuggestion=Make sure that the application has the required microphone privacy settings., NSLocalizedDescription=Microphone access not authorized., NSLocalizedFailureReason=The app does not have permission to use the microphone.}
0