Heads up! We're sunsetting this forum to streamline our community space.

join us on Discord for support, updates, and all things Snap AR! See you there!
Sign up for our live Camera Kit office hour on July 17 @ 9:30am (PT) / 4:30pm (GMT)

CameraKit crashes if on audio call (iOS)

aidan
aidan Posts: 32 🔥
edited April 2023 in General #1

**Tested with Discord and FaceTime - if I have an active call going it messes with the AVCaptureSession/ARKit somehow and causes CameraKit to fail crashing the whole app

If I set AVSessionInput.audioEnabled to False it at least doesn't crash anymore:
avSession = AVSessionInput(session: self.captureSession) avSession!.audioEnabled = false

BUT the camera still freezes up due to something wrong with the ARSession:

2023-04-08 04:00:13.437839-0400 Gotcha[1259:72917] [Session] ARSession <0x13f250d00>: did fail with error: Error Domain=com.apple.arkit.error Code=102 "Required sensor failed." UserInfo={NSLocalizedFailureReason=A sensor failed to deliver the required input., NSUnderlyingError=0x2808bcc00 {Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x2809c8b40 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}}}, NSLocalizedRecoverySuggestion=Make sure that the application has the required privacy settings., NSLocalizedDescription=Required sensor failed.}

Only happens when on call! Very strange behavior.

Comments

  • aidan
    aidan Posts: 32 🔥
    edited April 2023 #2

    Update: The error doesn't seem to have anything to do with Microphone or Video usage in other apps (I disabled their permissions)

    Now my hunch is the error is a conflict with AVAudioSession - no matter what audio settings are set in advance CameraKit will shut down background audio making me think it collides with audio calls.

    Is it possible that CameraKit is overriding AVAudioSession settings? FYI I've now tested with FaceTime and Discord audio calling and both cause CameraKit to fail.

    iPhone 12 Pro running 16.4.1

    FYI this is important use case for my app as I'm making Lens games streamable ;)

    Here's a little more so you can see what's going on: Camera loads then Lens loads up (Lens has worldtracking functionality) but after ~0.1s the entire View freezes. Can still be dismissed though. Another potentially helpful detail is when the view occasionally loads without displaying the Lens (just the camera feed) it does not freeze up and the camera feed continues to stream in.

  • aidan
    aidan Posts: 32 🔥
    edited April 2023 #3

    Update #2: Confirming this error does NOT occur within a facetracking lens (back or front camera doesn't matter) so it seems to be a core issue with how ARKit is handled inside CameraKit and I have suspicion the ARSessionInput(session:ARSession()) provided to CameraKit.start() is not being used as I also don't receive any events in my ARSessionDelegate so it's potentially ignoring my attempt to use providesAudioData=false which could be the reason it's failing

    TL;DR: background audio call + CameraKit + Lens with worldtracking = CRASH!

  • aidan
    aidan Posts: 32 🔥

    @arashp @stevenxu Update #3: Confirming it does indeed suspend when receiving a call from both FaceTime and Discord during an active CameraKit session and never recovers, appearing frozen on its last rendered frame. The app remains stable, view can be dismissed, no error in console

  • stevenxu
    stevenxu Posts: 612 👻

    @aidan Thanks for following up on that! Updated our ticket for @arashp and eng team to investigate when they get a chance.

  • stevenxu
    stevenxu Posts: 612 👻

    @aidan We didn't forget about this! Will follow up with our eng team later today. Last I checked, they are aware of this bug and it's a matter of bandwidth to get to this when they can. Thanks for your patience!

  • aidan
    aidan Posts: 32 🔥

    @stevenxu woooot! thanks for the ping. If it's realistic to have it working by AWE (May 31st) would love to know as I'm going to be showing it off there and running trials with the first batch of Twitch streamers

    otherwise full steam ahead on gameplay over here!!

  • arashp
    arashp Posts: 52 🔥🔥
    edited May 2023 #8

    Hi @aidan, thank you for reporting this and your patience while we looked into it. It may be that the compass, microphone, or some other sensor--used by ARKit in this case--are unavailable during a call; hence the message: "Code=102 'Required sensor failed' ". (Well, maybe "sensor is unavailable" would've been more apt.)

    To handle this case, more generally, you might handle the interruption: stop your ARSessionInput and AVSessionInput when a user gets a call and start once the interruption has ended. As a rough example you might implement the handing of interruptions along these lines:

    NotificationCenter.default.addObserver(self, selector: #selector(wasInterrupted), name: 
         .AVCaptureSessionWasInterrupted, object: nil)
    
    NotificationCenter.default.addObserver(self, selector: #selector(interruptionEnded), name:
         .AVCaptureSessionInterruptionEnded, object: nil)
    
    NotificationCenter.default.addObserver(self, selector: #selector(interruptionEnded), name:
         UIApplication.didBecomeActiveNotification, object: nil)
    

    Or whatever cases you wish to handle, and then stop and start your inputs:

        @objc
        func wasInterrupted() {
            cameraController?.arInput.stopRunning()
            cameraController?.cameraInput?.stopRunning()
        }
    
        @objc
        func interruptionEnded() {
            if cameraController?.arInput.isRunning == false {
                cameraController?.arInput.startRunning()
                cameraController?.cameraInput?.startRunning()
            }
        }
    
    

    Let us know whether this addresses the issue or if you have any further questions. Thanks again.

  • aidan
    aidan Posts: 32 🔥

    Hey @arashp, thanks for the detailed response however for my use case I need ARKit to keep alive during an active call. If I'm understanding you correctly the compass is an optional feature in ARKit so would I be able to disable the compass while using CameraKit? Then I would be able to stream the CameraKit session during screen broadcasting. Thanks so much

  • arashp
    arashp Posts: 52 🔥🔥

    Appreciate the use case and feedback @aidan. Another potential cause here is that the microphone is unavailable during a call. I have asked our team to investigate. Thanks again for reporting this

  • aidan
    aidan Posts: 32 🔥
    edited September 2023 #11

    @arashp @stevenxu update on this issue? this is putting me in the unfortunate position to migrate off Camera Kit

    Big hunch this fix is as simple as setting ARWorldTrackingConfiguration().providesAudioData to False. This is done somewhere inside CameraKit.start which I don't have access to. If I override the delegate, I can see that the ARSession is indeed sending audio data ARSession did output audio sample buffer data

  • stevenxu
    stevenxu Posts: 612 👻

    @aidan Thanks for holding us accountable and your patience!! I've updated the jira ticket with your hunch and eng just confirmed they are scoping it this week with your information.

    This would be a huge L for our team if you were to migrate off, but completely understand why given our lack of progress and time to resolve this. For that, I apologize. No excuses on our side.

    Hang tight as I'm looking to get a full update for you by the date in the calendar invite sent.

  • Hi there,

    I’m facing a similar issue with my app, which involves several different lenses. When I don't grant microphone permission, some lenses work fine without needing it. However, in some cases, when the camera opens to start the lens, it freezes and displays an error message in the Xcode console. I'm pretty sure I haven't used any audio-related features in these lenses directly, but it seems that certain features like WorldTracking might be indirectly using audio, causing an error when a user denies microphone permission. You can find the error message from the Xcode console and the devices/software with their versions used during these tests below. I would greatly appreciate any help or information about the ongoing process to solve this problem if there is any.

    Device: iPhone 11, IOS: 17.5.1
    Flutter Version: 3.19.5
    SCCameraKit SDK Version: 1.30.0
    Lens Studio Version: 4.55.1
    Xcode Version: 15.4

    Error Message:

    ARSession <0x132fc4b70>: did fail with error: Error Domain=com.apple.arkit.error Code=104 "Microphone access not authorized." UserInfo={NSLocalizedRecoverySuggestion=Make sure that the application has the required microphone privacy settings., NSLocalizedDescription=Microphone access not authorized., NSLocalizedFailureReason=The app does not have permission to use the microphone.}

Categories