Heads up! We're sunsetting this forum to streamline our community space.

join us on Discord for support, updates, and all things Snap AR! See you there!
Sign up for our live Camera Kit office hour on July 17 @ 9:30am (PT) / 4:30pm (GMT)

Objective c

SlideUp
SlideUp Posts: 46 🔥

Hey,

I see the iOS samples are in swift. Does it support objective c as well? If so would you be able to share a basic sample of using camera kit in objective c?

Thanks

Comments

  • stevenxu
    stevenxu Posts: 612 👻

    @SlideUp Checking on this w/ team. Hang tight

  • SlideUp
    SlideUp Posts: 46 🔥

    When we import the bridge header “-swift.h” for the UI sdk we still can’t access the classes properly. I think the sdk isn’t exposing the classes fully to objc c

  • SlideUp
    SlideUp Posts: 46 🔥

    Here's what we tried

  • arashp
    arashp Posts: 52 🔥🔥

    Hi @SlideUp, Thanks for sharing this! Yah, basically, while many parts of Reference UI are accessible in Objective-C, not all are at the moment. The Reference UI is meant to provide reusable UI components and most developers are using Swift and SwiftUI.SnapchatDelegate, to take one of your examples, is accessible in Swift only.

    However, the code is available to you and you can make these elements accessible in Objective-C, for example by doing:

    @objc public protocol SnapchatDelegate: AnyObject.

    You will then have to do the same for other types. Some of them (e.g. Enums) need a little more than just the @objc attribute to be visible in Swift.

    Is your codebase currently in Objective-C or is the language you prefer? I would like to pass on your feedback to our team.

  • SlideUp
    SlideUp Posts: 46 🔥

    @arashp thanks for the info. Our main code base is swift but one of our developers likes to create quick prototypes of features in objective c and that’s why I was asking! Thanks

  • arashp
    arashp Posts: 52 🔥🔥

    That makes perfect sense. Thanks for letting us know @SlideUp.

  • SlideUp
    SlideUp Posts: 46 🔥

    We re-wrote a basic version of the demo app 100% in objective c successfully. We’re able to start a session and apply a lens. It was a great learning experience of the camera kit flow 🤓

    There’s a 10 seconds delay for the camera preview to show for the first time. We’re currently investigating it.

    We also implemented a custom remote api following the sample and noticed it’s not a fully available feature yet.

  • SlideUp
    SlideUp Posts: 46 🔥

    @arashp why does it take 9 seconds for this method to complete?

    [self.cameraKit startWithInput:input arInput:arInput cameraPosition:AVCaptureDevicePositionFront videoOrientation:AVCaptureVideoOrientationPortrait dataProvider:dataProvider hintDelegate:nil textInputContextProvider:nil agreementsPresentationContextProvider:nil];

    see logs below.

    it's quite slow.

    2022-10-05 19:57:32.807949-0300 camkit[345:39583] self.cameraKit startWithInput started
    2022-10-05 19:57:41.976361-0300 camkit[345:39583] Metal API Validation Enabled
    [SG][I] [GL::Device] platformDeviceString: Apple Inc. Apple A10 GPU OpenGL ES 3.0 Metal - 66.6[SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
    [SG][I] CORE: SG GL Version: 30
    [SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
    [SG][I] CORE: SG GL Version: 30
    [SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
    [SG][I] CORE: SG GL Version: 30
    [SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
    [SG][I] CORE: SG GL Version: 30
    2022-10-05 19:57:42.078955-0300 camkit[345:39583] self.cameraKit startWithInput ended

  • stevenxu
    stevenxu Posts: 612 👻

    @SlideUp

    We re-wrote a basic version of the demo app 100% in objective c successfully. We’re able to start a session and apply a lens. It was a great learning experience of the camera kit flow 🤓

    Wow nice!! Any key learnings to share?

    why does it take 9 seconds for this method to complete?

    Checking with team... Is that 9 second unique in objective c or in general?

  • SlideUp
    SlideUp Posts: 46 🔥

    The swift code doesn't have the 9 seconds delay on initilization.

    But it has this error in the logs

    2022-10-07 15:44:50.294130-0300 CameraKitSample[2271:633813] Metal API Validation Enabled
    [SG][I] [GL::Device] platformDeviceString: Apple Inc. Apple A10 GPU OpenGL ES 3.0 Metal - 66.6[SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
    [SG][I] CORE: SG GL Version: 30
    [SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
    [SG][I] CORE: SG GL Version: 30
    [SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
    [SG][I] CORE: SG GL Version: 30
    [SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
    [SG][I] CORE: SG GL Version: 30
    2022-10-07 15:44:51.186687-0300 CameraKitSample[2271:633835] Task .<3> finished with error [-999] Error Domain=NSURLErrorDomain Code=-999 "cancelled" UserInfo={NSErrorFailingURLStringKey=https://cf-st.sc-cdn.net/aps/bolt/aHR0cHM6Ly9ib2x0LWdjZG4uc2MtY2RuLm5ldC8zL0pla25nMU5PSEV1bkRUd2VWM2lyUD9ibz1FaGdhQUJvQU1nRjlPZ0VFUWdZSXRjdUJtZ1pJQWxBU1lBRSUzRCZ1Yz0xOA._FMpng, NSLocalizedDescription=cancelled, NSErrorFailingURLKey=https://cf-st.sc-cdn.net/aps/bolt/aHR0cHM6Ly9ib2x0LWdjZG4uc2MtY2RuLm5ldC8zL0pla25nMU5PSEV1bkRUd2VWM2lyUD9ibz1FaGdhQUJvQU1nRjlPZ0VFUWdZSXRjdUJtZ1pJQWxBU1lBRSUzRCZ1Yz0xOA._FMpng}
    2022-10-07 15:44:51.222122-0300 CameraKitSample[2271:633905] Connection 5: unable to determine interface type without an established connection
    2022-10-07 15:44:51.222164-0300 CameraKitSample[2271:633905] Connection 5: unable to determine fallback status without a connection
    2022-10-07 15:44:51.222282-0300 CameraKitSample[2271:633905] Task .<3> HTTP load failed, 0/0 bytes (error code: -999 [1:89])

  • SlideUp
    SlideUp Posts: 46 🔥

    Solved the issue!

    My mistake.

    The documentation says it clearly:

        // It's important you start the capture session after starting the CameraKit session
        // because the CameraKit input and session configures the capture session implicitly and you may run into a
        // race condition which causes some audio and video output frames to be lost, resulting in a blank preview view
    

    I was starting it before CameraKit.

    B)

  • stevenxu
    stevenxu Posts: 612 👻

    @SlideUp Ah phew okay thanks for closing the loop!

  • arashp
    arashp Posts: 52 🔥🔥

    Thanks for the update @SlideUp and sorry I missed it! Glad to hear you got a demo app in Objective-C. And yes, you found the root cause: that method should really take ms rather seconds, if you start the capture session after the Camera Kit session

Categories