join us on Discord for support, updates, and all things Snap AR! See you there!
Objective c
Comments
-
Specifically could you show this in objective c? https://github.com/Snapchat/camera-kit-reference/blob/main/samples/ios/CameraKitSample/CameraKitSample/AppDelegate.swift
Thanks
0 -
Hi @SlideUp, Thanks for sharing this! Yah, basically, while many parts of Reference UI are accessible in Objective-C, not all are at the moment. The Reference UI is meant to provide reusable UI components and most developers are using Swift and SwiftUI.
SnapchatDelegate
, to take one of your examples, is accessible in Swift only.However, the code is available to you and you can make these elements accessible in Objective-C, for example by doing:
@objc public protocol SnapchatDelegate: AnyObject
.You will then have to do the same for other types. Some of them (e.g. Enums) need a little more than just the @objc attribute to be visible in Swift.
Is your codebase currently in Objective-C or is the language you prefer? I would like to pass on your feedback to our team.
1 -
We re-wrote a basic version of the demo app 100% in objective c successfully. We’re able to start a session and apply a lens. It was a great learning experience of the camera kit flow 🤓
There’s a 10 seconds delay for the camera preview to show for the first time. We’re currently investigating it.
We also implemented a custom remote api following the sample and noticed it’s not a fully available feature yet.
1 -
@arashp why does it take 9 seconds for this method to complete?
[self.cameraKit startWithInput:input arInput:arInput cameraPosition:AVCaptureDevicePositionFront videoOrientation:AVCaptureVideoOrientationPortrait dataProvider:dataProvider hintDelegate:nil textInputContextProvider:nil agreementsPresentationContextProvider:nil];
see logs below.
it's quite slow.
2022-10-05 19:57:32.807949-0300 camkit[345:39583] self.cameraKit startWithInput started
2022-10-05 19:57:41.976361-0300 camkit[345:39583] Metal API Validation Enabled
[SG][I] [GL::Device] platformDeviceString: Apple Inc. Apple A10 GPU OpenGL ES 3.0 Metal - 66.6[SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
[SG][I] CORE: SG GL Version: 30
[SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
[SG][I] CORE: SG GL Version: 30
[SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
[SG][I] CORE: SG GL Version: 30
[SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
[SG][I] CORE: SG GL Version: 30
2022-10-05 19:57:42.078955-0300 camkit[345:39583] self.cameraKit startWithInput ended0 -
We re-wrote a basic version of the demo app 100% in objective c successfully. We’re able to start a session and apply a lens. It was a great learning experience of the camera kit flow 🤓
Wow nice!! Any key learnings to share?
why does it take 9 seconds for this method to complete?
Checking with team... Is that 9 second unique in objective c or in general?
0 -
The swift code doesn't have the 9 seconds delay on initilization.
But it has this error in the logs
2022-10-07 15:44:50.294130-0300 CameraKitSample[2271:633813] Metal API Validation Enabled
[SG][I] [GL::Device] platformDeviceString: Apple Inc. Apple A10 GPU OpenGL ES 3.0 Metal - 66.6[SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
[SG][I] CORE: SG GL Version: 30
[SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
[SG][I] CORE: SG GL Version: 30
[SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
[SG][I] CORE: SG GL Version: 30
[SG][I] CORE: OpenGL Version string: OpenGL ES 3.0 Metal - 66.6
[SG][I] CORE: SG GL Version: 30
2022-10-07 15:44:51.186687-0300 CameraKitSample[2271:633835] Task.<3> finished with error [-999] Error Domain=NSURLErrorDomain Code=-999 "cancelled" UserInfo={NSErrorFailingURLStringKey=https://cf-st.sc-cdn.net/aps/bolt/aHR0cHM6Ly9ib2x0LWdjZG4uc2MtY2RuLm5ldC8zL0pla25nMU5PSEV1bkRUd2VWM2lyUD9ibz1FaGdhQUJvQU1nRjlPZ0VFUWdZSXRjdUJtZ1pJQWxBU1lBRSUzRCZ1Yz0xOA._FMpng, NSLocalizedDescription=cancelled, NSErrorFailingURLKey=https://cf-st.sc-cdn.net/aps/bolt/aHR0cHM6Ly9ib2x0LWdjZG4uc2MtY2RuLm5ldC8zL0pla25nMU5PSEV1bkRUd2VWM2lyUD9ibz1FaGdhQUJvQU1nRjlPZ0VFUWdZSXRjdUJtZ1pJQWxBU1lBRSUzRCZ1Yz0xOA._FMpng}
2022-10-07 15:44:51.222122-0300 CameraKitSample[2271:633905] Connection 5: unable to determine interface type without an established connection
2022-10-07 15:44:51.222164-0300 CameraKitSample[2271:633905] Connection 5: unable to determine fallback status without a connection
2022-10-07 15:44:51.222282-0300 CameraKitSample[2271:633905] Task.<3> HTTP load failed, 0/0 bytes (error code: -999 [1:89]) 0 -
Solved the issue!
My mistake.
The documentation says it clearly:
// It's important you start the capture session after starting the CameraKit session // because the CameraKit input and session configures the capture session implicitly and you may run into a // race condition which causes some audio and video output frames to be lost, resulting in a blank preview view
I was starting it before CameraKit.
1