Heads up! We're sunsetting this forum to streamline our community space.

join us on Discord for support, updates, and all things Snap AR! See you there!

arashp 🔥🔥

Comments

  • Hello @"Deniz Çetin", thank you for the question about Remote APIs and Flutter. The Flutter Sample does not include Remote API support at the moment. If you'd like to develop this functionality, you might, first, import the Cat Fact example from our iOS Sample App to the Flutter App. There are three steps to implementing…
  • Hi @"Jorge Costa" -- thanks for explaining the use case. We do expect the overall memory footprint to increase until all Lenses are cached or until the app hits the Lens cache size limit. And yes clear does just clears the currently applied Lens. We will need to investigate further if you find the memory footprint…
  • Thanks @stevenxu, @"Jorge Costa" The lens repository is written to disk. It is refreshed every 12 hours or on every app launch. There is an API for determining the size of the cache: SCCameraKitCacheConfig. But there is no API for cleaning or invalidating the cache. Could you tell us a little more about the use case?
  • Hi @Rafa, thank you for the follow-up! The compressed size of the iOS SDK is ~12MB. This is what users would download from the App Store. The size of the SDK inside the Xcode filesystem will be bigger--roughly 29MB. But that's not the same as download size.
  • Hi @Rafa, we're always working to optimize the size and impact of the SDK on your apps. The download size is ~12MB on iOS and ~14MB on Android. On device size can vary however--especially depending on what Lenses are cached at the time--but 30MB is not unexpected in my experience. On Android, you can use Play Feature…
  • Hi Sam, sorry for the delay here. This code looks correct and I assume you are passing in configureDataProvider() into your Camera Kit session. The issue may be with your test custom lens. Have you try it with Buckingham Palace? It is a template found in Lens Studio. If you can get Buckingham Palace working, then it might…
  • Hi @"Ali Raza", thank you for reporting this problem. Are you able to reproduce it on our sample app? Without seeing your code and knowing the differences between the two devices, we are unable to tell why taking a photo/video might work on one device but not another. I am assuming the devices have the same permissions and…
  • Thank you for the feedback. Yes, this appears to be related to the ARKit session thread that you mentioned, which is still on our radar.
  • Hi @aidan, we checked on this and we have not published such performance metrics on iOS. However, running one of our sample apps on that same iPhone may provide a point of comparison. Apologies for the delay on this. https://github.com/Snapchat/camera-kit-reference/tree/main/samples/ios I am not away of any way of measure…
  • Hi @randomgoose, thanks for the questions. You are right that Camera Kit sample apps handle touches somewhat differently than Snapchat. If you are using our ReferenceUI, the code that handles touches is in CameraViewController.setupActions. You can modify that behavior for your use case. Regarding your other question, yes,…
  • Hi @pocketaapp, one more note this. If you'd like to get a list of lens ids, given a group id, the repository classes are the place to look. In iOS, for instance, you can use LensRepository.lenses(groupID:) On Android, there are two options for getting such Lens Group metadata:…
  • Hi @pocketaapp, welcome and thanks for the question. We have a guide for downloading Lenses before users open the camera screen. You can find some code samples that capture our best practices in that guide. https://docs.snap.com/camera-kit/quick-start/integrate-sdk/integrate-sdk-app/guides/prefetch-lenses In addition to…
  • Thank you for your awesome post @aidan. I am guessing it would be useful to get a callback when ARSession is ready so that the app doesn't have to wait and poll?
  • Hi @"Gabriele Conte", thank you for reporting this. The Swift code looks good. So the difference must be caused by something else. We can definitely look into it once we have the information requested by my colleague @stevenxu.
  • Appreciate the use case and feedback @aidan. Another potential cause here is that the microphone is unavailable during a call. I have asked our team to investigate. Thanks again for reporting this
  • Hi @aidan, thank you for reporting this and your patience while we looked into it. It may be that the compass, microphone, or some other sensor--used by ARKit in this case--are unavailable during a call; hence the message: "Code=102 'Required sensor failed' ". (Well, maybe "sensor is unavailable" would've been more apt.)…
  • Hi @Sani, thank you for reporting this issue. I am able to reproduce it. It looks like Camera Kit SDK is overriding the AVAudioSession category downstream from where you set it. I have opened a ticket so that the team can investigate. In the meantime, since Camera Kit is changing the category, a potential workaround is to…
  • Hi @"Ameer Hamza", thank you for the question. That is correct: at the moment, you would need to port CameraActionsView to SwiftUI. This class has not been implemented in SwiftUI yet.
  • Hi @Shreyash Shah, thank you for the question. SCCameraKitSession should take 10 to 30 milliseconds to initialize. In most context this should have no performance impact. However, we do recommend not running multiple instances SCCameraKitSession at the same time. It is possible to do so but it can have a performance…
  • Hi @"Ameer Hamza", thank you for raising this. If you are looking to change a private method or else many things in ReferenceUI, you may wish to make a deep copy of the files in ReferenceUI. In other words, copy the ReferenceUI folders from the Pod project to your app. In this particular case,…
  • Hi @"AliveNow Creative Tech", thank you for reporting this issue. We are unable to reproduce the delay or failure to apply a Lens in the iOS sample app running SDK v1.20.0. To find the cause of this could you please answer the following questions? -- Is this happening with every Lens or some Lenses? -- Is it happening on…
  • Hi @"Mubassir Hayat". Thank you for this. Zooming out to the original question, we have captured the best practices for setting up a camera pipeline and recording AR Experiences in our sample apps: https://github.com/Snapchat/camera-kit-reference Once you have the recording, it would be up to you to send it up to your…
  • Hi @Sani, thank you for the question. The time to update the lens is set by our caching policy. It can take up to 12 hours for Lenses to be updated in the app. There is no workaround at the moment. However, this is an area we are actively working on and I have passed on your feedback to the team
  • Hi @Rafa, thank you for this question. We don't use SPM in the sample apps--in that way we are not explicitly supporting them. But all our individual frameworks should be SPM compatible.
  • Hi @"Andrey Rylov", regarding your question about error handling: sample apps default to crashing in this scenario, but we provide an api to handle such errors: Android https://kit.snapchat.com/reference/CameraKit/android/1.19.0/-camera-kit/com.snap.camerakit/-session/-builder/handle-errors-with.html iOS…
  • Hi @"Andrey Rylov", thank you for the feedback. I will discuss with the team
  • As mentioned on that thread, besides the workaround, the Camera Kit team is discussing ways of avoiding this issue by restructuring the example provided in the iOS Sample App.
  • To avoid this confusion in the future, the team is investigating a fix with migrating the CameraKit setup from the AppDelegate to the SceneDelegate.
  • A quick note on the original error: [error] Error, [LSAGLView::drawTexture] LS::Exception : [LSAGLView] attempt to use deleted framebuffer As mentioned this was due to a difference in info.plist. Investigating this further, removing the key Scene Configuration from info.plist should resolve the issue. This key is added to…