ForumChris 👻

Comments

  • Hello @"Angelique A", it is hard to answer for your specific workflow. To answer more generally, you can use Camera Kit to use the capabilities of Lens Studio in your own app or website. Definitely checkout the features available in our Lens Studio documentation, or provide additional information on your creative workflow…
  • Hello @c18e, glad you are interested in the capabilities of Camera Kit. With Camera Kit, here is our image marker documentation which can help you get started with scanning specific visual trackers. For the ability to upload content and easily position it, you have a couple of different options. You can either build these…
  • Hey @anasshaikh045, you can get positional data for several tracked elements via body tracking and adjust the sizes of AR content base don this. Note that if you setup your project following our template examples, AR content that is bound to these elements will scale automatically.
  • @"ThalesLens official" yeah there currently isnt this API but I will make a feature request for the team
  • Hello @"mrm.designs" for Remote Assets the Lens will need to be uploaded to an associated organization's Lens Folder as that is how the Assets are validated. Non-Remote Asset Lenses should still be able to be uploaded through the previous flow. Can you confirm in Business Manager (business.snapchat.com > Members) that you…
  • @"ThalesLens official" can you provide more information on what the use case is and what API you are hoping to use? Currently the only way to publish a Lens is via Lens Studio
  • Hey @aidan. Unfortunately, that isn't possible and they need to be delivered via the Camera Kit portal. That Lens is included as part of the sample apps to know that the view is working when your online Lenses are not delivered. It is delivered through some internal settings that are not able to enable at this time.…
  • Hey @"Jorge Costa", yep it's possible! Take a look at this forum post which shows the workflow for importing an image and will largely be the same as for a GLB model. Additionally, here is the image example on our docs page You will leverage a Remote Media Module (which can be added from the Lens Studio Resources Panel)…
  • @kforte318 You should have a delete option if you scroll to the right within that table:
  • @"Eric G" just to confirm, you now have the model coming in but the baked in materials/ textures are not? If you use the built in GLTF material it should render correctly:
  • Hey @"Blnk Digital" I took a look at your project and think I understand where the confusion is coming from. The generated code that comes with an API Module isn't quite geared towards the RemoteMediaModule use case. I put together an example project that should hopefully show the changes you would need to make for this…
  • Hey @Kevando To send data from the Lens to the app you will want to use Remote APIs. In the Camera Kit Sample app you can look at the CatFactRemoteApiServiceProvider for an example implementation and adapt that to send a request from the Lens once the user has completed it. Your use case is most similar to the Button Press…
  • Hey @"Blnk Digital" yeah you get these once you create an API spec at https://my-lenses.snapchat.com/apis or you can use the Placeholder one provided. To import the API module, go to Asset Library > APIs and Placeholder (or your created one).
  • Hi @"Blnk Digital" Yes this is possible through Remote APIs: You would set them up as normal, and then send the image object in the body of the response from the app. If looking at the CatFactRemoteApiService example you would respond to the request from the Lens like this: "import_image" -> {// Get the image however makes…
  • Hi @"Daniel eXplorins" I just tested for my account and was successful. Can you confirm that you are not getting the API to show in the list when following the setup steps in the Camera Kit docs: https://docs.snap.com/camera-kit/guides/tutorials/communicating-between-lenses-and-app#creating-an-api-spec Additionally, if the…
  • Hello @"Blnk | Michael" To follow up on my comment from Office Hours, this is possible and there are a couple of ways to approach it. From the Lens standpoint: 1. Start from the Dynamic Lens Template: This template shows how to read in Launch Parameters 2. Upload to Remote Assets: Once you have your assets uploaded to…
  • Hello Artiphon, This is not currently possible but is on our roadmap. We will report back when this is released.
  • Hey @Roy, We have seen this wrong visibility error off an on in the past and are looking into it further to resolve it. With that said, this message should not block the actual communication between the app and the Lens. Can you confirm if you are using your own API Spec or the placeholder one for Camera Kit? If you can…
  • Hello @"Jorge Costa" That is possible through Remote APIs Since all requests via Remote APIs are initiated from the Lens you would send it to the app at launch and then call your onTap function when you get the response. In the Button Press example in the Lens you would interpret parsedResponse with the values you are…
  • Hello @"Jorge Costa" , it sounds like you may benefit from Remote APIs With this setup, you could send a request from the Lens to the app, then within the app respond to the request with the new line to display once that SwiftUI Button is pressed. This way, the Lens will stay applied only once, but the information in the…
  • Hello @onurk Depending on your use case there are a couple of approaches you could take. For a scenario where there are a people moving through the frame while AR content remains in the background, this could be achieved through a Segmentation Texture. You could set the segmentation type to body and click the inverted…
  • Hello, We are happy to help debug why some Lenses are causing the app to crash. Have you been able to reproduce this in the sample app as well? We will follow up offline to get the Lens IDs for these Lenses as well. As for the 1.4 SDK version in the CamKit portal. This is the placeholder value that we use when a Lens is…