join us on Discord for support, updates, and all things Snap AR! See you there!
Camera Kit for web surface tracking
Hi,
I'm new to Camera Kit for web and I'd like to get a simple demo running. Just a cube that stays on a surface or similar. So far I've tried with a couple of lenses and existing templates (eg. Portal) but no luck with the camera kit for web. The object simply doesn't show.
Can anyone share a working example for surface tracking on the web please? I have seen the existing face filter demos, they work great. But surface tracking isn't.
Any pointers are highly appreciated.
Thank you
Best Answers
-
Hi @Thor!
For surface tracking work, the user needs to interact with the output canvas (typically by tapping it) so that Safari can request permission for the application to use the gyroscope and accelerometer. You may want to add a "tap here to provide permission" prompt in your Lens.
Additionally, when you set your source on your Camera Kit session, you'll need to specify that it is a back facing camera in order for surface tracking to be enabled. Instructions for this are here: https://docs.snap.com/camera-kit/quick-start/integrate-sdk/integrate-sdk-web/web-configuration#source
If you still have issues please let me know!
3 -
Hi Michael, thank you. I had granted the permissions and had done a click to get the sensor permissions popup.
What was missing was the instructions to use the back camera.await session.setSource(mediaStream,{cameraType: 'back'});
However, this also means that I should not use:const source = createMediaStreamSource(mediaStream);
but instead use the mediaStream directly:const mediaStream = await navigator.mediaDevices.getUserMedia({ audio: false, // video: true, video: { facingMode: { exact: "environment" }, }, }); await session.setSource(mediaStream,{cameraType: 'back'});
Thanks for the pointers.
1 -
@Thor, no problem. You can still use
createMediaStreamSource
. The second argument allows for options to be specified:const source = createMediaStreamSource(mediaStream, { cameraType: "back" });
1
Answers
-
Hi,
Thank you for your feedback. I agree, we should add a good example of this to the https://camera-kit.snapchat.com/websdk/sample/basic list.Until then, I think you can try a lens called "CamKit Anim Object" or "CamKit Look Around" from the default set of lenses in the Camera Kit portal. Launch a browser on a mobile device with a back camera and make sure the camera is not mirrored (https://camera-kit.snapchat.com/websdk/sample/transform). That should let you try the surface tracking.
Thanks!
0 -
Hi @JacekC , thank you for your reply. Unfortunately it also doesn't work with the proposed lenses. Could you please have a look at my code? Maybe you can figure out where I'm going the wrong way.
`window.addEventListener("load", async () => {
try {
// The JSON_WEB_TOKEN can be found in the SnapKit Portal, where it's called the ApiToken.// Bootstrapping Camera Kit downloads the WebAssembly executable which contains the rendering engine. const cameraKit = await bootstrapCameraKit({ apiToken: TOKEN }); // Creating the session initializes the rendering engine, and creates a CameraKitSession instance. The // CameraKitSession is used to interact with the rendering engine -- for example, setting an input video source // and applying Lenses. const session = await cameraKit.createSession(); // When an error occurs, it means the current Lens could not be rendered. A real application will want to do // something more sophisticated here -- like asking the user to pick a different Lens, for example. session.events.addEventListener("error", (event) => console.error(event.detail)); // We'll use the canvas-output element as a placeholder in our HTML, replacing it here with the <canvas> output // from CameraKitSession -- in particular, we're using the `live` output, which renders the Lens's Live // RenderTarget. // // (For more on RenderTargets, see [the LensStudio documentation](https://docs.snap.com/lens-studio/references/guides/lens-features/scene-set-up/camera#live-target-and-capture-target)) document.getElementById("root")!.replaceWith(session.output.live); // We use the LensRepository to fetch a list of Lenses -- these are identified by a LensGroup ID. LensGroups // are configured in the Camera Kit Portal, where their IDs can be found. const { lenses } = await cameraKit.lensRepository.loadLensGroups([MY_GROUP]); console.log('lenses', lenses[0]); try { // The `source` here is a CameraKitSource, which can be created using a variety of helper methods // provided by Camera Kit. For example, to create a source from the device's camera, use the // `createUserMediaSource`. const mediaStream = await navigator.mediaDevices.getUserMedia({ audio: false, // video: true, video: { facingMode: { exact: "environment" }, }, }); const source = createMediaStreamSource(mediaStream); // const userMediaSource = await createUserMediaSource({video:{facingMode: 'back'}}); // userMediaSource.setTransform(Transform2D.MirrorY) console.log('mediaSource', source) await session.setSource(source); // source.setTransform(Transform2D.MirrorX); session.applyLens(lenses[0]); session.play(); } catch (err) { console.error(err) } } catch (error: unknown) { console.error(error); }`
Thank you
0 -
Hi,
"CamKit Anim Object" or "CamKit Look Around" are the lensIds?
Sorry for confusion, these are the names of the two lenses in the default Lens Group that you get in the camera-kit portal: https://camera-kit.snapchat.com/organizations/ . You should see 20+ lenses there available for you to try and experiment with.
0