Heads up! We're sunsetting this forum to streamline our community space.

join us on Discord for support, updates, and all things Snap AR! See you there!
Sign up for our live Camera Kit office hour on July 17 @ 9:30am (PT) / 4:30pm (GMT)

Duel Camera in Camera Kit Web

Kevando
Kevando Posts: 76 🔥🔥

hey @Michael Mase we talked about this on the office hours call, and i'd like to show you what i have in mind for what will be: the greatest show in the world.

https://t.snapchat.com/IKlXTsj9

here i've got a lens that "records" your face movements using the selfie camera, and then when you flip the camera and it image tracks a pringles can, it plays back the "recorded" blendshapes on the pringles face with a mesh i made.

what i want is to be able to not have the record part of the experience and have so you like puppeteer the face on the pringles can live

Comments

  • Kevando
    Kevando Posts: 76 🔥🔥

    oh yeah, i also have this lens on a camera kit website

    https://banana-party.netlify.app/lens/0

    its pretty janky. and i have no idea what im doing when it comes to the media stream stuff talked about on the call with the constructing a second canvas and whatnot. if there's good documentation/tutorial around this thats necessarily camera kit, please share it. i feel i know javascript pretty well, but this is all new territory for me

  • stevenxu
    stevenxu Posts: 612 👻

    @Kevando thanks for sharing here on the forum. Just created a ticket for Michael+web team when he gets a chance

  • Michael Mase
    Michael Mase Posts: 66 🔥🔥

    Hi @Kevando, are you still having issues with this? If you could share the Lens ID and describe how it works, I can try to help! :)

  • Kevando
    Kevando Posts: 76 🔥🔥

    darn, i didnt have email notifications on. Yes, here is the lens ID

    7e1f5d44-5ae7-41ec-b8b2-d504e12b2e37

    ONE
    The way my lens works is with the "face mesh expressions playback" asset. Such a cool asset. I didnt really change how that works too much. I have it "record" my face, which just saves the face expression blendshapes to some variable that gets saved in the script. I think i just made it global.

    TWO
    Then you switch the camera from front (selfie) camera to back (world) camera and it says "find a pringles can" then it uses an image marker to attach a face mesh that i made to the pringles can.

    THREE
    That face mesh is connected to the "face recorder" and plays back your expressions on the mesh.

    A video is worth a billion forum posts, so I will upload that shortly.

  • Kevando
    Kevando Posts: 76 🔥🔥

    https://www.dropbox.com/scl/fi/zgjleyon7p3q1fqt562zo/pringles-lens-how-it-works.mp4?rlkey=8z86udtijflt4ja25fduycu5o&dl=0

    ok @Michael Mase here is a link to a video i just made with the lens in snapchat. hopefully this makes sense.

    essentially I have exactly what i want, only it's in stages. what i really want is for this to all happen in real time. so you point your phone camera at the Pringles Can and he comes to life!

  • Kevando
    Kevando Posts: 76 🔥🔥

    https://t.snapchat.com/chnVBXCz

    and here is the lens for reference :D

  • Michael Mase
    Michael Mase Posts: 66 🔥🔥

    Hey @Kevando! You should be able to do this out of the box with the Web SDK by calling applyConstraints on your webcam's MediaStream and switching back and forth between a 'user' and 'environment' facing mode. If you have any questions, please let me know!

Categories