Heads up! We're sunsetting this forum to streamline our community space.

join us on Discord for support, updates, and all things Snap AR! See you there!
Sign up for our live Camera Kit office hour on July 17 @ 9:30am (PT) / 4:30pm (GMT)

External Images into Lens

Can we bring external images into a lens? Would like to pull images outside of your camera roll.

Reference: https://lens.snapchat.com/7731733ff75c4cd793c5e4b9519c683b

Best Answer

  • ForumChris
    ForumChris Posts: 23 👻
    #2 Answer ✓

    Hey @Blnk Digital

    I took a look at your project and think I understand where the confusion is coming from. The generated code that comes with an API Module isn't quite geared towards the RemoteMediaModule use case.

    I put together an example project that should hopefully show the changes you would need to make for this use case. I added comments directly in the files but let me know if any of it isn't clear.

    Here is the API documentation for the RemoteMediaModule as well but the attached example will probably be a bit more clear than the API docs.

Answers

  • stevenxu
    stevenxu Posts: 612 👻

    @Blnk | Michael You should be able to because I remember there was this request a while back. Let me double check with @ForumChris

  • Thanks Steven! Assuming we’ll be able to achieve through Remote API just wanted to clarify before jumping in. 🙏

  • ForumChris
    ForumChris Posts: 23 👻

    Hi @Blnk Digital

    Yes this is possible through Remote APIs:

    You would set them up as normal, and then send the image object in the body of the response from the app.

    If looking at the CatFactRemoteApiService example you would respond to the request from the Lens like this:

    "import_image" -> {
    // Get the image however makes sense for your use case 
                     val fromRaw = context.resources.openRawResource(R.raw.image).use { it.readBytes() }
    
                    var connection: HttpURLConnection? = null
                    try {
                        val url = URL("$BASE_URL")
                        connection = (url.openConnection() as HttpURLConnection).apply {
                            doOutput = false
                        }
                        val body = connection.inputStream.readBytes()
                        onResponse.accept(request.toSuccessResponse(body = fromRaw))
    

    And this would send the image to the Lens:

    In the Lens you would load the image through the RemoteMediaModule. When importing the spec into Lens Studio via the Asset Library, you get a few generated files one of which is API Module which has a few helpers and the endpoints listed for your spec. You will want to do something very similar to handleAPIResponse but will need to change it a bit to read in the image:

    The main thing is that rather than parsing the body of the response you will want to pass the entire response into a RemoteMediaModule to interpret the image. So something like:

    // Get a reference to a Remote Media Module
    
    //@input  Asset.RemoteMediaModule mediaModule; 
    
    function handleImageResponse(response, cb) {
        if (response.statusCode !== 1) {
            var errorMessage = getErrorMessage(response);
            print(errorMessage);
            cb(true, errorMessage);
        } else {
            script.mediaModule.loadResourceAsImageTexture(
                response.asResource(),
                function(texture) {
                    cb(false, texture);
                },
                function(error) {
                    print(error)
                });        
        }
    }
    
    ApiModule.prototype.import_image = function(image_id, cb) {
        var req = global.RemoteApiRequest.create();
        req.endpoint = "import_image";
        req.parameters = {
            "image_id": image_id,
        };
        this.remoteServiceModule.performApiRequest(req, function(response) {
            if(cb) {
                handleImageResponse(response, cb);
            }
        });
    };
    

    This, will call the callback and then pass the texture to it as the second parameter, then for whichever image you wish to change, you would set the baseTex value:

    //@input Component.Image screenImage;
    
    var displayImage  = function(isError, texture) {
        if(!isError)
            script.screenImage.mainMaterial.mainPass.baseTex = texture;
    }
    
    
    ApiModule.import_image("1", displayImage)
    
  • Hey @ForumChris - we are receiving an error "ApiModule not found". Do you know what could be wrong? Thanks!

  • Hey @ForumChris, just wanted to bump this. Does ApiModule need to be imported from somewhere?

  • ForumChris
    ForumChris Posts: 23 👻

    Hey @Blnk Digital yeah you get these once you create an API spec at https://my-lenses.snapchat.com/apis or you can use the Placeholder one provided.

    To import the API module, go to Asset Library > APIs and Placeholder (or your created one).

  • Hey Chris, we are still having some issues getting this working. What file would the last 2 code snippets go in? I'm not super familiar with how to execute scripts in Lens Studio.

    Also, what is a remote media module? Is there documentation on this? I'm not seeing any.

    Attaching the current lens studio file if you are able to directly tell us where we are going wrong.

    https://we.tl/t-IO68twP07H

    Thanks!

Categories