2

while using the "Detect, track and classify objects with a custom classification model on Android" as explained here: link to tutorial

I am bounded to use the remote model only while wi-fi network is connected (as supposes to be to save users from cellular data usage). I want to use remote model BUT save it after downloaded for later usage in cache. I cannot use all the models as local to begin with because the APK will be too big (it will bundle the models).

This is how my supposes to load local model:

LocalModel localModel =
new LocalModel.Builder()
    .setAssetFilePath("model.tflite")
    // or .setAbsoluteFilePath(absolute file path to model file)
    // or .setUri(URI to model file)
    .build();

This is how to load a model from Firebase:

DownloadConditions downloadConditions = new DownloadConditions.Builder()
                    .requireWifi()
                    .build();
RemoteModelManager.getInstance().download(remoteModel, downloadConditions)
           .addOnSuccessListener(new OnSuccessListener<Void>() {
            @Override
            public void onSuccess(Void unused) {
                 RemoteModelManager.getInstance().isModelDownloaded(remoteModel)
                                    .addOnSuccessListener(aBoolean -> {
                                        tensorDownloaded = aBoolean;
                                        AppendLogError("loadTesnsor::tensor model loaded::3");
                                    });
           }
 })

Maybe - I can save the model when the download finished? if yes - how to save it? something like this:

RemoteModelManager.getInstance().download(remoteModel, downloadConditions)
                    .addOnSuccessListener(new OnSuccessListener<Void>() {
                        @Override
                        public void onSuccess(Void unused) {
                            RemoteModelManager.getInstance().isModelDownloaded(remoteModel)
                                    .addOnSuccessListener(aBoolean -> {
                                        RemoteModelManager.getInstance().getDownloadedModels(HOW TO SAVE THE MODEL)
                                        tensorDownloaded = aBoolean;

                                    });
                        }
                    })

Any other recommendation of how to save APK file size and still manage several models will be great.

Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
kfir
  • 635
  • 8
  • 27

2 Answers2

0

After successfully downloading the remote model, you can obtain the downloaded model using getDownloadedModels() method of RemoteModelManager as you mentioned:

RemoteModelManager.getInstance().isModelDownloaded(remoteModel)
    .addOnSuccessListener(aBoolean -> {
        if (aBoolean) {
            RemoteModelManager.getInstance().getDownloadedModels(remoteModel)
                .addOnSuccessListener(downloadedModels -> {
                    // Save the downloaded model for later usage
                    // You can copy or move the model to the desired location
                });
        }
    });

In the addOnSuccessListener callback, you can retrieve the downloaded model using the getDownloadedModels() method. This will give you a list of DownloadedModel objects representing the downloaded models. You can save this model to the desired location in your cache or storage using standard file operations. For example, you can copy or move the model to a specific directory:

String destinationPath = "/path/to/destination/model.tflite";
File sourceFile = downloadedModels.get(0).getFile();
File destinationFile = new File(destinationPath);

try {
    // Copy the downloaded model to the destination file
    Files.copy(sourceFile.toPath(), destinationFile.toPath(), StandardCopyOption.REPLACE_EXISTING);
    // Or you can move the downloaded model to the destination file
    // Files.move(sourceFile.toPath(), destinationFile.toPath(), StandardCopyOption.REPLACE_EXISTING);

    // Now the model is saved at the destination path for later usage
} catch (IOException e) {
    // Handle the exception
}

Adjust the destinationPath variable to specify the desired location and file name for the saved model. By following these steps, you can save the downloaded remote model to your cache or storage for later usage while managing the APK file size. Remember to handle any necessary error handling and permissions related to file operations.

BugsCreator
  • 433
  • 3
  • 10
  • This is more ChatGPT plagiarism from this user. – tchrist Jun 30 '23 at 19:43
  • see answer is mine just rewrited with it – BugsCreator Jun 30 '23 at 20:19
  • 2
    I have research much about this topic. I saw this pattern that you suggested when using the tensorflow lite library. but when downloading using the new MLKIt, library you use "CustomRemoteModel" which do not have any file interface. the "downloadedModels.get(0).getFile();" do not exists. there is no way that I found to save remote model to local. I am updating my answer to use the download local model. – kfir Jul 11 '23 at 07:34
0

Following the answer from @BugCreator & google MLKit support, MLKit support I have concluded that the best way is to check if the model was downloaded & download if needed. no need to save it locally. (although the "save to cache" feature can be useful if you want to save a local copy to prevent future downloading if cache erase/ data optimizations was done by the google APIs)

I am posting my final code in case it will help someone.

RemoteModelManager.getInstance().isModelDownloaded(remoteModel)
                    .addOnSuccessListener(aBoolean -> {
                        if (!aBoolean)
                        {
                            RemoteModelManager.getInstance().download(remoteModel, downloadConditions)
                                    .addOnSuccessListener(new OnSuccessListener<Void>() {
                                        @Override
                                        public void onSuccess(Void unused) {
                                            RemoteModelManager.getInstance().isModelDownloaded(remoteModel)
                                                    .addOnSuccessListener(aBoolean -> {
                                                        tensorDownloaded = aBoolean;                                                            
                                                    });
                                        }
                                    })
                                    .addOnFailureListener(new OnFailureListener() {
                                        @Override
                                        public void onFailure(@NonNull Exception e) {
                                            AppendLogError("loadTesnsor::" + e.toString());
                                            tensorLoadOnce = false;
                                            ToastMeVeryShort("Wifi needed for first downloading of models", true);
                                        }
                                    });
                        }
                        else
                        {
                            tensorDownloaded = aBoolean;                                
                        }                            
                    });

Edit 1:

I have research much about this topic. when downloading using the new MLKit, library : I use the "CustomRemoteModel" which do not have any file interface. I cannot save to local the downloaded model for future use.

Also : when the wifi is not available, the MLKit has a bug that is caused by SDK 34 concerning the MLKit internal broadcast receivers. I have reported it here: link to bug report

I use now firebase storage to host and download one time the model file to internal local SD. it is anyhow easier to control, you can check if wifi is available and ask before downloading. or even allow cellular downloading.

this is the code

private void downloadModelToInternal() {

    File rootPath = new File(mContext.getFilesDir() , MODEL_DIRECTORY_NAME );
    if(!rootPath.exists()) {
        rootPath.mkdirs();
    }

    final File localFile = new File(rootPath,localModelName);
    if (!localFile.exists()) {
        FirebaseStorage storage = FirebaseStorage.getInstance();
        String bucket = "gs://appbucketnameonfirebase";
        StorageReference storageRef = storage.getReferenceFromUrl(bucket);
        StorageReference  islandRef = storageRef.child(localModelName);

        islandRef.getFile(localFile).addOnSuccessListener(new OnSuccessListener<FileDownloadTask.TaskSnapshot>() {
            @Override
            public void onSuccess(FileDownloadTask.TaskSnapshot taskSnapshot) {
                
                InitModelAfterDownload();
                ToastMeVeryShort("Model downloaded once", true);                    
            }
        }).addOnFailureListener(new OnFailureListener() {
            @Override
            public void onFailure(@NonNull Exception exception) {
                tensorLoadOnce.set(false);
                ToastMeVeryShort("Cannot download model", true);
                
            }
        });
    }
    else
    {
        InitModelAfterDownload();
    }
}
private void InitModelAfterDownload()
{
    try {
        File rootPath = new File(mContext.getFilesDir(), MODEL_DIRECTORY_NAME );
        if (!rootPath.exists()) {
            rootPath.mkdirs();
        }
        final File localFile = new File(rootPath, localModelName);
        localModel =
                new LocalModel.Builder()
                        .setAbsoluteFilePath(localFile.getAbsolutePath())
                        .build();
        tensorDownloaded = true;
    }
    catch (Exception ex)
    {
        
    }
}
kfir
  • 635
  • 8
  • 27