while using the "Detect, track and classify objects with a custom classification model on Android" as explained here: link to tutorial
I am bounded to use the remote model only while wi-fi network is connected (as supposes to be to save users from cellular data usage). I want to use remote model BUT save it after downloaded for later usage in cache. I cannot use all the models as local to begin with because the APK will be too big (it will bundle the models).
This is how my supposes to load local model:
LocalModel localModel =
new LocalModel.Builder()
.setAssetFilePath("model.tflite")
// or .setAbsoluteFilePath(absolute file path to model file)
// or .setUri(URI to model file)
.build();
This is how to load a model from Firebase:
DownloadConditions downloadConditions = new DownloadConditions.Builder()
.requireWifi()
.build();
RemoteModelManager.getInstance().download(remoteModel, downloadConditions)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void unused) {
RemoteModelManager.getInstance().isModelDownloaded(remoteModel)
.addOnSuccessListener(aBoolean -> {
tensorDownloaded = aBoolean;
AppendLogError("loadTesnsor::tensor model loaded::3");
});
}
})
Maybe - I can save the model when the download finished? if yes - how to save it? something like this:
RemoteModelManager.getInstance().download(remoteModel, downloadConditions)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void unused) {
RemoteModelManager.getInstance().isModelDownloaded(remoteModel)
.addOnSuccessListener(aBoolean -> {
RemoteModelManager.getInstance().getDownloadedModels(HOW TO SAVE THE MODEL)
tensorDownloaded = aBoolean;
});
}
})
Any other recommendation of how to save APK file size and still manage several models will be great.