I have a model that uses dropout for MC prediction. It works fine in Python on my desktop. I’d like to execute the same algorithm on mobile platforms but the dropout does not seem to be applied. I get the same result each MC iteration when evaluating the model on mobile.
First, I load the model in PyTorch and enable dropout as documented in other questions [1]
model.eval()
for m in model.modules():
if m.__class__.__name__.startswith('Dropout'):
m.train()
return model
Next I optimize for mobile. I’ve added the parameter to disable removing the dropout during the optimize method:
torchscript_model = torch.jit.script(model)
optimize_for_mobile(torchscript_model,
optimization_blocklist={MobileOptimizerType.REMOVE_DROPOUT})
._save_for_lite_interpreter(ptlFile)
Lastly, in Android, I load the model then execute the MC iterations (simplified example)..
module = LiteModuleLoader.loadModuleFromAsset( this.getAssets(), "<ptlFile name>" );
<snip MC iterations>
float[] score = module.forward(IValue.from(inputTensor)).toTensor().getDataAsFloatArray();
</snip>
Each iteration I get the same score values. Any suggestions how to achieve the same result on mobile as on the desktop? If I remove the REMOVE_DROPOUT from the optimizer blocklist, the exported model is smaller in size. So it seems as if the dropout layers are in the mobile model properly but they don’t appear to be active during mobile evaluation.
Pytorch version 2.2.1 on Desktop.
pytorch_android_lite 1.13.1 on Android.
[1] Measuring uncertainty using MC Dropout on pytorch