I am looking to use tensorflowjs on the browser as part of my ReactJS app. The model I am trying to use (as a proof of concept) is toxicity model. It is meant to detect overly toxic reviews and prevent the form form being submitted.
The issue is when I load the model, it downloads the model over the network which takes quite awhile.
useEffect(() => {
async function loadModel() {
tfjs.setBackend('webgl');
setIsModelLoading(true);
const threshold = 0.6;
const toxicityModel = await toxicityClassifier.load(threshold);
setModel(toxicityModel);
setIsModelLoading(false);
}
if (model === null) {
loadModel();
}
}, [model]);
const predictToxicity = async (value) => {
const predictions = await model.classify([value]);
console.log(predictions);
setTextToxicity(
predictions
.filter((item) => item.results[0].match === true)
.map((item) => item.label),
);
};
above is the useEffect
hook for my code. the toxicityClassifier.load
takes a good 10 seconds. That prevents my predictToxicity
function to function correctly (model is still null as its not loaded).
On further investigation, i found that its because the model is downloaded over the network when the component first loads. Is there anyway i can package the model with the react code rather than have it consistently download over the network?
Another way is to start a worker in the background when the page first loads. This bring a bit of complexity for now and i would prefer not to do this.