I have tried to run this model from the link:
https://www.kaggle.com/code/alexfordna/garbage-classification-mobilenetv2-92-accuracy/notebook
When I did it on colab with similar dataset (but smaller, 2100 img to 6 classes), it works well. But when I added this code to predict the input image:
from google.colab import files
from PIL import Image
def process_uploaded_image(image_path, target_size=(224, 224)):
img = Image.open(image_path)
img = img.resize(target_size)
img_array = np.array(img)
if img_array.shape[-1] == 4:
img_array = img_array[..., :3]
img_array = img_array / 255.0
img_array = np.expand_dims(img_array, axis=0)
img_array = mobilenetv2.preprocess_input(img_array)
return img_array
uploaded = files.upload()
for fn in uploaded.keys():
processed_image = process_uploaded_image(fn, target_size=IMAGE_SIZE)
preds = model.predict(processed_image)
pred_class = np.argmax(preds, axis=1)
plt.imshow(Image.open(fn)) # Display the uploaded image
plt.title(f'Predicted class: {categories[pred_class[0]]}')
plt.axis('off')
plt.show()
print(f'File {fn} is predicted as: {categories[pred_class[0]]}')
The result is a wrong prediction. For example, the model always predict my input as a “trash” class. It will change to another class when I stop the runtime, but it still in a wrong prediction too.
I also added this code to check the prediction probabilities:
preds = model.predict(processed_image)
pred_probs = preds[0] # Get the prediction probabilities for the first (and only) batch
print("Prediction probabilities:", pred_probs)
pred_class = np.argmax(pred_probs)
print("Predicted class:", categories[pred_class])
the output:
**1/1** ━━━━━━━━━━━━━━━━━━━━ **0s** 24ms/step Prediction probabilities: [0.31027108 0.12315894 0.47848797 0.00863316 0.07789086 0.00155797]
Predicted class: metal
Why is this happening, and how can my model predict the result correctly?
2
Sometimes it will work but most of the it wont work
Abinesh Saravanan is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.