I’m working on binary classification with deep learning.
My deep learning structure consists of fully connected layers.
The batch size is 512, learning rate is 0.001, and the dropout rate is 0.5.
The data set is unbalanced at a 9:1 ratio.
So I applied focal loss, downsampling, and oversampling to solve data imbalanced problem.
For convenience, judging 1 as 0 is defined as “false alarm”, and judging 0 as 1 is defined as “missed”.
When applying the focal loss, the performance was the best. Its performance was 18.03% for false alalrm and 13.68% for missed.
When false alarm goes up, missed goes down, and when false alarm goes down, missed goes up.
I want both false alarm and missed to fall.
I tried fine tuning (change deep learning structure, change batch size, change learning rate), but it doesn’t work out the way I want it to.
So, I have two questions.
-
Does the false alarm or missed perform at that level if we do the binary classification?
-
I wonder if there is any other methods other than focus loss, downsampling and oversampling.
jeongwoo is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.