After setting manual seed in PyTorch, if we apply dropouts in two scenarios on the same layers:
- Scenario 1: 10% dropout on linear layer 6 with seed 2024
- Scenario 2: 20% dropout on the same linear layer 6 with the same seed 2024
Will this always mean (deterministically) that in Scenario 2, the exact same neurons get dropped out as in Scenario 1 (for the 10%) + some additional neurons (for the remaining 10%)?
Or, does Scenario 2 randomly drop out neurons that may or may not be a part of the original 10% from Scenario 1?