Topic 1 Question 109
During batch training of a neural network, you notice that there is an oscillation in the loss. How should you adjust your model to ensure that it converges?
Decrease the size of the training batch.
Decrease the learning rate hyperparameter.
Increase the learning rate hyperparameter.
Increase the size of the training batch.
ユーザの投票
コメント(5)
- 正解だと思う選択肢: B
B larger learning rates can reduce training time but may lead to model oscillation and may miss the optimal model parameter values.
👍 5hiromi2022/12/21 - 正解だと思う選択肢: B👍 2mymy94182022/12/17
- 正解 だと思う選択肢: B
having a large learning rate results in Instability or Oscillations. Thus, the first solution is to tune the learning rate by gradually decreasing it. https://towardsdatascience.com/8-common-pitfalls-in-neural-network-training-workarounds-for-them-7d3de51763ad
👍 1enghabeth2023/02/09
シャッフルモード