Topic 1 Question 23
A Machine Learning Specialist is building a convolutional neural network (CNN) that will classify 10 types of animals. The Specialist has built a series of layers in a neural network that will take an input image of an animal, pass it through a series of convolutional and pooling layers, and then finally pass it through a dense and fully connected layer with 10 nodes. The Specialist would like to get an output from the neural network that is a probability distribution of how likely it is that the input image belongs to each of the 10 classes. Which function will produce the desired output?
Dropout
Smooth L1 loss
Softmax
Rectified linear units (ReLU)
ユーザの投票
コメント(17)
C might be much suitable softmax is to turn numbers into probabilities.
https://medium.com/data-science-bootcamp/understand-the-softmax-function-in-minutes-f3a59641e86d
👍 29DonaldCMLIN2021/09/22C is right. Softmax function is used for multi-class predictoins
👍 14rsimham2021/09/22C --> Softmax.
Let's go over the alternatives: A. Dropout --> Not really a function, but rather a method to avoid overfitting. It consists of dropping some neurons during the training process, so that the performance of our algorithm does not become very dependent on any single neuron. B. Smooth L1 loss --> It's a loss function, thus a function to be minimized by the entire neural network. It's not an activation function. C. Softmax --> This is the traditional function used for multi-class classification problems (such as classifying an animal into one of 10 categories) D. Rectified linear units (ReLU) --> This activation function is often used on the first and intermediate (hidden) layers, not on the final layer. In any case, it wouldn't make sense to use it for classification because its values can exceed 1 (and probabilities can't)
👍 9felbuch2021/10/21
シャッフルモード