Topic 1 Question 59
An AI practitioner has built a deep learning model to classify the types of materials in images. The AI practitioner now wants to measure the model performance.
Which metric will help the AI practitioner evaluate the performance of the model?
Confusion matrix
Correlation matrix
R2 score
Mean squared error (MSE)
ユーザの投票
コメント(3)
A. Confusion matrix
A confusion matrix is a useful metric for evaluating the performance of a classification model. It provides a summary of prediction results on a classification problem, showing the number of correct and incorrect predictions broken down by each class. This helps the AI practitioner understand how well the model is distinguishing between different types of materials in the images.
👍 2dehkon2024/11/07- 正解だと思う選択肢: A
The model is performing a classification task (identifying types of materials), and confusion matrices are specifically designed for evaluating classification models.
👍 2Blair772024/11/12 - 正解だと思う選択肢: A
A. Confusion matrix is a key metric for evaluating classification models. It provides a summary of the model's predictions, showing the true positive, false positive, true negative, and false negative counts. This allows the AI practitioner to understand how well the model is classifying the different types of materials, and helps in calculating other important metrics like accuracy, precision, recall, and F1-score.
👍 1Jessiii2025/02/11
シャッフルモード