Topic 1 Question 226
A bank wants to launch a low-rate credit promotion campaign. The bank must identify which customers to target with the promotion and wants to make sure that each customer's full credit history is considered when an approval or denial decision is made.
The bank's data science team used the XGBoost algorithm to train a classification model based on account transaction features. The data science team deployed the model by using the Amazon SageMaker model hosting service. The accuracy of the model is sufficient, but the data science team wants to be able to explain why the model denies the promotion to some customers.
What should the data science team do to meet this requirement in the MOST operationally efficient manner?
Create a SageMaker notebook instance. Upload the model artifact to the notebook. Use the plot_importance() method in the Python XGBoost interface to create a feature importance chart for the individual predictions.
Retrain the model by using SageMaker Debugger. Configure Debugger to calculate and collect Shapley values. Create a chart that shows features and SHapley. Additive explanations (SHAP) values to explain how the features affect the model outcomes.
Set up and run an explainability job powered by SageMaker Clarify to analyze the individual customer data, using the training data as a baseline. Create a chart that shows features and SHapley Additive explanations (SHAP) values to explain how the features affect the model outcomes.
Use SageMaker Model Monitor to create Shapley values that help explain model behavior. Store the Shapley values in Amazon S3. Create a chart that shows features and SHapley Additive explanations (SHAP) values to explain how the features affect the model outcomes.
ユーザの投票
コメント(6)
- 正解だと思う選択肢: C
"Explain individual model predictions
Customers and internal stakeholders both want transparency into how models make their predictions. SageMaker Clarify integrates with SageMaker Experiments to show you the importance of each model input for a specific prediction. Results can be made available to customer-facing employees so that they have an understanding of the model’s behavior when making decisions based on model predictions."
👍 3sevosevo2023/03/18 - 正解だと思う選択肢: C
Its between B and C SageMaker Clarify is used to promote transparency and accountability in machine learning models. Thats what we are looking for why model denies promotion to some customers
👍 3blanco7502023/03/20 Selected Answer: B Retrain the model by using SageMaker Debugger. Configure Debugger to calculate and collect Shapley values. Create a chart that shows features and SHapley Additive explanations (SHAP) values to explain how the features affect the model outcomes.
While A, C, and D are all options for explaining the model's behavior, the most efficient way to meet the bank's requirements is to use SageMaker Debugger to calculate and collect
S Shapley values for each prediction. This allows the data science team to easily explain why the model denied the promotion to certain customers. SageMaker Debugger also provides built-in integration with SageMaker Studio, which enables data scientists to visualize the Shapley values and other debugging information through a user-friendly interface.
👍 3Gaby9992023/04/25
シャッフルモード