Topic 1 Question 178
You work for a bank. You have created a custom model to predict whether a loan application should be flagged for human review. The input features are stored in a BigQuery table. The model is performing well, and you plan to deploy it to production. Due to compliance requirements the model must provide explanations for each prediction. You want to add this functionality to your model code with minimal effort and provide explanations that are as accurate as possible. What should you do?
Create an AutoML tabular model by using the BigQuery data with integrated Vertex Explainable AI.
Create a BigQuery ML deep neural network model and use the ML.EXPLAIN_PREDICT method with the num_integral_steps parameter.
Upload the custom model to Vertex AI Model Registry and configure feature-based attribution by using sampled Shapley with input baselines.
Update the custom serving container to include sampled Shapley-based explanations in the prediction outputs.
ユーザの投票
コメント(3)
- 正解だと思う選択肢: D
A and B is out because you already have a model, C does not provide an explanation for each prediction. Therefore D meets all the criteria.
👍 1pikachu0072024/01/11 - 正解だと思う選択肢: A
Not a deep neural network for sure (B). Out of the remaining 3, A is the simplest approach.
👍 1b1a8fae2024/01/11 - 正解だと思う選択肢: D
pikachu007 answer made me reconsider
👍 1b1a8fae2024/01/12
シャッフルモード