Topic 1 Question 242
Your team is training a large number of ML models that use different algorithms, parameters, and datasets. Some models are trained in Vertex AI Pipelines, and some are trained on Vertex AI Workbench notebook instances. Your team wants to compare the performance of the models across both services. You want to minimize the effort required to store the parameters and metrics. What should you do?
Implement an additional step for all the models running in pipelines and notebooks to export parameters and metrics to BigQuery.
Create a Vertex AI experiment. Submit all the pipelines as experiment runs. For models trained on notebooks log parameters and metrics by using the Vertex AI SDK.
Implement all models in Vertex AI Pipelines Create a Vertex AI experiment, and associate all pipeline runs with that experiment.
Store all model parameters and metrics as model metadata by using the Vertex AI Metadata API.
ユーザの投票
コメント(1)
- 正解だと思う選択肢: C
Options A and B: Logging metrics to BigQuery involves additional setup and integration efforts. Option D: Loading Vertex ML Metadata into a pandas DataFrame for visualization requires manual work and doesn't leverage built-in visualization tools.
👍 1pikachu0072024/01/13
シャッフルモード