Topic 1 Question 257
You recently trained an XGBoost model on tabular data. You plan to expose the model for internal use as an HTTP microservice. After deployment, you expect a small number of incoming requests. You want to productionize the model with the least amount of effort and latency. What should you do?
Deploy the model to BigQuery ML by using CREATE MODEL with the BOOSTED_TREE_REGRESSOR statement, and invoke the BigQuery API from the microservice.
Build a Flask-based app. Package the app in a custom container on Vertex AI, and deploy it to Vertex AI Endpoints.
Build a Flask-based app. Package the app in a Docker image, and deploy it to Google Kubernetes Engine in Autopilot mode.
Use a prebuilt XGBoost Vertex container to create a model, and deploy it to Vertex AI Endpoints.
ユーザの投票
コメント(1)
- 正解だと思う選択肢: D
Prebuilt Container: It eliminates the need to build and manage a custom container, reducing development time and complexity. Vertex AI Endpoints: It provides a managed serving infrastructure with low latency and high availability, optimizing performance for predictions. Minimal Effort: It involves simple steps of creating a Vertex model and deploying it to an endpoint, streamlining the process.
👍 1pikachu0072024/01/13
シャッフルモード