Topic 1 Question 189
You are implementing a batch inference ML pipeline in Google Cloud. The model was developed using TensorFlow and is stored in SavedModel format in Cloud Storage. You need to apply the model to a historical dataset containing 10 TB of data that is stored in a BigQuery table. How should you perform the inference?
Export the historical data to Cloud Storage in Avro format. Configure a Vertex AI batch prediction job to generate predictions for the exported data
Import the TensorFlow model by using the CREATE MODEL statement in BigQuery ML. Apply the historical data to the TensorFlow model
Export the historical data to Cloud Storage in CSV format. Configure a Vertex AI batch prediction job to generate predictions for the exported data
Configure a Vertex AI batch prediction job to apply the model to the historical data in BigQuery
ユーザの投票
コメント(1)
- 正解だと思う選択肢: D
Limitations of other options:
A and C. Exporting data: Exporting 10 TB of data to Cloud Storage incurs additional storage costs, transfer time, and potential data management complexities. B. BigQuery ML: While BigQuery ML supports some TensorFlow models, it might have limitations with certain model architectures or features. Additionally, it might not be as optimized for large-scale batch inference as Vertex AI.
👍 1pikachu0072024/01/12
シャッフルモード