Topic 1 Question 281
You work at a large organization that recently decided to move their ML and data workloads to Google Cloud. The data engineering team has exported the structured data to a Cloud Storage bucket in Avro format. You need to propose a workflow that performs analytics, creates features, and hosts the features that your ML models use for online prediction. How should you configure the pipeline?
Ingest the Avro files into Cloud Spanner to perform analytics. Use a Dataflow pipeline to create the features, and store them in Vertex AI Feature Store for online prediction.
Ingest the Avro files into BigQuery to perform analytics. Use a Dataflow pipeline to create the features, and store them in Vertex AI Feature Store for online prediction.
Ingest the Avro files into Cloud Spanner to perform analytics. Use a Dataflow pipeline to create the features, and store them in BigQuery for online prediction.
Ingest the Avro files into BigQuery to perform analytics. Use BigQuery SQL to create features and store them in a separate BigQuery table for online prediction.
ユーザの投票
コメント(6)
- 正解だと思う選択肢: B
My Answer: B
“You need to propose a workflow that performs analytics, creates features, and hosts ”: Ingest the Avro files into BigQuery to perform analytics
“workflow that performs analytics, creates features”: Dataflow pipeline to create the features
“and hosts the features that your ML models use for online prediction”:store them in Vertex AI Feature Store for online prediction
👍 8guilhermebutzke2024/02/19 - 正解だと思う選択肢: B
Vertex AI Feature Store is designed for managing and serving features for online prediction with low latency.
👍 2emsherff2024/04/09 - 正解だと思う選択肢: A
I think the answer is A because BigQuery does not support Avro format but CloudSpanner does.
👍 1MultiCloudIronMan2024/04/06
シャッフルモード