Topic 1 Question 165
You want to train an AutoML model to predict house prices by using a small public dataset stored in BigQuery. You need to prepare the data and want to use the simplest, most efficient approach. What should you do?
Write a query that preprocesses the data by using BigQuery and creates a new table. Create a Vertex AI managed dataset with the new table as the data source.
Use Dataflow to preprocess the data. Write the output in TFRecord format to a Cloud Storage bucket.
Write a query that preprocesses the data by using BigQuery. Export the query results as CSV files, and use those files to create a Vertex AI managed dataset.
Use a Vertex AI Workbench notebook instance to preprocess the data by using the pandas library. Export the data as CSV files, and use those files to create a Vertex AI managed dataset.
ユーザの投票
コメント(4)
- 正解だと思う選択肢: B
Dataflow seems like the easiest and most scalable way to deal with this issue. Option B.
👍 1kalle_balle2024/01/06 A seems the easiest to me: preprocess the data on BigQuery (where the input table is stored) and export directly as Vertex AI managed dataset.
👍 1b1a8fae2024/01/08I go for A:
👍 1vale_76_na_xxx2024/01/08
シャッフルモード