Topic 1 Question 77
You need to execute a batch prediction on 100 million records in a BigQuery table with a custom TensorFlow DNN regressor model, and then store the predicted results in a BigQuery table. You want to minimize the effort required to build this inference pipeline. What should you do?
Import the TensorFlow model with BigQuery ML, and run the ml.predict function.
Use the TensorFlow BigQuery reader to load the data, and use the BigQuery API to write the results to BigQuery.
Create a Dataflow pipeline to convert the data in BigQuery to TFRecords. Run a batch inference on Vertex AI Prediction, and write the results to BigQuery.
Load the TensorFlow SavedModel in a Dataflow pipeline. Use the BigQuery I/O connector with a custom function to perform the inference within the pipeline, and write the results to BigQuery.
ユーザの投票
コメント(6)
- 正解だと思う選択肢: A👍 4hiromi2022/12/18
- 正解だと思う選択肢: A
for this: https://cloud.google.com/bigquery-ml/docs/reference/standard-sql/bigqueryml-syntax-inference-overview Predict the label, either a numerical value for regression tasks or a categorical value for classification tasks on DNN regresion
👍 2enghabeth2023/02/08 - 正解だと思う選択肢: A
Went with A
👍 2M252023/05/08
シャッフルモード