Topic 1 Question 255
You have recently used TensorFlow to train a classification model on tabular data. You have created a Dataflow pipeline that can transform several terabytes of data into training or prediction datasets consisting of TFRecords. You now need to productionize the model, and you want the predictions to be automatically uploaded to a BigQuery table on a weekly schedule. What should you do?
Import the model into Vertex AI and deploy it to a Vertex AI endpoint. On Vertex AI Pipelines, create a pipeline that uses the DataflowPythonJobOp and the ModelBacthPredictOp components.
Import the model into Vertex AI and deploy it to a Vertex AI endpoint. Create a Dataflow pipeline that reuses the data processing logic sends requests to the endpoint, and then uploads predictions to a BigQuery table.
Import the model into Vertex AI. On Vertex AI Pipelines, create a pipeline that uses the DataflowPvthonJobOp and the ModelBatchPredictOp components.
Import the model into BigQuery. Implement the data processing logic in a SQL query. On Vertex AI Pipelines create a pipeline that uses the BigquervQueryJobOp and the BigqueryPredictModelJobOp components.
ユーザの投票
コメント(1)
- 正解だと思う選択肢: B
Option A: Vertex AI Pipelines are excellent for orchestrating ML workflows but might not be as efficient as Dataflow for large-scale data processing, especially with existing Dataflow logic. Option C: While Vertex AI Pipelines can handle model loading and prediction, Dataflow is better suited for large-scale data processing and BigQuery integration. Option D: BigQuery ML is primarily for in-database model training and prediction, not ideal for external models or large-scale data processing.
👍 1pikachu0072024/01/13
シャッフルモード