Examtopics

Professional Machine Learning Engineer
  • Topic 1 Question 77

    You need to execute a batch prediction on 100 million records in a BigQuery table with a custom TensorFlow DNN regressor model, and then store the predicted results in a BigQuery table. You want to minimize the effort required to build this inference pipeline. What should you do?

    • Import the TensorFlow model with BigQuery ML, and run the ml.predict function.

    • Use the TensorFlow BigQuery reader to load the data, and use the BigQuery API to write the results to BigQuery.

    • Create a Dataflow pipeline to convert the data in BigQuery to TFRecords. Run a batch inference on Vertex AI Prediction, and write the results to BigQuery.

    • Load the TensorFlow SavedModel in a Dataflow pipeline. Use the BigQuery I/O connector with a custom function to perform the inference within the pipeline, and write the results to BigQuery.


    シャッフルモード