Topic 1 Question 3
Your company is building a near real-time streaming pipeline to process JSON telemetry data from small appliances. You need to process messages arriving at a Pub/Sub topic, capitalize letters in the serial number field, and write results to BigQuery. You want to use a managed service and write a minimal amount of code for underlying transformations. What should you do?
Use a Pub/Sub to BigQuery subscription, write results directly to BigQuery, and schedule a transformation query to run every five minutes.
Use a Pub/Sub to Cloud Storage subscription, write a Cloud Run service that is triggered when objects arrive in the bucket, performs the transformations, and writes the results to BigQuery.
Use the “Pub/Sub to BigQuery” Dataflow template with a UDF, and write the results to BigQuery.
Use a Pub/Sub push subscription, write a Cloud Run service that accepts the messages, performs the transformations, and writes the results to BigQuery.
ユーザの投票
コメント(4)
- 正解だと思う選択肢: C
A UDF of the Dataflow is a simpler coding option than a Cloud Run.
👍 1trashbox2025/01/22 - 正解だと思う選択肢: C
I agree that C is the best answer. However, answer A is doable and is also low/no code and also considered acceptable.
👍 1rich_maverick2025/02/26 - 正解だと思う選択肢: A
Pub/Sub to BQ is now the recommended solution, no longer need dataflow
👍 1bc3f2222025/02/27
シャッフルモード