Topic 1 Question 45
You need to create a data pipeline that streams event information from applications in multiple Google Cloud regions into BigQuery for near real-time analysis. The data requires transformation before loading. You want to create the pipeline using a visual interface. What should you do?
Push event information to a Pub/Sub topic. Create a Dataflow job using the Dataflow job builder.
Push event information to a Pub/Sub topic. Create a Cloud Run function to subscribe to the Pub/Sub topic, apply transformations, and insert the data into BigQuery.
Push event information to a Pub/Sub topic. Create a BigQuery subscription in Pub/Sub.
Push event information to Cloud Storage, and create an external table in BigQuery. Create a BigQuery scheduled job that executes once each day to apply transformations.
ユーザの投票
コメント(2)
- 正解だと思う選択肢: A
The best solution is A. Push event information to a Pub/Sub topic. Create a Dataflow job using the Dataflow job builder. This is because it directly addresses all requirements: streaming data from multiple regions using Pub/Sub, transformations within Dataflow, near real-time analysis with Dataflow streaming pipelines, and a visual interface using the Dataflow Job Builder. Option B (Cloud Run) is less ideal for robust streaming pipelines and visual creation. Option C (BigQuery Subscription) lacks data transformation. Option D (Cloud Storage/Scheduled Job) is a batch, daily process, not near real-time streaming.
👍 1n21837128472025/02/27 - 正解だと思う選択肢: A
dataflow job builder is visual & dataflow is real-time
👍 1n21837128472025/03/08
シャッフルモード