Topic 1 Question 282
You are using a Dataflow streaming job to read messages from a message bus that does not support exactly-once delivery. Your job then applies some transformations, and loads the result into BigQuery. You want to ensure that your data is being streamed into BigQuery with exactly-once delivery semantics. You expect your ingestion throughput into BigQuery to be about 1.5 GB per second. What should you do?
Use the BigQuery Storage Write API and ensure that your target BigQuery table is regional.
Use the BigQuery Storage Write API and ensure that your target BigQuery table is multiregional.
Use the BigQuery Streaming API and ensure that your target BigQuery table is regional.
Use the BigQuery Streaming API and ensure that your target BigQuery table is multiregional.
ユーザの投票
コメント(2)
- 正解だと思う選択肢: A
Voting on A
👍 2Ed_Kim2024/01/02 - 正解だと思う選択肢: A
- BigQuery Storage Write API: This API is designed for high-throughput, low-latency writing of data into BigQuery. It also provides tools to prevent data duplication, which is essential for exactly-once delivery semantics.
- Regional Table: Choosing a regional location for the BigQuery table could potentially provide better performance and lower latency, as it would be closer to the Dataflow job if they are in the same region.
👍 1raaad2024/01/09
シャッフルモード