Topic 1 Question 100
You have a requirement to insert minute-resolution data from 50,000 sensors into a BigQuery table. You expect significant growth in data volume and need the data to be available within 1 minute of ingestion for real-time analysis of aggregated trends. What should you do?
Use bq load to load a batch of sensor data every 60 seconds.
Use a Cloud Dataflow pipeline to stream data into the BigQuery table.
Use the INSERT statement to insert a batch of data every 60 seconds.
Use the MERGE statement to apply updates in batch every 60 seconds.
ユーザの投票
コメント(17)
I think we need a pipeline, so it's B to me.
👍 28jvg6372020/03/19Correct - B
👍 16[Removed]2020/03/22- 正解だと思う選択肢: B
Is B, if we expect a growth we’ll need some buffer (that will be pub-sub) and the dataflow pipeline to stream data in big query. The tabledata.insertAll method is not valid here.
👍 7MaxNRG2021/12/30
シャッフルモード