Topic 1 Question 14
Your ecommerce website captures user clickstream data to analyze customer traffic patterns in real time and support personalization features on your website. You plan to analyze this data using big data tools. You need a low-latency solution that can store 8 TB of data and can scale to millions of read and write requests per second. What should you do?
Write your data into Bigtable and use Dataproc and the Apache Hbase libraries for analysis.
Deploy a Cloud SQL environment with read replicas for improved performance. Use Datastream to export data to Cloud Storage and analyze with Dataproc and the Cloud Storage connector.
Use Memorystore to handle your low-latency requirements and for real-time analytics.
Stream your data into BigQuery and use Dataproc and the BigQuery Storage API to analyze large volumes of data.
ユーザの投票
コメント(10)
A: Write your data into Bigtable ***** and use Dataproc and the Apache Hbase libraries for analysis.
👍 4pk3492022/12/25Answer is A. Click stream and time series data and the size is 8TB. Read low latency with reads and writes. Correct answer is A to use BigTable for storage and use either CBT or Hbase API to interact with data.
👍 3Kloudgeek2022/12/19- 正解だと思う選択肢: A
B couldn't handle this volume of writes and read, D wouldn't be able to handle the writing and C wouldn't be suited for this.
👍 2fredcaram2022/12/19
シャッフルモード