Topic 1 Question 298
You are a developer at a company that operates an ecommerce website. The website stores the customer order data in a Cloud SQL for PostgreSQL database. Data scientists on the marketing team access this data to run their reports. Every time they run these reports, the website's performance is negatively affected. You want to provide access to up-to-date customer order datasets without affecting your website. What should you do?
Configure Cloud Scheduler to run an hourly Cloud Function that exports the data from the Cloud SQL database into CSV format and sends the data to a Cloud Storage bucket.
Set up a Bigtable table for the data science team. Configure the application to perform dual writes to both Cloud SQL and Bigtable simultaneously.
Set up a BigQuery dataset for the data science team. Configure Datastream to replicate the relevant Cloud SQL tables in BigQuery.
Create a clone of the PostgreSQL database instance for the data science team. Schedule a job to create a new clone every 15 minutes.
ユーザの投票
コメント(1)
- 正解だと思う選択肢: C
BigQuery for Analytics: BigQuery is a serverless data warehouse designed for large-scale analytics. It can handle complex queries and large datasets without affecting the performance of your operational database (Cloud SQL).
Datastream for Replication: Using Datastream allows you to continuously replicate the relevant tables from Cloud SQL to BigQuery. This ensures that the data scientists have access to up-to-date customer order datasets without putting any load on the Cloud SQL database. They can run their reports and analytics directly on BigQuery, which is optimized for such workloads.
👍 2anshad6662024/10/11
シャッフルモード