Topic 1 Question 68
Your team uses the Google Ads platform to visualize metrics. You want to export the data to BigQuery to get more granular insights. You need to execute a one-time transfer of historical data and automatically update data daily. You want a solution that is low-code, serverless, and requires minimal maintenance. What should you do?
Export the historical data to BigQuery by using BigQuery Data Transfer Service. Use Cloud Composer for daily automation.
Export the historical data to Cloud Storage by using Storage Transfer Service. Use Pub/Sub to trigger a Dataflow template that loads data for daily automation.
Export the historical data as a CSV file. Import the file into BigQuery for analysis. Use Cloud Composer for daily automation.
Export the historical data to BigQuery by using BigQuery Data Transfer Service. Use BigQuery Data Transfer Service for daily automation.
ユーザの投票
コメント(1)
- 正解だと思う選択肢: D
The best option is D. BigQuery Data Transfer Service (DTS) for both. Option D is best because BigQuery DTS directly handles both historical and daily Google Ads data transfer to BigQuery in a low-code, serverless, managed way. Option A (DTS + Composer) is incorrect because Composer adds unnecessary complexity for daily updates when DTS can handle it itself. Option B (Storage Transfer + Dataflow) is incorrect because it's overly complex, involving multiple services and not low-code. Storage Transfer Service isn't for Google Ads data directly. Option C (CSV + Composer) is incorrect because CSV export and manual import are not automated, and Composer is again more complex than needed. Therefore, Option D, using BigQuery DTS for both historical and daily transfers, is the simplest, most direct, and best-fit solution.
👍 1n21837128472025/03/05
シャッフルモード