Topic 1 Question 186
Your new customer has requested daily reports that show their net consumption of Google Cloud compute resources and who used the resources. You need to quickly and efficiently generate these daily reports. What should you do?
Do daily exports of Cloud Logging data to BigQuery. Create views filtering by project, log type, resource, and user.
Filter data in Cloud Logging by project, resource, and user; then export the data in CSV format.
Filter data in Cloud Logging by project, log type, resource, and user, then import the data into BigQuery.
Export Cloud Logging data to Cloud Storage in CSV format. Cleanse the data using Dataprep, filtering by project, resource, and user.
ユーザの投票
コメント(17)
A. Do daily exports of Cloud Logging data to BigQuery. Create views filtering by project, log type, resource, and user.
You cannot import custom or filtered billing criteria into BigQuery. There are three types of Cloud Billing data tables with a fixed schema that must further drilled-down via BigQuery views.
Reference: https://cloud.google.com/billing/docs/how-to/export-data-bigquery#setup
👍 5AWSandeep2022/09/02- 正解だと思う選択肢: A
A. The D isn't filtering by log type. B and C are discarded because you need to drill down the exported loggs in Big Query or other.
👍 3devaid2022/10/05 - 正解だと思う選択肢: A
B and D do not consider the log type field. C looks good and I would go for it. However, A looks equally good and I've found a CloudSkillsBoost lab that is exactly describing what answer A does, i.e. exporting logs to BQ and then creating a VIEW. https://www.cloudskillsboost.google/focuses/6100?parent=catalog I think the advantage of exporting complete logs (i.e. filtering them after they reach BQ) is that in case we would want to adjust the reporting in the future, we would have the complete logs with all fields available, whereas with C we would need to take extra steps.
👍 3maci_f2023/01/25
シャッフルモード