Topic 1 Question 25
You want to use Google Stackdriver Logging to monitor Google BigQuery usage. You need an instant notification to be sent to your monitoring tool when new data is appended to a certain table using an insert job, but you do not want to receive notifications for other tables. What should you do?
Make a call to the Stackdriver API to list all logs, and apply an advanced filter.
In the Stackdriver logging admin interface, and enable a log sink export to BigQuery.
In the Stackdriver logging admin interface, enable a log sink export to Google Cloud Pub/Sub, and subscribe to the topic from your monitoring tool.
Using the Stackdriver API, create a project sink with advanced log filter to export to Pub/Sub, and subscribe to the topic from your monitoring tool.
ユーザの投票
コメント(17)
I would choose D. A and B are wrong since don't notify anything to the monitoring tool. C has no filter on what will be notified. We want only some tables.
👍 44jvg6372020/03/11D as the key requirement is to have notification on a particular table. It can be achieved using advanced log filter to filter only the table logs and create a project sink to Cloud Pub/Sub for notification. Refer GCP documentation - Advanced Logs Filters: https://cloud.google.com/logging/docs/view/advanced-queries A is wrong as advanced filter will help in filtering. However, there is no notification sends. B is wrong as it would send all the logs and BigQuery does not provide notifications. C is wrong as it would send all the logs.
👍 13MaxNRG2021/11/10It's definitely D. From: https://cloud.google.com/logging/docs/reference/v2/rest/v2/projects.sinks "a sink used to export log entries to one of the following destinations in any project: a Cloud Storage bucket, a BigQuery dataset, a Pub/Sub topic or a Cloud Logging log bucket. A logs filter controls which log entries are exported". B is wrong because why would you export to BigQuery again if you want to be notified by your monitoring tool?
👍 5fire5587872021/08/16
シャッフルモード