Topic 1 Question 244
You have a Bigtable instance that consists of three nodes that store personally identifiable information (PII) data. You need to log all read or write operations, including any metadata or configuration reads of this database table, in your company’s Security Information and Event Management (SIEM) system. What should you do?
• Navigate to Cloud Monitoring in the Google Cloud console, and create a custom monitoring job for the Bigtable instance to track all changes. • Create an alert by using webhook endpoints, with the SIEM endpoint as a receiver.
• Navigate to the Audit Logs page in the Google Cloud console, and enable Admin Write logs for the Bigtable instance. • Create a Cloud Functions instance to export logs from Cloud Logging to your SIEM.
• Navigate to the Audit Logs page in the Google Cloud console, and enable Data Read, Data Write and Admin Read logs for the Bigtable instance. • Create a Pub/Sub topic as a Cloud Logging sink destination, and add your SIEM as a subscriber to the topic.
• Install the Ops Agent on the Bigtable instance during configuration. • Create a service account with read permissions for the Bigtable instance. • Create a custom Dataflow job with this service account to export logs to the company’s SIEM system.
ユーザの投票
コメント(7)
- 正解だと思う選択肢: C
Option C is the most appropriate choice for capturing audit and data access logs from a Bigtable instance and sending them to your SIEM system.
- Enabling Data Read, Data Write, and Admin Read logs for the Bigtable instance ensures that you capture the relevant operations, including read and write operations, as well as administrative reads, in the audit logs.
- Creating a Pub/Sub topic as a Cloud Logging sink destination allows you to export the logs from Cloud Logging to Pub/Sub. This is a common approach for sending logs to external systems, including SIEMs.
- Adding your SIEM as a subscriber to the Pub/Sub topic ensures that the logs are forwarded to your SIEM system, allowing you to monitor and analyze them for security and compliance purposes.
NB:A Cloud Logging sink destination is a configuration that specifies where logs collected by Google Cloud's Cloud Logging service should be sent or exported. It allows you to control the destination of logs generated by various Google Cloud services, such as Compute Engine, Cloud Storage, BigQuery, and more.
👍 4taylz8762023/10/06 - 正解だと思う選択肢: B
Enabling Admin Write logs for the Bigtable instance in Cloud Logging will capture administrative write actions on the Bigtable instance. This includes any configuration changes and metadata reads related to the Bigtable instance. Creating a Cloud Functions instance and configuring it to export logs from Cloud Logging to your SIEM allows you to take the captured logs and route them to your SIEM system in a format that your SIEM can understand. Cloud Functions can act as a serverless function to process and forward the logs to your SIEM using an appropriate method, such as sending them via an API or message queue.
👍 2qannik2023/08/05 - 正解だと思う選択肢: C
Data Access audit logs—except for BigQuery—are disabled by default and you need to enable them
👍 23arle2023/08/09
シャッフルモード