Topic 1 Question 312
A team at your organization collects logs in an on-premises security information and event management system (SIEM). You must provide a subset of Google Cloud logs for the SIEM, and minimize the risk of data exposure in your cloud environment. What should you do?
Create a new BigQuery dataset. Stream all logs to this dataset. Provide the on-premises SIEM system access to the data in BigQuery by using workload identity federation and let the SIEM team filter for the relevant log data.
Define a log view for the relevant logs. Provide access to the log view to a principal from your on-premises identity provider by using workforce identity federation.
Create a log sink for the relevant logs. Send the logs to Pub/Sub. Retrieve the logs from Pub/Sub and push the logs to the SIEM by using Dataflow.
Filter for the relevant logs. Store the logs in a Cloud Storage bucket. Grant the service account access to the bucket. Provide the service account key to the SIEM team.
ユーザの投票
コメント(7)
going with C..
👍 2kalbd22122024/11/21- 正解だと思う選択肢: C
Why C is Correct: Log Sink for Filtering:
A log sink allows you to filter and export only the relevant logs, ensuring unnecessary data is not sent, which reduces the risk of data exposure. Pub/Sub for Delivery:
Exporting logs to Pub/Sub enables real-time streaming of filtered logs to external systems. This ensures the SIEM receives logs promptly and securely. Dataflow for Transformation and Transfer:
Use Dataflow to process and transform logs as needed before pushing them to the on-premises SIEM.
👍 2BPzen2024/11/28 - 正解だと思う選択肢: B
Answer B
👍 1abdelrahman892024/10/24
シャッフルモード