Topic 1 Question 385
A company stores its internal data within an Amazon S3 bucket. All existing data within the S3 bucket is protected by using server-side encryption with Amazon S3 managed encryption keys (SSE-S3). S3 Versioning is enabled. A SysOps administrator must replicate the internal data to another S3 bucket in a different AWS account for disaster recovery. All the existing data is copied from the source S3 bucket to the destination S3 bucket.
Which replication solution is MOST operationally efficient?
Add a replication rule to the source bucket and specify the destination bucket. Create a bucket policy for the destination bucket to allow the owner of the source bucket to replicate objects.
Schedule an AWS Batch job with Amazon EventBridge to copy new objects from the source bucket to the destination bucket. Create a Batch Operations IAM role in the destination account.
Configure an Amazon S3 event notification for the source bucket to invoke an AWS Lambda function to copy new objects to the destination bucket. Ensure that the Lambda function has cross-account access permissions.
Run a scheduled script on an Amazon EC2 instance to copy new objects from the source bucket to the destination bucket. Assign cross-account access permissions to the EC2 instance's role.
ユーザの投票
コメント(5)
- 正解だと思う選択肢: C
C is correct. "All exited data is copied already" it means that just care about new one. https://docs.aws.amazon.com/lambda/latest/dg/with-s3.html
👍 1WinAndWin2024/01/01 - 正解だと思う選択肢: A
A - Using S3 cross-account replication with a replication rule on the source bucket and a bucket policy on the destination bucket is the most operationally efficient solution for replicating data between S3 buckets in different AWS accounts
While S3 event notifications and Lambda functions can be powerful, they add complexity, and setting up cross-account access permissions for Lambda functions might involve additional configuration.
👍 1nharaz2024/01/01 Answer A seems to be way more efficient
👍 1Kipalom2024/01/01
シャッフルモード