Topic 1 Question 66
You set up a streaming data insert into a Redis cluster via a Kafka cluster. Both clusters are running on Compute Engine instances. You need to encrypt data at rest with encryption keys that you can create, rotate, and destroy as needed. What should you do?
Create a dedicated service account, and use encryption at rest to reference your data stored in your Compute Engine cluster instances as part of your API service calls.
Create encryption keys in Cloud Key Management Service. Use those keys to encrypt your data in all of the Compute Engine cluster instances.
Create encryption keys locally. Upload your encryption keys to Cloud Key Management Service. Use those keys to encrypt your data in all of the Compute Engine cluster instances.
Create encryption keys in Cloud Key Management Service. Reference those keys in your API service calls when accessing the data in your Compute Engine cluster instances.
ユーザの投票
コメント(17)
Dear Admin, almost every answer is incorrect . Please check the comments and update your website.
👍 35SonuKhan12021/10/29correct: B
👍 19[Removed]2020/03/21- 正解だと思う選択肢: B
A makes no sense, you need to use your own keys. You don’t create keys locally and upload them, you should import it to make it work..using the kms public key…not just “uploading” it. C is also out. IT’s between B and D Cloud KMS is a cloud-hosted key management service that lets you manage cryptographic keys for your cloud services the same way you do on-premises, You can generate, use, rotate, and destroy cryptographic keys from there. Since you want to encrypt data at rest, is B, you don’t use them for any API calls. https://cloud.google.com/compute/docs/disks/customer-managed-encryption
👍 7MaxNRG2021/12/21
シャッフルモード