Topic 1 Question 125
Your team is building a service that performs compute-heavy processing on batches of data. The data is processed faster based on the speed and number of CPUs on the machine. These batches of data vary in size and may arrive at any time from multiple third-party sources. You need to ensure that third parties are able to upload their data securely. You want to minimize costs, while ensuring that the data is processed as quickly as possible. What should you do?
Provide a secure file transfer protocol (SFTP) server on a Compute Engine instance so that third parties can upload batches of data, and provide appropriate credentials to the server. Create a Cloud Function with a google.storage.object.finalize Cloud Storage trigger. Write code so that the function can scale up a Compute Engine autoscaling managed instance group Use an image pre-loaded with the data processing software that terminates the instances when processing completes.
Provide a Cloud Storage bucket so that third parties can upload batches of data, and provide appropriate Identity and Access Management (IAM) access to the bucket. Use a standard Google Kubernetes Engine (GKE) cluster and maintain two services: one that processes the batches of data, and one that monitors Cloud Storage for new batches of data. Stop the processing service when there are no batches of data to process.
Provide a Cloud Storage bucket so that third parties can upload batches of data, and provide appropriate Identity and Access Management (IAM) access to the bucket. Create a Cloud Function with a google.storage.object.finalize Cloud Storage trigger. Write code so that the function can scale up a Compute Engine autoscaling managed instance group. Use an image pre-loaded with the data processing software that terminates the instances when processing completes.
Provide a Cloud Storage bucket so that third parties can upload batches of data, and provide appropriate Identity and Access Management (IAM) access to the bucket. Use Cloud Monitoring to detect new batches of data in the bucket and trigger a Cloud Function that processes the data. Set a Cloud Function to use the largest CPU possible to minimize the runtime of the processing.
ユーザの投票
コメント(3)
- 正解だと思う選択肢: C
I would go with C , using GCS is cost effective and secure compared to other options. D. Cloud function with large CPU results in high cost.
👍 3Jason_Cloud_at2023/10/26 - 正解だと思う選択肢: C
I would say C. GCS is not that expensive and you can set rules to archive old data. GCE is optimal for compute heavy batch jobs compared to cloud functions.
👍 2Andrei_Z2023/11/21 - 正解だと思う選択肢: C
The recommended solution is (option C)
Provide a Cloud Storage bucket for third parties to upload batches of data, and utilize a Cloud Function with a google.storage.object.finalize trigger to scale up a Compute Engine autoscaling managed instance group. This approach ensures secure data uploads to a Cloud Storage bucket with proper IAM access controls.
The Cloud Function, triggered upon new object finalization in the bucket, scales up a managed instance group with pre-loaded data processing software, optimizing for compute-heavy tasks. The instances terminate upon completion, minimizing costs.
This design efficiently leverages serverless and autoscaling capabilities, ensuring quick and cost-effective processing of data batches arriving at varying times from multiple sources.
👍 1xhilmi2023/12/06
シャッフルモード