Topic 1 Question 2
At Dress4Win, an operations engineer wants to create a tow-cost solution to remotely archive copies of database backup files. The database files are compressed tar files stored in their current data center. How should he proceed?
Create a cron script using gsutil to copy the files to a Coldline Storage bucket.
Create a cron script using gsutil to copy the files to a Regional Storage bucket.
Create a Cloud Storage Transfer Service Job to copy the files to a Coldline Storage bucket.
Create a Cloud Storage Transfer Service job to copy the files to a Regional Storage bucket.
解説
Follow these rules of thumb when deciding whether to use gsutil or Storage Transfer Service:
- When transferring data from an on-premises location, use gsutil.
- When transferring data from another cloud storage provider, use Storage Transfer Service.
- Otherwise, evaluate both tools with respect to your specific scenario. Use this guidance as a starting point. The specific details of your transfer scenario will also help you determine which tool is more appropriate.
ユーザの投票
コメント(17)
Should be C: https://cloud.google.com/storage-transfer/docs/on-prem-overview Especially, when Google docs explicitly states, that custom scripts are unreliable, slow, insecure, difficult to maintain and troubleshoot.
👍 33Ayzen2020/04/27Answer should be C. As per the latest case study on google cloud website , they have DB storage of 1 PB out of which 600 TB is used. So you get the size of the data. These are the thumb rules as per GCP documentation -
Transfer scenario Recommendation
Transferring from another cloud storage provider Use Storage Transfer Service Transferring less than 1 TB from on-premises Use gsutil Transferring more than 1 TB from on-premises Use Transfer service for on-premises data
👍 15SamirJ2020/10/14IMPORTANT: Dress4Win is not anymore part of the officially listed case studies: https://cloud.google.com/certification/guides/professional-cloud-architect
👍 5jabrrJ68w02ond12022/11/01
シャッフルモード