Topic 1 Question 71
You are migrating data from a legacy on-premises MySQL database to Google Cloud. The database contains various tables with different data types and sizes, including large tables with millions of rows and transactional data. You need to migrate this data while maintaining data integrity, and minimizing downtime and cost. What should you do?
Set up a Cloud Composer environment to orchestrate a custom data pipeline. Use a Python script to extract data from the MySQL database and load it to MySQL on Compute Engine.
Export the MySQL database to CSV files, transfer the files to Cloud Storage by using Storage Transfer Service, and load the files into a Cloud SQL for MySQL instance.
Use Database Migration Service to replicate the MySQL database to a Cloud SQL for MySQL instance.
Use Cloud Data Fusion to migrate the MySQL database to MySQL on Compute Engine.
ユーザの投票
コメント(1)
- 正解だと思う選択肢: C
The best option is C. Use Database Migration Service (DMS). Option C is best because DMS is built for MySQL migration to Cloud SQL, ensuring integrity and minimizing downtime. Option A (Composer, Python, Compute Engine) is incorrect because Composer adds complexity, Python is manual, and Compute Engine is less managed. Option B (CSV, Storage Transfer, Cloud SQL) is incorrect because CSV export is slow, risks integrity, and has downtime. Option D (Data Fusion to Compute Engine) is incorrect because Data Fusion is overkill for simple migration, and Compute Engine is less managed than Cloud SQL. Therefore, Option C, DMS, is the most direct and efficient migration solution.
👍 1n21837128472025/03/05
シャッフルモード