Topic 1 Question 261
You want to migrate your existing Teradata data warehouse to BigQuery. You want to move the historical data to BigQuery by using the most efficient method that requires the least amount of programming, but local storage space on your existing data warehouse is limited. What should you do?
Use BigQuery Data Transfer Service by using the Java Database Connectivity (JDBC) driver with FastExport connection.
Create a Teradata Parallel Transporter (TPT) export script to export the historical data, and import to BigQuery by using the bq command-line tool.
Use BigQuery Data Transfer Service with the Teradata Parallel Transporter (TPT) tbuild utility.
Create a script to export the historical data, and upload in batches to Cloud Storage. Set up a BigQuery Data Transfer Service instance from Cloud Storage to BigQuery.
ユーザの投票
コメント(2)
- 正解だと思う選択肢: A
https://cloud.google.com/bigquery/docs/migration/teradata-overview#extraction_method
Lack of local storage pushes this to JDBC driver
👍 2rahulvin2023/12/30 - 正解だと思う選択肢: A
- Reduced Local Storage: By using FastExport, data is directly streamed from Teradata to BigQuery without the need for local storage, addressing your storage limitations.
- Minimal Programming: BigQuery Data Transfer Service offers a user-friendly interface, eliminating the need for extensive scripting or coding.
👍 2raaad2024/01/05
シャッフルモード