Topic 1 Question 122
2 つ選択You decided to use Cloud Datastore to ingest vehicle telemetry data in real time. You want to build a storage system that will account for the long-term data growth, while keeping the costs low. You also want to create snapshots of the data periodically, so that you can make a point-in-time (PIT) recovery, or clone a copy of the data for Cloud Datastore in a different environment. You want to archive these snapshots for a long time. Which two methods can accomplish this?
Use managed export, and store the data in a Cloud Storage bucket using Nearline or Coldline class.
Use managed export, and then import to Cloud Datastore in a separate project under a unique namespace reserved for that export.
Use managed export, and then import the data into a BigQuery table created just for that export, and delete temporary export files.
Write an application that uses Cloud Datastore client libraries to read all the entities. Treat each entity as a BigQuery table row via BigQuery streaming insert. Assign an export timestamp for each export, and attach it as an extra column for each row. Make sure that the BigQuery table is partitioned using the export timestamp column.
Write an application that uses Cloud Datastore client libraries to read all the entities. Format the exported data into a JSON file. Apply compression before storing the data in Cloud Source Repositories.
ユーザの投票
コメント(17)
- 👍 35Ganshank2020/04/13
AC https://cloud.google.com/datastore/docs/export-import-entities C: To import only a subset of entities or to import data into BigQuery, you must specify an entity filter in your export. B: Not correct since you want to store in a different environment than Datastore. Tho this statment is true: Data exported from one Datastore mode database can be imported into another Datastore mode database, even one in another project. A is correct Billing and pricing for managed exports and imports in Datastore Output files stored in Cloud Storage count towards your Cloud Storage data storage costs. Steps to Export all the entities
- Go to the Datastore Entities Export page in the Google Cloud Console.
- Go to the Datastore Export page
- Set the Namespace field to All Namespaces, and set the Kind field to All Kinds.
- Below Destination, enter the name of your "Cloud Storage bucket".
- Click Export.
👍 23atnafu20202020/08/26A for sure. Then I was undecided between B and C; B has high costs and C has low costs (storage is more expensive in Datastore). However the question says that you want data to be used for Datastore. There is no native way to export data from BigQuery to Datastore, hence the only two options that allow data to be restored to Datastore are A and B.
👍 7fire5587872021/08/18
シャッフルモード