Topic 2 Question 78
What is the recommended action to do in order to switch between SSD and HDD storage for your Google Cloud Bigtable instance?
create a third instance and sync the data from the two storage types via batch jobs
export the data from the existing instance and import the data into a new instance
run parallel instances where one is HDD and the other is SDD
the selection is final and you must resume using the same storage type
解説
When you create a Cloud Bigtable instance and cluster, your choice of SSD or HDD storage for the cluster is permanent. You cannot use the Google Cloud Platform Console to change the type of storage that is used for the cluster. If you need to convert an existing HDD cluster to SSD, or vice-versa, you can export the data from the existing instance and import the data into a new instance.
Alternatively, you can write - a Cloud Dataflow or Hadoop MapReduce job that copies the data from one instance to another. Reference: https://cloud.google.com/bigtable/docs/choosing-ssd-hdd
コメント(17)
QUESTION 2 Your company built a TensorFlow neutral-network model with a large number of neurons and layers. Themodel fits well for the training data. However, when tested against new data, it performs poorly. What method can you employ to address this? A. Threading B. Serialization C. Dropout Methods D. Dimensionality Reduction Correct Answer: C
👍 9nez152019/12/23QUESTION 16 You need to store and analyze social media postings in Google BigQuery at a rate of 10,000 messages per minute in near real-time. Initially, design the application to use streaming inserts for individual postings. Your application also performs data aggregations right after the streaming inserts. You discover that the queries after streaming inserts do not exhibit strong consistency, and reports from the queries might miss in-flight data. How can you adjust your application design? A. Re-write the application to load accumulated data every 2 minutes. B. Convert the streaming insert code to batch load for individual messages. C. Load the original message to Google Cloud SQL, and export the table every hour to BigQuery via streaming inserts. D. Estimate the average latency for data availability after streaming inserts, and always run queries after waiting twice as long. Correct Answer: D
👍 8nez152019/12/23QUESTION 156 Your company is selecting a system to centralize data ingestion and delivery. You are considering messaging and data integration systems to address the requirements. The key requirements are: The ability to seek to a particular offset in a topic, possibly back to the start of all data ever captured Support for publish/subscribe semantics on hundreds of topics Retain per-key ordering Which system should you choose? A. Apache Kafka B. Cloud Storage C. Cloud Pub/Sub D. Firebase Cloud Messaging Correct Answer: A
👍 6nez152019/12/24
シャッフルモード