Topic 1 Question 138
You have several Spark jobs that run on a Cloud Dataproc cluster on a schedule. Some of the jobs run in sequence, and some of the jobs run concurrently. You need to automate this process. What should you do?
Create a Cloud Dataproc Workflow Template
Create an initialization action to execute the jobs
Create a Directed Acyclic Graph in Cloud Composer
Create a Bash script that uses the Cloud SDK to create a cluster, execute jobs, and then tear down the cluster
ユーザの投票
コメント(11)
- 正解だと思う選択肢: C
C. Composer fits better to schedule Dataproc Workflows, check the documentation: https://cloud.google.com/dataproc/docs/concepts/workflows/workflow-schedule-solutions
Also A is not enough. Dataproc Workflow Template itself don't has a native schedule option.
👍 4devaid2022/10/16 Correct answer is A. https://cloud.google.com/dataproc/docs/concepts/workflows/using-workflows
👍 3LP_PDE2022/10/02C. Create a Directed Acyclic Graph in Cloud Composer
👍 3AzureDP9002022/12/31
シャッフルモード
