Topic 1 Question 295
You are designing the architecture to process your data from Cloud Storage to BigQuery by using Dataflow. The network team provided you with the Shared VPC network and subnetwork to be used by your pipelines. You need to enable the deployment of the pipeline on the Shared VPC network. What should you do?
Assign the compute.networkUser role to the Dataflow service agent.
Assign the compute.networkUser role to the service account that executes the Dataflow pipeline.
Assign the dataflow.admin role to the Dataflow service agent.
Assign the dataflow.admin role to the service account that executes the Dataflow pipeline.
ユーザの投票
コメント(4)
- 正解だと思う選択肢: A
- Dataflow service agent is the one responsible for setting up and managing the network resources that Dataflow requires.
- By granting the compute.networkUser role to this service agent, we are enabling it to provision the necessary network resources within the Shared VPC for your Dataflow job.
👍 2raaad2024/01/11 B. Assign the compute.networkUser role to the service account that executes the Dataflow pipeline. See the ref - https://cloud.google.com/dataflow/docs/guides/specifying-networks
👍 1GCP0012024/01/07- 正解だと思う選択肢: B
Option B is Correct.
Explanation: You need to give compute networkuser role to service account that is processing the pipeline as it will need to deploy nessesary worker nodes on the shared vpc project.
Option A is incorrect as Dataflow Service Agent is Google MGS service account that will not responsible for running or deoplying workers in shared vpc.
Option C and D is incorrect as dataflow.admin is elevated privlages to create and manage all of dataflow components not deploying resources in shared vpc.
👍 1BIGQUERY_ALT_ALT2024/01/11
シャッフルモード