Topic 1 Question 233
You are building a custom image classification model and plan to use Vertex AI Pipelines to implement the end-to-end training. Your dataset consists of images that need to be preprocessed before they can be used to train the model. The preprocessing steps include resizing the images, converting them to grayscale, and extracting features. You have already implemented some Python functions for the preprocessing tasks. Which components should you use in your pipeline?
DataprocSparkBatchOp and CustomTrainingJobOp
DataflowPythonJobOp, WaitGcpResourcesOp, and CustomTrainingJobOp
dsl.ParallelFor, dsl.component, and CustomTrainingJobOp
ImageDatasetImportDataOp, dsl.component, and AutoMLImageTrainingJobRunOp
ユーザの投票
コメント(5)
- 正解だと思う選択肢: B
My Answer: B
Looking for the options, DataflowPythonJobOp can be used for parallelizing the preprocessing tasks, which is suitable for image resizing, converting to grayscale, and extracting features. dsl.ParallelFor could be useful for parallelizing tasks but might not be the most straightforward option for image preprocessing.
Generally DataflowPythonJobOp is followed by WaitGcpResourcesOp.
👍 4guilhermebutzke2024/02/15 - 正解だと思う選択肢: B
B is definitely right, no doubt
👍 2Dirtie_Sinkie2024/09/17 - 正解だと思う選択肢: B
A. DataprocSparkBatchOp: While capable of data processing, it's less well-suited for image-specific tasks like resizing and grayscale conversion compared to DataflowPythonJobOp. C. dsl.ParallelFor, dsl.component: While offering flexibility, they require more manual orchestration and potentially less efficient for image preprocessing compared to DataflowPythonJobOp. D. ImageDatasetImportDataOp, AutoMLImageTrainingJobRunOp: These components are designed for AutoML Image training, not directly compatible with custom preprocessing and training tasks.
👍 1pikachu0072024/01/12
シャッフルモード