Topic 1 Question 263
You are developing a custom TensorFlow classification model based on tabular data. Your raw data is stored in BigQuery. contains hundreds of millions of rows, and includes both categorical and numerical features. You need to use a MaxMin scaler on some numerical features, and apply a one-hot encoding to some categorical features such as SKU names. Your model will be trained over multiple epochs. You want to minimize the effort and cost of your solution. What should you do?
- Write a SQL query to create a separate lookup table to scale the numerical features.
- Deploy a TensorFlow-based model from Hugging Face to BigQuery to encode the text features.
- Feed the resulting BigQuery view into Vertex AI Training.
- Use BigQuery to scale the numerical features.
- Feed the features into Vertex AI Training.
- Allow TensorFlow to perform the one-hot text encoding.
- Use TFX components with Dataflow to encode the text features and scale the numerical features.
- Export results to Cloud Storage as TFRecords.
- Feed the data into Vertex AI Training.
- Write a SQL query to create a separate lookup table to scale the numerical features.
- Perform the one-hot text encoding in BigQuery.
- Feed the resulting BigQuery view into Vertex AI Training.
ユーザの投票
コメント(1)
- 正解だと思う選択肢: B
Option A: Involves creating a separate lookup table and deploying a Hugging Face model in BigQuery, increasing complexity and cost. Option C: While TFX offers robust preprocessing capabilities, it adds overhead for this use case and requires knowledge of Dataflow. Option D: Performing one-hot encoding in BigQuery can be less efficient than TensorFlow's optimized implementation.
👍 1pikachu0072024/01/13
シャッフルモード