Topic 1 Question 21
You need to create a weekly aggregated sales report based on a large volume of data. You want to use Python to design an efficient process for generating this report. What should you do?
Create a Cloud Run function that uses NumPy. Use Cloud Scheduler to schedule the function to run once a week.
Create a Colab Enterprise notebook and use the bigframes.pandas library. Schedule the notebook to execute once a week.
Create a Cloud Data Fusion and Wrangler flow. Schedule the flow to run once a week.
Create a Dataflow directed acyclic graph (DAG) coded in Python. Use Cloud Scheduler to schedule the code to run once a week.
ユーザの投票
コメント(1)
- 正解だと思う選択肢: B
Option B (Colab Enterprise + bigframes.pandas) and Option D (Dataflow DAG in Python) are the most efficient options for handling large data volumes and using Python.
Option B is likely more straightforward and faster to implement for generating a report, especially if the analyst is familiar with Pandas and notebooks. bigframes.pandas simplifies interaction with BigQuery data within a Python environment for reporting purposes.
Option D is more robust and scalable for general data pipelines and potentially more complex transformations, but might be overkill for a weekly reporting task compared to the ease of use offered by bigframes.pandas in Colab Enterprise.
Given the need for an "efficient process for generating this report" and the desire to use Python, Option B provides a very efficient and relatively simpler path for creating a weekly aggregated sales report directly from BigQuery using Python's familiar Pandas-like syntax via bigframes.pandas in a scheduled Colab Enterprise notebook.
Final Answer: The final answer is B B
👍 1n21837128472025/02/27
シャッフルモード