Topic 1 Question 77
A company needs to configure a real-time data ingestion architecture for its application. The company needs an API, a process that transforms data as the data is streamed, and a storage solution for the data. Which solution will meet these requirements with the LEAST operational overhead?
Deploy an Amazon EC2 instance to host an API that sends data to an Amazon Kinesis data stream. Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
Deploy an Amazon EC2 instance to host an API that sends data to AWS Glue. Stop source/destination checking on the EC2 instance. Use AWS Glue to transform the data and to send the data to Amazon S3.
Configure an Amazon API Gateway API to send data to an Amazon Kinesis data stream. Create an Amazon Kinesis Data Firehose delivery stream that uses the Kinesis data stream as a data source. Use AWS Lambda functions to transform the data. Use the Kinesis Data Firehose delivery stream to send the data to Amazon S3.
Configure an Amazon API Gateway API to send data to AWS Glue. Use AWS Lambda functions to transform the data. Use AWS Glue to send the data to Amazon S3.
ユーザの投票
コメント(9)
- 正解だと思う選択肢: C
(A) - You don't need to deploy an EC2 instance to host an API - Operational overhead (B) - Same as A (C) - Is the answer (D) - AWS Glue gets data from S3, not from API GW. AWS Glue could do ETL by itself, so don't need lambda. Non sense. https://aws.amazon.com/glue/
👍 30123jhl02022/10/17 - 正解だと思う選択肢: C
C is correct answer
👍 2Cristian932022/10/25 Gotta love all those chatgpt answers y'all are throwing at us.
Kinesis Firehose is NEAR real-time, not real-time like your bots tell you.
👍 2UnluckyDucky2023/03/17
シャッフルモード