Topic 1 Question 23
A company wants to build a generative AI application by using Amazon Bedrock and needs to choose a foundation model (FM). The company wants to know how much information can fit into one prompt. Which consideration will inform the company's decision?
Temperature
Context window
Batch size
Model size
ユーザの投票
コメント(6)
- 正解だと思う選択肢: B
A company needs to know the maximum input size for a single prompt when choosing a Foundation Model (FM) in Amazon Bedrock.
A. Temperature: This controls the randomness of the output, not the input prompt length. Temperature affects creativity, not input size. B. Context window: This defines the maximum length of the input prompt the model can process. It directly limits how much information can be included. C. Batch size: This is the number of prompts processed at once, affecting throughput, not individual prompt length. It's about processing multiple prompts efficiently. D. Model size: This relates to the model's overall capacity and complexity, not directly to the input prompt length. Size impacts performance, not input limits. Therefore, B. Context window is the correct answer.
👍 5Moon2024/12/31 - 正解だと思う選択肢: B
The context window refers to the maximum number of tokens (words or pieces of words) that a foundation model can process in a single input prompt.
👍 2jove2024/11/05 - 正解だと思う選択肢: B
B. Context window
The context window of a foundation model determines the maximum amount of text that can be processed in a single prompt. A larger context window allows for more complex and informative prompts, while a smaller context window limits the amount of information that can be provided.
The other options are not directly related to the maximum prompt length:
Temperature: This parameter controls the randomness of the model's output. Batch size: This refers to the number of samples processed in a single batch during training or inference. Model size: This refers to the number of parameters in the model, which affects its complexity and performance. Therefore, when choosing a foundation model for a generative AI application, the company should carefully consider the context window to ensure that it can accommodate the desired input length.
👍 2eesa2024/12/09
シャッフルモード