Topic 1 Question 83
A company wants to assess the costs that are associated with using a large language model (LLM) to generate inferences. The company wants to use Amazon Bedrock to build generative AI applications. Which factor will drive the inference costs?
Number of tokens consumed
Temperature value
Amount of data used to train the LLM
Total training time
ユーザの投票
コメント(4)
- 正解だと思う選択肢: A
A is correct. Token is the basic unit of generative AI model
👍 3PHD_CHENG2024/11/19 - 正解だと思う選択肢: A
A. Number of tokens consumed. More tokens used = higher cost.
All other affects training costs, not inference costs. Correct answer is A
👍 2OnePG2025/02/04 - 正解だと思う選択肢: A
No. of tokens consumed while processing. Tokens are the basic units of input and output that a generative AI model operates on, representing words, subwords, or other linguistic units.
👍 185b5b552025/02/01
シャッフルモード