Topic 1 Question 73
A company has documents that are missing some words because of a database error. The company wants to build an ML model that can suggest potential words to fill in the missing text. Which type of model meets this requirement?
Topic modeling
Clustering models
Prescriptive ML models
BERT-based models
ユーザの投票
コメント(3)
BERT (Bidirectional Encoder Representations from Transformers) is a language model designed to understand context in text by considering both the left and right sides of a word. BERT-based models are well-suited for filling in missing words in sentences due to their ability to predict masked words in a given text. This makes them ideal for tasks that require filling in missing information within text data.
👍 7dehkon2024/11/07- 正解だと思う選択肢: D
Answer: D. BERT-based models
BERT (Bidirectional Encoder Representations from Transformers) uses a masked language modeling approach. It learns how to predict missing or “masked” words in a sentence based on the surrounding context. This makes a BERT-based model ideal for suggesting potential words to fill in missing text.
👍 1may2021_r2024/12/29 - 正解だと思う選択肢: D
D. BERT-based models: BERT (Bidirectional Encoder Representations from Transformers) is a pre-trained language model that has been fine-tuned for various natural language processing tasks, including text completion. BERT-based models are particularly effective at predicting missing words or filling in gaps in text because they can understand context in both directions (left and right of the missing word). This makes them ideal for suggesting potential words to fill in missing text.
👍 1Jessiii2025/02/11
シャッフルモード