Topic 1 Question 124
A data scientist has developed a machine learning translation model for English to Japanese by using Amazon SageMaker's built-in seq2seq algorithm with 500,000 aligned sentence pairs. While testing with sample sentences, the data scientist finds that the translation quality is reasonable for an example as short as five words. However, the quality becomes unacceptable if the sentence is 100 words long. Which action will resolve the problem?
Change preprocessing to use n-grams.
Add more nodes to the recurrent neural network (RNN) than the largest sentence's word count.
Adjust hyperparameters related to the attention mechanism.
Choose a different weight initialization type.
ユーザの投票
コメント(7)
I agree with an answer of C Attention mechanism. The disadvantage of an encoder-decoder framework is that model performance decreases as and when the length of the source sequence increases because of the limit of how much information the fixed-length encoded feature vector can contain. To tackle this problem, in 2015, Bahdanau et al. proposed the attention mechanism. In an attention mechanism, the decoder tries to find the location in the encoder sequence where the most important information could be located and uses that information and previously decoded words to predict the next token in the sequence.
👍 24cnethers2021/09/28- 正解だと思う選択肢: C
i go with C
👍 4peterfish2022/07/18 - 正解だと思う選択肢: C
C. Adjust hyperparameters related to the attention mechanism.
The seq2seq algorithm uses an attention mechanism to dynamically focus on relevant parts of the input sequence for each output sequence element. Increasing the attention mechanism's ability to learn dependencies between long input and output sequences might help improve the translation quality for long sentences.
The data scientist could try adjusting relevant hyperparameters such as attention depth or attention scale, or try a different attention mechanism such as scaled dot-product attention, to see if that improves the translation quality for long sentences.
👍 4AjoseO2023/02/13
シャッフルモード