Questions
- NLP transduction can encode and decode text representations. (True/False)
- Natural Language Understanding (NLU) is a subset of Natural Language Processing (NLP). (True/False)
- Language modeling algorithms generate probable sequences of words based on input sequences. (True/False)
- A transformer is a customized LSTM with a CNN layer. (True/False)
- A transformer does not contain LSTM or CNN layers. (True/False)
- Attention examines all the tokens in a sequence, not just the last one. (True/False)
- A transformer uses a positional vector, not positional encoding. (True/False)
- A transformer contains a feedforward network. (True/False)
- The masked multi-headed attention component of the decoder of a transformer prevents the algorithm parsing a given position from seeing the rest of a sequence that is being processed. (True/False)
- Transformers can analyze long-distance dependencies better than LSTMs. (True/False)