Questions
- Machine translation has now exceeded human baselines. (True/False)
- Machine translation requires large datasets. (True/False)
- There is no need to compare transformer models using the same datasets. (True/False)
- BLEU is the French word for blue and is the acronym of an NLP metric (True/False)
- Smoothing techniques enhance BERT. (True/False)
- German-English is the same as English-German for machine translation. (True/False)
- The original Transformer multi-head attention sub-layer has 2 heads. (True/False)
- The original Transformer encoder has 6 layers. (True/False)
- The original Transformer encoder has 6 layers but only 2 decoder layers. (True/False)
- You can train transformers without decoders. (True/False)