Questions
- Reformer transformer models don’t contain encoders. (True/False)
- Reformer transformer models don’t contain decoders. (True/False)
- The inputs are stored layer by layer in Reformer models. (True/False)
- DeBERTa transformer models disentangle content and positions. (True/False)
- It is necessary to test the hundreds of pretrained transformer models before choosing one for a project. (True/False)
- The latest transformer model is always the best. (True/False)
- It is better to have one transformer model per NLP task than one multi-task transformer model. (True/False)
- A transformer model always needs to be fine-tuned. (True/False)
- OpenAI GPT-3 engines can perform a wide range of NLP tasks without fine-tuning. (True/False)
- It is always better to implement an AI algorithm on a local server. (True/False)