Understanding Detectron2’s solvers
We should try two obvious fine-tuning techniques: changing the backbone network and increasing the batch size. As indicated in the Detectron2 Model Zoo (introduced in Chapter 3), the backbone we selected in Chapter 5 is the simplest one (ResNet50), with a low mAP@0.5 on the pre-trained dataset. It is lightweight and fast to train and infer, so we have been using it for our experiments using a free Google Colab plan. If computation resources are available, selecting a more powerful backbone, such as X101FPN, on the Detectron2 Model Zoo would be beneficial. However, in this section, we will keep this simple backbone model and experiment with the settings and types of optimizers. These are standard hyperparameters because they apply to deep learning in general.
Understanding the available optimizers and their related hyperparameters is essential to understanding the configuration parameters Detectron2 offers for fine-tuning models. This section...