We have already learned about fully connected layers in Chapter 2, Dive Deep into Deep Neural Networks. Having fully connected layers simply means that all the nodes in one layer are connected to the outputs of the next layers. The output of the fully connected layer is a class of probabilities, where each class is assigned a probability. All probabilities must sum up to 1. The activation function used at the output of the layer is called the softmax function.
Argentina
Australia
Austria
Belgium
Brazil
Bulgaria
Canada
Chile
Colombia
Cyprus
Czechia
Denmark
Ecuador
Egypt
Estonia
Finland
France
Germany
Great Britain
Greece
Hungary
India
Indonesia
Ireland
Italy
Japan
Latvia
Lithuania
Luxembourg
Malaysia
Malta
Mexico
Netherlands
New Zealand
Norway
Philippines
Poland
Portugal
Romania
Russia
Singapore
Slovakia
Slovenia
South Africa
South Korea
Spain
Sweden
Switzerland
Taiwan
Thailand
Turkey
Ukraine
United States