As we've seen, one outlier is enough to break the least-squares regression. Such instability is a manifestation of overfitting problems. Methods that help prevent models from overfitting are generally referred to as regularization techniques. Usually, regularization is achieved by imposing additional constraints on the model. This can be an additional term in a loss function, noise injection, or something else. We've already implemented one such technique previously, in Chapter 3, K-Nearest Neighbors Classifier. Locality constraint w in the DTW algorithm is essentially a way to regularize the result. In the case of linear regression, regularization imposes constraints on the weights vector values.
Argentina
Australia
Austria
Belgium
Brazil
Bulgaria
Canada
Chile
Colombia
Cyprus
Czechia
Denmark
Ecuador
Egypt
Estonia
Finland
France
Germany
Great Britain
Greece
Hungary
India
Indonesia
Ireland
Italy
Japan
Latvia
Lithuania
Luxembourg
Malaysia
Malta
Mexico
Netherlands
New Zealand
Norway
Philippines
Poland
Portugal
Romania
Russia
Singapore
Slovakia
Slovenia
South Africa
South Korea
Spain
Sweden
Switzerland
Taiwan
Thailand
Turkey
Ukraine
United States