A novel hybrid PSO- and GS-based hyperparameter optimization algorithm for support vector regression
[ X ]
Tarih
2023
Yazarlar
Dergi Başlığı
Dergi ISSN
Cilt Başlığı
Yayıncı
Springer London Ltd
Erişim Hakkı
info:eu-repo/semantics/closedAccess
Özet
Hyperparameter optimization is vital in improving the prediction accuracy of support vector regression (SVR), as in all machine learning algorithms. This study introduces a new hybrid optimization algorithm, namely PSOGS, which consolidates two strong and widely used algorithms, particle swarm optimization (PSO) and grid search (GS). This hybrid algorithm was experimented on five benchmark datasets. The speed and the prediction accuracy of PSOGS-optimized SVR models (PSOGS-SVR) were compared to those of its constituent algorithms (PSO and GS) and another hybrid optimization algorithm (PSOGSA) that combines PSO and gravitational search algorithm (GSA). The prediction accuracies were evaluated and compared in terms of root mean square error and mean absolute percentage error. For the sake of reliability, the results of the experiments were obtained by performing 10-fold cross-validation on 30 runs. The results showed that PSOGS-SVR yields prediction accuracy comparable to GS-SVR, performs much faster than GS-SVR, and provides better results with less execution time than PSO-SVR. Besides, PSOGS-SVR presents more effective results than PSOGSA-SVR in terms of both prediction accuracy and execution time. As a result, this study proved that PSOGS is a fast, stable, efficient, and reliable algorithm for optimizing hyperparameters of SVR.
Açıklama
Anahtar Kelimeler
Support vector regression, Hyperparameter optimization, Grid search, Particle swarm optimization
Kaynak
Neural Computing & Applications
WoS Q Değeri
Q2
Scopus Q Değeri
Q1
Cilt
35
Sayı
27