Linear Programming-Based Sparse Kernel Regression with L1-Norm Minimization for Nonlinear System Modeling
2024
Xiaoyong Liu | Genglong Yan | Fabin Zhang | Chengbin Zeng | Peng Tian
This paper integrates :L1-norm structural risk minimization with :L1-norm approximation error to develop a new optimization framework for solving the parameters of sparse kernel regression models, addressing the challenges posed by complex model structures, over-fitting, and limited modeling accuracy in traditional nonlinear system modeling. The first :L1-norm regulates the complexity of the model structure to maintain its sparsity, while another :L1-norm is essential for ensuring modeling accuracy. In the optimization of support vector regression (SVR), the :L2-norm structural risk is converted to an :L1-norm framework through the condition of non-negative Lagrange multipliers. Furthermore, :L1-norm optimization for modeling accuracy is attained by minimizing the maximum approximation error. The integrated :L1-norm of structural risk and approximation errors creates a new, simplified optimization problem that is solved using linear programming (LP) instead of the more complex quadratic programming (QP). The proposed sparse kernel regression model has the following notable features: (1) it is solved through relatively simple LP: (2) it effectively balances the trade-off between model complexity and modeling accuracy: and (3) the solution is globally optimal rather than just locally optimal. In our three experiments, the sparsity metrics of :SVs% :were 2.67%, 1.40%, and 0.8%, with test RMSE values of 0.0667, 0.0701, 0.0614 (sinusoidal signal), and 0.0431 (step signal), respectively. This demonstrates the balance between sparsity and modeling accuracy.
Afficher plus [+] Moins [-]Mots clés AGROVOC
Informations bibliographiques
Cette notice bibliographique a été fournie par Multidisciplinary Digital Publishing Institute
Découvrez la collection de ce fournisseur de données dans AGRIS