WANG Kuaini, CAO Jinde, LIU Qingshan. An Exponential Laplace Loss Function Based Robust ELM for Regression Estimation[J]. Applied Mathematics and Mechanics, 2019, 40(11): 1169-1178. doi: 10.21656/1000-0887.400240
Citation: WANG Kuaini, CAO Jinde, LIU Qingshan. An Exponential Laplace Loss Function Based Robust ELM for Regression Estimation[J]. Applied Mathematics and Mechanics, 2019, 40(11): 1169-1178. doi: 10.21656/1000-0887.400240

An Exponential Laplace Loss Function Based Robust ELM for Regression Estimation

doi: 10.21656/1000-0887.400240
Funds:  The National Natural Science Foundation of China(61833005;61907033);China Postdoctoral Science Foundation(2018M642129)
  • Received Date: 2019-08-20
  • Publish Date: 2019-11-01
  • Datasets are often contaminated by various noises in many practical applications. The classical extreme learning machine (ELM) shows poor prediction accuracy and large fluctuation of prediction results in dealing with such datasets. To overcome this drawback, an exponential Laplace loss function was proposed, which can weaken the influences of noises. The proposed loss function is based on the Gauss kernel function, and is differentiable, non-convex, bounded and able to approach the Laplace function. Then the proposed loss function was introduced into the ELM to build a robust ELM model for regression estimation. The iterative re-weighted algorithm was employed to solve the resultant optimization problem. In each iteration, the training samples with noises were given smaller weights, which can effectively improve the prediction accuracy. Experiments on real-world datasets show that, the proposed model has better learning performance and robustness.
  • loading
  • [1]
    HUANG G, ZHU Q, SIEW C. Extreme learning machine: a new learning scheme of feedforward neural networks[C]// 2004 IEEE International Joint Conference on Neural Networks . Budapest, Hungary, 2004: 985-990.
    [2]
    HUANG G, ZHU Q, SIEW C. Extreme learning machine: theory and applications[J]. Neurocomputing,2006,70(1/3): 489-501.
    [3]
    HUANG G, ZHOU H, DING X, et al. Extreme learning machine for regression and multiclass classification[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics),2012,42(2): 513-529.
    [4]
    姜琳颖, 余东海, 石鑫. 基于加权极限学习机的肿瘤基因表达谱数据分类[J]. 东北大学学报(自然科学版), 2017,38(6): 798-803.(JIANG Linying, YU Donghai, SHI Xin. Tumor microarray gene expression data classification based on weighted extreme learning machine[J]. Journal of Northeastern University (Natural Science),2017,38(6): 798-803.(in Chinese))
    [5]
    柯逍, 邹嘉伟, 杜明智, 等. 基于蒙特卡罗数据集均衡与鲁棒性增量极限学习机的图像自动标注[J]. 电子学报, 2017,45(12): 2925-2935.(KE Xiao, ZOU Jiawei, DU Mingzhi, et al. The automatic image annotation based on monte-carlo data set balance and robustness incremental extreme learning machine[J]. Acta Electronica Sinica,2017,45(12): 2925-2935.(in Chinese))
    [6]
    HAMPEL F, RONCHETTI E, ROUSSEEUW P, et al. Robust Statistics [M]. Wiley, 2005.
    [7]
    FRENAY B, VERLEYSEN M. Classification in the presence of label noise: a survey[J]. IEEE Transactions on Neural Networks and Learning Systems,2014,25(5): 845-869.
    [8]
    DENG W, ZHENG Q, CHEN L. Regularized extreme learning machine[C]// 2009 IEEE Symposium on Computational Intelligence and Data Mining . Nashville, TN, USA, 2009: 389-395.
    [9]
    HORATA P, CHIEWCHANWATTANA S, SUNAT K. Robust extreme learning machine[J]. Neurocomputing,2013,102: 31-44.
    [10]
    ZHANG K, LUO M. Outlier-robust extreme learning machine for regression problems[J]. Neurocomputing,2015,151(3): 1519-1527.
    [11]
    REN Z, YANG L. Robust extreme learning machines with different loss functions[J]. Neural Processing Letters,2019,49(3): 1543-1565.
    [12]
    CHEN K, L Q, LU Y, et al. Robust regularized extreme learning machine for regression using iteratively reweighted least squares[J]. Neurocomputing,2017,230: 345-358.
    [13]
    WANG L, JIA H, LI J. Training robust support vector machine with smooth ramp loss in the primal space[J]. Neurocomputing,2008,71(13/15): 3020-3025.
    [14]
    SHEN X, NIU L, QI Z, et al. Support vector machine classifier with truncated pinball loss[J]. Pattern Recognition,2017,68: 199-210.
    [15]
    ZHONG P. Training robust support vector regression with smooth non-convex loss function[J]. Optimization Methods & Software,2012,27(6): 1039-1058.
    [16]
    JIANG W, NIE F, HUANG H. Robust dictionary learning with capped l1 norm[C]// Twenty-Fourth International Joint Conference on Artificial Intelligence . Buenos Aires, Argentina, 2015.
    [17]
    LIU W, POKHAREL P, PRINCIPE J. Correntropy: properties and applications in non-Gaussian signal processing[J]. IEEE Transactions on Signal Processing,2007,55(11): 5286-5298.
    [18]
    HE R, HU B, YUAN X,et al. Robust Recognition Via Information Theoretic Learning [M]. Springer, 2014.
    [19]
    XING H, WANG X. Training extreme learning machine via regularized correntropy criterion[J]. Neural Computing and Applications,2013,23(7/8): 1977-1986.
    [20]
    SINGH A, POKHAREL R, PRINCIPE J. The C-loss function for pattern classification[J]. Pattern Recognition,2014,47(1): 441-453.
    [21]
    FENG Y, YANG Y, HUANG X, et al. Robust support vector machines for classification with nonconvex and smooth losses[J]. Neural Computation,2016,28(6): 1217-1247.
    [22]
    YANG L, REN Z, WANG Y,et al. A robust regression framework with Laplace kernel-induced loss[J]. Neural Computation,2017,29(11): 3014-3039.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Article Metrics

    Article views (1216) PDF downloads(533) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return