![]() |
个人信息Personal Information
副研究员
招生学科专业:
动力工程及工程热物理 -- 【招收硕士研究生】 -- 能源与动力学院
航空宇航科学与技术 -- 【招收博士、硕士研究生】 -- 能源与动力学院
能源动力 -- 【招收博士、硕士研究生】 -- 能源与动力学院
学历:南京航空航天大学
学位:工学博士学位
所在单位:能源与动力学院
电子邮箱:
Gram-Schmidt process based incremental extreme learning machine
点击次数:
所属单位:能源与动力学院
发表刊物:NEUROCOMPUTING
关键字:Extreme learning machine Incremental learning QR decomposition Gram-Schmidt process
摘要:To compact the architecture of extreme learning machine (ELM), two incremental learning algorithms are proposed in this paper. The previous incremental learning algorithms for ELM recruit hidden nodes randomly, which is equivalent to implementing a random selection from a candidate set of infinite size. Hence, it is impossible to recruit good hidden nodes, and thus it usually requires more hidden nodes than traditional neural networks to achieve matched performance. To improve the quality of the hidden nodes recruited, an incremental learning algorithm for ELM is presented based on Gram Schmidt process (GSI-ELM), which recruits the best hidden node from a random subset of fixed size via defining an evaluating criterion at each learning step. However, the "nesting effect" exists in the GSI-ELM, that is to say, the hidden nodes once recruited by GSI-ELM can not be later discarded. To treat this "nesting problem", the improved GSI-ELM (IGSI-ELM) is generated with an elimination mechanism. At each learning step IGSI-ELM eliminates the worst hidden node from the already-recruited group if it is not the newly-recruited one. Finally, to verify the efficacy and feasibility of the proposed algorithms, i.e. GSI-ELM and IGSI-ELM, in this paper, experiments on regression and classification benchmark data sets are investigated. (C) 2017 Elsevier B.V. All rights reserved.
ISSN号:0925-2312
是否译文:否
发表时间:2017-06-07
合写作者:李智强,Xi, Peng-Peng,Liang, Dong,Sun, Liguo,Chen, Ting-Hao
通讯作者:赵永平