陈松灿
Professor
Alma Mater:杭州大学/上海交通大学/南京航空航天大学
Education Level:南京航空航天大学
Degree:Doctoral Degree in Engineering
School/Department:College of Computer Science and Technology
E-Mail:
Hits:
Affiliation of Author(s):计算机科学与技术学院/人工智能学院/软件学院
Journal:Jisuanji Yanjiu yu Fazhan
Abstract:Non-stationary multivariate time series (NSMTS) forecasting is still a challenging issue nowadays. The existing deep learning models based on recurrent neural networks (RNNs), especially long short-term memory (LSTM) and gated recurrent unit (GRU) neural networks, have received impressive performance in prediction. Although the architecture of the LSTM is relatively complex, it cannot always dominate in performance. Latest researches show that with a simpler gated unit structure, the minimal gated unit (MGU) can not only simplify the network architecture, but also improve the training efficiency in computer vision and some sequence problems. Most importantly, our experiments show that this kind of unit can be effectively applied to the NSMTS predictions and achieve comparable results with LSTM and MGU neural networks. However, none of the three gated unit based neural networks can always dominate in performance over all the NSMTS. Therefore, in this paper we propose a novel linear MIX gated unit (MIXGU). This gated unit can adjust the importance weights of GRU and MGU dynamically to achieve a better hybrid structure for each MIXGU in the network during training. The experimental results show that this MIXGU neural network has higher prediction performance than other state-of-the-art one gated unit neural network models. © 2019, Science Press. All right reserved.
ISSN No.:1000-1239
Translation or Not:no
Date of Publication:2019-08-01
Co-author:Liu, Jiexi
Correspondence Author:csc