Personal Homepage

Personal Information

MORE+

Degree:Doctoral Degree in Science
School/Department:College of Science

王丽平

+

Gender:Female

Education Level:中科院数学与系统科学研究院

Alma Mater:中科院数学与系统科学研究院

Paper Publications

Hierarchical Gaussian Processes model for multi-task learning
Date of Publication:2018-02-01 Hits:

Affiliation of Author(s):计算机科学与技术学院/人工智能学院/软件学院
Journal:PATTERN RECOGNITION
Key Words:GP-LVM Multi-task learning Feature learning Hierarchical model
Abstract:Multi-task learning (MTL) has been proved to improve performance of individual tasks by learning multiple related tasks together. Recently Nonparametric Bayesian Gaussian Process (GP) models have also been adapted to MTL and exhibit enough flexibility due to its non-parametric nature, thus can exempt from the assumption about the probability distributions of variables. To date, there have had two approaches proposed to implement GP-based MTL, i.e., cross-covariance-based and joint feature learning methods. Although successfully applied in scenarios such as face verification and collaborative filtering, these methods have their own drawbacks, for example, the cross-covariance-based method suffers from poor scalability because of the large covariance matrix involved; while the joint feature learning method can just implicitly incorporate relation between tasks, thus leading to a failure in explicitly exploiting the prior knowledge like correlation between tasks, which is crucial for further promoting MTLs. To address both issues, in this paper, we establish a two layer unified framework called Hierarchical Gaussian Process Multi-task Learning (HGPMT) method to jointly learn the latent shared features among tasks and a multi-task model. Furthermore, since the HGPMT does not need to involve the cross-covariance, its computational complexity is much lower. Finally, experimental results on both toy multi-task regression dataset and real datasets demonstrate its superiority in performance of multi-task learning to recently proposed approaches. (C) 2017 Elsevier Ltd. All rights reserved.
ISSN No.:0031-3203
Translation or Not:no
Date of Publication:2018-02-01
Co-author:csc
Correspondence Author:WLP
Date of Publication:2018-02-01