教授 博士生导师
招生学科专业:
计算机科学与技术 -- 【招收硕士研究生】 -- 计算机科学与技术学院
电子信息 -- 【招收硕士研究生】 -- 计算机科学与技术学院
应用统计 -- 【招收硕士研究生】 -- 数学学院
数学 -- 【招收博士、硕士研究生】 -- 数学学院
性别:女
毕业院校:中科院数学与系统科学研究院
学历:中科院数学与系统科学研究院
学位:理学博士学位
所在单位:数学学院
办公地点:理学楼372办公室
电子邮箱:
最后更新时间:..
点击次数:
所属单位:计算机科学与技术学院/人工智能学院/软件学院
发表刊物:PATTERN RECOGNITION
关键字:GP-LVM Multi-task learning Feature learning Hierarchical model
摘要:Multi-task learning (MTL) has been proved to improve performance of individual tasks by learning multiple related tasks together. Recently Nonparametric Bayesian Gaussian Process (GP) models have also been adapted to MTL and exhibit enough flexibility due to its non-parametric nature, thus can exempt from the assumption about the probability distributions of variables. To date, there have had two approaches proposed to implement GP-based MTL, i.e., cross-covariance-based and joint feature learning methods. Although successfully applied in scenarios such as face verification and collaborative filtering, these methods have their own drawbacks, for example, the cross-covariance-based method suffers from poor scalability because of the large covariance matrix involved; while the joint feature learning method can just implicitly incorporate relation between tasks, thus leading to a failure in explicitly exploiting the prior knowledge like correlation between tasks, which is crucial for further promoting MTLs. To address both issues, in this paper, we establish a two layer unified framework called Hierarchical Gaussian Process Multi-task Learning (HGPMT) method to jointly learn the latent shared features among tasks and a multi-task model. Furthermore, since the HGPMT does not need to involve the cross-covariance, its computational complexity is much lower. Finally, experimental results on both toy multi-task regression dataset and real datasets demonstrate its superiority in performance of multi-task learning to recently proposed approaches. (C) 2017 Elsevier Ltd. All rights reserved.
ISSN号:0031-3203
是否译文:否
发表时间:2018-02-01
合写作者:陈松灿
通讯作者:王丽平