Affiliation of Author(s):计算机科学与技术学院/人工智能学院/软件学院
Journal:Proc. ACM SIGKDD Int. Conf. Knowl. Discov. Data Min.
Abstract:Deep convolutional neural networks have achieved great success in various applications. However, training an effective DNN model for a specific task is rather challenging because it requires a prior knowledge or experience to design the network architecture, repeated trial-and-error process to tune the parameters, and a large set of labeled data to train the model. In this paper, we propose to overcome these challenges by actively adapting a pre-trained model to a new task with less labeled examples. Specifically, the pre-trained model is iteratively fine tuned based on the most useful examples. The examples are actively selected based on a novel criterion, which jointly estimates the potential contribution of an instance on optimizing the feature representation as well as improving the classification model for the target task. On one hand, the pre-trained model brings plentiful information from its original task, avoiding redesign of the network architecture or training from scratch; and on the other hand, the labeling cost can be significantly reduced by active label querying. Experiments on multiple datasets and different pre-trained models demonstrate that the proposed approach can achieve cost-effective training of DNNs. © 2018 Association for Computing Machinery.
Translation or Not:no
Date of Publication:2018-07-19
Co-author:Zhao, Jia-Wei,Liu, Zhao-Yang
Correspondence Author:Sheng Jun Huang
Date of Publication:2018-07-19
黄圣君
+
Gender:Male
Education Level:南京大学
Alma Mater:南京大学
Paper Publications
Cost-effective training of deep CNNs with active model adaptation
Date of Publication:2018-07-19 Hits: