[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1458082.1458099acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
research-article

Transfer learning from multiple source domains via consensus regularization

Published: 26 October 2008 Publication History

Abstract

Recent years have witnessed an increased interest in transfer learning. Despite the vast amount of research performed in this field, there are remaining challenges in applying the knowledge learnt from multiple source domains to a target domain. First, data from multiple source domains can be semantically related, but have different distributions. It is not clear how to exploit the distribution differences among multiple source domains to boost the learning performance in a target domain. Second, many real-world applications demand this transfer learning to be performed in a distributed manner. To meet these challenges, we propose a consensus regularization framework for transfer learning from multiple source domains to a target domain. In this framework, a local classifier is trained by considering both local data available in a source domain and the prediction consensus with the classifiers from other source domains. In addition, the training algorithm can be implemented in a distributed manner, in which all the source-domains are treated as slave nodes and the target domain is used as the master node. To combine the training results from multiple source domains, it only needs share some statistical data rather than the full contents of their labeled data. This can modestly relieve the privacy concerns and avoid the need to upload all data to a central location. Finally, our experimental results show the effectiveness of our consensus regularization learning.

References

[1]
W. Dai, Q. Yang, G. Xue, and Y. Yu. Boosting for transfer learning. In Proc. of the 24th ICML, pages 193--200, 2007.
[2]
W. Dai, G. Xue, Q. Yang, and Y. Yu. Co-clustering based classification for out-of-domain documents. In Proc. of the 13th ACM SIGKDD, pages 210--219, 2007.
[3]
D. Xing, W. Dai, G. Xue, and Y. Yu. Bridged refinement for transfer learning. In Proc. of the 13th PKDD, pages 324--335, 2007.
[4]
A. Smeaton and P. Over. Trecvid: Benchmarking the effectiveness of information retrieval tasks on digital video. In Proc. of the Intl. Conf. on Image and Video Retrieval, 2003.
[5]
J. Yang, R. Yan, and A. G. Hauptmann. Cross-domain video concept detection using adaptive svms. In Proc. of the 15th ACM MM, pages 188--197, 2007.
[6]
David Hosmer and Stanley Lemeshow. Applied Logistic Regression. Wiley, New York, 2000.
[7]
Andrzej Ruszczynski. Nonlinear Optimization. Princeton University Press, Princeton, 2006.
[8]
T. Joachims. Transductive inference for text classification using support vector machines. In Proc. of the 16th ICML, 1999.
[9]
T. Joachims. Transductive learning via spectral graph partitioning. In Proc. of the 20th ICML, 2003.
[10]
X. Liao, Y. Xue, and L. Carin. Logistic regression with an auxiliary data source. In Proc. of the 22nd ICML, pages 505--512, 2005.
[11]
A. Smith and C. Elkan. Making generative classifiers robust to selection bias. In Proc. of the 13th ACM SIGKDD, pages 657--666, 2007.
[12]
R. Raina, A. Y. Ng, and D. Koller. Constructing informative priors using transfer learning. In Proc. of the 23rd ICML, pages 713--720, 2006.
[13]
R. Raina, A. Battle, H. Lee, B. Packer, and A. Y. Ng. Self-taught learning: Transfer learning from unlabeled data. Proc. of the 24th ICML, pages 759--766, 2007.
[14]
Y. Grandvalet and Y. Bengio. Semi-supervised learning by entropy minimization. In Proc. of the 19th NIPS, pages 529--536, 2005.
[15]
D. Yarowsky. Unsupervised word sense disambiguation rivaling supervised methods. In Proc. of the 33th ACL, pages 189--196, 1995.
[16]
A. Blum and T. Mitchell. Combining labeled and unlabeled data with co-training. In Proc. of the 11th COLT, pages 92--100, 1998.
[17]
Y. Grandvalet, F. d'Alché-Buc, and C. Ambroise. Boosting mixture models for semi-supervised learning. In Proc. of the intl. conf. on Artificial Neural Networks, pages 41--48, 2001.
[18]
V. Sindhwani, P. Niyogi, and M. Belkin. A co-regularization approach to semi-supervised learning with multiple views. In Proc. ICML Workshop on Learning with Multiple Views, 2005.
[19]
S. Dasgupta, M. L. Littman, and D. A. McAllester. Pac generalization bounds for co-training. In Proc. of the 15th NIPS, pages 375--382, 2001.
[20]
S. Abney. Bootstrapping. In Proc. of the ACL, 2002.
[21]
S. Abney. Understanding the yarowsky algorithm. Computational Linguistics, 30(3):365--395, 2004.

Cited By

View all
  • (2024)Simultaneous Selection and Adaptation of Source Data via Four-Level OptimizationTransactions of the Association for Computational Linguistics10.1162/tacl_a_0065812(449-466)Online publication date: 3-May-2024
  • (2024)Fast Implementation of Object Detection Algorithm Based on Homomorphic Model TransformationNeurocomputing10.1016/j.neucom.2024.127313(127313)Online publication date: Feb-2024
  • (2024)A comprehensive survey of federated transfer learning: challenges, methods and applicationsFrontiers of Computer Science10.1007/s11704-024-40065-x18:6Online publication date: 23-Jul-2024
  • Show More Cited By

Index Terms

  1. Transfer learning from multiple source domains via consensus regularization

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CIKM '08: Proceedings of the 17th ACM conference on Information and knowledge management
    October 2008
    1562 pages
    ISBN:9781595939913
    DOI:10.1145/1458082
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 October 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. classification
    2. consensus regularization
    3. transfer learning

    Qualifiers

    • Research-article

    Conference

    CIKM08
    CIKM08: Conference on Information and Knowledge Management
    October 26 - 30, 2008
    California, Napa Valley, USA

    Acceptance Rates

    Overall Acceptance Rate 1,861 of 8,427 submissions, 22%

    Upcoming Conference

    CIKM '25

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)33
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 01 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Simultaneous Selection and Adaptation of Source Data via Four-Level OptimizationTransactions of the Association for Computational Linguistics10.1162/tacl_a_0065812(449-466)Online publication date: 3-May-2024
    • (2024)Fast Implementation of Object Detection Algorithm Based on Homomorphic Model TransformationNeurocomputing10.1016/j.neucom.2024.127313(127313)Online publication date: Feb-2024
    • (2024)A comprehensive survey of federated transfer learning: challenges, methods and applicationsFrontiers of Computer Science10.1007/s11704-024-40065-x18:6Online publication date: 23-Jul-2024
    • (2023)Adaptive Domain Generalization Via Online Disagreement MinimizationIEEE Transactions on Image Processing10.1109/TIP.2023.329573932(4247-4258)Online publication date: 2023
    • (2023)Deep Representation Learning: Fundamentals, Technologies, Applications, and Open ChallengesIEEE Access10.1109/ACCESS.2023.333519611(137621-137659)Online publication date: 2023
    • (2023)Improve the performance of CT-based pneumonia classification via source data reweightingScientific Reports10.1038/s41598-023-35938-313:1Online publication date: 9-Jun-2023
    • (2023)Integrating transformer and autoencoder techniques with spectral graph algorithms for the prediction of scarcely labeled molecular dataComputers in Biology and Medicine10.1016/j.compbiomed.2022.106479153(106479)Online publication date: Feb-2023
    • (2022)PADA: Example-based Prompt Learning for on-the-fly Adaptation to Unseen DomainsTransactions of the Association for Computational Linguistics10.1162/tacl_a_0046810(414-433)Online publication date: 11-Apr-2022
    • (2022)Influence of Transfer Learning on Machine Learning Systems Robustness to Data Quality Degradation2022 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN55064.2022.9892247(1-8)Online publication date: 18-Jul-2022
    • (2022)Task-specific Inconsistency Alignment for Domain Adaptive Object Detection2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52688.2022.01382(14197-14206)Online publication date: Jun-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media