[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Fuzzy Shared Representation Learning for Multistream Classification

Published: 04 July 2024 Publication History

Abstract

Multistream classification aims to predict the target stream by transferring knowledge from labeled source streams amid nonstationary processes with concept drifts. While existing methods address label scarcity, covariate shift, and asynchronous concept drift, they focus solely on the original feature space, neglecting the influence of redundant or low-quality features with uncertainties. Therefore, the advancement of this task is still challenged by how to: 1) ensure guaranteed joint representations of different streams, 2) grapple with uncertainty and interpretability during knowledge transfer, and 3) track and adapt the asynchronous drifts in each stream. To address these challenges, we propose an interpretable fuzzy shared representation learning (FSRL) method based on the Takagi–Sugeno–Kang (TSK) fuzzy system. Specifically, FSRL accomplishes the nonlinear transformation of individual streams by learning the fuzzy mapping with the antecedents of the TSK fuzzy system, thereby effectively preserving discriminative information for each original stream in an interpretable way. Then, a multistream joint distribution adaptation algorithm is proposed to optimize the consequent part of the TSK fuzzy system, which learns the final fuzzy shared representations for different streams. Hence, this method concurrently investigates both the commonalities across streams and the distinctive information within each stream. Following that, window-based and GMM-based online adaptation strategies are designed to address the asynchronous drifts over time. The former can directly demonstrate the effectiveness of FSRL in knowledge transfer across multiple streams, while the GMM-based method offers an informed way to overcome the asynchronous drift problem by integrating drift detection and adaptation. Finally, extensive experiments on several synthetic and real-world benchmarks with concept drift demonstrate the proposed method's effectiveness and efficiency.

References

[1]
J. Lu, A. Liu, F. Dong, F. Gu, J. Gama, and G. Zhang, “Learning under concept drift: A review,” IEEE Trans. Knowl. Data Eng., vol. 31, no. 12, pp. 2346–2363, Dec. 2019.
[2]
B. Jiao, Y. Guo, D. Gong, and Q. Chen, “Dynamic ensemble selection for imbalanced data streams with concept drift,” IEEE Trans. Neural Netw. Learn. Syst., vol. 35, no. 1, pp. 1278–1291, Jan. 2024.
[3]
A. Liu, J. Lu, and G. Zhang, “Concept drift detection: Dealing with missing values via fuzzy distance estimations,” IEEE Trans. Fuzzy Syst., vol. 29, no. 11, pp. 3219–3233, Nov. 2021.
[4]
F. Hinder, V. Vaquet, and B. Hammer, “One or two things we know about concept drift—a survey on monitoring in evolving environments. Part A: Detecting concept drift,” Front. Artif. Intell., vol. 7, 2024, Art. no.
[5]
G. Bai, C. Ling, and L. Zhao, “Temporal domain generalization with drift-aware dynamic neural networks,” in Proc. 11th Int. Conf. Learn. Representations, 2023. [Online]. Available: https://openreview.net/forum?id=sWOsRj4nT1n
[6]
S. Abadifard, S. Bakhshi, S. Gheibuni, and F. Can, “DynED: Dynamic ensemble diversification in data stream classification,” in Proc. 32nd ACM Int. Conf. Inf. Knowl. Manage., 2023, pp. 3707–3711.
[7]
J. Gama and G. Castillo, “Learning with local drift detection,” in Proc. Int. Conf. Adv. Data Mining Appl., Springer, 2006, pp. 42–55.
[8]
C. Yang, Y.-M. Cheung, J. Ding, and K. C. Tan, “Concept drift-tolerant transfer learning in dynamic environments,” IEEE Trans. Neural Netw. Learn. Syst., vol. 33, no. 8, pp. 3857–3871, Aug. 2022.
[9]
X. Renchunzi and M. Pratama, “Automatic online multi-source domain adaptation,” Inf. Sci., vol. 582, pp. 480–494, 2022.
[10]
K. Wang, J. Lu, A. Liu, and G. Zhang, “An augmented learning approach for multiple data streams under concept drift,” in Proc. Australas. Joint Conf. Artif. Intell., Springer, 2023, pp. 391–402.
[11]
Y. Song, J. Lu, H. Lu, and G. Zhang, “Learning data streams with changing distributions and temporal dependency,” IEEE Trans. Neural Netw. Learn. Syst., vol. 34, no. 8, pp. 3952–3965, Aug. 2023.
[12]
B. Jiao, Y. Guo, S. Yang, J. Pu, and D. Gong, “Reduced-space multistream classification based on multi-objective evolutionary optimization,” IEEE Trans. Evol. Comput., vol. 27, no. 4, pp. 764–777, Aug. 2023.
[13]
S. Chandra, A. Haque, L. Khan, and C. Aggarwal, “An adaptive framework for multistream classification,” in Proc. 25th ACM Int. Conf. Inf. Knowl. Manage., 2016, pp. 1181–1190.
[14]
B. Dong, Y. Gao, S. Chandra, and L. Khan, “Multistream classification with relative density ratio estimation,” in Proc. AAAI Conf. Artif. Intell., 2019, pp. 3478–3485.
[15]
H. Tao, Z. Wang, Y. Li, M. Zamani, and L. Khan, “COMC: A framework for online cross-domain multistream classification,” in Proc. IEEE Int. Joint Conf. Neural Netw., 2019, pp. 1–8.
[16]
A. Haque, Z. Wang, S. Chandra, B. Dong, L. Khan, and K. W. Hamlen, “Fusion: An online method for multistream classification,” in Proc. ACM Conf. Inf. Knowl. Manage., 2017, pp. 919–928.
[17]
Z. Wang, C. Zhou, B. Du, and F. He, “Self-paced supervision for multi-source domain adaptation,” in Proc. 31st Int. Joint Conf. Artif. Intell., 2022, pp. 3551–3557.
[18]
J. Lu, H. Zuo, and G. Zhang, “Fuzzy multiple-source transfer learning,” IEEE Trans. Fuzzy Syst., vol. 28, no. 12, pp. 3418–3431, Dec. 2020.
[19]
H. Yu, J. Lu, and G. Zhang, “Topology learning-based fuzzy random neural networks for streaming data regression,” IEEE Trans. Fuzzy Syst., vol. 30, no. 2, pp. 412–425, Feb. 2022.
[20]
K. Li, J. Lu, H. Zuo, and G. Zhang, “Source-free multi-domain adaptation with fuzzy rule-based deep neural networks,” IEEE Trans. Fuzzy Syst., vol. 31, no. 12, pp. 4180–4194, Dec. 2023.
[21]
C. Yang, Z. Deng, K.-S. Choi, and S. Wang, “Takagi–Sugeno–Kang transfer learning fuzzy logic system for the adaptive recognition of epileptic electroencephalogram signals,” IEEE Trans. Fuzzy Syst., vol. 24, no. 5, pp. 1079–1094, Oct. 2016.
[22]
P. Xu, Z. Deng, J. Wang, Q. Zhang, K.-S. Choi, and S. Wang, “Transfer representation learning with TSK fuzzy system,” IEEE Trans. Fuzzy Syst., vol. 29, no. 3, pp. 649–663, Mar. 2021.
[23]
W. Zhang, Z. Deng, T. Zhang, K.-S. Choi, and S. Wang, “Multi-view fuzzy representation learning with rules based model,” IEEE Trans. Knowl. Data Eng., vol. 36, no. 2, pp. 736–749, Feb. 2024.
[24]
G. H. F. M. Oliveira, L. L. Minku, and A. L. Oliveira, “Tackling virtual and real concept drifts: An adaptive Gaussian mixture model approach,” IEEE Trans. Knowl. Data Eng., vol. 35, no. 2, pp. 2048–2060, Feb. 2023.
[25]
E. Yu, J. Lu, B. Zhang, and G. Zhang, “Online boosting adaptive learning under concept drift for multistream classification,” in Proc. AAAI Conf. Artif. Intell., 2024, pp. 16522–16530.
[26]
J. Gama, P. Medas, G. Castillo, and P. Rodrigues, “Learning with drift detection,” in Proc. Braz. Symp. Artif. Intell., Springer, 2004, pp. 286–295.
[27]
H. M. Gomes et al., “Adaptive random forests for evolving data stream classification,” Mach. Learn., vol. 106, no. 9, pp. 1469–1495, 2017.
[28]
A. Bifet and R. Gavalda, “Learning from time-changing data with adaptive windowing,” in Proc. SIAM Int. Conf. Data Mining, 2007, pp. 443–448.
[29]
S. Xu and J. Wang, “Dynamic extreme learning machine for data stream classification,” Neurocomputing, vol. 238, pp. 433–449, 2017.
[30]
C. Alippi and M. Roveri, “Just-in-time adaptive classifiers–part I: Detecting nonstationary changes,” IEEE Trans. Neural Netw., vol. 19, no. 7, pp. 1145–1153, Jul. 2008.
[31]
N. Lu, G. Zhang, and J. Lu, “Concept drift detection via competence models,” Artif. Intell., vol. 209, pp. 11–28, 2014.
[32]
J. Z. Kolter and M. A. Maloof, “Dynamic weighted majority: An ensemble method for drifting concepts,” J. Mach. Learn. Res., vol. 8, pp. 2755–2790, 2007.
[33]
R. Elwell and R. Polikar, “Incremental learning of concept drift in nonstationary environments,” IEEE Trans. Neural Netw., vol. 22, no. 10, pp. 1517–1531, Oct. 2011.
[34]
M. Pratama, M. de Carvalho, R. Xie, E. Lughofer, and J. Lu, “ATL: Autonomous knowledge transfer from many streaming processes,” in Proc. 28th ACM Int. Conf. Inf. Knowl. Manage., 2019, pp. 269–278.
[35]
H. Yu, Q. Zhang, T. Liu, J. Lu, Y. Wen, and G. Zhang, “Meta-add: A meta-learning based pre-trained model for concept drift active detection,” Inf. Sci., vol. 608, pp. 996–1009, 2022.
[36]
E. Yu, Y. Song, G. Zhang, and J. Lu, “Learn-to-adapt: Concept drift adaptation for hybrid multiple streams,” Neurocomputing, vol. 496, pp. 121–130, 2022.
[37]
H. Du, L. L. Minku, and H. Zhou, “Multi-source transfer learning for non-stationary environments,” in Proc. Int. Joint Conf. Neural Netw., 2019, pp. 1–8.
[38]
T. Takagi and M. Sugeno, “Fuzzy identification of systems and its applications to modeling and control,” IEEE Trans. Systems, Man, Cybern., vol. SMC-15, no. 1, pp. 116–132, Jan./Feb. 1985.
[39]
T. Su and J. G. Dy, “In search of deterministic methods for initializing k-means and gaussian mixture clustering,” Intell. Data Anal., vol. 11, no. 4, pp. 319–338, 2007.
[40]
M. Long, H. Zhu, J. Wang, and M. I. Jordan, “Deep transfer learning with joint adaptation networks,” in Proc. Int. Conf. Mach. Learn., 2017, pp. 2208–2217.
[41]
J. Wang et al., “Generalizing to unseen domains: A survey on domain generalization,” IEEE Trans. Knowl. Data Eng., vol. 35, no. 8, pp. 8052–8072, Aug. 2023.
[42]
Y. Kim and C. H. Park, “An efficient concept drift detection method for streaming data under limited labeling,” IEICE Trans. Inf. Syst., vol. 100, no. 10, pp. 2537–2546, 2017.
[43]
W. N. Street and Y. Kim, “A streaming ensemble algorithm (SEA) for large-scale classification,” in Proc. 7th ACM SIGKDD Int. Conf. Knowl. Discov. Data Mining, 2001, pp. 377–382.
[44]
Y. Song, J. Lu, A. Liu, H. Lu, and G. Zhang, “A segment-based drift adaptation method for data streams,” IEEE Trans. Neural Netw. Learn. Syst., vol. 33, no. 9, pp. 4876–4889, Sep. 2022.
[45]
A. Liu, J. Lu, and G. Zhang, “Diverse instance-weighting ensemble based on region drift disagreement for concept drift adaptation,” IEEE Trans. Neural Netw. Learn. Syst., vol. 32, no. 1, pp. 293–307, Jan. 2021.
[46]
G. Ditzler and R. Polikar, “Incremental learning of concept drift from streaming imbalanced data,” IEEE Trans. Knowl. Data Eng., vol. 25, no. 10, pp. 2283–2301, Oct. 2013.
[47]
A. Geiger, P. Lenz, and R. Urtasun, “Are we ready for autonomous driving? The kitti vision benchmark suite,” in Proc. IEEE Conf. Comput. Vis. Pattern Recognit., 2012, pp. 3354–3361.
[48]
A. Vyas, R. Kannao, V. Bhargava, and P. Guha, “Commercial block detection in broadcast news videos,” in Proc. Indian Conf. Comput. Vis. Graph. Image Process., 2014, pp. 1–7.
[49]
J. Montiel, J. Read, A. Bifet, and T. Abdessalem, “Scikit-multiflow: A multi-output streaming framework,” J. Mach. Learn. Res., vol. 19, no. 72, pp. 1–5, 2018. [Online]. Available: http://jmlr.org/papers/v19/18-251.html

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Fuzzy Systems
IEEE Transactions on Fuzzy Systems  Volume 32, Issue 10
Oct. 2024
597 pages

Publisher

IEEE Press

Publication History

Published: 04 July 2024

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 18 Jan 2025

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media