[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.5555/3001335.3001389guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Fuzzy interpretation of induction results

Published: 20 August 1995 Publication History

Abstract

When applying rules induced from training examples to a test example, there are three possible cases which demand different actions: (1) no match, (2) single match, and (3) multiple match. Existing techniques for dealing with the first and third cases are exclusively based on probability estimation. However, when there are continuous attributes in the example space, and if these attributes have been discretised into intervals before induction, fuzzy interpretation of the discretised intervals at deduction time could be very valuable. This paper introduces the idea of using fuzzy borders for interpretation of discretised intervals at deduction time, and outlines the results we have obtained with the HCV (Version 2.0) software.

References

[1]
J. Catlett, On Changing Continuous Attributes into Ordered Discrete Attributes, Proceedings of EWSL-91, 1991.
[2]
U.M. Fayyad and K.B. Irani, On the Handling of Continuous-Valued Attributes in Decision Tree Generation, Machine Learning, 8(1992), 87-102.
[3]
J. Hong, AE1: An Extension Matrix Approximate Method for the General Covering Problem, International Journal of Computer and Information Sciences, 14(1985), 6:421-437.
[4]
J. R. Hong, PKAS: A Practical Knowledge Acquisition System, Unpublished (1994).
[5]
N. Lavrac and S. Dzeroski, Inductive Logic Programming - Techniques and Applications, Ellis Horwood, 1994.
[6]
R.S. Michalski, Variable-Valued Logic and Its Applications to Pattern Recognition and Machine Learning, Computer Science and Multiple-Valued Logic Theory and Applications, D.C. Rine (Ed.), Amsterdam: North-Holland, 1975, 506-534.
[7]
R.S. Michalski, I. Mozetic, J. Hong and N. Lavrac, The Multi-Purpose Incremental Learning System AQ15 and Its Testing Application to Three Medical Domains, Proceedings of AAAI 1986, 1986, 1041-1045.
[8]
T. Niblett and I. Bratko, Learning Decision Rules in Noisy Domains, Research and Development in Expert Systems III, M. A. Bramer (Ed.), Cambridge, New York Cambridge University Press, 1987 pp. 25-34
[9]
J.R. Quinlan, Induction of Decision Trees, Machine Learning, 1(1986), 81-106.
[10]
J.R. Quinlan, Generating Production Rules from Decision Trees, Proceedings of International Joint Conference on Artificial Intelligence, J. McDermott (Ed.), Morgan Kaufmann Publishers, Inc., 1987, 304-307.
[11]
J.R. Quinlan, C4.5: Programs for Machine Learning, Morgan Kaufmann Publishers, 1992.
[12]
X. Wu, Inductive Learning: Algorithms and Frontiers, Artificial Intelligence Review, 7(1993), 2:93-108.
[13]
X. Wu, The HCV Induction Algorithm, Proceedings of the 21st ACM Computer Science Conference, S.C. Kwasny and J.F. Buck (Eds.), ACM Press, USA, 1993, 168-175.
[14]
X. Wu, Knowledge Acquisition from Data Bases (in press), 1995.
[15]
X. Wu, J. Krisár and P. Måhlén, HCV (Version 2.0) User's Manual, Department of Software Development, Monash University, Australia, 1995.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Guide Proceedings
KDD'95: Proceedings of the First International Conference on Knowledge Discovery and Data Mining
August 1995
342 pages

Sponsors

  • AAAI: American Association for Artificial Intelligence

Publisher

AAAI Press

Publication History

Published: 20 August 1995

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 13 Dec 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media