[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Modal Regression via Direct Log-Density Derivative Estimation

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2016)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9948))

Included in the following conference series:

Abstract

Regression is aimed at estimating the conditional expectation of output given input, which is suitable for analyzing functional relation between input and output. On the other hand, when the conditional density with multiple modes is analyzed, modal regression comes in handy. Partial mean shift (PMS) is a promising method of modal regression, which updates data points toward conditional modes by gradient ascent. In the implementation, PMS first obtains an estimate of the joint density by kernel density estimation and then computes its derivative for gradient ascent. However, this two-step approach can be unreliable because a good density estimator does not necessarily mean a good density derivative estimator. In this paper, we propose a novel method for modal regression based on direct estimation of the log-density derivative without density estimation. Experiments show the superiority of our direct method over PMS.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 35.99
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 44.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The intervals of \(\sigma _{\mathrm {x}}\) and \(h_{\mathrm {x}}\) (or \(\sigma _{\mathrm {y}}\) and \(h_{\mathrm {y}}\)) were further changed by multiplying the median value of \(|x_i^{(j)}-x_k^{(j)}|\) with respect to ijk (or \(|y_i-y_k|\) with respect to ik).

  2. 2.

    The datasets were downloaded at http://www.blackwellpublishing.com/rss.

References

  1. Bishop, C.: Pattern Recognition and Machine Learning. Springer, New York (2006)

    MATH  Google Scholar 

  2. Chen, Y.C., Genovese, C., Tibshirani, R., Wasserman, L.: Nonparametric modal regression. Ann. Stat. 44(2), 489–514 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  3. Comaniciu, D., Meer, P.: Mean shift: a robust approach toward feature space analysis. IEEE Trans. PAMI 24(5), 603–619 (2002)

    Article  Google Scholar 

  4. Cox, D.D.: A penalty method for nonparametric estimation of the logarithmic derivative of a density function. Annals Inst. Stat. Math. 37(1), 271–288 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  5. Einbeck, J., Tutz, G.: Modelling beyond regression functions: an application of multimodal regression to speed-flow data. J. Roy. Stat. Soc.: Ser. C (Appl. Stat.) 55(4), 461–475 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  6. Fashing, M., Tomasi, C.: Mean shift is a bound optimization. IEEE Trans. PAMI 27(3), 471–474 (2005)

    Article  Google Scholar 

  7. Fukunaga, K., Hostetler, L.: The estimation of the gradient of a density function, with applications in pattern recognition. IEEE Trans. IT 21(1), 32–40 (1975)

    Article  MathSciNet  MATH  Google Scholar 

  8. Hyndman, R., Bashtannyk, D., Grunwald, G.: Estimating and visualizing conditional densities. J. Comput. Graph. Stat. 5(4), 315–336 (1996)

    MathSciNet  Google Scholar 

  9. Petty, K., Noeimi, H., Sanwal, K., Rydzewski, D., Skabardonis, A., Varaiya, P., Al-Deek, H.: The freeway service patrol evaluation project: database support programs, and accessibility. Transp. Res. Part C Emerg. Technol. 4(2), 71–85 (1996)

    Article  Google Scholar 

  10. Sager, T.W., Thisted, R.A.: Maximum likelihood estimation of isotonic modal regression. Ann. Stat. 10(3), 690–707 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  11. Sasaki, H., Hyvärinen, A., Sugiyama, M.: Clustering via mode seeking by direct estimation of the gradient of a log-density. In: Calders, T., Esposito, F., Hüllermeier, E., Meo, R. (eds.) ECML PKDD 2014, Part III. LNCS, vol. 8726, pp. 19–34. Springer, Heidelberg (2014)

    Google Scholar 

Download references

Acknowledgments

HS was supported by KAKENHI 15H06103 and MS was supported by KAKENHI 25700022.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hiroaki Sasaki .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Sasaki, H., Ono, Y., Sugiyama, M. (2016). Modal Regression via Direct Log-Density Derivative Estimation. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds) Neural Information Processing. ICONIP 2016. Lecture Notes in Computer Science(), vol 9948. Springer, Cham. https://doi.org/10.1007/978-3-319-46672-9_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-46672-9_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-46671-2

  • Online ISBN: 978-3-319-46672-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics