Abstract
Non-negative least squares fitting is a basic block in many applications. Following the classical active-set method by Lawson and Hanson (Solving least squares problems, Prentice-Hall, 1974), much research has been directed toward improving that algorithm. In this paper we present a new method that produces an initial setting for this classical algorithm. This initialization method exploits the relationship between projection based methods and the active set methods. Two quantitative measurements are introduced to evaluate the quality of initial settings for active set method. Experimental results indicate that the proposed initialization provides a good jump-start for the active set method leading to a reduction of the number of iterations of active set methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bertsekas, D.P.: On the Goldstein-Levitin-Poljak gradient projection method. IEEE Trans. Autom. Control 21, 174–184 (1976)
Bertsekas, D.P.: Projected Newton methods for optimization problems with simple constraints. SIAM J. Control Optim. 20(2), 221–246 (1982)
Bierlaire, M., Toint, P.L., Tuyttens, D.: On iterative algorithms for linear least squares problems with bound constraints. Linear Algebra Appl. 143, 111–143 (1991)
Boutsidis, C., Gallopoulos, E.: SVD based initialization:ahead start for nonnegative matrix factorization. Pattern Recognit. 41(4), 1350–1362 (2009)
Bro, R., Jong, S.D.: A fast non-negativity-constrained least squares algorithm. J. Chem. 11(5), 393–401 (1997)
Chen, D., Plemmons, R.J.: Nonnegativity constraints in numerical analysis. In: Bultheel, A., Cools, R. (eds.) Symposium on the Birth of Numerical Analysis, pp. 109–140. Katholieke Universiteit Leuven, Belgium (2007)
Drineas, P., Boutsidis, C.: Random projections for the nonnegative least-squares problem. Linear Algebra Appl. 431(5–7), 760–771 (2009)
Franc, V., Hlavc, V., Navara, M.: Sequential coordinate-wise algorithm for non-negative least squares problem. Research report ctu-cmp-2005-06, Center for Machine Perception, Czech Technical University (2005)
Guerrero-García, P., Santos-Palomo, A.: A sparse implementation of Lawson and Hanson’s convex NNLS method. Technical report ma-05-03, University of Málaga (2005)
Kim, D., Sra, S., Dhillon, I.S.: Fast Newton-type methods for the least squares nonnegative matrix approximation problem. In: Proceedings of the 2007 SIAM International Conference on Data Mining, pp. 343–354 (2007)
Kim, D., Sra, S., Dhillon, I.S.: Fast projection-based methods for the least squares nonnegative matrix approximation problem. Stat. Anal. Data Min. 1(1), 38–51 (2008)
Kim, J., Park, H.: Toward faster nonnegative matrix factorization: A new algorithm and comparisons. In: Proceedings of the IEEE International Conference on Data Mining, pp. 353–362 (2008)
Kuhn, H.W., Tucker, A.W.: Nonlinear programming. In: Neyman, J. (ed.) Proceedings of the second Berkeley symposium on mathematical statistics and probability, pp. 481–492. University of California Press, Berkeley (1951)
Lawson, C., Hanson, R.: Solving Least Squares Problems. Prentice-Hall, New York (1974)
Lin, C.J.: Projected gradient methods for nonnegative matrix factorization. Neural Comput. 19(10), 2756–2779 (2007)
Luo, Y., Duraiswami, R.: Efficient parallel nonnegative least squares on multicore architectures. SIAM J. Sci. Comput. 33, 2848 (2011)
Portugal, L.F., Judice, J.J., Vicente, L.N.: A comparison of block pivoting and interior-point algorithms for linear least squares problems with nonnegative variables. Math. Comput. 63, 625–643 (1994)
Timotheou, S.: Nonnegative least squares learning for the random neural network. In: ICANN, pp. 195–204 (2008)
Van Benthem, M.H., Keenan, M.R.: Fast algorithm for the solution of large-scale non-negativity-constrained least squares problems. J. Chem. 18, 441–450 (2004)
Vo, N., Moran, B., Challa, S.: Nonnegative-least-square classifier for face recognition. In: Advances in Neural Networks, pp. 449–456 (2009)
Acknowledgements
The work is supported by National High-Tech Research and Development Plan of China under Grant No. 2010AA012302, and National Basic Research Program of China (973) under Grant No.2006CB605102. The authors would like to thank Professor Ahmed Sameh at Purdue University for the helpful discussion and support from the beginning of this study. Thanks also go to Professor Stratis Gallopoulos at University of Patras for his comments and pointers to helpful information during the study of the method and the writing of the article. The authors would also like to thank Professor Guangwen Yang and Wei Xue for providing a great supportive environment for this study.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag London Limited
About this chapter
Cite this chapter
Wang, M., Wang, X. (2012). A Jump-Start of Non-negative Least Squares Solvers. In: Berry, M., et al. High-Performance Scientific Computing. Springer, London. https://doi.org/10.1007/978-1-4471-2437-5_15
Download citation
DOI: https://doi.org/10.1007/978-1-4471-2437-5_15
Publisher Name: Springer, London
Print ISBN: 978-1-4471-2436-8
Online ISBN: 978-1-4471-2437-5
eBook Packages: Computer ScienceComputer Science (R0)