Abstract
Based on advantages of basic non-negative sparse coding (NNSC) model, and considered the prior class constraint of image features, a novel NNSC model is discussed here. In this NNSC model, the sparseness criteria is selected as a two-parameter density estimation model and the dispersion ratio of within-class and between-class is used as the class constraint. Utilizing this NNSC model, image features can be extracted successfully. Further, the feature recognition task by using different classifiers can be implemented well. Simulation results prove that our NNSC model proposed is indeed effective in extracting image features and recognition task in application.
References
Hoyer PO (2003) Modelling receptive fields with non-negative sparse coding. Neurocomputing 52:547–552
Olshausen BA, Field DJ (1996) Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 381:607–609
Li L, Zhang YJ (2009) SENSC: a stable and efficient algorithm for nonnegative sparse coding. Acta Autom Sin 35:439–443
Lee DD, Seng HS (1999) Learning the parts of objects by non-negative matrix factorization. Nature 401:788–891
Cao J, Lin Z (2014) Bayesian signal detection with compressed measurements. Inform Sci 289:241–253
Shang Li (2008) Non-negative sparse coding shrinkage for image denoising using normal inverse Gaussian density model. Image Vis Comput 26:1137–1147
Hyvärinen A (1997) Sparse coding shrinkage: denoising of nongaussian data by maximum likelihood estimation. Neural Comput 11:1739–1768
Cao J, Chen T, Fan J (2014) Fast online learning algorithm for landmark recognition based on BoW framework. In: Proceedings of the 9th IEEE Conference on Industrial Electronics and Applications. Hangzhou, China, June 2014, pp 1163–1168
Huang GB, Zhu Q, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501
Cao J, Xiong L (2014) Protein sequence classification with improved extreme learning machine algorithms, BioMed Research International, vol. 2014, Article ID 103054, 12 pages. doi:10.1155/2014/103054
Huang GB, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70:3056–3062
Huang GB, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71:3460–3468
Cortes C, Vapnik VN (1995) Support vector networks. Mach Learn 20:273–297
Chen GY, Xie WF (2006) Pattern recognition with SVM and dual-tree complex wavelets. Image Vis Comput 25:960–966
Zhang L, Zhou W, Jiao L (2004) Wavelet support vector machine. IEEE Trans Syst, Man, Cybern—Part B 34:34–39
Acknowledgments
This work was supported by the National Natural Science Foundation of China (Nos. 61373098, 61370109).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wang, X., Wang, C., Shang, L. et al. Dispersion Constraint Based Non-negative Sparse Coding Model. Neural Process Lett 43, 603–609 (2016). https://doi.org/10.1007/s11063-015-9432-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-015-9432-7