Partial Histogram Bayes Learning Algorithm for Classification Applications

  • Authors

    • Haider O. Lawend
    • Anuar M. Muad
    • Aini Hussain
    2018-10-02
    https://doi.org/10.14419/ijet.v7i4.11.20787
  • Classification, Histogram noise estimation and reduction, Histogram probability distribution, Naïve Bayes, Supervised learning.
  • Abstract

    This paper presents a proposed supervised classification technique namely partial histogram Bayes (PHBayes) learning algorithm. Conventional classifier based on Gaussian function has limitation when dealing with different probability distribution functions and requires large memory for large number of instance. Alternatively, histogram based classifiers are flexible for different probability density function. The aims of PHBayes are to handle large number of instances in datasets with lesser memory requirement, and fast in training and testing phases. The PHBayes depends on portion of the observed histogram that is similar to the probability density function. PHBayes was analyzed using synthetic and real data. Several factors affecting classification accuracy were considered. The PHBayes was compared with other established classifiers and demonstrated higher accurate classification, lesser memory even when dealing with large number of instance, and faster in training and testing phases.

     

     

  • References

    1. [1] Freedman, S. T. and Adams, J. A. Filtering data based on human-inspired forgetting. IEEE Trans. Syst. Man Cybern. B, Cybern. 2011, 41(6):1544-1555.

      [2] Mohammad, M. M., Jing, G., Latifur, K., Jiawei, H. and Bhavani, T. Classification and novel class detection in concept-drifting data streams under time constraints. IEEE Trans. Knowl. Data Eng. 2011, 23(6):859-874.

      [3] Bogdan, W. M. Neural network architectures and learning algorithms. IEEE Ind. Electron. Mag. 2009, 3(4):56-63.

      [4] French, R.M. Catastrophic forgetting in connectionist networks. Trends Cogn. Sci. 1999, 3(4):128-135.

      [5] Mensink, T., Verbeek, J., Perronnin, F. and Csurka, G. Distance-based image classification: Generalizing to new classes at near-zero cost. IEEE Trans. Pattern Anal. Mach. Intell. 2013, 35(11):2624-2637.

      [6] Mathur, A. and Foody, G. M. Multiclass and binary SVM classification: Implications for training and classification users. IEEE Geosci. Remote Sens. Lett. 2008, 5(2):241-245.

      [7] Chih-Wei, H. and Chih-Jen, L. A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Netw. 2002, 13(2):415-425.

      [8] Ravikumar, B., Thukaram, D. and Khincha, H. P. Comparison of multiclass SVM classification methods to use in a supportive system for distance relay coordination. IEEE Trans. Power Del. 2010, 25(3):1296-1305.

      [9] Chaudhuri, P., Ghosh, A. K. and Oja, H. Classification based on hybridization of parametric and nonparametric classifiers. IEEE Trans. Pattern Anal. Mach. Intell. 2009, 31(7):1153-1164.

      [10] [10] Veenman, C. J. and Reinders, M. J. T. The nearest subclass classifier: A compromise between the nearest mean and nearest neighbor classifier. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27(9):1417-1429.

      [11] Viswanath, P. and Sarma, T. H. An improvement to k-nearest neighbor classifier. Proceedings of the Recent Advances in Intelligent Computational Syst., pp. 227-231, 2011.

      [12] Yiguang, L., Sam, G. S., Chunguang, L. and Zhisheng, Y. k-NS: A classifier by the distance to the nearest subspace. IEEE Trans. Neural Netw. 2011, 22(8):1256-1268.

      [13] Hui, W. Neighborhood counting measure and minimum risk metric. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32(4):766-768.

      [14] Xueyi, W. A fast exact k-nearest neighbors algorithm for high dimensional search using k-means clustering and triangle inequality. Proceedings of the Int. Joint Conf. on Neural Networks, pp. 1293-1299, 2011.

      [15] Hui, W. Nearest neighbors by neighborhood counting. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28(6):942-953.

      [16] Yoonho, H., Bohyung, H. and Hee-Kap, A. A fast nearest neighbor search algorithm by nonlinear embedding. IEEE Conf. on Computer Vision and Pattern Recognition, pp. 3053-3060, Providence, 2012.

      [17] Moutafis, P., Leng, M., and Kakadiaris, I. A. An overview and empirical comparison of distance metric learning methods. IEEE Trans. Cybern. 2017, 47(3):612-625.

      [18] Yu, J., Bai, M., Wang, G. and Shi, X. Fault diagnosis of planetary gearbox with incomplete information using assignment reduction and flexible naive Bayesian classifier. J. of Mechanical Science and Technology, 2018, 32(1):37-47.

      [19] Yonghong, H., Englehart, K. B., Hudgins, B. and Chan, A. D. C. A Gaussian mixture model based classification scheme for myoelectric control of powered upper limb prostheses. IEEE Trans. Biomed. Eng. 2005, 52(11):1801-1811.

      [20] Karthikeyan, G., Rajendra, A. U., Chua Kuang, C., Choo Min L. and Thomas, A. K. One-class classification of mammograms using trace transform functionals. IEEE Trans. Instrum. Meas. 2014, 63(2):304-311.

      [21] Wei, L., Saurabh, P., James, F. E. and Mann, B. L. Locality-preserving dimensionality reduction and classification for hyperspectral image analysis. IEEE Trans. Geosci. Remote Sens. 2012, 50(4):1185-1198.

      [22] Saurabh, P., Minshan, C., Wei, L. and James, F. E. Segmented mixture-of-Gaussian classification for hyperspectral image analysis. IEEE Geosci. Remote Sens. Lett. 2014, 11(1):138-142.

      [23] Wei, L., Saurabh, P. and James, F. E. Hyperspectral image classification using Gaussian mixture models and Markov random fields. IEEE Geosci. Remote Sens. Lett. 2014, 11(1):153-157.

      [24] Xi-Zhao, W., Yu-Lin, H. and Wang, D. D. Non-naive Bayesian classifiers for classification problems with continuous attributes. IEEE Trans. Cybern. 2014, 44(1):21-39.

      [25] John, G. H. and Langley, P. Estimating continuous distributions in Bayesian classifiers. Proceedings of the Conf. on Uncertain. Artif. Intell. pp. 338–345, Montréal, 1995.

      [26] Liu, J. N. K., Yu-Lin, H., Xi-Zhao, W. and Yan-Xing, H. A comparative study among different kernel functions in flexible naive Bayesian classification. Proceedings of the Int. Conf. on Mach. Learning and Cybern., pp. 638-643, 2011.

      [27] Lawend, H. O.; and Muad, A. M. A non-parametric partial histogram Bayes learning algorithm for classification applications. Proceedings of the IEEE Int. Conf. on Control Sys., Computing and Eng., pp. 35-39, 2014.

      [28] Hung-Ju, H. and Chun-Nan, H. Bayesian classification for data from the same unknown class. IEEE Trans. Syst. Man Cybern. B, Cybern. 2002, 32(2):137-145.

      [29] Soria, D., Garibaldi, J. M., Biganzoli, E. and Ellis, I. O. A comparison of three different methods for classification of breast cancer data. Proceedings of the Seventh Int. Conf. on Mach. Learning and Appl , pp. 619-624, 2008.

      [30] Yan-Shi, D. and Ke-Song, H. A comparison of several ensemble methods for text categorization. Proceedings of the Services Computing, pp. 419-422, 2004.

      [31] Dilrukshi, I. and Zoysa, K. D. Twitter news classification: Theoretical and practical comparison of SVM against naive Bayes algorithms. Proceedings of the Int. Conf. on Advances in ICT for Emerging Regions, pp. 278-278, 2013.

      [32] Shameem, F. and Nisar, H. Comparison of classification techniques- Proceedings of the SVM and Naives Bayes to predict the arboviral disease-dengue. IEEE Int. Conf. on Bioinformatics and Biomedicine Workshops, pp. 538-539. 2011.

      [33] Perez, A., Larranaga, P. and Inza, I. Bayesian classifiers based on kernel density estimation: Flexible classifiers. Int. J. of Approximate Reasoning. 2009, 50(2):341-362.

      [34] Papoulis, A. Probability, random variables, and stochastic processes. McGraw-Hill. 1991.

      [35] Asuncion, A. and Newman, D.J. UCI machine learning repository. 2007, http://www.ics.uci.edu/~mlearn/MLRepository.html.

  • Downloads

  • How to Cite

    O. Lawend, H., M. Muad, A., & Hussain, A. (2018). Partial Histogram Bayes Learning Algorithm for Classification Applications. International Journal of Engineering & Technology, 7(4.11), 126-132. https://doi.org/10.14419/ijet.v7i4.11.20787

    Received date: 2018-10-02

    Accepted date: 2018-10-02

    Published date: 2018-10-02