An Analysis of Large Data Classification using Ensemble Neural Network

  • Abstract
  • Keywords
  • References
  • PDF
  • Abstract

    In this paper, operational and complexity analysis are investigated for a proposed model of ensemble Artificial Neural Networks (ANN) multiple classifiers. The main idea to this is to employ more classifiers to obtain a more accurate prediction as well as to enhance the classification capabilities in case of larger data. The classification result analyzed between a single classifier and multiple classifiers followed by the estimates of upper bounds of converged functional error with the partitioning of the benchmark dataset. The estimates derived using the Apriori method shows that the proposed ensemble ANN algorithm with a different approach is feasible where such problems with a high number of inputs and classes can be solved with time complexity of O(n^k ) for some k, which is a type of polynomial. This result is in line with the significant performance achieved by the diversity rule applied with the use of reordering technique. As conclusion, an ensemble heterogeneous ANN classifier is practical and relevant to theoretical and experimental of combiners for the ensemble ANN classifier systems for a large dataset.



  • Keywords

    Classification; Complexity Approximation; Ensemble Neural Network, Large Data Neural Network Classifier

  • References

      [1] Ciresan, D.C., et al., “Deep big simple neural nets excel on handwritten digit recognition”. Neural Computation, (2010). 22(12): 3207 - 3220.

      [2] Bishop, C.M., Neural networks for pattern recognition. (1995): Oxford: Clarendon.

      [3] Egmont-Petersen, M., D.d. Ridder, and H. Handels, Image processing with neural networks—a review. The Journal of Pattern Recognition Society, (2002). 35: 2279–2301.

      [4] Stahl, F., M. Gaber, and M. Bramer, Scaling up data mining techniques to large datasets using parallel and distributed processing, in Business intelligence and performance management, P. Rausch, A.F. Sheta, and A. Ayesh, Editors. (2013), Springer London. 243-259.

      [5] Windeatt, T., Accuracy diversity and ensemble MLP classifier design. IEEE Transactions on Neural Networks, (2006). 17(5): 1194-1211.

      [6] Sospedra, J.T., Ensembles of artificial neural networks: Analysis and development of design methods, in Department of Computer Science and Engineering. (2011), Universitat Jaume I: Castellon.

      [7] Ceamanosa, X., et al., A classifier ensemble based on fusion of support vector machines for classifying hyperspectral data. International Journal of Image and Data Fusion, (2010). 1(3): 293–307.

      [8] Zang, W., et al., Comparative study between incremental and ensemble learning on data streams: Case study. Journal of Big Data, (2014). 1(1): 5.

      [9] Kuncheva, L. and J. Rodríguez, A weighted voting framework for classifiers ensembles. Knowledge and Information Systems, (2014). 38(2): 259-275.

      [10] Torres-Sospedra, J., C. Hernández-Espinosa, and M. Fernández-Redondo, Introducing reordering algorithms to classic well-known ensembles to improve their performance, in Neural information processing, B.-L. Lu, L. Zhang, and J. Kwok, Editors. (2011), Springer Berlin Heidelberg. 572-579.

      [11] Woźniak, M., M. Graña, and E. Corchado, A survey of multiple classifier systems as hybrid systems. Information Fusion, (2014). 16(0): 3-17.

      [12] Shields, M.W. and M.C. Casey, A theoretical framework for multiple neural network systems. Neurocomputing, (2008). 71(7–9): 1462-1476.

      [13] Mohamad, M., M.Y.M. Saman, and M.S. Hitam, The use of output combiners in enhancing the performance of large data for ANNs. IAENG International Journal of Computer Science, (2014). 41(1): 38-47.

      [14] Zinkevich, M., et al. Parallelized stochastic gradient descent. in Advances in neural information processing systems, (2010), 2595-2603.

      [15] Babii, S. Performance evaluation for training a distributed backpropagation implementation. in 4th International Symposium on Applied Computational Intelligence and Informatics. Timisoara IEEE, (2007), 273-278.

      [16] Yu, L., S. Wang, and K.K. Lai, Credit risk assessment with a multistage neural network ensemble learning approach. Expert Systems with Applications, (2008). 34(2): 1434-1444.

      [17] Engel, A., Complexity of learning in artificial neural networks. Theoretical computer science, (2001). 265(1–2): 285-306.

      [18] Sharma, K. and D. Garg, Complexity analysis in heterogeneous system. Computer and Information Science, (2009). 2(1): P48.

      [19] Alizadeh, H., M.-B. Behrouz, and H. Parvin, To improve the quality of cluster ensembles by selecting a subset of base clusters. Journal of Experimental & Theoretical Artificial Intelligence, (2014). 26(1): 127-150.

      [20] Ghaemi, R., et al., A survey: Clustering ensembles techniques. World Academy of Science, Engineering and Technology, (2009)




Article ID: 11155
DOI: 10.14419/ijet.v7i2.14.11155

Copyright © 2012-2015 Science Publishing Corporation Inc. All rights reserved.