Analysis of ai algorithms for foreseeing university student’s academic and co-curricular performance

  • Authors

    • Soheli Farhana Universiti Kuala Lumpur
    • Adidah Lajis Universiti Kuala Lumpur
    • Zalizah Awang Long Universiti Kuala Lumpur
    • Haidawati Nasir Universiti Kuala Lumpur
    2019-07-24
    https://doi.org/10.14419/ijet.v7i4.29241
  • Performance, Student Academic, AI Algorithms.
  • Abstract

    Evaluation of student’s activity turns out to be more difficult because of the huge volume of information in the instructive databases. As of now in the universities, the absence of an existing framework to investigate and screen the understudy advancement and execution isn't being tended to. There are three main reasons why this is going on. To begin with, the examination on existing forecast strategies is as yet lacking to distinguish the most reasonable techniques for foreseeing the intelligent methods in the system. Second is because of the absence of examinations on the elements influencing understudies’ accomplishments specifically courses inside university system. Third is to unavailability of the correlation between the academic and co-curricular activities. Subsequently, a systematical writing audit on foreseeing understudy execution by utilizing different artificial intelligence (AI) algorithms strategies is proposed to enhance performance accomplishments. This paper is briefly discussed and analyzed of the different AI algorithms to predict the performance analysis of universities student by corelating academic and co-curricular values. Finally, an ideal algorithm is proposed to develop the performance analysis system by comparing the above analysis results. The accuracy of the proposed algorithm is achieved to 95.38% through analysis. It could convey the advantages and effects to understudies, instructors and scholastic foundations.

     

     

  • References

    1. [1] Kuncel, N.R., Hezlett, S.A. and Ones, D.S., 2001. A comprehensive meta-analysis of the predictive validity of the graduate record examinations: implications for graduate student selection and performance. Psychological bulletin, 127(1), p.162. https://doi.org/10.1037//0033-2909.127.1.162.

      [2] Hunsaker, S.L., Finley, V.S. and Frank, E.L., 1997. An analysis of teacher nominations and student performance in gifted programs. Gifted Child Quarterly, 41(2), pp.19-24. https://doi.org/10.1177/001698629704100203.

      [3] Anderson, J.R., 2013. Analysis of student performance with the LISP tutor. In Diagnostic monitoring of skill and knowledge acquisition (pp. 45-68). Routledge.

      [4] Smith, J. and Naylor, R., 2001. Determinants of degree performance in UK universities: a statistical analysis of the 1993 student cohort. oxford Bulletin of Economics and Statistics, 63(1), pp.29-60. https://doi.org/10.1111/1468-0084.00208.

      [5] Topor, D.R., Keane, S.P., Shelton, T.L. and Calkins, S.D., 2010. Parent involvement and student academic performance: A multiple mediational analysis. Journal of prevention & intervention in the community, 38(3), pp.183-197. https://doi.org/10.1080/10852352.2010.486297.

      [6] Pandey, M. and Sharma, V.K., 2013. A decision tree algorithm pertaining to the student performance analysis and prediction. International Journal of Computer Applications, 61(13). https://doi.org/10.5120/9985-4822.

      [7] Wang, A.Y., Newlin, M.H. and Tucker, T.L., 2001. A discourse analysis of online classroom chats: Predictors of cyber-student performance. Teaching of Psychology, 28(3), pp.222-226. https://doi.org/10.1207/S15328023TOP2803_09.

      [8] Angeline, D.M.D., 2013. Association rule generation for student performance analysis using apriori algorithm. The SIJ Transactions on Computer Science Engineering & its Applications (CSEA), 1(1), pp.12-16.

      [9] Hanushek, E.A. and Raymond, M.E., 2005. Does school accountability lead to improved student performance? Journal of Policy Analysis and Management: The Journal of the Association for Public Policy Analysis and Management, 24(2), pp.297-327. https://doi.org/10.1002/pam.20091.

      [10] Shahiri, A.M. and Husain, W., 2015. A review on predicting student's performance using data mining techniques. Procedia Computer Science, 72, pp.414-422. https://doi.org/10.1016/j.procs.2015.12.157.

      [11] Lucas, S., Tuncel, A., Bensalah, K., Zeltser, I., Jenkins, A., Pearle, M. and Cadeddu, J., 2008. Virtual reality training improves simulated laparoscopic surgery performance in laparoscopy naive medical students. Journal of endourology, 22(5), pp.1047-1052. https://doi.org/10.1089/end.2007.0366.

      [12] Osmanbegovic, E. and Suljic, M., 2012. Data mining approach for predicting student performance. Economic Review: Journal of Economics and Business, 10(1), pp.3-12.

      [13] Ayinde, A.Q., Adetunji, A.B., Bello, M. and Odeniyi, O.A., 2013. Performance Evaluation of Naive Bayes and Decision Stump Algorithms in Mining Students' Educational Data. International Journal of Computer Science Issues (IJCSI), 10(4), p.147.

      [14] Schommer, M., 1993. Epistemological development and academic performance among secondary students. Journal of educational psychology, 85(3), p.406. https://doi.org/10.1037//0022-0663.85.3.406.

      [15] Cai, J., 1995. A cognitive analysis of US and Chinese students' mathematical performance on tasks involving computation, simple problem solving, and complex problem solving. Journal for Research in Mathematics Education. Monograph, pp. i-151. https://doi.org/10.2307/749940.

      [16] Richardson, D.R., 2000. Comparison of naive and experienced students of elementary physiology on performance in an advanced course. Advances in Physiology Education, 23(1), pp. S91-95. https://doi.org/10.1152/advances.2000.23.1.S91.

      [17] Troussas, C., Virvou, M., Espinosa, K.J., Llaguno, K. and Caro, J., 2013, July. Sentiment analysis of Facebook statuses using Naive Bayes classifier for language learning. In IISA 2013 (pp. 1-6). IEEE. https://doi.org/10.1109/IISA.2013.6623713.

      [18] Graham, S. and Harris, K.R., 1989. Components analysis of cognitive strategy instruction: Effects on learning disabled students' compositions and self-efficacy. Journal of educational Psychology, 81(3), p.353. https://doi.org/10.1037//0022-0663.81.3.353.

      [19] Sabitha, A.S., Mehrotra, D., Bansal, A. and Sharma, B.K., 2016. A naive bayes approach for converging learning objects with open educational resources. Education and Information Technologies, 21(6), pp.1753-1767. https://doi.org/10.1007/s10639-015-9416-2.

      [20] Thanh Noi, P. and Kappas, M., 2018. Comparison of random forest, k-nearest neighbor, and support vector machine classifiers for land cover classification using Sentinel-2 imagery. Sensors, 18(1), p.18. https://doi.org/10.3390/s18010018.

      [21] Manavalan, B., Shin, T.H. and Lee, G., 2018. DHSpred: support-vector-machine-based human DNase I hypersensitive sites prediction using the optimal features selected by random forest. Oncotarget, 9(2), p.1944. https://doi.org/10.18632/oncotarget.23099.

      [22] Tharwat, A. and Hassanien, A.E., 2018. Chaotic antlion algorithm for parameter optimization of support vector machine. Applied Intelligence, 48(3), pp.670-686. https://doi.org/10.1007/s10489-017-0994-0.

  • Downloads

  • How to Cite

    Farhana, S., Lajis, A., Awang Long, Z., & Nasir, H. (2019). Analysis of ai algorithms for foreseeing university student’s academic and co-curricular performance. International Journal of Engineering & Technology, 8(2), 59-62. https://doi.org/10.14419/ijet.v7i4.29241

    Received date: 2019-05-13

    Accepted date: 2019-05-19

    Published date: 2019-07-24