Design of testing framework for code smell detection (OOPS) using BFO algorithm

  • Authors

    • Pratiksha Sharma Chandigarh University
    • Er. Arshpreet Kaur Chandigarh University
    2018-08-06
    https://doi.org/10.14419/ijet.v7i2.27.14635
  • Software Metrics, Code Smell Detection, BFOA Method, God Class and Lazy Class.
  • Abstract

    Detection of bad smells refers to any indication in the program code of a execution that perhaps designate a issue, maintain the software and software evolution. Code Smell detection is a main challenging for software developers and their informal classification direct to the designing of various smell detection methods and software tools. It appraises 4 code smell detection tool in software like as a in Fusion, JDeodorant, PMD and Jspirit. In this research proposes a method for detection the bad code smells in software is called as code smell. Bad smell detection in software, OOSMs are used to identify the Source Code whereby Plug-in were implemented for code detection in which position of program initial code the bad smell appeared so that software refactoring can then acquire position. Classified the code smell, as a type of codes: long method, PIH, LPL, LC, SS and GOD class etc. Detection of the code smell and as a result applying the correct detection phases when require is significant to enhance the Quality of the code or program. The various tool has been proposed for detection of the code smell each one featured by particular properties. The main objective of this research work described our proposed method on using various tools for code smell detection. We find the major differences between them and dissimilar consequences we attained. The major drawback of current research work is that it focuses on one particular language which makes them restricted to one kind of programs only. These tools fail to detect the smelly code if any kind of change in environment is encountered. The base paper compares the most popular code smell detection tools on basis of various factors like accuracy, False Positive Rate etc. which gives a clear picture of functionality these tools possess. In this paper, a unique technique is designed to identify CSs. For this purpose, various object-oriented programming (OOPs)-based-metrics with their maintainability index are used. Further, code refactoring and optimization technique are applied to obtain low maintainability Index. Finally, the proposed scheme is evaluated to achieve satisfactory results. The results of the BFOA test defined that the lazy class caused framework defects in DLS, DR, and SE. However, the LPL caused no framework defects what so ever. The consequences of the connection rules test searched that the LCCS (Lazy Class Code Smell) caused structured defects in DE and DLS, which corresponded to the consequences of the BFOA test. In this research work, a proposed method is designed to verify the code smell. For this purpose, different OOPs based Software Metrics with their MI (Maintainability Index) are utilized. Further Code refactoring and optimization method id applied to attained the less maintainability index and evaluated to achieved satisfactory results.

     

     

     

     
  • References

    1. [1] Fontana, F. A., Braione, P., and Zanoni, M. (2012). Automatic detection of bad smells in code: An experimental assessment. Journal of Object Technology, 11(2), 5-1.

      [2] Schumacher, J., Zazworka, N., Shull, F., Seaman, C., and Shaw, M. (2010, September). Building empirical support for automated code smell detection. In Proceedings of the 2010 ACM-IEEE International Symposium on Empirical Software Engineering and Measurement (p. 8). ACM.

      [3] Mika Mantyla. (2003). Bad smells in software-a taxonomy and an empirical study. Helsinki University of Technology (2003).

      [4] Radu Marinescu. 2004. Detection strategies: Metrics-based rules for detecting design aws. In Software Maintenance, 2004. Proceedings. 20th IEEE International Conference on. IEEE, 350–359.

      [5] Matthew James Munro. 2005. Product metrics for automatic identi_cation of"bad smell" design problems in java source-code. In Software Metrics, 2005.11th IEEE International Symposium. IEEE, 15–15.

      [6] Pietrzak, B., and Walter, B. (2006, June). Leveraging code smell detection with inter-smell relations. In International Conference on Extreme Programming and Agile Processes in Software Engineering (pp. 75-84). Springer, Berlin, Heidelberg.

      [7] Mannan, U. A., Ahmed, I., Almurshed, R. A. M., Dig, D., and Jensen, C. (2016, May). Understanding code smells in android applications. In Proceedings of the International Workshop on Mobile Software Engineering and Systems (pp. 225-234). ACM.

      [8] Paiva, T., Damasceno, A., Figueiredo, E., & Sant’Anna, C. (2017). On the evaluation of code smells and detection tools. Journal of Software Engineering Research and Development, 5(1), 7.

      [9] Mansoor, U., Kessentini, M., Maxim, B. R., and Deb, K. (2017). Multi-objective code-smells detection using good and bad design examples. Software Quality Journal, 25(2), 529-552.

      [10] Liu, X., and Zhang, C. (2017). DT: a detection tool to automatically detect code smell in software project. Advances in Computer Science Research, 71.

      [11] Palomba, F., Di Nucci, D., Panichella, A., Zaidman, A., and De Lucia, A. (2017, February). Lightweight detection of Android-specific code smells: The aDoctor project. In Software Analysis, Evolution and Reengineering (SANER), 2017 IEEE 24th International Conference on (pp. 487-491). IEEE.

      [12] Haque, M. S., Carver, J., and Atkison, T. (2018, March). Causes, impacts, and detection approaches of code smell: a survey. In Proceedings of the ACMSE 2018 Conference (p. 25). ACM.

      [13] Di Nucci, D., Palomba, F., Tamburri, D. A., Serebrenik, A., and De Lucia, A. (2018, February). Detecting code smells using machine learning techniques: are we there yet?. In 25th IEEE International Conference on Software Analysis, Evolution and Reengineering (SANER2018): REproducibility Studies and NEgative Results (RENE) Track. Institute of Electrical and Electronics Engineers (IEEE).

      [14] Ito, Y., Hazeyama, A., Morimoto, Y., Kaminaga, H., Nakamura, S., & Miyadera, Y. (2015). A Method for Detecting Bad Smells and ITS Application to Software Engineering Education. International Journal of Software Innovation (IJSI), 3(2), 13-23.

      [15] Danphitsanuphan, P., & Suwantada, T. (2012, May). Code smell detecting tool and code smell-structure bug relationship. In Engineering and Technology (S-CET), 2012 Spring

      [16] Sreenu, K., & Rao, D. J. Performance-Detection of Bad Smells in Code for Refactoring Methods.

      [17] Jyothi, V. E., Srikanth, K., & Rao, K. N. (2012). Effective Implementation of Agile Practices-Object Oriented Metrics tool to Improve Software Quality. International Journal of Software Engineering & Applications, 3(4), 13.

      [18] Simon, F., Steinbruckner, F., & Lewerentz, C. (2001). Metrics based refactoring. In Software Maintenance and Reengineering, 2001. Fifth European Conference on (pp. 30-38). IEEE.

      [19] Meananeatra, P., Rongviriyapanish, S., & Apiwattanapong, T. (2011, May). Using software metrics to select refactoring for long method bad smell. In Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), 2011 8th International Conference on (pp. 492-495). IEEE.

      [20] Henderson Sellers, B., Object-Oriented Metrics: Measures of Complexity. Prentice Hall, 1996.

      [21] Williams, Laurie, Dright Ho, and Sarah Heckman. (2005). Software Metrics in Eclipse [Online]. Avaliable: http://agile.csc.ncsu.edu/SEMaterials/tutorials/metrics/.

      [22] Fontana, F. A., Mariani, E., Mornioli, A., Sormani, R., & Tonello, A. (2011, March). An experience report on using code smells detection tools. In Software Testing, Verification and Validation Workshops (ICSTW), 2011 IEEE Fourth International Conference on (pp. 450-457). IEEE.

      [23] Paiva, T., Damasceno, A., Figueiredo, E., & Sant’Anna, C. (2017). On the evaluation of code smells and detection tools. Journal of Software Engineering Research and Development, 5(1), 7.

      [24] Abdelmoez, W., Kosba, E., & Iesa, A. F, “Risk-based code smells detection tool,†In the International Conference on Computing Technology and Information Management (ICCTIM) (p. 148). Society of Digital Information and Wireless Communication, January 2012.

      [25] Francesca Arcelli Fontana, Marco Zanoni, Alessandro Marino & Mika V. Mantyla, “Code Smell Detection: Towards a Machine Learning-Based Approach,†IEEE International Confrence, September 2013.

      [26] Karaboga, D., & Basturk, B., “On the performance of artificial bee colony (ABC) algorithm,†Applied soft computing, 8(1), 687-697, 2008.

      [27] Ayman Madi, O.K. Zein and Seifedine Kadry, “On the Improvement of Cyclomatic Complexity Metric,†International Journal of Software Engineering and Its Applications Vol.7, No.2, March, 2013

      [28] Ki H. Kang and Jina Kang, “Do External Knowledge Sourcing Modes Matter for Service Innovation? Empirical Evidence from South Korean Service Firms,†J prod Innov M nag 2014.

      [29] Thanis Paiva, Amanda Damasceno, Eduardo Figueiredo and Cláudio Sant Anna, “On the evaluation of code smells and detection tools,†Journal of Software Engineering Research and Development, Springer 2017.

  • Downloads

  • How to Cite

    Sharma, P., & Arshpreet Kaur, E. (2018). Design of testing framework for code smell detection (OOPS) using BFO algorithm. International Journal of Engineering & Technology, 7(2.27), 161-166. https://doi.org/10.14419/ijet.v7i2.27.14635

    Received date: 2018-06-23

    Accepted date: 2018-07-29

    Published date: 2018-08-06