An Efficient Ids Based on Fuzzy Firefly Optimization and Fast Learning Network

  • Authors

    • Bh Dasaradha Ram
    • B. V. Subba Rao
    2018-12-09
    https://doi.org/10.14419/ijet.v7i4.36.24137
  • Fast learning network, IDS, Fuzzy Firefly’s, ANN.
  • Abstract

    Overseen Interruption Recognition Framework is a framework that has the capacity of picking up from cases about past attacks to perceive new strikes. Using ANN based interruption discovery is promising for decreasing the amount of false negative or false positives in light of the fact that ANN has the capacity of picking up from certified cases. In this article, a made learning model for Quick Learning System (FLN) in light of fluffy firefly streamlining (FFO) has been proposed and named as FF-FLN. The model has been associated with the issue of interruption location and endorsed in perspective of the famous dataset KDD99. Our created strategy has been taken a gander at against a broad assortment of meta-heuristic figurings for planning ELM, and FLN classifier. FF-FLN has defeated other learning approaches in the testing exactness of the learning.

     

  • References

    1. [1] Eberhart RC &Kennedy J, “A new optimizer using particles swarm theoryâ€, Sixth Int. Symp. on Micro Machine and Human Science, (1995), pp.39–43.

      [2] Eberhart RC & Kennedy J, “Particle swarm optimizationâ€, IEEE Int. Conf. on Neural Network, (1995), pp.1942–1948.

      [3] Shi Y & Eberhart RC, “A modified particle swarm optimizerâ€, IEEE World Conf. on Computation Intelligence, (1998), pp.69–73.

      [4] Shi Y & Eberhart RC, “Empirical study of Particle Swarm Optimizationâ€, IEEE World Conference on Evolutionary Computation, (1999), pp.6–9.

      [5] Yao X, “A review of evolutionary artificial neural networksâ€, Int. J. Intell. Syst., Vol.8, No.4, (1993), pp.539–567.

      [6] Angeline PJ, Sauders GM & Pollack JB, “An evolutionary algorithm that constructs recurrent neural networksâ€, IEEE Trans. Neural Networks, Vol.5, No.1, (1994), pp.54–65.

      [7] Marco G & Alberto T, “On the problem of local minima in back-propagationâ€, IEEE Trans. Pattern Anal. Mach. Intell., Vol.14, No.1, (1992), pp.76–86.

      [8] Van Ooyen A & Nienhuis B, “Improving the convergence of the back-propagation algorithmâ€, Neural Network, Vol.5, No.4, (1992), pp.465–471.

      [9] Ahmad M & Salam FMA, “Supervised learning using the Cauchy energy functionâ€, International Conference ON Fuzzy logic and Neural Networks, (1992), pp.721-724.

      [10] Jacobs RA, “Increased rates of convergence through learning rate adaptationâ€, Neural Networks, Vol.1, (1988), pp.295–307.

      [11] Weirs MK, “A method for self-determination of adaptive learning rates in back propagationâ€, Neural Networks, Vol.4, (1991), pp.371–379.

      [12] Irie B & Miyake S, “Capability of three-layered perceptronâ€, IEEE Int. Conf. On Neural Networks, (1998), pp.641–648.

      [13] Shaw S & Kinsner W, “Chaotic simulated annealing in multilayer feed forward networksâ€, In Canadian Conf. on Electrical and Computer Engineering, Vol. 1, (1996), pp.265–269.

      [14] Seop KC, Mohammed OA & Song YH, “Detection of magnetic body using article neural network with modified simulated annealingâ€, IEEE Trans. Magn., Vol.30, (1994), pp.3644–3647.

      [15] Bhattacharya U & Parui SK, “The Use of Self-adaptive learning rates to improve the convergence of backpropagation algorithmâ€, Tech. Rep. GVPR-1/95, CVPR Unit, Indian Statistical Institute, Calcutta, India, (1995).

      [16] Chunkai Z, Huihe S & Yu L, “Particle swarm optimization for evolving artificial neural networkâ€, IEEE Int. Conf. on System, Man, and Cybernetics, Vol.4, (2000), pp.2487–2490.

      [17] Shi YH & Eberhart RC, “Parameter selection in particle swarm optimizationâ€, Annual conference on Evolutionary Programming, (1998), pp.591-600.

      [18] Salerno J, “Using the particle swarm optimization technique to train a recurrent neural modelâ€, Ninth IEEE Int. Conf. on Tools with Artificial Intelligence, (1997), pp.45–49.

      [19] Eberhart RC & Shi Y, “Comparing Inertia Weights and Constriction Factors in Particle swarm Optimizationâ€, Proc. of congress on Evolutionary Computing, Vol.1, (2000), pp.84–88.

      [20] Homik K, “Multilayer feed forward networks are universal approximatorsâ€, Neural Networks, Vol.2, (1989), pp.359–366.

      Boeringer DW & Werner DH, “Particle swarm optimization versus genetic algorithms for phased array synthesisâ€, IEEE Trans. Antennas Propagation, Vol.52, No.3, (2004), pp.771–779.
  • Downloads

  • How to Cite

    Dasaradha Ram, B., & V. Subba Rao, B. (2018). An Efficient Ids Based on Fuzzy Firefly Optimization and Fast Learning Network. International Journal of Engineering & Technology, 7(4.36), 557-561. https://doi.org/10.14419/ijet.v7i4.36.24137

    Received date: 2018-12-16

    Accepted date: 2018-12-16

    Published date: 2018-12-09