Cold LOGIK and RDHX Solution for Data Center Energy Optimization

  • Authors

    • T Suresh
    • Dr A. Murugan
    2018-06-25
    https://doi.org/10.14419/ijet.v7i3.4.16757
  • Data Center Optimization, Cold logic, RDHx, Rear Door Heat exchanger, energy saving, power saving, green computing.
  • In all types of data center, keeping the right temperature with less cost and energy is one of important objective as energy saving is crucial in increased data driven industry. Energy saving is global focus for all industry. In Information technology, more than 60% of energy is utilized in data centers as it needs to be up and running. As per Avocent data center issue study, across globe more than 54% of data centers are in redesigning process to improve their efficiency and reduce operational cost and energy consumption. Data center managers and operators major challenge was how to maintain the temperature of servers with less power and energy. When the densities of data center energy nearing 5 kilowatts (kW) per cabinet, organizations are trying to find a way to manage the heat through latest technologies. Power usage per square can be reduced by incorporating liquid-cooling devices instead of increasing airflow volume. This is especially important in a data center with a typical under-floor cooling system. This research paper uses Rear-Door Heat eXchangers (RDHx) and cool logic solutions to reduce energy consumption. It gives result of implementation of Cold Logik and RDHx solution to Data center and proves that how it saves energy and power. Data center has optimized space, cooling, power and operational cost by implementing RDHx technology. This will enable to add more servers without increasing the space and reduce cooling and power cost. It also saves Data center space from heat dissipation from servers.

     

     

  • References

    1. [1] "Telecommunications Infrastructure Standard for Data Centers". ihs.com. 2005-04-12. Retrieved 2017-02-28

      [2] Pruhs, K. Green computing algorithmic. In Proceedings of the 2011 IEEE 52nd Annual Symposium on Foundations of Computer Science (FOCS), Palm Springs, CA, USA, 22–25 October 2011; pp. 3–4

      [3] 2008 ASHRAE Environmental Guidelines for Datacom Equipment, ASHRAE white paper.

      [4] Luca Parolini, Bruno Sinopoli, Bruce H. Krogh “Reducing Data Center Energy Consumption via Coordinated Cooling and Load Management†usenix, The advanced computing system association. (https://www.usenix.org)

      [5] Brandon Heller, SriniSeetharaman, PriyaMahadevan, YiannisYiakoumis, Puneet Sharma, Sujata Banerjee, Nick McKeown “ElasticTree: Saving Energy in Data Center Networks†(https://www.usenix.org/legacy/event/nsdi10/tech/full_papers/heller.pdf)

      [6] M. Dayarathna, Y. Wen and R. Fan, "Data Center Energy Consumption Modeling: A Survey," in IEEE Communications Surveys & Tutorials, vol. 18, no. 1, pp. 732-794, Firstquarter 2016.

      [7] Liang Liu, Hao Wang, Xue Liu, Xing Jin, Wen Bo He, Qing Bo Wang, Ying Chen “GreenCloud: a new architecture for green data center†ICAC-INDST '09 Proceedings of the 6th international conference industry session on Autonomic computing and communications industry session

      [8] Felipe Abaunza, Ari-PekkaHameri, TapioNiemi, (2018) "EEUI: a new measure to monitor and manage energy efficiency in data centers", International Journal of Productivity and Performance Management, Vol. 67 Issue: 1, pp.111-127

      [9] Sahana S., Bose R., Sarddar D. (2018) Server Utilization-Based Smart Temperature Monitoring System for Cloud Data Center. In: Bhattacharyya S., Sen S., Dutta M., Biswas P., Chattopadhyay H. (eds) Industry Interactive Innovations in Science, Engineering and Technology. Lecture Notes in Networks and Systems, vol 11. Springer, Singapore

  • Downloads

  • How to Cite

    Suresh, T., & A. Murugan, D. (2018). Cold LOGIK and RDHX Solution for Data Center Energy Optimization. International Journal of Engineering & Technology, 7(3.4), 113-117. https://doi.org/10.14419/ijet.v7i3.4.16757