Data centers are a long-term investment, with a slow payback pace and which – specialists say and prove with numerous actual cases – show a strong reluctance in adopting innovative technologies.

The best example in this respect is that of the cooling with liquid. Which, although it provides important earnings and savings, it is unlikely to be able to ever replace completely the air cooling system of a data center if the latter was designed from the very beginning to use this technology. The reasons are many and range from constructive and technological limitations to the financial effort related to the replacement process and the risk of unplanned downtimes. Of course, small-scale experiments, pilot projects, can be performed, but the probability of complete replacement is very low.

It is a justified “attitude” – specialists say – if we take into account that the main purpose of a Data Center is to provide high levels of availability and security of data and applications. In addition, to the extent of the possibilities and existing capacity, of scalability, flexibility, advanced data recovery capabilities in case of disaster, etc.

Even if innovation is not a requirement expressly mentioned in the “factsheet” of data centers, it does not mean that their hardware and software architectures remain unchanged throughout the life of a Data Center. There are inevitable modernization processes and inherent changes, only they are more subtle. Or … not as obvious as … the replacement of cooling systems.

Hyper-convergence through Software-Defined Infrastructure

According to experts, an increasingly big number of large data centers are currently at the border between the post-virtualization stage and the stage generically called Software-Defined Infrastructure. Basically, we are witnessing a fundamental change in the management of data centers, which renounce – slowly but surely – the hardware-centric approach, starting to focus on applications, virtualization, and resource loading levels. It is a new stage in which the definition of policies and rules at the level of the virtual machine has a major influence on the automation level, and the effects become visible at the operational and administrative levels. The increase in the level of automation generates a significant reduction of the configuration and reconfiguration effort, and the immediate effects consist of increasing the level of agility, reducing the number of errors and decreasing operational costs. And they have a direct impact on the IT specialists, who need to develop new skills and abilities.

The adoption of Software-Defined Infrastructure architectures responds to a requirement increasingly faced by data centers – the hyper-convergence. It is a requirement generated by the need for simplifying and unifying the management, and especially by the need for an optimal capitalization of the heterogeneous, modular and/or outdated resources they have.


Availability prevails

A second aspect highlighted by specialists, when it comes to significant changes, is that data backup and data recovery in case of disaster become increasingly important in the customer requirements. Although, traditionally, they are two distinct categories, the virtualization and the development of Cloud services offer to lead to a new convergent approach. This will be substantiated by significantly lower data and application recovery and restoration times (the new compression and deduplication technologies contributing also), by reduced costs and a higher availability level.


The effect of the Cloud model

The rapid adoption of the Cloud services in the enterprise environment inevitably influences the IT “consumption” model in organizations and an increasing number of vendors align the new requirements, coming with offers designed to provide a scalability level similar to those offered by the Cloud. This is because more and more companies require an IT environment able to expand and contract quickly, in accordance with changing business requirements. However, it remains to be seen whether organizations can benefit from the cost and availability levels similar to those offered by Cloud service providers.


The Open Source attack

In 2015, Open Source technologies will significantly strengthen their position in the software architecture of data centers. It is true that this is just a prediction that returns periodically, but this year experts say that the number of data centers that will move from the experimental phase to the production one – developing projects based on technologies such as KVM, OpenStack, and OpenFlow – will become truly significant

It is a prediction that is confirmed by the Gartner analysts, who estimate that by 2016, 95% of IT organizations will use Open Source technologies for the provision of key services to customers. This is a huge step, considering that just 5 years ago, the level of adoption of Open Source solutions was only 20%.

Data centers represent the basis of the new IT consumption model as a service. However, the pace of adoption of new trends depends upon the rate of return at the level of verticals.

For more information please contact us at