Tech titans are making it clear that there is plenty of room for innovation and technology within the data center. Microsoft sunk a data center to the bottom of the ocean to give internet speeds a boost and improve the sustainability of the company’s data center operations. Google is using AI to control the cooling systems in some of its server spaces. Buzzy tech industry trends like automation, edge computing, and AI are all making their way into the data center, and with the biggest tech companies continually innovating in this space, it can seem impossible for companies that aren’t juggernauts to keep up.
In an environment where automation, edge computing and IoT is a constant topic of conversation among industry leaders, why does the task of innovating in the data center seem so challenging for the traditional enterprise? It’s because of several barriers that have existed for years, pre-occupying the enterprise with the challenges of maintaining a brick-and-mortar data center operation. Forget underwater data centers or having AI control portions of the data center. Before organizations pursue the futuristic trends, they are seeing with the likes of goliaths like Google, Microsoft, and Amazon, they need to first get a handle on the more immediate challenge of cloud migration and gathering basic data about their infrastructure. For companies that wish to get serious about improving their data center operations, it is important to take a close look at the following factors that could be keeping them from a successful cloud migration.
A Lack of Data and Modern Physical Systems is Crippling
Small to mid-size enterprises lack even basic data-gathering capabilities about their data center facilities and operations, and it is holding them back from making critical decisions and potential financial savings. Basic data points like power use, server locations and capacity are conditions for innovation. Enterprises understand migrating applications to the cloud can help with infrastructure cost savings, but if they can’t answer basic questions about their true needs, making this migration can be a big challenge.
For companies that are likely to remain on-premises or in a hybrid environment for their data center operations, infrastructure must also be modernized and secured before further innovation can occur. Increasingly, systems like cooling and electric are becoming connected to the internet, allowing for greater control and automation in the server facility. However, while this connectivity is a step toward modernization, it can also be a risk. Without an airtight security operation, failures like power outages can open the door for hackers to hit. To add more future-focused technology and systems, enterprises must make sure their basic infrastructure is accounted for and secure first.
The Investment in the Legacy Data Center is High
Another factor keeping some companies from starting a cloud migration is the investment they already have in legacy infrastructure. They have staff able to run it and, likely, a facility they have spent a considerable amount on to build out and maintain. But companies often forget that a cloud migration doesn’t mean they have to abandon the data center altogether. Increasingly, companies are turning to managed service providers to step in and consult on their capacity. For instance, if a company only needs 30 percent of the on-premise infrastructure, the provider can look to bring in other companies that might be able to use it.
Companies that maintain their own data centers should also keep in mind the control they are relinquishing with a migration to the cloud. Yes, a cloud migration allows businesses to save money, work more efficiently and scale more quickly. However, they should also consider they are taking something that used to fit in their own building, under their control, and putting it in a space open to users outside their business. When migrating to the cloud from an on-premises data center, it is essential an enterprise does its due diligence to vet the vendors their cloud provider will be working with.
Regulation Is Increasingly Limiting
Some companies find it extremely difficult to cut ties with their own data center because of new and increased regulation. The General Data Protection Regulation (GDPR), for example, makes many companies wary of relinquishing any kind of control over their data because of how costly mistakes can be. One slip-up carries a fine of up to about $26 million in U.S. dollars, or alternatively, two percent of the company’s revenue. Industry-specific regulation is a factor, too. Especially in highly-regulated industries like healthcare or finance, companies often simply feel more secure managing all the data themselves.
Innovating the operations of the data center is possible for enterprises, but it’s critical they overcome these barriers that have plagued data centers for years. Doing so opens the door to incorporating IoT, AI and other systems that can save time and money. While they won’t be putting servers underwater anytime soon, companies who can overcome these challenges now will be well-positioned for meeting the challenges of the future.
About the author
Dave Eastman is the VP InCommand at ServerFarm where he oversees the development and delivery of all InCommand services. For the previous 15 years (1999 – 2014) Eastman was the Sr. Manager, Global Data Centers for a multi-national Fortune 300 enterprise where he was responsible for the IT infrastructure and capacity management of 50 company owned and co-located data centers in the US, EMEA and APAC. In this role Eastman was also responsible for data center global strategies, designs, capacities, projects and construction including co-location negotiations and contracts. One of Eastman’s notable achievements in this capacity was the development and implementation of Data Center Infrastructure Management (DCIM) software and disciplines which were deployed globally.