Skip to main content
Exploring what’s next in tech – Insights, information, and ideas for today’s IT and business leaders

Micro data centers: Picking up the data load in a decentralized world

They may be small, but they're enabling massive compute power at the edge.

Cloud adoption, in all its iterations, is spurring enterprises to rethink their approaches. This is especially true when it comes to data center deployment. New data centers at the edge in retail businesses and businesses with remote offices are becoming the norm. These micro data centers—or data centers in a box, as they've been termed—give businesses the ability to process data where it is accumulated.

This means the massive volume of data collected can be evaluated where it's happening and decisions can be made on the spot. "Customers looking to capture the value of their data are focusing heavily on deploying IT infrastructure on premises to analyze it. Additionally, with the continued adoption of AI and ML and limitations around connectivity, edge adoption continues to increase," says Aaron Carman, senior innovation program manager in the Office of the CTO at Hewlett Packard Enterprise. The processing of information has moved from where it was created to the data center and back again, but at a much higher volume, he says.

A 2020 Gartner report, "Your Data Center May Not Be Dead, but It's Morphing," predicts that by 2025, 85 percent of infrastructure strategies will integrate on-premises, colocation, cloud, and the edge—up from 20 percent in 2020.

Please read: AI sharpens its edge

Micro data centers emerged around eight years ago in the U.S. and Europe as leading vendors recognized enterprise infrastructure had challenges dealing with deployments at edge locations, which could be in remote areas with harsh, hostile environments that have little infrastructure support. With their flexible form factors, from 1U wall mounts to 40-foot containers—complete with self-contained power, HVAC, and security—micro data centers address a range of customer demands.

The solutions can be turnkey, simplifying deployment of the entire infrastructure and providing standardization across multiple locations. This approach for delivering compute and storage at the edge has become critical for minimizing edge operational complexity. Turnkey solution prices range anywhere from $1,000 for the smallest form factors all the way up to several million dollars, depending on the requirements of the customer.

From computer room to micro data center

Around 2016, Carman says HPE started seeing the shift toward supporting enterprise compute installations at the edge in a more turnkey way to fix customer problems. He explains that customers had their enterprise data centers and their computer rooms, with the definition of the two dependent on the customer.

Typically, it was difficult enough for enterprise IT to get a handle on its core data centers, let alone manage a localized computer room serving a business unit owner. These computer rooms, often no more than a single server stuck in a closet or under a desk, were off IT's radar. Such computer rooms morphed into a new enterprise edge, delivering different types of services and workloads, dependent upon the enterprise market vertical, whether it was campus computer rooms or within manufacturing plants.

Please read: Why colo makes sense

When GDPR was enacted, enterprises took a hard look at computer rooms and realized they were noncompliant and constituted immense security threats, among other problems. Basically, they were struggling with how to handle a decentralized infrastructure and data management model. Moving the data center to the edge gave businesses the flexibility to meet new security demands while still delivering a good user experience.

Micro data center design considerations

It's essential to understand a site's environmental factors when looking for a solution for edge infrastructure deployments. The methodology for designing a solution changes because you need to factor in these site conditions and determine if you will be able to support the necessary workload and standard enterprise infrastructure. Leveraging micro data centers at an edge site is an alternative to building a dedicated data center facility, especially if space and power limitations exist.

Standardized infrastructure solutions like a micro data center cut through all the environmental needs. Leveraging these solutions enables enterprise IT to manage familiar infrastructure and minimizes some of the operational impact of decentralized IT operations, says Carman. In addition, the solutions need to be centrally manageable, including remote control of the infrastructure so that it's easier to handle the edge deployments.

Who benefits from an edge data center?

Almost every industry and company embracing data analytics as a market differentiator will be looking to adopt edge infrastructure deployments to overcome connectivity challenges and provide real-time insights and action. Carman says the first wave of workload decentralization was the adoption of public cloud and the second wave is placing infrastructure at a customer-defined edge.

He points out that businesses are still struggling to manage the hybrid nature of workloads being hosted by multiple different providers and physical locations. "So your current enterprise data centers, you really can't call them core anymore. You now have all of your workloads hosted by cloud providers, some within your data centers and now at all these vast edge sites … and you have to manage them at an ever-changing scale," he says. These complexities are vast, and it's only through standardization and management automation that enterprise IT can maintain control of their decentralized environments. It requires a completely new operational model to fold the business and service providers into a shared IT operating model.

Please read: Your edge. Your future.

Healthcare is a good example of where micro data centers make sense. A hospital is the most expensive real estate in the world. A hospital computer room that requires hundreds of square feet of space to provide IT services is taking up room that could be generating revenue. Installing a modular micro data center that takes just a small amount of real estate from a loading dock or within the indoor ambulance parking areas frees up space for beds and patient care.

On the other hand, if you have a very small computing overlay and use an extremely ruggedized-type solution, you may not need a micro data center at all. One example would be military spec servers that are meant for extreme industrial or military-type settings. These hardened systems tend to have limited CPU, memory, and storage capacity, but they can be deployed in most harsh situations with little environmental consideration. The only drawback is the limitation of processing capabilities for workloads.

Edge deployment considerations

Understanding edge infrastructure deployment based on workload requirements is critical in selecting the proper micro data center. There are several additional factors that need to be considered to support edge workloads over time and at scale. Those include infrastructure ruggedness, the connectivity profile, and infrastructure scalability.

Infrastructure ruggedness

Understanding the target site helps define the requirements of the infrastructure, which is key. Typically, many micro data centers provide protected power and cooling at a certain capacity, but overall site power capacity, quality, and availability need to be understood. It is key to identify proper placement of the micro data center, paying attention to existing power infrastructure distribution to help control costs. Other items to understand are industry regulatory requirements, site conditions like particulate, physical security, installation pathways, and physical cabling locations.

Connectivity profile

Many times, as enterprise architects are understanding workload requirements, connectivity dependency is overlooked. Each site will have unique connectivity characteristics, some of which are not fully understood, leading to workload performance and management issues.

Infrastructure scalability

It is a common problem to not understand the multiyear site requirements for workload hosting. It is key that business units' strategic plans account for the proper amount of site infrastructure to support workloads over time. With any infrastructure deployment, always address infrastructure capacity planning based on business unit use case requirements at scale to ensure foundational infrastructure deployments like a micro data center can support several years of IT infrastructure growth.

Consider manageability

After you have this environment up, how are you going to remotely monitor and control the environment? The infrastructure needs to be intelligent so it can be monitored and controlled. You can use software platforms like a data center infrastructure management solution or go with an option that provides a complete managed solution meant to work specifically with the infrastructure purchased.

Most edge sites will not have personnel who can act as "remote hands" to intercede on behalf of IT. Remote control of power and cooling is key so that there is not a complete dependency on vendor or site staff to remediate the most minor infrastructure issues.

On-demand consumption options are available for container as a service, platform as a service, and VM as a service, and can include everything from the hardware to support services and management. You can also find options where you think of it from a workload perspective: AI and ML as a service, for example, where infrastructure is extended and managed remotely down to the actual workload.

So you want it online yesterday?

Deployment isn't quite as fast as ordering lunch, but close. Some of the smallest data centers that hang on a wall can be up and running in an afternoon. Other, higher density ones take more time because deployment depends on how much site prep is required. HPE's Carman says, "It could take three days to install and probably about two months of site prep prior to that installation to plan and prepare the site. The range is probably going to be three to four months for deployment." One thing is certain: Deploying a micro data center is faster than creating your own traditional compute facility.

The future is now

Gartner's Hype Cycle for midsize enterprises predicts that the micro data center trend will start to plateau this year. Carman agrees, pointing out that from a hype cycle, it's all about catching up to demand. He believes the micro data center segment, as it's defined today, has enough solutions to handle what's going on with AI and ML and that it has a long product lifecycle.

"Businesses are finding out that data is gravity, and so much data is being created that trying to justify pushing a lot of that through the network is not giving them a return on investment at that point," he says. "It's becoming too cumbersome, and therefore more of these remote edge deployments will become more autonomous and run by themselves. They'll be very dependent upon what types of connectivity they can get and what kind of environment they're being installed into."

Edge data centers are an enabling technology to get AI apps to retailers, manufacturers, and other businesses that need that potential compute and storage solution on site. The capabilities that a micro data center adds to business computing solutions will provide a clear path for growth over the next seven to 10 years as everything moves out to the edge.

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.