What's the edge really about? We asked our experts
Merriam-Webster defines the edge as "the line where an object or area begins." For 21st century enterprises, the edge is where innovation and inspiration begin.
An increasing amount of computing power is moving away from data centers in the cloud and closer to the physical world. By the year 2025, IDC predicts there will be an estimated 56 billion connected IoT devices on the planet and 75 percent of all enterprise data will be generated at the edge.
But what is edge computing exactly? At Hewlett Packard Enterprise, we broadly define the edge as a place where people, places, things, and their data intersect. A more precise answer depends on how you apply the edge to your specific business needs.
Edge devices run the gamut from simple embedded sensors that keep your offices cool and energy efficient to complex industrial robots that are building the next Mars rover. They include the automated checkout machine you use at the grocery store, the intelligent cameras that keep a watchful eye on city streets, and the AI-powered MRI machine that helps your doctor interpret your latest scans.
To further refine what edge computing is and how organizations can use it to their advantage, we've assembled a dream team of subject matter experts:
- Lin Nease is chief technologist for IoT in HPE's advisory and professional services group, as well as an HPE Fellow.
- Partha Narasimhan is CTO at Aruba, an HPE company and a leading provider of next-gen network access technology.
- Dr. Eng Lim Goh is CTO for high-performance computing and artificial intelligence at HPE.
We asked these technology leaders to address some basic questions about the edge and what it means for enterprises.
When people talk about the edge, what does that mean to you?
Nease: The edge is the tentacles of the octopus. It's everything that resides outside those massive multipurpose air-conditioned data centers, which we now think of as the data center core. It's incredibly diverse and includes everything from small data rooms to smartphones to sensors, along with the networking infrastructure that connects them. Anything that requires immersion in the physical world, or a digital rendition of it, has to happen at the edge.
Narasimhan: At HPE, we begin by defining the edge as any infrastructure that's not inside the data center. But it's also about the places we find ourselves in and the experiences we want to have. It's about digitizing the physical world and transforming that data into insights that provide more visibility into the spaces we're operating in. In an industrial environment, for example, data collected at the edge can be used to improve operations, enhance safety, reduce costs, or increase profitability.
Goh: The edge is the point where the data is first collected—say, an IP camera connected to a server in a retail store, a smart vibration sensor on a machine inside a factory, or an electron microscope. A key characteristic of edge devices is that, on their network side, they have much more data flowing out than coming in. A third defining feature is that these devices are constrained in some way: by the amount of power you can provide to them, the kind of environment they must operate in, or the amount of bandwidth available.
Why is the edge suddenly such a hot topic? What's driving this?
Nease: We've reached the point with Moore's Law where we're able to bring compute and data processing into the physical world in the form of IoT devices. And that enables a wide range of new use cases. For example, we can build facial recognition logic into cameras, so they can identify people immediately. When we're driving and roaming between cell sites, our car's navigation system still has to work; that requires processing data at the edge. If you're operating a natural gas pipeline across a remote stretch of land, you might have at best a low-bandwidth satellite connection. The logic that's controlling the pumps has to happen at the edge.
Narasimhan: The number of personal and IoT devices is growing, which means data is continually increasing. Processing all of that data in a centralized location or the cloud is expensive. So organizations end up just dropping that data on the floor and not doing anything with it. But if you can process that data at the source, the costs become much lower and more use cases become viable. It's all about bringing a cloud-like experience to the edge and enabling new low-latency services like augmented and virtual reality.
Please read: A growing reality: AR at the edge
Goh: Sensors are continually becoming more sensitive, and there are more and more of them. That means they're generating more data than ever. But the cost of bandwidth isn't decreasing at anywhere near the same rate, so it becomes very expensive to backhaul all that data. Collecting and processing it locally can be more cost efficient. At the same time, the number of applications requiring low latency is increasing. When you don't have time to send data up to the cloud and wait for an answer to come back, that's when you need processing power at the edge.
How important are technologies like cloud-native computing and AI to making edge computing work?
Nease: Extremely important. To take one example, HPE's own servers are built using AI at the edge. In our factory in the Czech Republic, the system performs 80 visual checks of a motherboard in less than 90 seconds. In the past, humans would do it, but they'd occasionally miss some things. With AI, the number of quality control issues has dropped dramatically. If this pattern recognition were performed in the cloud, it would probably take 10 minutes or more.
Narasimhan: Cloud-native technologies like microservices, containers, and orchestration are important for agility and scale. Nothing stays the same for very long, and you need the ability to keep pace as the technology changes. You may also want to run some applications on the edge and others in the cloud, then shift between them depending on the needs of the customer or the environment. Using cloud-native services at the edge gives us the flexibility to do that.
Please read: Start making sense: Building modern data platforms
Goh: Nearly every edge computing scenario uses AI or rule-based machine learning models to some degree. Sensors upload data to the cloud, which is used to train a machine learning model. That model is pushed back down to edge devices using containers. These devices then use that model to make decisions independently. Today, edge devices are making inferences based on cloud-generated models; in the future, they may become smart enough to learn locally on their own.
What are the barriers to wide-scale deployment on the edge?
Nease: The biggest barrier is complexity. The number of systems we need to deploy, manage, and monitor multiplies dramatically at the edge. As a result, the attack surface becomes huge. How many times have we seen "the bad guys" round up a bunch of video cameras or HVAC systems and generate denial-of-service attacks on mundane infrastructure that still uses default admin passwords? Security at the edge is a big problem.
Narasimhan: It's really mindset. You can't think about the edge the same way. We're not shipping storage and infrastructure to customers and having them deploy and manage it. It needs to be centrally managed and orchestrated as a service. That's going to face a lot of resistance. The other big issues are security and privacy. How do we provide the right tools to manage data on behalf of the customer? The only way to gain their confidence is to be open about who has access to PII [personally identifiable information] and other data and how that process is managed.
Please read: How the edge is reshaping healthcare
Goh: The energy and environmental constraints I mentioned earlier: If you're building an edge device to go into a car, for example, it has to be able to run off a 12-volt battery. You can't use a fan for cooling because it will quickly become clogged by dust. If it's operating next to people, it can't be too loud. And so on. HPE built a high-performance computer for the International Space Station—the literal edge. One of the constraints was that it had to draw less than 500 watts. [Editor's note: Supercomputers like the HPE-Cray Aurora require 60,000 times more than that.] Another was that it had to survive the vibration of blast off and radiation levels 10 to 100 times higher than on the earth's surface. Those all had to be factored into the design.
What else should enterprises be thinking about as they begin to deploy edge technologies?
Nease: The biggest constraints of the edge aren't on the technology side. The constraints are in process and adoption patterns. Companies that embrace the edge faster than others will enjoy big advantages, and new businesses will emerge as a result.
Narasimhan: There's been such a mad rush to digitize and automate the physical world that we're underestimating the number of vulnerabilities it creates. I remember being at an Aruba user conference in 2017 when AWS went down and we couldn't run any of our demos. It was both amusing and alarming to see the number of people complaining that they couldn't open their garage doors or operate other devices in their homes. We just assume these things will be available when we need them. The level of attention cybersecurity requires is not keeping pace with the rate of edge adoption.
Goh: Customers need to be clear on why they're considering edge technologies. Are they thinking about it for technical reasons or strategic ones? If they have an unsolved technical issue that will solve business problems and where they think the intelligent edge will help, we approach it one way. But if they're saying, "We have this vision to pivot our company and we believe the edge will be a great way to start pivoting early," then that's a very different conversation.
This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.