Computing at the edge

7 Reasons Why We Need to Compute at the Edge

From household appliances and fitness trackers to drones and medical devices, the “things” that fuel the Internet of Things can now collect data and send alerts before they even need maintenance. 

These things aren’t as inert as they might appear though. Rather than merely being conduits for data systems or the cloud, some argue that the so-called edge devices need an intelligence boost. This is the thinking behind Hewlett Packard Enterprise’s new Edgeline Systems, which provides computing power to such edge devices.

As Dr. Tom Bradicich, an HPE VP and GM of Servers and IoT systems recently explained, there are seven good reasons why beefing up edge systems will be critical for the future of the industry:

1. Latency. What good is an Internet-connected car if there’s a lag between when a child appears in front of the car and when the system actually tells the car to stop? Ideally, there should be no latency at all, but there usually is. Even worse, there’s a chance that the connection can be lost entirely. (Ever experience a dropped call on your cell phone?) For some mission-critical functions, latency is intolerable and you must compute on the edge. This is true even when speeds increase. When 5G rolls out commercial in 2018, for instance, it will be an improvement from the current latency but still not as fail-safe as edge computing is today.

2. Bandwidth. Sending data from edge devices to the cloud or a data center (Bradicich points out that the difference is academic because “A cloud is just a data center that no one is supposed to know where it is”) can use a tremendous amount of bandwidth. Fearing that such devices will be a drag on the system, some have proposed creating a separate network for the IoT. You can greatly curtail that drag by eliminating the need to send data back and forth. Many companies simply can’t handle the bandwidth needs of IoT right now.

3. Compliance. There are laws or policies in certain countries governing the regional transfer of data. Companies that embrace IoT often run up against such compliance issues.

4. Security. If you are going to send data all over the place, it will be vulnerable to attacks and breaches. Already, hackers have found ways to breach everything from cars to baby monitors that are connected to the Internet.

5. Cost. Extra bandwidth and extra security will inevitably cost extra as well. Since companies are often motivated to save money by realizing efficiencies via IoT, keeping costs down is of prime concern.

6. Duplication. If you are going to collect data and send it to the cloud there will inevitably be some duplication. While it might not reach 100%, if you collect 10 TB of data on the edge and then send it to the cloud, then that’s a duplication.

7. Data corruption. Even without any nefarious activity from hackers, data will be corrupted on its own. Retries, drops and missed connections will plague edge-to-data-center communications. Obviously, that’s a bigger deal for mission-critical applications.

For those reasons, HPE is offering products for what Bradicich dubs “really deep edge computing.” He added that tech upgrades like 5G may help some aspects of traditional IoT computing, but won’t remedy the seven he listed. “And there might be an eighth, ninth and tenth that I haven’t thought of,” he said.

Perhaps that’s why IDC is predicting that by 2018, some 40 percent of IoT computing will be “stored, processed, analyzed and acted upon close to, or at the edge of the network.”

Still, HPE is currently nearly alone in exploiting the potential of deep edge computing. If the opportunity is so large, why are so many others holding off? “Somebody’s got to be first,” said Bradicich. “It’s the right season for a leader like us to be first.”

 

CONNECT WITH DR. TOM:

twitter.com/tombradicichphd
linkedin.com/in/tombradicichphd

 

Click here to read about the latest news in IoT from HPE’s Discover 2016 conference in Las Vegas.