Skip to main content
Exploring what’s next in tech – Insights, information, and ideas for today’s IT and business leaders

The future of analytics is real time

The amount of data you are gathering can be staggering, so how do you derive the most value from the edge?

Michael Lee Sherwood wants tourists to know that every time they step out onto the Las Vegas Strip or cruise to a stop at an intersection near a major hotel, he's thinking of them. Or at least his technology is.

As chief innovation officer for the entertainment capital of the world, Sherwood has been piloting an edge computing project in the past few years to test ways of getting people around town as quickly and safely as possible. Key to the project, called Blackjack, is the use of analytics technology that pulls and inspects data from more than 100 sensors in traffic cameras, stoplights, and self-driving cars.

Still in its infancy, the project could someday make it possible to replace timed signals, which drive impatient and hurried people nuts, with smart systems that constantly monitor traffic patterns and adjust in a heartbeat to minimize congestion and inconvenience.

"We're looking at how we can use data to make all of our intersections more intelligent, and to do that, you need edge computing," says Sherwood. "You need to be able to take all sorts of data, run calculations against it, and in a matter of milliseconds produce a result. That has to happen at the edge."

Sherwood is among a growing number of tech leaders who recognize that as connected Internet of Things (IoT) devices find their way into billions of connected objects around the world, there is a huge opportunity to collect, aggregate, evaluate, and make use of information gleaned from those objects.

Dave McCarthy, research vice president and industry analyst at IDC, says that while that may seem like a "no duh" practice, for the first decade of the IoT's advancement, organizations were more focused on deploying devices than they were on getting value from all the data those gadgets were managing. That was partly because, until the rise of edge computing, they weren't sure how to assess all those bits and bytes quickly enough to do something meaningful with them, he says. The latency, connectivity, and cost challenges of streaming information to and from central data repositories were too significant.

Please read: The best ways to manage analytics in the data-driven enterprise

"The star of the IoT show was the data being generated at all these locations, but organizations didn't really know what to do with it or how to utilize it," McCarthy says. "So it became like a rainy-day fund where CIOs said, 'We'll just take this information, store it, and maybe go back and look at it someday.'"

The time for looking has apparently arrived. IDC predicts that more than half of new enterprise infrastructure will be at the edge by 2023 and global spending on edge technology will reach $250 billion by 2024.

Despite this trend, many IT and business leaders are still in the early stages of defining their edge computing analytics strategies, according to research by Hewlett Packard Enterprise. They know their businesses need to do it to achieve better operational efficiency and competitive advantage. But key technologies that would make processing and analyzing data at the edge more effective—like artificial intelligence (AI) and machine learning (ML)—are also in their infancy. And many IT organizations lack the technical depth or expertise to get their edge programs to the next level.

Turning vision into reality

Glyn Bowden, chief architect of the AI and data science practice at HPE Pointnext Services, says the edge computing vision becomes reality when billions of IoT devices in the wild no longer have to connect to central data repositories. Instead, they operate independently, with local ML models dictating how devices or objects operate based on their observable behavior.

For example, in a smart factory, semiconductors traveling down a conveyor belt might be expected to have a certain color or shape. If connected cameras capture images that show the silicon chips are positioned incorrectly or have the wrong shape, size, or color, it could suggest an equipment malfunction. AI algorithms could then recommend or automatically implement machinery repairs without humans having to get involved, or the model could be adjusted if it turns out the anomaly wasn't a big deal. Similarly, models could be refined for individual machines based on a variety of situational factors, such as plant lighting, temperatures, floor height, equipment models, and so forth.

Please read: Your edge. Your future. Where insights meet opportunity

Bowden calls all of this "inference at the edge," which refers to a process of applying mathematical logic and rules to local knowledge bases to reach actionable conclusions—for example, fix that machine or lower the heat in a particular room. While AI and ML combined with analytics solutions help enable this, Bowden notes setup and ongoing management can be complex and difficult. For that reason, he recommends offloading such work to a professional services organization.

"If you're technically able to make 10,000 decisions a second but your business can't react in real time because you're not properly set up for it, there's not much point in doing it," Bowden says. "By bringing in outside help, you're able to turn over that work to experts who will set you up correctly, manage it all cost effectively, and ensure everything is done as securely as possible."

Bowden notes having the right edge analytics setup can potentially be a safety issue. If a company is using facial recognition to determine if people trying to enter a building are really employees, for example, there needs to be a split-second inference about whether to grant a person access. Any delay could lead to a guard losing patience and letting the wrong person in or the system bogging down and hampering worker productivity. He also points out that this is the type of situation where you need to consider if AI is the appropriate solution or you would be better served by a simpler automated process, or even just better training of your security guards.

And, of course, when self-driving cars become more commonplace, they too will rely on the ability of edge analytics to process data from LiDAR systems to determine how close vehicles are when approaching nearby objects. Even the tiniest error or delay could spell catastrophe, making local oversight critical.

"A lot of inference is going to happen at the edge rather than in core data centers for these types of reasons," Bowden says. "This is absolutely where we're headed."

It's a learning process

In Las Vegas, Sherwood is already getting there with the city's Blackjack project. For instance, the city has been testing autonomous vehicles along the Strip for years. But with edge analytics, it's now able to automatically send commands, based on analysis of local conditions, to improve safe operations. "If it's raining," Sherwood says, "we can communicate new speeds directly to the vehicles."

He says the Blackjack project has uncovered unanticipated challenges, such as the difficulty of training models to ask the right questions and apply the appropriate answers. To help offset those, the city has consulted vendors and service companies with expertise on the matter and is establishing its own data analytics operation as well.

"This is a whole new area for us, and we're still learning," he says. "But it's worth it, and most organizations, especially municipalities, should have some sort of edge analytics program. It's definitely a driver for the future. For organizations that already have a wealth of data, it's just a matter of getting their programs together."

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.