How to help your company get started in deep learning

Deep learning is revolutionizing many aspects of our lives, and it's impacting the enterprise world as well, from improving the customer experience to managing operations and infrastructure. But while the concept is clear, the details can be foggy. Here’s what you need to know to get started with deep learning.

qArtificial intelligence is taking the consumer technology world by storm. Different AI-driven applications such as digital assistants and voice interfaces are shaping the digital lives of users. However, the enterprise technology world is still trying to shape an AI-driven future. One AI technique that has found an early adoption in the enterprise world is deep learning. Deep learning is a sophisticated and complex shift for machines, from task-oriented to data representations functionality. “Hadoop may have sparked a revolution in analytics, but the most recent revolution is deep learning,” according to Forrester in a recent Forbes article.

Like big data, the business payoffs in deep learning are potentially huge, even if AI never reaches a state of true cognitive computing. Mastering deep learning requires a fast, agile, and highly efficient infrastructure; access to massive data stores; and a lot of patience, since machines need time and experiences to learn, just as humans do. But before you can reach the point of ROI, you have to get started.

Where to start your deep learning journey

For an enterprise, big data and deep learning journeys are tightly linked together.

“Not surprisingly, early adoption of deep learning has happened in companies with access to massive amounts of data which can be analyzed to solve their problems," says Pankaj Goyal, vice president of Hewlett Packard Enterprise's artificial intelligence business. "This data might be in the form of text, image, voice, or video. These problems could be automation, new customer experience, or new product innovation.”

In short, if you’ve got access to huge datasets, deep learning can deliver increased efficiencies and additional insights that humans may never have thought to seek.

Perhaps not surprisingly, the most natural and logical place to start is with identifying what data you most need to reap more insights.

HPE introduces new set of artificial intelligence platforms and services

You can also take note of where deep learning is currently being used successfully inside or outside of your company and do something similar—for example, natural language recognition and digital assistants.

“It’s best to start with a small, relatively bounded project rather than ‘boil the ocean.’ You’ll get results quicker, or fail faster,” advises Bill Mannel, vice president and general manager at HPE. “Know your goals for the project and what you think should be your measure of success. Start with a low-investment threshold, to keep expectations mild.”

Instead of trying to mine secrets from all your data at once, start with a specific project you understand well and can measure easily, and from which you don’t expect fast results. That’s because deep learning technology is dumb as a rock in the beginning; it takes time for it to learn enough to accomplish much of any use to you.

A successful project might be related to customer experience management, personalized marketing, regional or personalized pricing, or even industrial robot performance. Practically, any project is a good starting point if you have sufficient amounts of data and the time to allow deep learning to root and grow to its full potential.

It’s important to explain that initial lag in performance is to be expected, due to the time necessary for the machine’s learning to progress. Otherwise, business executives, who have come to expect immediate results as they see with data streaming and real-time analytics, may kill the project before it gets to the starting block. Tapping down inflated expectations is key because, otherwise, early successes can feel like a letdown or a shortfall. “There’s a lot of hype in the market today, very similar to the big data market from years past,” warns Mannel.

The most important thing to remember is that deep learning is an immature industry that is rapidly changing. However, the technology and industry are evolving so rapidly that companies cannot afford to wait to deploy deep learning.

Already, there are deep learning libraries and algorithms that are growing and evolving at an incredible pace, points out Mike Gualtieri, an analyst at Forrester. “The most popular are multilayered convolutional networks with back propagation, ideal for image and voice recognition, and recurrent neural networks, ideal for natural language processing,” Gualtieri says in his blog.

Waiting, therefore, isn’t an option.

If you still find there are too many directions to take to start a deep learning project, narrow down the list with an eye toward reduced complexity. In other words, choose projects wherein the deep learning aspect is the challenging part while every other component is familiar.

“Companies should begin by developing deep learning offerings for the vertical market use cases the companies already know best, because these offerings will often require domain- and problem-specific knowledge,” says Steve Conway, senior vice president of high-performance computing research at Hyperion Research (formerly the IDC HPC team). “In the medical healthcare market, for example, the talk might be more about cancer detections per second rather than input/output operations per second.”

Since the deep learning training process often requires billions of computations from the necessary mathematical vector operations, technical staff and vendors alike often revert to talk about the affordable supercomputers needed to do that work, and the IOPS and specs related to them.

Companies are well-advised to choose their deep learning projects based on solving specific problems wherein processes are inundated with repeated, manual tasks and the datasets are too large for humans to effectively sort and analyze.  

According to a recent Grand View Research report, use cases include:

  • Aerospace and defense industry remote sensing, object detection and localization, spectrogram analysis, network anomaly identification, and malware detection. Deep learning is also being applied to manual tasks and pilot operations in the cockpit as well as wearable computing for soldiers.
  • Autonomous and connected car systems. For example, the report says, “Audi uses deep learning algorithms in its camera-based technology to recognize traffic signals by their characters and shapes.”

Where to start if internal datasets aren’t large enough to point the way

“The requirement of a large amount of data to train neural networks (deep learning) is expected to pose a challenge to the industry growth,” according to the Grand View Research report. That challenge may be swiftly overcome given the explosive rate at which data is growing everywhere. The report also finds that “increased [U.S.] government support is expected to provide a positive impact on the industry growth. The establishments of subcommittees on artificial intelligence and machine learning within the federal government are providing traction for the industry growth.”

Major players in the deep learning space are open sourcing frameworks and libraries to further spur development, such as Google’s TensorFlow, Facebook’s open source AI hardware, and the nonprofit group OpenAI. Forrester’s Gualtieri rated open source deep-learning libraries such as Caffe, Deeplearning4j (DL4J), MXNet, TensorFlow, and Theano as the most popular.

Meanwhile, companies that lack access to huge data stores now can expect to see deep learning as a service to appear on the horizon. In other words, if your company’s datasets are too small to point to a direction for deep learning projects, don’t conclude that you can’t or shouldn’t take any on. While it is true that deep learning depends heavily on a sufficient amount of data in order to arrive at valid conclusions, you only need access to big data. You don’t have to own the data. You can also tap data from outside your company or share data with other parties to create large datasets.

“Unlike the social media/Internet giants, companies in many major markets don't yet have the massive data volumes needed to use deep learning effectively in production mode. But they're working to amass enough data, sometimes by pooling depersonalized data with major competitors,” says Hyperion Research's Conway.

Why would a company pool data with its competitors? “Companies willing to pool data with competitors or other parties know that to stay in the game, they will need to deploy deep learning in a few years,” says Conway, and they need the data to begin on that path now.

How big must data stores be? According to the open source deep learning project DL4J, “The minimums vary with the complexity of the problem, but 100,000 instances in total, across all categories, is a good place to start.” Look to increase data gathering efforts, up to and including pooling data with other entities, be they partners or competitors. Often, data pooling helps you identify projects to start working with deep learning. Either the reasons for your company joining a data pool will reveal potential projects or common problems among all the parties that agreed to pool data will.

Technical considerations for a robust infrastructure

Big data and deep learning require specialized technologies and an expanded infrastructure.

“Companies should become familiar with high-performance computing if they aren't already. HPC has moved to the forefront of deep learning R&D, mainly because HPC vendors know how to build computers with the large memories and ultrafast data rates needed to make deep learning solutions happen in near real time,” says Conway. “It's no accident that the most economically important deep learning use cases, such as autonomous vehicle research and precision medicine, already rely on HPC.”

One of the advantages from big data arriving on the scene first is that it fueled the advent and maturation of several technologies. That, in turn, drove efficiencies and decreased some costs. For example, Conway reports that HPC systems start at under $50,000 today.

That is not to say that all of the technologies involved have matured. “Deep learning is an emerging area and still fairly immature across the entire stack from a variety of frameworks to middleware and too few applications. Those apps that do exist are custom,” says Vineeth Ram, vice president, global marketing for AI, HPC, and data analytics, at HPE. “Other areas, such as databases and analytics, are mature though.”

It is from this mismatch of emerging and mature technologies that companies are forging their deep learning projects and experimentations. Many find the projects themselves to be challenging enough and are seeking third parties to help optimize the inference engine and infrastructure as well as other technology issues.

Three steps to get started

“Deep learning is fascinating, but not easy," says Goyal. "We hear the success stories but forget tens of years of effort behind it. Because deep learning is so challenging, it is best to follow a three-step, graduating process.” They are:

  1. Explore one or two use cases based on a business need, and where sufficient data exists, to apply deep learning techniques.
  2. Experiment with deep learning at a limited scale. Run a proof of concept, understand infrastructure and performance, and understand your data infrastructure and models.
  3. Expand and optimize deep learning, once you get good results with your experiment. Scale up your infrastructure, business processes, and data management. 

“Deep learning is science but also an art," Goyal says. "You have to set the right expectations to get the benefit out of it.”

Related links:

In search of Invisible IT

HPE is making artificial intelligence accessible and practical

How fast will AI grow? Which industries will use AI first?

The 6 types of artificial intelligence

Leveraging Deep Learning for Fraud Detection

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.