HPE Is Making Artificial Intelligence Accessible and Practical
October 25, 2017 • Blog Post • Pankaj Goyal, Vice President, Artificial Intelligence Business
IN THIS ARTICLE
- A.I. is accelerating innovation in hundreds of fields thanks to a technique known as deep learning
- HPE is introducing new tools and techniques like our integrated A.I. solution for rapid software development and our Deep Learning Cookbook to bring deep learning to every company
New tools and partnerships bring Deep Learning techniques to customers whore at various stages of their Artificial Intelligence adoption journey
For decades, we've confined Artificial Intelligence (A.I.) to the realm of science fiction. Whether aspirational like Commander Data of Star Trek: The Next Generation or dystopian like Skynet of The Terminator series of films we've thought of A.I. as just beyond our reach.
However, A.I. jumped from science fiction to science years ago. From genomic sequencing analysis to climate research, medical science, autonomous driving, and robotics, A.I. is accelerating innovation in hundreds of fields thanks to a technique known as deep learning, a technique best implemented in a high-performance computing (HPC) environment. A global leader in HPC supplying nearly 30 percent of the worlds supercomputers, HPE is making AI real for its customers. We are introducing new tools and techniques like our integrated A.I. solution for rapid software development and our Deep Learning Cookbook to bring deep learning to every company no matter where they are in their A.I adoption journey.
In order to understand how A.I. is helping businesses connect with customers, grow their company and drive innovation, we first need to understand what A.I. is and what goes into making a machine intelligent.
Digging into Deep Learning
Think of deep learning as your brain firing at full capacity to absorb and analyze every bit of information that comes its way with the goal of solving a single problem. Data can be numbers, text, images, sounds and more. The key is capturing raw information and applying massive processing power to classify and organize each bit and byte while scanning for patterns.
If that sounds difficult, it is. Data scientists require knowledge of the algorithmic "recipes" needed to cook up intelligence from vast seas of information. Getting IT involved early is crucial, but it's also specialist work that usually requires a fair amount of software design and systems integration to test for and understand computational limits. No one wants to fix a network broken by an overly ambitious algorithm.
The Path to A.I. Adoption
So, how can enterprises experiment with and exploit A.I. and deep learning without taking on too much, too fast? We are helping customers navigate their unique journeys to A.I. adoption through three phases:
Phase 1: Explore: This phase is about figuring out what is out there and the appropriate needs for your business. Every major technical breakthrough has followed a hype cycle. At this stage, a company going all-in on A.I. could be just as destructive for the unthinking leader who sees salvation in machines that promise to do the thinking for him or her. The resulting hype and hopes unmet lay waste to early adopters who invest too much, too fast. We help companies understand their options and devise a plan for adopting A.I. into their business models, starting at the infrastructure level and working with partners to help dreams become realities.
Phase 2: Experiment: We help customers test the waters with A.I. by applying pre-determined experiments to a companys unique problems both within our Centers of Excellence, designed to assist IT departments and data scientists who are looking to accelerate their deep learning applications and realize better ROI from their deep learning deployments in the near term, and eventually within a company. With HPE Pointnext, scaling deep learning and A.I. is as easy as turning the dial for more capacity, but without losing control of the underlying architecture. We offer customers a flexible consumption services for HPE infrastructure, which avoids over-provisioning, increases cost savings and scales up and down as needed to accommodate the needs of deep learning deployments.
Phase 3: Expand: Once an organization has completed a series of proof-of-concept experiments with A.I. and its seen a reasonable expectation of returns, we provide the team and technology for developing an A.I. infrastructure that's built to scale over time. And that's important: A.I. is exponential technology that grows hungrier for capacity as it chews up data and provides insight. No A.I. strategy is complete without a plan for growing HPC capabilities efficiently and affordably.
Cognitive computing that's powered by deep learning techniques is driving major scientific, technological, and economic advancements all over the world. And yet we can do more. We can make deep learning a universally accessible tool. That starts with demystifying the processes and tools, which Hewlett Packard Labs is already doing with the Deep Learning Cookbook. I encourage you to start experimenting today.