Skip to main content
Exploring what’s next in tech – Insights, information, and ideas for today’s IT leaders

How HPC could invigorate medical records management

Between regulatory requirement, organizational consolidation, and the ever-growing amount of data being accumulated, the medical records piece of the healthcare process is constantly being challenged to keep up with all of the moving parts. Can high-performance computing give EHR the power it needs?

The healthcare industry’s computing infrastructure could soon find itself on life support.

Long known for its budget-minded approach to IT, medical establishments have been running everything from imaging to diagnostics and electronic health records (EHRs) on old x86 architecture, pioneered more than 35 years ago. These systems have served them well, for the most part.

But with more medical establishments merging and digitally transforming, as well as adding complicated applications like virtual reality, augmented reality, and artificial intelligence, existing back-end systems are starting to sputter. In the process, doctors, nurses, and patients are often frustrated with what they see as increasingly slow and annoying technology.

“Current systems are doing a good job at what they were built for,” says Jerry Power, who runs the Institute for Communications Technology Management at the University of Southern California Marshall School of Business. “[But] they were never designed for the volume of data that's being fed into them today or the types of analytics researchers now wish to do. And they weren’t built to run things like AI. There is so much we need now that wasn’t in those original designs. The world has advanced quite a lot since then.”

Energizing EHRs

Nowhere is this more apparent than with EHRs, which were always intended to serve as a “golden record” of sorts for a patient’s medical history. In theory, if you could put every single record of an appointment, blood pressure or temperature reading, inoculation, X-ray, MRI image, CAT scan, or DNA record in one place, any doctor or nurse could use it to create better diagnoses and treatment plans for all patients. That’s a key reason 96 percent of hospitals now use EHRs compared with just 9 percent in 2008.

While noble in theory, what’s been happening in practice is that medical staff charged with capturing and digitizing every single patient detail feel they are spending more time inputting data than delivering quality healthcare. As a result, there’s been something of a backlash lately against EHRs. Indeed, one study found that doctors and nurses are split over whether EHRs improve the quality of care, with more physicians seeing them as detrimental and more nurses saying they help. Another survey found 61 percent of doctors believe EHRs have a negative effect on efficiency and productivity, and 54 percent think they damage patient-provider relationships.

Digital transformation in life sciences. What’s new and what that means to you.

None of this is to suggest all doctors would kick EHRs to the curb. Rather, most surveys seem to indicate they would like to see the systems improve. And some observers say the real culprit is the growth of medical data (by some estimates 48 percent per year) hitting outdated server and storage systems.

Overcoming conventional computing

“The main roadblock on this journey to a better world lies in the limitations of our conventional computing solutions,” wrote Antonio Neri, president and CEO of Hewlett Packard Enterprise earlier this year. “There is a growing data deluge as our ambitions grow faster than our computers can improve. Every two years, we create more data than has ever been created before, with the majority of it originating at the edge, or the periphery of a network.”

Brian Murphy, a healthcare industry analyst with Chilmark Research, agrees.

“There’s been a huge emphasis in recent years on measuring things that weren’t previously measured—in particular, quality of care,” Murphy says. “That triggered a wave of investments in analytics software, and organizations had to then buy more servers and storage to accommodate that. The other thing that’s happened is there’s been a staggering quantity of new data types—like genetic engineering research, proteomics, and genomics—coming into these systems. Right now, many healthcare organizations are trying to figure out what to do with all of this.”

Shining a light on HPC

One solution some organizations are exploring right now involves sticking with an x86 infrastructure but moving up to newer, more powerful high-performance computing (HPC) systems. While these supercomputers won’t cure every ill associated with modern medical records management, HPC is poised to address many performance and scalability issues plaguing the healthcare industry.

“I do think HPC can make a huge difference for healthcare providers, especially when they start utilizing more sophisticated technology like AI and the volume of data they’re handling significantly increases,” says Power. “Organizing medical data can be very time consuming for doctors, and depending on the power of their computers, it can be a very slow process. That’s where HPC can come in handy because doctors can’t be waiting 15 minutes or more for a primitive computer to sort through a humongous database to find something.”

Traditional HPC systems use parallel processing to run advanced applications programs faster and more efficiently and reliably. They’re great at not only handling massive amounts of data but at rapidly solving the types of complex computation problems that take weeks, months, or even years for medical researchers to solve on their own. As such, they’re considered an attractive future for current server systems.

But some think even modern HPC itself is ready for a refresh. In fact, there’s a relatively new architecture that promises to handle even greater volumes of data and complexity.

Memory-Driven Computing emerges

Known as Memory-Driven Computing, this approach sets itself apart by giving every processor in a system access to a giant shared pool of memory. In earlier computing systems, for one processor to access data not held in its own memory, it basically had to request the information from the processor where it was stored. And if the data was stored on a disk somewhere, same thing. All that information moving between tiers of memory and storage caused tremendous inefficiency. With Memory-Driven Computing, you can hold hundreds of terabytes or even petabytes in memory, allowing you to solve problems thousands of times faster than a conventional computer.

This doesn’t necessarily involve just one type of memory, by the way. With Memory-Driven Computing, you’re more likely to have different kinds to balance out price, performance, and persistence. But it’s all stitched together uniformly through a memory fabric, of sorts.

Memory-driven systems would ostensibly enable healthcare providers to keep pace with the amount of data coming their way and, at the same time, add enough processing juice to accommodate growth in any functional areas of an organization.

Specifically, it could allow doctors to address some of the most insidious scourges of human health, such as cancer, heart disease, Alzheimer’s, Parkinson’s, and HIV. The value that Memory-Driven Computing brings to healthcare and life sciences research is the ability to iterate on computational studies much faster.  We’ve already experienced 100-times improvements in study throughput. It could also help define and cure diseases that are exceedingly rare and difficult to diagnose. And with more robust predictive analytics, Memory-Driven Computing could even prevent major illnesses before they occur. What’s more, it could enable better integration of EHRs, addressing current frustrations with recording and accessing medical data.

Cybersecurity innovation

Of course, with any newer data-driven system, cybersecurity and government privacy regulations are a major concern for most healthcare organizations. In the past five years, more than 187.2 million medical records have been breached in the U.S., according to Privacy Rights Clearinghouse. FortiGuard Labs, a security protection firm, reported healthcare saw an average of nearly 32,000 network intrusions per day per organization in 2017. Healthcare data breach costs are reportedly the highest of any industry at $408 per record.

As part of early research into Memory-Driven Computing, at least one vendor, Hewlett Packard Enterprise, is researching ways to embed cybersecurity countermeasures into HPC hardware, specifically silicon. Traditionally, security is bolted onto systems through firewalls, antivirus software, and other such add-ons. However, with new computer designs, like Memory-Driven Computing, comes new opportunities to embed security throughout software and hardware.

The struggles healthcare organizations face with collecting, analyzing, and gleaning insights from data are not insurmountable. Powerful HPC technology like Memory-Driven Computing could soon play a major part in revitalizing the industry’s aged networks, boosting the ability of physicians to deliver even higher levels of quality care.

HPC and EHR: Lessons for leaders

  • The goal is to remove stumbling blocks that make users perceive EHRs as an impediment rather than a benefit.
  • Better access to historical patient data will result in better outcomes for the patient.
  • Memory-Driven Computing can be the game changer.

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.