Design, deliver, and run enterprise blockchain workloads quickly and easily.
All servers and systems
The current state of drug development is a lengthy, expensive process, and to a certain extent, it is a shot in the dark. Only one out of thousands of potential compounds make it through the research, trial, and review pipeline to be approved by the U.S. Food and Drug Administration (FDA) and put into large-scale manufacturing. The cost of developing a single drug is staggering. A study by the Tufts Center for the Study of Drug Development found that it costs more than $2.5 billion to develop a drug. Other estimates put it higher: $4 billion, or even as high as $11 billion.
And things may be getting worse, not better. We’ve all become accustomed to the idea that things in technology and science are on an inevitable march toward more efficiency and lower costs. The prime example is Moore’s Law, which essentially says that computing power doubles approximately every 18 months. But the reverse seems to be happening when it comes to drug development.
“There's something called Eroom's Law in drug discovery, and it’s the exact opposite of Moore's Law,” says Alex Madama, chief technologist at Hewlett Packard Enterprise. “It says that every nine years, the throughput and productivity of getting drugs through the approval cycle goes down by half.”
There are a number of reasons for that, including stricter safety and approval guidelines. It’s a problem with which the pharmaceutical industry has been struggling. And it’s not only the industry that gets hurt by it, but also patients hoping for new drugs that can help with their medical conditions.
The industry is hoping that developments in computing—particularly artificial intelligence, cloud computing, the Internet of Things, and big data—will help reverse that course. It’s no sure bet at this point, although there are promising signs.
“For several years, we’ve seen those kinds of technologies being taken up by the industry, and they’ve already made a difference,” says Kevin Julian, Accenture senior managing director for Life Sciences and Accelerated R&D Services, North America. “The cloud, for example, has become almost universally accepted and adopted, and companies have moved significant chunks of their research and development operations into it.”
The technologies are targeted at cutting development time and reducing costs at every step of the chain of developing new drugs, from initial research to clinical trials. That chain starts with what’s called the drug discovery stage, in which researchers identify potential compounds that have the potential to be developed into useful drugs. Traditionally, this stage relies on what Madama calls “shots in the dark”—a great deal of lab work trial and error. He estimates that only one out of 10,000 compounds initially identified as useful eventually becomes an FDA-approved drug that goes into manufacturing and distribution. That takes a tremendous amount of time and money.
Big data analytics and related technologies can help improve the process. One way these technologies do that is by helping scientists do retrospective analysis on existing drugs or compounds they have previously studied and identify purposes for them other than their initial ones. For example, the drug Rogaine, initially developed as a blood pressure medicine, was eventually found to grow hair and is now prescribed to stimulate hair growth. Data analytics can speed up finding other uses for compounds that were studied initially for other purposes.
There are other ways computing technologies can be used in this phase as well. Many drugs are antibodies, large proteins that work by binding in a specific way to disease causers such as bacteria and viruses, called antigens. The binding sites and proteins are made up of extremely complex shapes that need to fit together perfectly. It can take an enormous amount of lab time and work to test thousands of drugs and see whether they can bind to the antigens properly. But Madama says, “You can use computing power to simulate the binding sites and the way proteins fold, to see if they will bind properly. A computer can do that analysis millions of times faster than you can do it in the lab.”
Julian adds that he expects artificial intelligence will eventually help in this stage as well. “You could use artificial intelligence to go through large amounts of genetics data to determine a correlation between a particular DNA sequence and a disease,” he says. “That will help identify potentially useful drugs.”
Once that is done, “artificial intelligence could be used on electronic medical record data to determine the target population for a potential drug, and enable the industry to very quickly set up and put the drugs through trials, helping bring it to market much more quickly,” he adds.
After the most promising drugs are identified, they are put through multiple phases of clinical trials, which are lengthy, costly, and complex to manage. Data analytics, IoT, and cloud computing are already offering benefits here, and they promise to bring even more in the future.
Clinical trials gather as much data as possible about whether the drugs are working, how effective they are, what kinds of side effects they have, and other information. Traditionally, participants in the trials visit a doctor’s office on a regular schedule. The patients have their vital signs and blood taken and analyzed, and so on. Patients can also self-report some of that information. All this takes a good deal of time, and it doesn’t always yield a tremendous amount of data.
The use of wearable and implantable IoT devices, though, can gather tremendous amounts of information, providing constant, ongoing updates rather than just once-weekly snapshots. The sensors and devices can automatically gather the data and send it back to researchers.
This can have multiple benefits. In the earliest trial phases, it can identify the proper dosing amounts of a drug as well as its potential toxicity. In later phases, the data can better determine the effectiveness of the drugs and potential side effects. “The more data you can gather, the more effective the clinical trials will be,” says Madama.
Of course, with all that data comes problems as well, including how to store it and analyze it. That’s where cloud computing and data analytics come in. The cloud makes vast amounts of cheap storage available and gives researchers access to less-expensive, higher-powered computing, which is required to perform the data analytics.
Clinical trials are expensive to perform and require that many people in multiple locations work together and have access to specific data at the right times. These are the same kinds of problems that enterprises face. So the cloud offers the same kinds of benefits to researchers performing clinical trials as those enterprises look for: better, less-expensive collaboration and an overall increase in productivity and efficiency.
Because of that, the cloud pays big dividends in performing clinical trials, by reducing costs and speeding them up, says Tarek Sherif, co-founder and CEO of Medidata, which has a cloud platform for clinical trials. Sherif told the BBC, "We were able to save one of our clients about 30 percent on the cost of running a trial.” And Julien told the BBC that with the use of the cloud, "we've seen overall savings of 50 percent—in some cases up to 75 percent—on the historically labor-intensive parts of the drug development process."
All this is well and good, but researchers have hopes that they’ll be able to magnify those benefits many-fold as computing power increases, machine learning and artificial intelligence make breakthroughs, and vast amounts of additional data are gathered. They’re hoping eventually to build what some call the Holy Grail: use of computers to simulate the functioning of the human cell, down to the subatomic level. That would allow researchers to perform simulations of how different biological compounds work in the human body, dramatically reducing the time and cost required to identify drugs for clinical trials.
Madama thinks that’s doable, although it's not quite around the corner.
“I think in the next five years, we’ll be able to start modeling organelles within a cell, and hopefully in 10 years, we’ll be able to model an entire cell down to the subatomic level,” he says. “That would be huge, that would be a revolution for the industry. And once we can do that, then hopefully we would be able to model an entire organism.”
If that could be done, he says, “that would take the pre-clinical trial phase from five or six years down to less than a year, and would save between $1 billion and $2 billion in the drug discovery process per drug.”
It won’t be easy to get there. But Madama says, “I’m hopeful that it will happen.”
And if it does, not only will drug costs be reduced and drugs made available more quickly, but patients could benefit from new cures not even imagined today.
This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.