Courtesy of Waters Corporation
The biopharma industry must become more efficient. Whether this comes from successive iterations, a revelation borne from pandemic-enabled efficiencies or consequences of the Inflation Reduction Act of 2022, efficiency has become a biopharma industry imperative.
Following approval of a drug, a fully optimized quality by design (QbD) approach can decouple much of the process from the molecule itself, leading to significant efficiencies.
This is the case with small molecules that are chemically synthesized. But, “For large molecules like biologics, which are much, much harder to make, the situation is different. The regulations are inflexible,” Davy Petit, senior director, global pharmaceutical and biomedical research business at Waters Corporation, told BioSpace.
“For biologics, you don’t file the molecule (with regulators). You file the process. So, once you have locked down the process, by following it you guarantee you are making the right drug. Later on, a manufacturer may come up with an improved way of making the biologic and improve yields. But under the current rules, the incentive to change isn’t there. To make even the slightest improvements to that process (later) will involve more clinical trials and be very expensive,” Petit said.
“With the sophisticated analytical techniques we have today to prove product comparability, decoupling the process from the molecule makes sense and can drive down the cost of biologics.”
Measure on the Manufacturing Floor
Good outcomes begin during upstream process development through carefully characterizing the process, then measuring critical quality attributes of the drug substance, Petit noted.
“If you know the molecule you want to make very well, and the critical processing parameters you must control – and measure both the molecule and the process continuously – you can predict the outcome, guarantee the product is correct and that it meets all quality criteria.
“Often, however, the technologies that are applied to process development, namely liquid chromatography or mass spectrometry, cannot yet be used as they are on the manufacturing floors,” Petit admitted. Consequently, the test results on samples sent to centralized labs for analysis by specialists can take weeks or even months to come back.
“There’s a lot of innovation still needed,” he said, but the ability to eliminate those steps is huge, and makes a sound case for at-line, in-line or online processing.
“If you want to decouple, you want to make decisions at the point of measurement in the production process, where you need them at the time of change,” Petit noted. “This could be simple measurements that are already done today like temperature, pH and the stirring speed of a bioreactor. In the future, that could include critical quality attributes of the drug substance.”
Measuring processing parameters on the manufacturing floor helps manufacturers scale up or down, detect errors early on and control parameters so the impact of any changes can be detected immediately.
With this approach, Petit said, “You’re not running blind for a couple of weeks,” waiting for a sample to be tested. Instead, “You can keep the quality high, the yield at a maximum and you can release within a certain timeframe.”
Simultaneously Optimize Media and Cell Selection
Reliance on in-line, at-line or online process analytics also lets manufacturers run batches simultaneously, he said. Take biologics, for example.
“You start with cell lines, make a clone and do clone selection to identify the best clone that produces the right quality and quantity of your drug substance. That clone lives in an ecosystem – your bioreactor, which is fed with a certain media and atmosphere. So, you also need to optimize the media. These are two different processes.
“With our technologies we can combine both, optimizing the media and the cell selection in one run,” Petit said. “Every cell culture run will take two weeks, so combining the two can save months of time. For a blockbuster, that can have a huge impact.”
Waters and partner Sartorius are working to embed some of the current technologies – like liquid chromatography and mass spectrometry– into the bioreactor during upstream process development.
“That’s a key,” Petit said. “It’s not about the measurement itself… but about doing the measurement in-line,” which enables aseptic samples to be pushed to analytical equipment in a sterile environment.
New technology helps. For example, charge detection mass spectrometry (CDMS), which Waters is working on, can measure whether a capsid is full or empty in a fast and accurate way – a clear need in the cell and gene therapy area.
Decoupling the process from the molecule harkens back to QbD principles. In drug development, QbD defines and maps certain designs so you know where the waste is and can therefore improve the performance of the process, Petit explained. “The means of doing this is by advanced analytics…which gives you insight into the process…so you can optimize it completely.”
Leveraging AI/ML During Discovery
The path to enhancing the efficiency of drug development really begins during the discovery process.
“AI and, in particular, deep learning, has had lots of success in recent years, but it’s been around longer than people realize,” said David Kita, Ph.D., co-founder and VP of R&D at Verseon. “It only became reliable, however, once a certain amount of data was accumulated that it could train on.
“Reliably predicting what will bind, bind selectively or be potent is a challenging physics problem (especially when searching for truly novel compounds). There isn’t enough data,” Kita said. This is despite the vast quantities of compounds identified through high throughput screening.
To that point, companies “are working at the genetic level to provide an understanding of the molecular drivers of disease,” said Mike Klein, CEO of Genomenon. “This is a critical component of target identification from which the drug molecules are tested around.”
This provides a much deeper understanding of the genomic drivers of disease, accelerates target identification and helps de-risk clinical trials by identifying the patients most likely to benefit.
“The industry is not yet at a point where it can simply plug in the genetics of a disease and run it across any drug molecule, but genetic drivers will be a key component in the advancement of AI drug discovery solutions,” Klein said.
Consequently, molecular modeling alone isn’t the solution to enhancing efficiency.
“Molecular modeling has a complexity problem,” Kita pointed out. “The number of degrees of freedom associated with a protein, the small molecules and the water molecules make this very complicated. There’s always a tradeoff between time versus accuracy.”
At Verseon, “We’ve focused on how to use AI in combination with molecular modeling to get more reliable predictions of good molecules…to synthesize in the lab,” he continued. Validation studies of early prototypes conducted in the early 2000s found the combination was “quite more reliable” in solving the docking and scoring problems than either method alone, he said.