In silico techniques – a JPAG event
European Pharmaceutical Review attended a JPAG event, exploring the plethora of uses for in silico techniques in the pharmaceutical industry.
The broad spectrum of uses for in silico technologies was explored at an event held in London this month.
The Joint Pharmaceutical Analysis Group’s (JPAG) ‘In silico techniques’ conference saw speakers from the Cambridge Crystallographic Data Centre (CCDC), Shimadzu and others share their latest developments for utilising computational models.
The second talk of the day, presented by Dr Edward Close from Process Systems Enterprise (PSE), described the industry’s progress towards systems-based pharmaceutics, which he explained is a holistic approach aimed at increasing R&D efficiency for drug delivery and manufacture.
Close said that this kind of system combines drug substance manufacture, drug product manufacture and product performance into a single system.
…the pharmaceutical industry needs to… start adopting models with global system behaviour”
“We are seeing more and more hybrid models, rather than just mechanistic and statistical models,” he said, highlighting that the industry is beginning to adapt to the latest in silico analytical techniques.
Focusing on dissolution and stability testing, Close explained that traditionally, computer models have been used to simulate specific points of the manufacturing and synthesis stages. The parameters are decided, which can include particle size, tablet mass and environmental inputs, for example, to yield particular results.
However, he emphasised that the pharmaceutical industry needs to move away from this and start adopting models with global system behaviour which incorporates systems-based pharmaceutics: “Rather than looking at points in one decision space, we need to look at an entire design space.”
Instead, a programme that takes outputs of one analysis and uses them as inputs in the next stage of the process is required, which can be provided by a systems-based method. Close explained that this is achieved by running a base model hundreds of thousands of times, combining high-performance computing and big data analytics. Significantly, this can be used to quantify uncertainty and improve the overall quality of drugs.
In the third presentation of the day, Dr Patrick Wray from Bristol-Myers Squibb (BMS) discussed methods for tracking the physical stability of amorphous active pharmaceutical ingredients (APIs) using data modelling.
Wray explained that during formulation development, it is difficult to perform stability studies manually as changing conditions can cause errors. Therefore, employing an in silico technique can resolve this issue.
“We want to automate the work as much as possible to diminish the manual burden, a process which also takes a long time,” said Wray.
He said that a code utilised by BMS was designed to crawl through data to find relevant files and folders for a certain timepoint that relates to these conditions. It will then use this information and compare it with previous timepoints in the spectra. This data is then compiled into one result.
…during formulation development, it is difficult to perform stability studies manually”
However, an aspect that Wray highlighted as a potential issue is that pulling the report is time consuming. Despite the model’s analysis of data taking seconds, formatting this into a single document is the lengthiest part of the process.
Other presentations from the day saw Dr James Mann from AstraZeneca discuss the company’s Dissolution Universal Strategy Tool (DUST) and Dr Garry Scrivens from Pfizer speak about models to predict the effect of drug load and surface area on the stability of multi-composite tablet blends.
The next JPAG event, focusing on data integrity, will be held on 19 March. Click here to find out more.