article

Implementation of high-throughput quality control processs within compound management

Posted: 31 August 2011 |

The constant growth of compound collections, combined with screening efforts on more challenging targets, is creating an increasing demand for quality control in order to ensure the integrity of the compound solutions being tested. This is true throughout the early drug discovery pipeline, from hit identification to lead nomination. Novartis has recently updated its QC capabilities by creating two High Throughput Quality Control (QC) laboratories, enabling a systematic QC at three critical stages: 1. Production of DSMO stock solutions used in High Throughput Screening (HTS); 2. Creation of dilution series for Hit-Validation; 3. Production of DMSO solutions prepared in the Novartis ‘Compound Hubs’ for the Hitto- Lead optimisation stage of projects. High-throughput LC-MS methods were developed to confirm the identity and the purity of ca. 500,000 samples/year in a highly automated fashion…

EPR Screening Supplement 2013

The constant growth of compound collections, combined with screening efforts on more challenging targets, is creating an increasing demand for quality control in order to ensure the integrity of the compound solutions being tested. This is true throughout the early drug discovery pipeline, from hit identification to lead nomination.

Novartis has recently updated its QC capabilities by creating two High Throughput Quality Control (QC) laboratories, enabling a systematic QC at three critical stages:

  1. Production of DSMO stock solutions used in High Throughput Screening (HTS)
  2. Creation of dilution series for Hit-Validation
  3. Production of DMSO solutions prepared in the Novartis ‘Compound Hubs’ for the Hitto- Lead optimisation stage of projects

High-throughput LC-MS methods were developed to confirm the identity and the purity of ca. 500,000 samples/year in a highly automated fashion.

The identity of the substances is determined by mass spectrometry (MS) while the purity is measured by UV absorbance and Evaporative Light Scattering Detection (ESLD). In addition, the concentration of the Compound Hub solutions is assessed by nitrogen detection (CLND) and/or ELSD.

The large volume of analytical data is communicated directly to the relevant biology and chemistry laboratories in tabular form in order to support their immediate research work and decision making. Additionally, the same tabular data is deposited in the global Novartis data warehouse and all raw data is stored in a Waters Analytical Workflow Manager (AWM) database for review, if required.

In addition to providing project level feedback, this systematic analytical QC approach gives Compound Management (CM) snapshots of the quality of the compound collection. These snapshots at multiple stages in the process can then be compared over time. This comparison over time enables the evaluation of the collection’s evolution from compound purchase, to library replenishment and new synthesis. It also enables a systematic evaluation of the compound management process itself.

New challenges for Compound Management

Pharmaceutical research Compound Management teams face multiple challenges including the ever increasing size of the collections of powders and solutions they maintain. In addition, evolving screening technologies require changes in plate contents and formats.

The compound management processes are highly complex and cover the registration of incoming compounds (from both internal and external sources), the preparation of new DMSO solutions, the cherry-picking of powder or solution samples and the preparation and registration of single tubes or plates (single concentration or dilutions series). Each of these steps is highly automated; however, process failures may still occur due to unavoidable human, hardware or software errors.

It has therefore become vital to implement quality control filters at critical stages in the compound management process in order to ensure the delivery of the right samples to the right place with the expected quality.

Identification of the needs

Three main critical stages were identified at Novartis (Figure 1).

  • The production of 10 mMol solutions for HTS. These solutions are produced from powders (either ‘historical’ or new substances) on fully or semi-automated platforms (ASP1,2: Automated Solution Production). The process is extremely resource-intensive since several hundreds of thousands of new solutions are produced each year during replenishment campaigns
  • The cherry-picking of 2 mMol solutions for the preparation of dose-response plates used for Hit-Validation. The analytical QC data must be available to evaluate in parallel with the screening results, i.e. one to two weeks after the receipt of the QC well plates
  • The preparation of 10 mMol solutions at the Compound Hub3 level to support the compound profiling and lead optimisation efforts. This process requires short turnaround times i.e. analytical data delivery between 24 and 36 hours after sample receipt by the analytical laboratory
  • The overall workload requires the analysis of about half a million samples per year for the main Novartis compound management sites located in Basel and Cambridge. Therefore, new high throughput QC laboratories were created in both locations to support the different quality control processes

Technology

The most commonly used and robust technology for high-throughput quality control has been for many years4 and is still Liquid Chromatography coupled to Mass Spectrometry (LC-MS). Due to the expected workload it was decided to equip both laboratories with ‘ultra performance liquid chromatography’ systems (Waters Acquity SQD) on which fast methods were developed. These methods combine electrospray ionisation (ESI) MS in both positive and negative mode for the identity check with UV-DAD and ELSD detection for the determination of the purity. Some instruments were additionally equipped with a CLND detector in order to assess the concentration of the Compound Hub solutions (Figure 2).

Hardware

Several minor yet critical hardware challenges relating to the system autosampler had to be resolved. Since the quality control samples can be delivered to the laboratory in a variety of formats (e.g. 96 or 384 well-plates or Matrix tube racks) the system needs to reliably perform injections from all of the containers from limited sample volumes. After some adjustments and careful calibration, the system autosampler was able to accommodate the different sample formats (typically 5 to 15 ul of solution, significantly below the specifications of the manufacturer for the injector).

Due to the implementation of the CLND detectors, acetonitrile had to be excluded as a chromatography solvent. A short gradient was developed with water and methanol as the eluent. Incorporating 0.05 per cent of formic acid as a modifier, enabled both positive and negative ionisation modes. A short and robust column type was chosen (Acquity BEH C18 30 x 2.1 millimetre, 1.7 μm particle size) to balance the needs for chromatographic separation and speed. This column allows the use of a high flow rate (ca. one ml/minute) without going beyond an acceptable temperature (60°C) and pressure (ca. 800 Bar) during the runs and typically was good for at least 10000 injections. A total cycle time of less than two minutes could be achieved, allowing about 700 LC-MS analyses/day on each instrument while still allowing most significant impurities to be readily separated.

Software

The main technical challenges in the QC process occurred on the software side. The Masslynx LC-MS control software is able to cope with high-throughput analytics but it needs to be fed with the appropriate data. Several applications had to be developed to support the different QC steps:

  • A web-based application was created to retrieve the sample information from the corresponding logistic database. The analyst simply needs to scan the plate or rack barcodes into the application which retrieves all the relevant data including sample position, substance name, molecular formula, structure etc. and generates a text-based data file
  • The data file is imported into an Analytical Workflow Manager (AWM) database. Dedicated so-called ‘agents’ generate work lists in a text format that is compatible with the LC-MS software
  • Reading the 1D barcodes at the LC-MS instrument automatically triggers the analytical runs. It is possible to load as many plates as the plate hotel can contain (typically 20)
  • The analytical data is automatically processed by the OpenLynx software and summarised in a so-called ‘.rpt’ file which contains the essential analytical data such as chromatograms and mass spectra
  • The commercially available software (OpenLynx Browser) only allows data review. If there was any error in the automated data interpretation, the whole dataset would need to be reprocessed. Consequently, a custom data review tool was developed internally to not only read the ‘.rpt’ files but also modify or annotate the analytical data and generate summary reports (Table 1)
  • All the raw and meta-data is automatically imported to the AWM database by dedicated agents after data review. A reporting tool can extract all the relevant meta-data in order to create analytical reports that will be sent to the requestors, chemists or biologists. The analytical data is simultaneously published in a global data warehouse (Avalon)

Challenges

High-throughput hyphenated LC-MS has been an enabling technology in most analytical laboratories for many years5. The main challenge was therefore not the technology itself but its integration within an existing and complex compound management and screening process. This includes retrieving necessary compound metadata prior to analysis using the 1 or 2D barcodes on the compound plates and racks as well as the need to link the analytical results to the corresponding solutions and screening results via a unique identifier named ‘Solution ID’.

Implementation of new processes

The most critical step was the sequential implementation of new test and productive QC processes without compromising the previously established ones. The QC of Dose Response plates, for example, had been established for years on a different type of equipment and using other software packages. This QC protocol had to keep running productively without being affected while the new Solution Production and the Compound Hub QC processes were being tested and implemented. There was about a year of overlap between old and new analytical infrastructures that had to be transparent for the end users but required a high level of flexibility from the analytical laboratory associates. The deployment of interim software tools for process tracking and data analysis/reporting was necessary to ensure a smooth transition from the existing to the new QC environment. Eventually, the Analytical QC of dose response plates was also migrated to the new instrumentation and software infrastructure.

Purity and concentration determination

There is to date no universal auxiliary detector for LC-MS. The most commonly used detectors in high-throughput LC-MS rely on UV absorbance and ELS. While the MS is used to assess the identity of the compound in solution, both auxiliary detectors are used to assess the purity of the compounds. UV purity is primarily determined by calculating peak area percent at 214 nanometres and the ELSD peak area percent purity is used as a back-up in the event a specific compound lacks a chromophore or co-elutes with DMSO which absorbs strongly at 214 nanometres.

Assessment of the test solution concen – tration is even more challenging. We have opted to use the CLND detector, since the literature6 suggests it is a powerful tool to quantify nitrogen-containing molecules, despite a structure-dependant response as reported by Yan et al.7 Since a fair percentage of the compound collection consists of non-nitrogen containing molecules, we decided to use the ELSD detector as a secondary tool for quantification purpose8, despite its limited linearity.

Compound Hub Quality Control

The most complex process to implement was the Quality Control of the solutions centrally produced in the different Compound Hubs. This process has many inputs (100s), since compounds originate from all the medicinal chemistry laboratories within the company and many outputs (10s), since solutions are sent to multiple biology laboratories worldwide. The solutions may be produced from newly synthesised compounds to support lead optimisation efforts or from the historical collection for structure activity relationship (SAR) studies. The complexity of the process and its short turnaround times (on average, the solutions are delivered to the destination laboratory within 36 hours after order) required high-throughput methods. This is not necessarily to deal with enormous amounts of samples (the QC laboratories deal with a few tens to a few hundreds of samples daily) but to ensure that the analytical information will be available soon enough (24-36 hours from receipt of samples). This helps avoid the misinterpretation of SAR or the publication of false or biased biological results to the data warehouse.

High quality data in a timely manner

In a high throughput screening environment a low failure rate (~ one per cent for example) in the method is usually considered acceptable. In the Hub QC process, the requirement for fast data turnaround necessitates a high level of automation while the need for high quality QC requires off-line data review by experienced laboratory associates. It is essential to ensure correct data interpretation and efficient communication of sometimes complex results to the end users. This paradox originates from the fact that while each QC laboratory analyses hundreds of samples originating from hundreds of different chemists, there is usually a small number of compounds from each individual chemist/project. Misinterpretation of the biological data from this small number of compounds could have a disproportionate effect on the progression of the lead optimisation work. Human data review is therefore essential to correct false positives or negatives and immediately inform the stakeholders (chemists, biologists, CM associates) about any findings such as a wrong structure for example. In addition to the individualised information flow, all the data is communicated to the requestors in a daily report and stored in a global data warehouse.

First observations

The different quality control processes, especially the Compound Hub QC, have confirmed both the high quality of the new compounds entering the compound management system and the efficiency of the automated sample production (Figure 3 on page 51). However, the QC processes have also identified the rare instances in which structures in our databases were wrong because of data entry errors (either in the structure itself or in the code of the new molecule). Additionally, human errors or instrumental failures resulting in switched vials could also be identified (i.e. substance A in Vial B and vice-versa). The most common failure mode of solutions in the QC process was either due to the degradation of the substances which started out pure or due to solutions with low concentration owing to poor compound solubility (Figure 4).

Conclusion

High-throughput LC-MS-based QC is quite widespread in the pharmaceutical research world, but its extension throughout the Compound Management processes within a year was a major challenge from the software, hardware and human points of view. We were able to smoothly establish new quality control steps thanks to the commitment of the CM and the IT teams and to the cooperation of biology and chemistry laboratories who kindly agreed to participate in test and pilot phases. The initiative was very well received by all the stakeholders who recognised the need for an exhaustive set of QC processes making the human and instrumental operations more efficient and less error-prone. However, the successful growth of the analytical QC processes from 150,000 to 500,000 LCMS/year is only the beginning. We now need to make the best use of the data we acquired e.g. by tracking the evolution of the quality of the compound collection over time and discarding degraded samples. A future step will be the implementation of complementary analytical technologies such as highthroughput, automated NMR among others.

References

1. U Schopfer, F. Hoehn, M. Hueber, European Pharmacetucial Review, 2005, Issue 1

2. U Schopfer, F. Hoehn, M. Hueber, M. Girod, C. Engeloch, M. Popov and I. Muckenschnabel, J Biomol Screen 2007; 12; 724

3. Schopfer U, Andreae MR, Heuber M, Saur A, Kummer MO, Girod M, Fox D, Steiner T, Popov M, Smith R., Comb Chem High Throughput Screen. 2007 May;10(4):283-7

4. D. Yurek, D. Branch and M.-S Kuo, J. Comb. Chem. 2002, 4, 138-148

5. J. Kyranos, H. Lee, W. Goetzinger, L. Li, J. Comb. Chem 2004, 6, 796-804

6. X.Liang, H. Patel, J. Young, P. Shah, T. Raglione, JPBA 47, 2008, 723-730

7. B. Yan, J. Zhao, K. Leopold, B. Zhang and G. Jiang, Anal. Chem. 2007, 79, 718-726

8. L. Fang, M. Wan, M. Pennacchio and J. Pan, J. Comb. Chem. 2000, 2, 254-257

 

Dr. Jermone Giovannoni received a Master of Physics and Chemistry degree from the School of Industrial Physics and Chemistry (ESPCI-ParisTech). He then moved to the University of Montpellier (France) where he obtained his Ph.D. in organic chemistry with Professor Jean Martinez. Subsequently, he joined the Company Chemspeed Technologies as an Application Chemist. In addition to his customer support activities he worked on the development of an automated peptide synthesizer. In 2005, Jerome Giovannoni moved to the agrochemical company Syngenta as a laboratory head and GLP study director in analytical development, with a focus on seed treatment products. He joined Novartis in 2009 where he is currently leading a high-throughput LCMS quality control laboratory within the Compound Management department. email: [email protected]

Dr. John Peltier is engaged in the application of high throughput mass spectrometric methods for screening compounds for activity against drug targets and for characterising those compounds for mechanisms of action and basic compound integrity. Dr. Peltier has over 20 years of experience in mass spectrometry and analytical methods development, applications of MS to high throughput screening, characterisation of lipid and carbohydrate components of bacterial cell surface antigens, characterisation of post-transcriptionally modified RNA, qualitative and quantitative analysis of peptides and proteins, protein interaction analysis and characterisation of the metabolism of small molecules for drug development. Prior to joining Novartis, Dr. Peltier conducted research and development at Correlogic Systems for diagnostic applications of MS. Previous research at Prolexys Pharmaceuticals included MS-based target discovery, high throughput assay development for kinases and other molecular targets. He also ran Prolexys’ discovery ADME/PK program. In an earlier position at Applied Biosystems, Dr. Peltier was involved the development of MS platforms, integrated analysis systems, software tools and reagent tools for the analysis of proteins and small molecules. Dr. Peltier received his BS and PhD degrees in Chemistry from McMaster University in Canada. He conducted his postdoctoral research at the University of Utah, in the laboratory of Dr. James McCloskey, where he investigated the structure of modified ribonucleic acids. email: [email protected]

Related topics

Related organisations

,