article

Future trends in drug discovery technology

Posted: 18 December 2012 |

The average cost to a major pharmaceutical company of developing a new drug is over USD 6 billion1. Herper1 observes that the pharmaceutical industry is gripped by rising failure rates and costs, and suggests that the cost of new drugs will be reduced by new technologies and deeper understanding of biology. While the objectives of drug discovery don’t change, the methods and techniques by which pharmaceutical companies, biotechs and academia discover new drugs are evolving at a significant pace – and they need to.

Cancer Research UK drug launched in its first clinical trial

Drug discovery scientists are all aiming to identify compounds and candidate drugs with ‘good’ properties that are safe and efficacious, as quickly and cheaply as possible. The standard approach of the last 20 years has been to identify a single molecule disease target, and then to identify a compound that interacts with and modulates this target with high specificity. However, there is now a growing realisation that this ‘one target – one drug’ approach doesn’t work well, and that screening huge libraries of compounds against one particular property of an isolated target is an inefficient way to discover potential drugs. Much of the innovation currently seen in drug discovery methodologies seeks to access and integrate more information – about targets, compounds, and disease phenotypes – to enable a more comprehensive and holistic approach to discovering ‘good’ drug candidates. This article does not try to crystal ball-gaze deep into the future, but rather to identify those trends in the adoption of new technologies and approaches that are gaining traction now, and that can be expected to become more prevalent in the next two to three years.

Research papers do not necessarily reflect the latest trends and thinking in technology adoption, due to the time required for manu – script preparation, submission, review, and publication. I have therefore reviewed infor – mation that has been reported at significant recent scientific conferences in order to identify some of the key technology trends in the field. These conferences included SLAS 2012, ELRIG’s Drug Discovery 2012 and the 2012 MipTec conference. I’ve also examined the programme for the January 2013 SLAS conference, and a next-generation sequencing conference (also in January 2013: Hanson Wade’s NGS-Pharma). From these, three technology areas stand out as being both particularly active, and also genuinely useful to drug discovery scientists.

These are:

  • increasing the throughput of key, but relatively slow, technologies
  • wider adoption of label-free techniques, and the introduction of new ones
  • better model systems for screening assays.

In addition to these three areas, there are two other striking features that emerge from these recent conferences. One is that difficult target areas are receiving more attention from researchers, and this is driving innovation in technology. The second is the growing realisa – tion that prior knowledge of specific molecular targets is not only unnecessary, but may actually hamper discovery efforts. Phenotypic drug discovery (PDD) is becoming more widely used (partly driven by the improved model systems and the means to interrogate them) and has been shown to be an efficient method for discovering ‘first-in-class’ drugs2.

Increasing throughput of key technologies

Mass spectrometry (MS) has been routinely used to assay the metabolism of compounds in lead optimisation and preclinical development for 20 or so years; fewer drugs now fail as a result of pharmacokinetic issues thanks to MS. The increasing size of chemical libraries, highthroughput screening (HTS) technologies that enable thousands or millions of compounds to be screened, and the concomitant increase in compounds requiring optimisation over this period have increased the urgency for the bioanalysis of metabolites, and a need for faster turnaround. In addition to new analytical techniques for MS that give more sensitivity and require less sample preparation, there has also been a drive to increase the throughput of MS. As examples, Agilent’s RapidFire highthroughput MS system has been used both to read a cytochrome P450 inhibition assay at Bristol-Myers Squibb (A Weston, SLAS 2013), using human liver microsomes, and also to detect inhibitors of histone demethylases in human macrophages at GSK3 (M Leveridge, Drug Discovery 2012). RapidFire requires no offline sample preparation, and permits sample times of 6 to 10 seconds. San Diego-based NextVal (www.nextval.com) has introduced a high-throughput MS system that generates spectra from acoustically printed arrays with a matrix-free direct surface ionisation technology – this avoids significant barriers associated with the use of matrix-based ionisation methods in screening. NextVal’s technology is currently being evaluated by Bristol-Myers Squibb.

Flow cytometry is a well-established cellular analysis technique whose value has been demonstrated in clinical laboratories all over the world. One of the most useful attributes of flow cytometry is its ability to collect multiparametric molecular and functional data from individual cells – it is a very high-content technique. Imaging high-content screening (HCS) has already proven its worth in compound screening and drug discovery, but the throughput issues of flow cytometry have limited its impact. Advances in flow cytometry instrumentation, such as Beckman’s MoFlo instrumentation and Amnis’ imaging flow cytometers, have now increased the throughput and information content available. Highthroughput flow cytometers are beginning to become useful tools in compound screening, and have been used by GSK for cell-based screening of histone demethylase inhibitors (R Jepras, SLAS 2013, HyperCyt high-through sampler and Accuri cytometer) and for a screening assay in haematopoietic target identification at the Czech Institute of Molecular Genetics (P Bartumek, SLAS 2013). However, Purdue University’s cytometry guru, Paul Robinson, believes that the availability of appropriate analytical tools to process and interrogate high-throughput flow cytometry data is becoming a rate-limiting step in the widespread adoption of high-throughput flow cytometry (SLAS 2013). However, highthroughput flow cytometry does have the power to establish functional relationships between cellular phenotypes, key activator molecules and their modulation states, and compound activity all within single cells, and it is now being applied to drug screening.

HCS is also being applied to highthroughput primary screening by some companies, usually with either multi-detector HCS imagers like PerkinElmer’s OPERA, or fast, lower-resolution imaging cytometers such as TTP Labtech’s Acumen eX3. While HCS is no longer a novel technology, its application earlier in the drug discovery work-flow is a more recent phenomenon. Use of HCS in primary screening reflects the growing realisation that multiplexed cell-based assays are more likely to uncover useful information about compounds than single output biochemical or cell-based assays, and that high-throughput HCS can provide this information earlier in the discovery process.

Microfluidics is another technology that has been around for some time, but is now benefitting from developments which increase system throughput. Greater multiplexing onto microfluidic devices is enabling thousands of molecular interactions to be measured on one array (E Maerkl, MipTec 2012). This Maerkl system has identified a compound that inhibits hepatitis C virus assembly. The same lab has also developed a microfluidic system to support more than 1,100 microbial cultures, and is using the platform to identify genes involved in bacterial resistance to drugs (MipTec 2012). Microfluidic single cell analysis, both by imaging and ‘digital’ PCR, has been used to describe the complex system regulation exerted by NF-kappaB, and how this is modified in pathogen-host interactions and in tissue inflammation (S Tay, MipTec 2012).

Label-free technologies

I have already mentioned the importance of MS, but there is a growing number of label-free technologies that are either already improving, or have the potential to improve, drug discovery. A recent conference dedicated to labelfree technologies (www.labelfreetech.com) highlighted some technologies that are poised to become established drug discovery tools. These technologies include:

  • More sensitive evanescent waveguide interferometry (e.g. www.creoptix.com)
  • Label-free and tether-free homogeneous assays based on back-scattering interferometry (BSI) that enables interaction assays with nanomolar levels of target protein (Molecular Sensing Inc, www.molsense.com). BSI is being used for the label-free detection and measurement of ligand interactions with GPCRs in membrane fragments (S Weinberger, SLAS 2013)
  • Microscale thermophoresis. This is another tether-free, label-free technique which uses an infrared laser to set up localised microscale temperature gradients within capillaries (NanoTemper Technologies GmbH, www.nanotemper.de). Intrinsic fluorescence is then used to track the thermophoretic movement of molecules, which is dependent on molecular interactions
  • High-density microarrays for label-free biomolecular reaction kinetics, based on ellipsometry4. This technology has been used to establish real-time binding curves for a probe to 10,000 immobilised targets on a functionalised glass slide. This gives a much higher throughput than other label-free technologies, and is a significant improvement on fluorescence detection, which provides end-point analysis of binding.

More relevant model systems

Among the most significant causes of late-stage drug failure are lack of efficacy in phase II and phase III trials, and safety issues. Both of these factors are, in part, attributed to the inadequacy of animal models for testing drugs for human disease. The pharmaceutical industry has been seeking better, more physiologically-relevant cellular models for many years. Human-derived primary cells are often cited as the ‘gold standard’ cellular model, but there are significant technical and ethical issues with collecting and using primary cells. It is also extremely difficult to obtain sufficient quantities of primary human cells to make a significant contribution to drug discovery programmes.

The 2012 Nobel Prize for Physiology or Medicine was awarded to two pioneers of stem cell technology: Sir John Gurdon and Shinya Yamanaka. Their work is now regularly being implemented in drug discovery, driven by the demand for more physiologically relevant cellular models for screening and optimisation assays. Induced-pluripotent stem cells (iPSCs) and human embryonic stem cells (hES) provide large numbers of well-differentiated, humanderived cells for a wide range of experiments within drug discovery. Cardiovascular drug discovery is already showing active uptake of the technology, with hES-derived cardio – myocytes being used for cytotoxicity analysis at AstraZeneca (J Sidaway, Drug Discovery 2012). Target validation of MAP4K4 as a modulator of diverse triggers of heart failure is being confirmed by RNA interference in human iPSC-derived cardiomyocytes at Imperial College, London (M Schneider, Drug Discovery 2012). Finally, human iPSC-derived cardio – myocytes have also been shown to be a more physiological model of human cardiac bioenergetics than the commonly used rodentderived H9C2 cardiomyocytes (Cellular Dynamics Inc, B Anson, SLAS 2013). Neuroscience is also embracing stem cell technologies to gain access to otherwise difficult-to-isolate cell populations. Pfizer’s Neusentis research unit is using hES to create sensory neurones that show many molecular and phenotypic characteristics of natural sensory neurones (J Bilsland, Drug Discovery 2012). Induced-PSCs from Down’s Syndrome and familial Alzheimer’s patients have been differentiated to excitatory cortical projection neurones that show Alzheimer’s-like path – ologies, thereby potentially enabling the screening of compounds for Alzheimer’s treatments (R Livesey, University of Cambridge, Drug Discovery 2012). It is highly likely that stem cells will be used more and more widely in the drug discovery process, and several companies now market cells of many phenotypes derived from stem cells (e.g. GE Healthcare and Cellular Dynamics Inc.).

Aside from the source of cells, the other major issue with in vitro cell culture is that two-dimensional culture systems do not reflect the environment that cells experience within a three-dimensional organism. It is now well accepted that cells grown in 2D culture behave significantly differently from that same type when grown either in 3D culture, or in organisms5, and there is a growing trend to utilise 3D model systems for at least some aspects of the drug discovery process. Cancer is probably the therapeutic area with most demand for 3D models, since tumour biology is influenced significantly by the cellular hypoxia induced by the size and shape of growing tumours. Three-dimensional systems for hepatocyte culture are also highly sought after, but are challenging to provide commercially.

Several issues must be addressed when establishing 3D cultures. These include: selecting the matrix, scaffold, or growth conditions that will provide the mechanical support for cells; optimising the time it takes for the culture to develop prior to its use; and utilising appropriate assay technologies that will not suffer interference from the matrix or scaffold. Despite these challenges, 3D cell culture is becoming more widely adopted as products become available to make 3D culture easier. A significant area of application for 3D culture is in stem cell assays, where the combination of well-defined human-derived cells with more physiological culture conditions seems particularly attractive to researchers.

Bayer Pharma is developing 3D spheroids as the basis for high-content assays for drug screening in oncology (P Steigemann, Drug Discovery 2012), and Leiden University has an active programme in developing 3D HCS assays for comprehensive phenotypic profiling of cancer cells in 384-well microplates (L Price, SLAS 2013). Leiden has also reported an automated microinjection system for rapid generation of matrix-embedded spheroids that are compatible with HCS (E Danen, MipTec 2012). Harvard Medical School has developed a biomechanical system seeded with iPSCderived endothelial cells that recreates vascular haemodynamics (W Adams, SLAS 2012). This system enables analysis of the relationship between patients’ genetic background and the endothelial phenotype. Commercial activity for 3D cell culture includes a wide range of matrix proteins and mixtures, as well as systems for the rapid formation of 3D cultures in collagen gels (TAP Biosystems’ RAFT, www.tapbiosystems.com), and ready-to-use organotypic 3D microtissues grown in a ‘hanging drop’ format (InSphero AG, www.insphero.com).

Next-generation sequencing

The impact and benefits of next-generation sequencing (NGS) technologies on uncovering disease aetiology, and the potential role for NGS in personalised medicine, are widely known and discussed. In the context of this review, it is the impact of NGS on target identification and validation that is most relevant. Although the concept of ‘one target – one drug’ is losing its predominance in drug discovery, it is still important to characterise the molecular target of a drug, and to understand the interactions that the target has with other molecules. NGS plays a key role in molecular analysis, particularly in the study of gene expression and RNA structure and function, and Illumina is developing methods to translate this potential to quantitative assays for drug discovery (G Schroth, SLAS 2013). Bristol-Myers Squibb is using NGS tools to assay genome-wide RNAi screens in its search for targets and biomarkers (J Feder, NGS-Pharma 2013), while AstraZeneca is evaluating NGS RNA sequencing against microarray technologies for transcriptome analysis (J Bradford, NGS Pharma 2013).

Technology drivers for innovation

It is said that the reduction in new drug registrations during the first decade of the century is partly attributable to the fact that all the ‘low hanging fruit’ of drug targets had already been harvested (e.g. GPCRs and other membrane proteins), and that those left were the more difficult targets. New insights into the root cause of many diseases point to targets that have traditionally been difficult for smallmolecule drugs, such as transcription factors and protein-protein interactions (PPIs). Industry and academia are now looking at these more intractable targets in greater detail, and the inherent difficulty of this work is driving the development of new techniques and approaches to drug discovery. One of the key areas, because of its profound effects on phenotype and disease, is epigenetics.

In epigenetics, Biofocus is applying computational compound selection to define targeted compound- and fragment- libraries for screening against a histone methyltransferase (HMT) and a histone demethylase, with detection using mobility-shift assays (D Hafenbradl, Drug Discovery 2012). Epizyme (www.epizyme.com) is overcoming the challenges of low turnover and high binding affinity of HMTs to their substrates and products by developing assays with sufficient sensitivity to be able to detect modifications in HMT activity (M Porter-Scott, Drug Discovery 2012). Cellzome, now part of GSK, has developed a quantitative proteomics strategy using tethered small molecule probes to detect drug binding to the huge, megadalton, protein complexes that constitute the chromatin-modifying structures (G Berganini, Drug Discovery 2012).

In PPIs, one of the significant problems is that the interacting surfaces between proteins are relatively large, featureless, areas with apparently little opportunity for modulation by small molecules. Allosteric sites are one possible route for drug modification of PPI, but robust, automated assays are required to screen for activity. Janssen is developing such cell-based assays on a microscopy platform, using fluorescence-based ‘two-hybrid’ or proximity ligation assay technologies (E Krausz, MipTec 2012).

Phenotypic drug discovery

Chris Lipinski, the author of medicinal chemistry’s ‘Rule of five’, has observed that it is often simply a matter of luck that the ‘one targetone drug’ approach to drug discovery works at all (Drug Discovery 2012). Lipinski goes on to say that the fact that this approach frequently doesn’t work is due to the robustness of biological networks which results from the redundancy that is built into signalling pathways. This redundancy means that com – pletely blocking a signalling pathway at one point often elicits no phenotypic response, because the signal is transmitted via another route. Blocking two points in a pathway is much more likely to induce a phenotypic change. Growing numbers of scientists are reaching the conclusion that single target screening is not the best way to search for new drugs, especially for ‘first-in-class’ drugs. This is borne out by Swinney & Anthony’s research2 showing that, in the 10 years to 2008, more drugs with novel mechanisms of action were discovered with PDD strategies than with target-based ones, at a time when the major focus of the pharmaceutical industry was on targetbased discovery.

Phenotypic screening used to depend on manual assays with, for example, rows of isolated rat hearts or lengths of intestine in perfusion systems, and with the effects of drugs being observed and measured based on phenotypic changes such as contraction rates or electrical activity. Modern high-content assays, using human-derived cells, permit a new form of phenotypic assay that not only gives great insight into mechanisms of action, but also allows high-throughput, target-agnostic screening for compounds that affect the phenotype of interest. As an example, AstraZeneca is applying a range of high-content assays to PDD at a number of stages across drug discovery, using co-cultures of human cells, for both efficacy and toxicity testing (B Isherwood, Drug Discovery 2012). The Broad Institute has used PDD to identify novel scaffolds from their compound collection that regulate a protease (PCSK9) involved in cholesterol homeostasis (M Palmer, SLAS 2013), while the Czech Institute of Molecular Genetics is using high-throughput flow cytometry as a basis for its haematopoietic disease PDD programme (P Bartumek, SLAS 2013).

Closing remarks

I believe that future drug discovery and development will continue to involve the same basic science disciplines that have been used for the past 60 years: structural biology to examine the physical characteristics of biomolecular targets; synthetic chemistry to provide drug-like compounds; and pharmacology to probe the mechanisms of interaction between compound, target, and organism. The differences will be in how scientists gain deeper understanding of human disease from genomics and proteomics, and how they screen for new drugs using complex, human-derived models of tissue and organ function in multiparametric assay systems that monitor multiple potential targets. New techniques to monitor drug target interactions with greater ease and sensitivity will support these approaches. Novel data access and sharing tools will be required to allow researchers to use these new insights to further their work. The recent, and imminent, conference presentations reviewed here indicate that we have already embarked on this new chapter in drug discovery.

References

  1. M Herper, The Truly Staggering Cost of Inventing New Drugs, Forbes.com, 2012
  2. Swinney & Anthony (2011) Nat. Rev. Drug Discov. 10, 507-519
  3. Kruidenier et al (2012) Nature 488, 404-408
  4. Landry et al (2012) Assay & Drug Development Technologies 10, 250-259
  5. Nirmalanandhan et al (2010) Assay & Drug Development Technologies 8, 581-590

About the author

Dr Terry McCann runs TJM Consultancy (www.tjmconsultancy.co.uk), a business support consultancy for the life sciences. TJM Consultancy has been providing strategic and tactical business development and marketing support to life science companies for seven years, and includes leading multinationals among its clients. Recent projects include a market share survey of leading vendors of HCS imagers – based on their own data – and an analysis of the European microscopy market. Prior to establishing TJM Consultancy, Dr McCann worked at Life Science Resources and then PerkinElmer, where he held several positions managing confocal microscopy and proteomics businesses, ultimately managing their cellular sciences business. Dr McCann has a doctorate from the University of Cambridge, and has conducted postdoctoral research at the Babraham Institute and Guy’s Medical School, University of London.