Utilisation of secondary screening

Posted: 19 June 2008 | Miroslav Cik, Johnson & Johnson, Anthony Davies, Trinity College Dublin, Marc Bickle, Max Planck Institute, Holger Erfle, BIOQUANT Centre, University of Heidelberg, Keith R. Olson, DiscoveRX Corporation | No comments yet

European Pharmaceutical Review has brought together four individuals from different sides of the scientific palette to discuss current and future issues surrounding secondary screening and maximising its potential.


European Pharmaceutical Review has brought together four individuals from different sides of the scientific palette to discuss current and future issues surrounding secondary screening and maximising its potential.

Roundtable Participants:

Miroslav Cik, Johnson & Johnson
Anthon Davies, Trinity College Dublin
Marc Bickle, Max Planck Institute
Holger Erfle, BIOQUANT Centre, University of Heidelberg
Keith R. Olson, DiscoveRX Corporation

In your opinion, which area of R&D benefits the most from utilising secondary screening methods?

Cik: I think the application of secondary screening benefits the hit-to-lead process in all pharmaceutical domains. Also, regarding high content screening, the first tests and assays developed, were directly applicable in oncology, making it the first domain to fully experience its benefits.

But I believe in the near future, other domains like neuroscience and ADMETox will also be able to enjoy the benefits of high content screening.

Davies: I am certain that in the fields of compound screening and large-scale target identification, secondary screens are hugely important. Secondary screens provide us with the opportunity to further qualify the importance and validity of any hits generated.

Erfle: From my point of view, cell biology, systems biology and as a consequence, drug development, will benefit most by secondary screening methods.

Olson: Primary screening has become more and more prolific in the generation of hit compounds, so the primary benefit afforded by secondary screening is in the qualification of those hits, and the assessment of biological relevance. Secondary screens, especially when they are cell-based, provide critical added information on compound behaviour that eventually leads to better decision making as hits transition to leads and move from HTS back to therapeutic areas.

Bickle: Drug discovery profits the most if the assays are correctly designed and are rich in information. Early toxicity and ADME properties can be deduced from cellular secondary screens.

Over the last few years, huge emphasis has been placed on implementing secondary screening technology. What do you see as the main driving forces behind this?

Cik: Before answering the question, I would like to remind everyone that secondary screenings are not new; they have always been used to some extent. But it is true that in the past ten years they started to be applied on a much greater scale than ever before. Today, they have become a common practice. This evolution happened in part because many new technologies became available, which make secondary screenings more useful. High content screening is one example, it was initially picked up approximately ten years ago and since then it has been growing steadily every year. It was successful because it provided much more data on compounds than we could achieve before. But also important are the new functional cellular assays, like bèta-arrestin recruitment assays that have been developed in recent years.

Now coming back to the question: all those new technologies were developed in order to maximise primary screening results, eliminate the false positives and increase speed and efficiency in generating new drugs; that has always been the main driving force. Failures at a later stage of development of a compound are so costly that the cost of implementing secondary screenings is easily won back.

Davies: The main driving force in my opinion would be the benefits of obtaining more biologically informative data earlier in the discovery process. The overall benefit would be seen in a better and more focused utilisation of resource, thereby increasing efficiency and driving down research.

Erfle: The scientific and pharmacological community realised that the knowledge gained by secondary screens was more effective and specific treatments for diverse diseases will be possible in the future. In addition, collecting in depth knowledge is nowadays possible on a high-scale and a reproducible manner by using automated sample preparation, data acquisitions, data evaluation and data mining.

Olson: The emergence of new technologies, methods and suppliers for cell-based assay solution suitable for screening I think have been the driving force behind the increase in assays run in a secondary mode. Many early cell-based screening approaches, such as high content analysis, proved the value inherent in cell-based assays, but came with certain limitations. As new technologies have entered the market, assay approaches like the PathHunter™ family of products have reduced the complexity of cell-based assays, and made it possible to run simple, biologically relevant assays at both the primary and secondary level. Continued expansion of such products, and the emergence of new approaches, will continue to drive this trend toward cellular screening solutions.

Bickle: The growing gap between the cost in drug development and drug approval has forced the pharmaceutical industry to reduce the cost of the drug discovery process, as enhancing the rate of successful drug discovery is very difficult. It has been recognised that too many molecules reach late stages in drug discovery which should have been stopped much earlier. It was therefore recognised that it is important to design secondary screens that can filter out candidates that would later be dropped due to poor ADME, low efficacy or high toxicity.

Do you think secondary screening can give early insights into the mode of action?

Erfle: In general, the knowledge gained by secondary screens can help to place proteins both upstream and downstream of each other, into pathways to deduce functional networks. By doing this, early insights into the mode of action will be created. Combined with available multiplexing possibilities of various microscopes detailed time resolved information about these networks will become available.

Davies: Yes; a well designed and properly focused secondary screen or suite of screens can yield valuable information in regards to biological interactions and potential mode(s) of action.

Olson: Yes, I think that is one of the primary benefits of increasing the number or extent of secondary campaigns following hit identification. For example, you may have a GPCR target for which you found a hit in a reporter gene assay or a second messenger assay. By adding additional assays either upstream or down alternate signalling routes in the cell, you gain a more complete understanding of how your potential lead compound is behaving. In the case of the PathHunter™ GPCR ß-Arrestin technology, you can take such a series of hits and determine in more detail whether or not your compound signals in a unique manner, or in an extreme case as a biased ligand showing differential affects that could be therapeutically relevant down the line. In addition, if you are able to add high-value assays such as protein-protein interaction biosensors, you can potentially dissect in great detail how a compound is behaving and which specific target molecules are involved in the response.

Bickle: This depends on the assay used. The more information collected, the better the understanding of the mode of action. Using cellular systems combined with microscopic technologies can yield tremendous amount of data, allowing a good appreciation of the underlying process.

What future developments are needed to complement and maximise the potential of secondary screening?

Cik: It has become a highly complex and time consuming task to extract useful information from multi-parametric, multi-compound sets of data. I think the most important need is for more efficient data mining tools that can simplify and speed up this task, while at the same time clarifying the information. They must present information in such a way that it facilitates making the correct decision in short time.

In addition, I believe that the robotics that were developed for large-scale primary screenings need to be adapted to deal with small-scale secondary screenings. This will make it possible to focus on a smaller set of compounds more quickly and more efficiently.

The third evolution that I find paramount is not in the technological realm, but on the organisational level. There is a strong need for improving coordination between all of the various groups of people involved in the development of compounds. This includes bio-informaticians, therapeutic experts and chemists, among many others.

Davies: More biologically relevant cell-based assays, which rely upon functional outputs. To achieve this goal I think it is essential that relevant cell types are utilised. This may necessitate the use of primary cells maintained in physiologically relevant conditions.

Erfle: Statistical analysis and data exploration are still not integrated enough fields, as well as proper assay development. In addition, many assays still suffer from stability and reproducibility of results.

Results from secondary screens should also be openly accessible in appropriate databases to maximise the output.

Olson: I think the primary need from customers is the availability of a broad range of assays that are cost effective and straightforward to implement. The PathHunter product line is an ideal example, where we have generated a very broad panel of assays in the GPCR target area numbering over 150 targets. These assays are available both as commercially available cell lines that can be run in-house, and they can be accessed as a screening/profiling service from DiscoveRx. When implemented in-house, they can be run on virtually any microplate reader using a robust chemiluminescent signal output. In addition, many of our GPCR targets are now being offered as consumable kit products using cells as reagents. These one-time use cells offer convenience and low cost access to our PathHunter GPCR ß-arrestin technology, and very soon our PathHunter NHR assay technology.

Bickle: Cell based systems that accurately predict the principal cause of failure of drug candidates are still lacking. Libraries of failed angles should be assembled and studied thoroughly to try and understand the mechanism underlying their failure. Drug failure markers should then be derived from such studies.

Which technologies do you feel are being most effectively applied to secondary drug screening?

Cik: I would place functional cellular assays in first place, since they provide the researcher with insight into the cellular context of the compound while at the same time providing an indication of its toxicity.

Primary cells are also being increasingly applied to secondary screening.

We certainly can not ignore label-free assays, such as the SPR-based assays and the impedance-based assays, which are interesting as well.

Davies: High content screening the power of this technology is only now being truly recognised. Over time we are seeing adoption of this technology in both industry and academia with an ever increasing range of research applications.

Erfle: Specific and evaluated gene perturbations, high resolution, time lapse microscopy with automated data evaluation and appropriate data mining and finally in vivo experiments are most effectively applied to secondary screening. In addition, biological network creation will benefit by the generated data.

Olson: Again, I think in general cell-based assays are providing the most value at the secondary screening level due to the added information that they provide. The goal of secondary screening is to generate added information that helps transition the relatively large number of hits generated at the primary screening level to a more manageable number of higher quality compounds that can hopefully become lead candidates.

Bickle: Imaging technologies such as high content screening are currently experiencing a high level of popularity. These technologies have adequate throughput and yield a large quantity of information. The full exploitation of this data is still not implemented, but promises to give insights into the mode of action and potential success of a drug candidate early in the process.

How do you see screening as a whole evolving over the next three to five years?

Cik: The evolution from large, multi-functional platforms towards smaller, more flexible platforms is already well underway and I expect it to continue in the coming years. There will be more focused screens on smaller numbers of compounds. I expect this to lead to a growing integration of primary and secondary screenings, a process that may ultimately even erase the difference between the two.

Another evolution I foresee is the rising integration of bio-informatics to enable more efficient handling of the complexity of screening.

Finally, I see ADME and toxicology screenings becoming more automated. Some of these developments have already been initiated.

Davies: I would guess that there would be a greater emphasis on multi-parametric live cell and functional outputs (for example those obtained using HCS), integrated with more traditional research methodologies, such as genomics and proteomics.

Erfle: More movements will also go in the direction of analysing hits from primary screens in different cell types and more physiological systems of primary cells. From my point of view, secondary screening will also move from pure knowledge creation, to support model creation and confirmation.

Olson: Cell-based assays will continue to increase in prevalence both in the secondary and primary screening arenas. As more and more technologies become available that can meet the demands of high throughput screening, I think you will see a clear shift away from biochemical assays and towards cellular assays. There are still many assays that are simpler and more cost effective to run in a biochemical assay, such as kinase assays for example, but new approaches will become available that will allow a wider range of targets to be addressed with cellular assays. I think a key limitation to any biochemical assay is that it cannot reflect what is actually going on inside a cell, much less inside an tissue, so the closer you can come to the true biology, the better off we will be in selecting high quality lead compounds.

Bickle: Screening is going to become a more and more multidisciplinary process involving chemists, biologists, statisticians and computer scientists.

Do you use cell based assays already in secondary screening, also how important do you rank them?

Davies: Yes they are very important. I think that observing biological processes at the cellular level will almost always lead to a clearer understanding of physiological and path-physiological process at the organ and organism level.

Erfle: Yes, we already use secondary screening for hit confirmation, consolidation and creation of additional in-depth knowledge, like protein dynamics and protein-protein interactions.

Bickle: Cell based assays are, in my opinion, extremely crucial, as they determine cell permeability and give early indications to their toxicity and specificity. This implies that these cell based assays should be in place before the start of the primary screen. They should be part of the target validation process.

In regards to screening, which product has excited you the most and why?

Cik: Bèta-arrestin recruitment technology is a very valuable tool that has emerged on the market in recent years. It can be monitored with different technologies such as Transfluor, BRET, CypHer or the beta-galactosidase complementation assay. It allows for a novel and broad look at G-Protein coupled receptor activity and provides a universal, high throughput and high content method of conducting all GPCR assays.

Davies: High content screening technologies, because they offer flexibility in both application and scale.

Erfle: I was excited most by the coming together of three techniques, bearing a very high potential for systems biology. These techniques are recombinant GFP-tagged proteins, RNAi and automated microscopy, because they allow the chance to gain a more in depth knowledge about proteins and biological networks.

Olson: GPCR targets continue to be the most prevalent targets in screening at most companies, and in our experience the PathHunter Arrestin technology has generated the most excitement in the drug discovery community due to the potential it offers for identification of new classes of compounds against what are already well-established and well-validated targets. As GPCRs continue to dominate in drug discovery, more and more is learned about how they function and how different compounds show efficacy at the therapeutic level. Increasingly, people are searching for more unique and biologically relevant ways to probe GPCRs. The emergence of ß-arrestin recruitment by activated as an indicator of GPCR function has already provided valuable insights on how compounds function, and there is growing evidence for the existence of biased ligands that could in the future provide drugs that only signal in a desirable manner. As there is also increasing interest in looking for new indications for established drugs, an arrestin-based detection method for GPCRs function, such as that provided by the PathHunter technology, can also provide value in drug repositioning applications based on its novel readout.

Bickle: The birth of high content screening has been very exciting to me, as this technology promises to solve many problems of the screening world. Being a cell based assay, many cellular questions such as cell permeability, toxicity, metabolic processing and possible side effects can be answered.

What are the major obstacles of getting secondary screening technology into laboratories?

Cik: Purchasing the right equipment and technology for executing secondary screenings involves a considerable financial investment. Furthermore, you have to acquire the right expertise and experience, to know how to use this equipment and technology. High content screening in particular requires expertise that is not easily obtained. But once you have it, the benefits are high. The only way is to invest heavily in training, you also will need the right specialists in bioinformatics.

Davies: From an academic perspective, I think a combination of high purchase price of instrumentation along with expensive repeat consumable costs will ultimately impede the adoption of such technologies for general use.

Erfle: The major obstacles in getting screening technology in to laboratories is expansive equipment, like automated liquid handling, microscopes and the access to resources like genome wide libraries of gene perturbing reagents.

Bickle: Secondary screening brings together a variety of expertise, building and coordinating multidisciplinary teams is very challenging.

In your opinion, how could these obstacles be minimised?

Cik: In my opinion, the most important factor is to start in time. Scientists should constantly follow all the new tools and technologies that appear on the market. They need to be ready to bring them into the laboratory once they have developed into something really useful. Such an approach will enable the step by step build up of expertise and advanced planning of investments.

If you ‘miss the first train’, you will have to make very large investments in short time to catch up. If that happens, it can be useful to work with external companies that provide all in solutions, but that of course, is a most expensive option.

Davies: As universities and colleges are for the main part the primary training ground for scientists, I feel it is important that there should be a stronger and better structured relationship between technology providers and academia. Such relationships can only benefit all.

Erfle: Establishment of core facilities giving access to expansive equipment and highly educated personal, as well as genome wide sets of gene perturbing reagents like validated siRNAs, GFP tagged cDNAs, antibodies and small molecules.

Bickle: Every expert should receive some training in the other fields in order to understand the challenges faced to the entire process. This implies a lot of time spent on education, which is often neglected.

Looking to the future, do you think secondary screening could become so standardised that commercial opportunities arise?

Davies: The answer is probably yes. I can only refer to the boom in use of information technology over the last 20 years or so, where implementation of industry standards led to a huge rise in use of such technologies and as a consequence numerous commercial spin offs.

Erfle: Sure, but a certain set of standards defined by leading scientists in field should have to be followed.

Bickle: Yes, in my opinion cellular assays will be developed that predict very well poor efficacy, poor ADME or poor toxicity of compounds irrespective of their therapeutic use. Such assays should be marketable, just as primary assays are marketed.