High Content Analysis Roundtable

Posted: 29 May 2009 | EPR | No comments yet

1. How significantly do you feel the Drug Discovery Process has benefited from the application of High Content Analysis techniques?

Anthony Davies: Since the mid 1990’s High-content analysis (HCA) has primarily been used in the later stages of the pre-clinical drug discovery process. However, as HCA techniques have developed and evolved, so has the role of this technology within the discovery process. Today HCA is integrated earlier in drug discovery, for example this technology is now widely utilised in target validation where contextual information obtained from cell based assays allows for better characterisation of biological mechanisms.

1. How significantly do you feel the Drug Discovery Process has benefited from the application of High Content Analysis techniques?Anthony Davies: Since the mid 1990's High-content analysis (HCA) has primarily been used in the later stages of the pre-clinical drug discovery process. However, as HCA techniques have developed and evolved, so has the role of this technology within the discovery process. Today HCA is integrated earlier in drug discovery, for example this technology is now widely utilised in target validation where contextual information obtained from cell based assays allows for better characterisation of biological mechanisms.


Anthony Davies
Director of High Content Research Facility, Department of Clinical Medicine, Trinity College Dublin

Sarah Payne
Product Manager, TTP LabTech Limited

Khuong Truong
European Product Manager, BD Biosciences

Jeremy Simpson
Professor of Cell Biology, University College Dublin (UCD)

Oscar ‘Joe’ Trask
Head of Cellular Imaging Technologies, Duke University

Peter Simpson
Associate Director of Cancer Bioscience, AstraZeneca

1. How significantly do you feel the Drug Discovery Process has benefited from the application of High Content Analysis techniques?

Anthony Davies: Since the mid 1990’s High-content analysis (HCA) has primarily been used in the later stages of the pre-clinical drug discovery process. However, as HCA techniques have developed and evolved, so has the role of this technology within the discovery process. Today HCA is integrated earlier in drug discovery, for example this technology is now widely utilised in target validation where contextual information obtained from cell based assays allows for better characterisation of biological mechanisms.

Overall High Content Analysis technologies offer a highly flexible research platform that lends itself well to both relatively simple cell based discovery programs such as those employed in primary screens, as well as the more detailed and complex mechanistic down stream studies.

Sarah Payne: In my opinion, the drug discovery process within Pharma and Biotech has very much benefited from the application of high content techniques both directly through inclusion of this screening practice into their drug discovery process but also indirectly, via those academic research projects which help to fuel our ever increasing knowledge of how biological systems actually work. From conception to acceptability, it has taken around 10 years for high content analysis to be widely accepted as a beneficial screening method rather than ‘nice-to-have’ technology. Due to the length of time required for a drug to pass through the drug discovery process, the number of compounds that have been identified using HCA techniques is limited. However, the success of these techniques may be reflected by their popularity and acceptance.

You only have to look at the equipment used in HTS and/or secondary screening facilities today to see that most major Pharma and Biotech companies are running larger scale screens with HCA technology, rather than using it solely within therapeutic research groups. Interestingly, HCA is also very much represented within the majority of academic research/screening centres, further strengthening the argument that HCA has turned out to be a beneficial technique within our industry as a whole. The technique gets so much air time at the larger conferences, with entire tracks allocated to the area of HCA or cell systems biology. Finally, almost all major technology providers have HCA systems in their portfolio and every year we are seeing new instruments being developed, which would seem to indicate these types of technologies are benefitting drug development programs and are here to stay.

A cell systems biology approach to drug discovery is becoming in vogue and HCA is strongly placed to be an important tool in this area of research. Gathering a “fingerprint” of disease related biomarkers and a phenotypic response is as critical at the target validation end of the drug discovery process as pre-clinical results. Potential compounds identified through screening campaigns are now rigorously tested for any adverse effects (as well as on-target effects) much earlier in the drug discovery process and many of these tests are performed using HCA technology. There also seems to be resurgence in the phenotypic screen, whereby the exact target(s)/mechanisms of compound action may not be known, but the desired cell based phenotype, e.g. inhibition of cell proliferation, is achieved. Such screens help to ‘weed out’ the problem of system redundancy which can be a problem with a single target based approach to screening. Also, their desired beneficial effects may be achieved via “polypharmacology” and as such may not have been identified through traditional target based biochemical screens until much later in the drug discovery process.

Khuong Truong: In my opinion, High Content Analysis has enabled researchers to work on particular cell types of interest and model organisms to understand more about their targets. In particular, the analysis of phenotypic readouts on a cell-by-cell basis has allowed them to quantify the effects on cellular models using techniques such as siRNA knock-out screens. In general, one should distinguish roughly two different types of users in the drug discovery process, researchers in HTS labs and in research labs using High Content Analysis. In the first case, HCA has allowed to achieve higher throughput while maintaining the ability to obtain large sets of data. Thus, some of the assays are moved forward in the screening process as safety issues can be addressed earlier in the development process and drug candidates made to fail earlier. In the second case, the measurement of multiple parameters on a cell combined with throughput has led to higher acceptance of HCA in the screening world. In addition, the automation of genotoxic assays (i.e. Micronucleus assay) which tend to replace animal testing increases the benefits even further as they potentially help to reduce costs compared to when they are scored manually.

Jeremy Simpson: It is perhaps still a little too early to appreciate the full impact of HCA on the drug discovery process; nevertheless as HCA is now being routinely employed at multiple stages along the pipeline, it will undoubtedly bring benefits. For example, if HCA is used in the primary stages of compound screening, not only does this provide quantitative and visual readout of a particular response under study, but it immediately gives us information with respect to the range of that response across a cell population. This is a huge advantage over whole-well readouts as used in more traditional screening approaches, and should result in potentially useful compounds being identified more rapidly and, equally importantly, less suitable compounds being disregarded at this early stage.

Oscar ‘Joe’ Trask: Greatly, this technology has provided researchers with an extraordinary tool that is just beginning to showcase the utility of these techniques in the broader scientific community as evident by journal papers, symposiums and conferences. At the same time, the full impact and benefits of the technology is still unknown even though it is more than 10 years old mainly as a result of the lengthy process of identifying compounds from primary screens, validation of disease target, and re-optimising of the chemical structures for solubility, absorption, and reduced toxicity prior to first human dose.

Keep in mind, with the exception of a couple of early adopters of this technology, many biotechnology and pharmaceutical companies and more recently academia institutions using high content imaging, have only been doing so for the past five to six years. Clearly there have been early landmark papers such as Tim Mitchison at Harvard University who used this technology with a well-orchestrated approach for secondary phenotypic screening processing steps to identify a small molecule compound called Monstrol, whose compound derivatives preceded to clinic trails. Scientists from both biopharmaceutical industry and academia institutions recognised this approach and are now engaged in phenotypic screening campaigns. High content imaging is an excellent tool being used in many disease in-vitro and ex-vivo model systems in discovery from primary screen to clinical trails. The promise of the technology 10 years ago primarily was focused on higher throughput and time reduction to push compounds through the drug discovery bottleneck barrier, however as we now know today, this did not completely transpire as planned. Rather, we discovered the conversion of many biochemical assays to cell based imaging assays with a greater knowledge of the mechanistic profile or phenotype of compound-target interaction in cells using small molecules, peptides, and RNA interference (RNAi) many times in lower throughput focused approaches. Researchers in the pharmaceutical screening centres initially used this technology in comparison with existing validated approaches such as flow cytometry and other well established fluorescent or colorimetric based screening technologies, and for that reason only a single output data feature per well of a microtiter plate from the high content imager was used to determine if a compound was active in a screening campaign. This clearly is not “high content” since only one data point was measured, however this quickly changed once the industry better understood the benefits of looking at additional high content metadata and accepted the technology as a legitimate drug discovery tool. For example, the key output feature in a protein translocation assay is ratio or difference of fluorescence between two cellular compartments. But there is much more data to consider in these multiparametric assays, mainly cell number which is critical to determine if compound treatment is toxic or effects cell adherence, the intensity of fluorescent dye in the cell nucleus can give an indication if the cell cycle is altered during compound exposure, and other features such as size and shape morphology, intensity, and dynamics are key high content measurements.

Lastly, I think many in the pharmaceutical industry would agree the technology has provided early evidence if a compound exhibits toxicity profile. In this case, the compound or compound series is dropped from further testing, or the compound undergoes an SAR campaign in attempts to make the compound less toxic in in-vitro or in-vivo model, or the compound is further validated using additional high content imaging, or complementary assays are developed to understand MOA saving the industry time and money. It is evident the scientific community in academia, biotechnology, and the pharmaceutical industry have embraced this technology and have benefited from early detection of toxicity from compounds and identification of novel compounds, learned more about MOA, target validation and identification. The final chapter of the high content imaging story and contributions the technology has provided the scientific community is not complete. The jury is still deliberating and “wow” what a wonderful time to be involved in high content imaging as we discover more exciting finds.

Peter Simpson: When it was first being introduced into the pharmaceutical industry, some saw HCS as the future of screening. It was envisaged that this technology might open up new avenues in which pathway based screening and cellular profiling would rapidly become routine, and that it would replace all biochemical and other reductionist techniques. Others, particularly those who were sanctioning the capital purchase process at the time, saw HCS as potentially an expensive luxury that would never add sufficient value to justify the expense of the capital and revenue costs. Ten years or more down the line, it seems clear to me that the truth lies somewhere in between. Using HCS, imaging based endpoints are now widely accepted as part of our routine drug discovery processes, and more complex readouts have become quite widely implemented. Pathway screening and cellular profiling have added some value to drug discovery, however there are few, if any, drugs on the market which have been developed directly as a consequence of this type of work yet. The use of two, or in a smaller number of cases three, simultaneous endpoints has become commonly used on cell imaging platforms, and this relatively modest degree of multiplexing has increased our ability to interpret Structure-Activity Relationship anomalies. Now that the hype around HCS has settled, it is clear that HCS is one, very useful, weapon in an armoury of cell screening options, sitting comfortably alongside conventional cell assays with a single readout such as ELISAs, which are still quick to develop and adequate for many purposes. Biochemical assays have retained utility on the screening front line, particularly for enzyme targets in which off target effects in a cell can confound interpretation of more complex phenotypic readouts.

2. Currently what specific areas of research are benefiting most from application of this type of technique?

Davies: HCA has allowed us, for the first time, to conduct extremely detailed high throughput morphometry studies on adherent cells. Prior to HCA, flow cytometry was the only reliable means available to perform rapid analysis of cell morphology. The main disadvantage of this approach is that cells must first be detached from their growth substrate and disaggregated prior to analysis. Additionally these systems are only capable of limited morphologic measurements. It seems logical that a detailed appraisal of an adherent cells structural and morphological characteristic can only truly be achieved if fully attached to a growth surface. The microscope based optical systems found in many HCA platforms allows for the rapid imaging of cells in an attached and undisturbed state. Morphological information captured in these images can then subsequently be extracted using specialised software based morphology analysis algorithms. I am unaware of any other technology currently available that can offer the same detail, throughput and reproducibility as HCA in this area of research.

Payne: Oncology was obviously the first specific research area to encompass HCA technology and since then many other therapeutic areas have also found this technique useful. Right now, for a screening HCA system like the Acumen® eX3, there seems to be huge interest within the areas of infectious diseases, stem cell therapy and genome-wide RNAi screening.

Truong: In general, any kind of cell-based assays requiring automation and multiplexing would benefit from this technique. More and more whole model organisms are analysed as well for the same reasons. The typical biological areas benefiting from this technique are neurobiology (where morphology changes can be easily measured), cancer research (angiogenesis assays, cell cycle studies, cell proliferation, and apoptosis), stem cell biology (for marker detection of the differentiation status) and toxicology (micronucleus assay, cell death assays).

J Simpson: The sequencing of the human genome remains a landmark event in biological sciences; however the development of technologies and approaches to exploit this information inevitably follow. HCA is a well placed technology in this regard, and is particularly powerful when combined with gene knock-down methods such as RNA interference. We are now seeing an exponential increase in the use of this combination, providing us with unprecedented opportunity to analyse basic cellular processes – such as cell division, cell migration, and cell transport – in a truly systematic way. More comprehensive understanding of basic cell biology is a prerequisite to improved drug design, and therefore HCA has a key role to play from both perspectives.

Trask: The groundwork was paved by cell biologists in cancer therapeutic areas. It was apparent this was a seamless transition for cancer biologists to use available biological reagent kits, mainly phosphorylated derived antibodies that specifically bind to target protein of interest to study signal transduction pathways, apoptosis, cell cycle, and migration. Many of these types of experiments were being conducted by other methods and high content imaging in many ways is a complementary approach to obtain similar results with the exception of high throughput aspect, morphological cell measurements such as angigogensis and dendrite extension in neurons, as well as multiplexing capabilities. Many of the early investigators measured the amount of receptor internationalise, protein transcription, protein phosphorylation expression in the nucleus, cytoplasm, cell membrane, or the amount of protein shuffling between different compartments within cells following challenge by an agonist or antagonist stimuli or a combination with small molecule compound to inhibit or promote protein function.

There are times when antibodies do not address all of the questions due to the biology studied, or become too expensive to use in high throughput drug discovery programs and there are times when other methodologies such as fluorescent proteins turn out to be useful. For example at Eli Lilly they developed a transgene cell model encoding fluorescent protein reporter for p38 MAPK-k2 that ultimately was developed for a medium throughput small molecule screen that identified novel structures. A company called Bioimage was founded based on similar ideas to create target ready fluorescent protein tag cell lines for use in cell imaging technologies. Today researchers across the spectrum are using similar assays, mainly in cancer biology, but with the twist of using RNAi to knockdown or partially block a signal transduction pathway to help understand MOA. Huge RNAi screening centres at private companies, biopharma and universities are looking for novel targets. At Duke University Centre for Drug Discovery we have taken advantage of fluorescent proteins in combination with neurodegenerative diseased gene to transfect live explant brain slice tissue or primary neurons followed by small molecule treatment. We use high content imaging as a tool to determine cell survival as well as morphological characteristics to identify drug candidates that rescue neurons from certain death or show a defined phenotype.

High content imaging is an exceptional tool for neurobiologists and they have benefited tremendously mainly because without an imaging approach detections of dendrites or axons is not possible. Neurobiologists are using high content imaging to study the morphological characteristics of dendrites and axons in cell cultures models and ex-vivo models from diseased or stressed tissue in wild type and transgenic animal model systems with impressive results when done correctly.

Toxicologists have found a niche using this technology and continue to change the way things were once done in big pharmaceutical firms by providing novel analytical tools which are being cross-validated and implemented in some organisations as a first tier toxicity panel. All biologists including developmental biologists are benefiting by using whole organisms such as Cyprinidae zebra fish, Drosophila fruit fly, and Caenorhabditis elegans nematode. Imaging whole organisms becomes more of a challenging approach to perform in a high throughput mode, but some have achieved this in zebra fish by anesthetising and physically orientating for reproducible imaging, while others perform this in lower throughput sometimes without a commercial high content imager which is not always required.

P Simpson: The major impact has been in research areas that did not previously have satisfactory cellular assays that could be readily performed at sufficient throughput. Here are three examples:

a) Complex neuronal differentiation, and synaptogenesis, can be measured to better understand the effects of molecules which could be potential therapeutics for neurodegenerative conditions

b) The differentiation of stem cells along multiple images can be tracked using fluorescence proteins or lineage-specific antibodies, enabling the simultaneous study of multiple cell types descended from a single progenitor type as separate subpopulations

c) In the cancer field, the ability of imaging quantitation algorithms to study the complexity of blood vessel morphology enables angiogenesis studies in a microtitre well that are relevant to the disease state.

3. What technological developments do you feel have had the most significant impact upon advancing research in molecular and cellular biology?

Davies: Image analysis – Historically conducting quantitative studies by optical microscopy was often labour intensive and relatively imprecise. With the advent of image analysis software tools, many problems in extracting numerical information from captured images have, to a great extent, been solved. Indeed, these software tools are now so effective the challenge now is how to interpret the vast amount of data produced from cellular images. In general the precision and throughput offered by automated image analysis technologies makes possible the simultaneous assessment of a multitude of biological processes in the context of an intact cellular system.

Cellular labels, stains and probes – Without these tools much of what we call High Content Analysis would not be possible. Over the last few years we have seen a dramatic increase in the availability of dyes, labels and probes to multitude cellular and sub-cellular targets and processes. Many of these labels have been formulated to be used in living cells, which can only extend the scope of research conducted in this area.

Payne: I would say that the most significant advances have been in the area of automation, which has enabled us to exponentially increase the amount of molecular and cellular biology data that can be gathered. Only 10-15 years ago, screening plates were being prepared by hand in small batches. Until more recently, there has been nothing available in the market place for large scale cell transfections, large scale cell preparation, cell plating or indeed the necessary robotics to screen plates unattended. Furthermore, we have moved from screening in predominantly biochemical based formats, using plate reader technology to the inclusion of screens that utilise automated cellular imaging systems.

Truong: The automation and miniaturisation of cell-based assays have been key to the research in this area. Both had impact on cost and reproducibility of assays, miniaturised assays can be found on multi-well plates (typically down to 1536 well plates) but also on microscope slides such as cell arrays. Automated sample preparation using robotics and analysis with high-content readers have had significant impact on the data quality. In particular, live-cell analysis and use of confocality have also contributed to obtain high quality data.

J Simpson: Again I would point to the impact of RNA interference in combination with quantitative fluorescence microscopy as a partnership that is revolutionising our understanding of cells. It is important to note that the success of this combination in the HCA arena has only become a reality through imaging technology developments. Perhaps most significant has been the introduction of completely automated microscopy platforms with critical features such as high precision stages, autofocus capabilities and improved light sources. HCA and RNAi screening is heavily reliant on rapid image acquisition in a truly robust manner if data are to be quantitative, and therefore such systems are essential.

Trask: It is apparent from the literature and scientific meetings that RNAi or post transcriptional gene silencing, including subtypes microRNA (miRNA) and small interfering RNA (siRNA), has impacted the high content imaging technology arena. Most major university research centres and biopharma companies have screened some type of RNAi library in hope of identifying on-target, as well as off-target, effects of gene identification. It will be years before we really know the full impact and benefits of RNAi approach because the number of follow up leads researchers are undertaking is massive. Initially though, RNAi has proved to be a great tool to “knock down” a target or signaling pathway of interest to study mechanisms of action in the cells, combined with compound treatment we are able to determine if drug is altering a pathway on-target and/or off-target. And the promise of stem cells, once fully validated, in combination with RNAi will be a tremendous arsenal for developmental biologists to take full advantage of these tools to determine the condition and ultimate fate of cells as they differentiate into end-stage lineages. I think the roundtable discussion in a previous issue talked about the benefits and topics of stem cells which should be considered when using the high content imaging research.

The use of fluorescent proteins is not a new idea, but the experimental design approaches in combination with newer molecular biology such as knock-in or gene regulated model systems in in-vivo, ex-vivo, or in-vitro are very exciting. At Harvard University, Jeff Lichtman’s group used the Cre/lox recombination system to switch on gene expression with three or more fluorescent proteins to visualise synaptic circuits in mice which they called Brainbow. At Duke University we are using a similar approach in combination with a drug induced regulator to turn-on neurodegenerative gene or genes in mature mixed ex-vivo culture system.

P Simpson: I would say that, in an HCS context, the greatest single impact has come from GFP and the subsequent explosion of tags for specific cellular proteins which can be used in living cells and fixed cells. The range of applications of these fluorescent probes for cellular imaging is enormous. The use of multiple labels of different colours has enabled us to understand how proteins interact and move from one compartment to another. Studying protein movement in real time, in “kinetic high content screening”, provides much clearer information to the scientists to interpret compared to an endpoint antibodies-based study. Moving forward, for example, mRNA fluorogenic tagging is opening up new doors in cellular analysis.

4. What developments do you believe are needed in order to further advancements within this field?

Davies: Progress in the HCA field is at present limited by a knowledge gap, which spans between the skilled HCA practitioner and the biologist at the bench. The main area where usability needs to be improved is image and data analysis. Typically these software packages can be difficult to use, resulting in users focusing on the technology rather than the biology.

To reach its full potential as a research tool, HCA must become more user friendly and intuitive with commonality across vendor packages.

The second area that needs to be addressed is HCA image format. At present most of the large vendors use their own proprietary image format, this renders the image analysis across different vendor platforms difficult. Currently the only way to overcome this problem is to purchase expensive file conversion packages (if available). The standardisation of image format, Meta data etc is currently one of the most pressing needs within the field of HCA.

Finally the implementation of experimental standards such as Minimum Information About a Cellular Assay (MIACA), will become increasingly important as the requirement for research groups to share HCA data increases. Implementation of these standards will not only allow for better comparison of data sets between studies, but will also inform the development of better described experimental procedures and working practises.


a) More fluorescence reagents for a greater degree of multiplexing are still required to generate better phenotypic based screening assays.

b) Further simplifying HCA technology to decrease the learning curves and the infrastructures required to support such technology, whilst not trading off the quality of the information delivered.

c) The ability to ask very complex biological questions using multiparameter/multiplexed assays without plate read-times having an inhibitory effect on throughput.

d) For HCA to become truly successful within the drug discovery process, further dramatic advances in instruments and the bioinformatics tools (required to manage and mine the heaps of data we can now generate) are not the only concern. Instead perhaps we should use the tools in hand to figure out the relevance of our current screening models. In part, has the decrease in our ability to develop new therapeutics been due to too many trade-offs being made for convenience and to achieve increased throughputs, e.g. the use of cell lines such as CHO and HEK, instead of more relevant primary cell lines? Do primary cell lines actually still function in a similar enough way in culture as they would in their natural environment to be meaningful? Is plastic, or glass a suitable substrate for growing cells on, or will basement membrane mimics be better?

Truong: The main benefit of HCA is to generate data on a cell by cell basis. However, the downside of this technique is that large data sets are generated. The most important bottleneck of the field these days is the ability to analyse and mine large data sets. There is a clear need to develop tools in Bioinformatics which would allow finding specific cellular information that scientists are looking for within these large data sets. In addition, development of tools allowing the establishment of predictive models based on HCA data would be of great relevance.

From a software perspective, the development of additional features is also needed. In particular for the analysis of live-cell assays it seems to become increasingly of interest as data on kinetics can be gathered.

J Simpson: Image analysis software has come a long way in recent years, however there is still much to do in this field. A key problem is that cellular morphology is not uniform across a population and this makes quantification of subtle phenotypes at the subcellular level difficult for automated systems. We have imaging platforms capable of producing thousands of images per day, but precise quantitative analysis of these data remains a bottleneck.

Trask: High content imaging technology platforms encompasses at least three primary disciplines which include:

  1. Hardware components of the instrument which are typically a microscope, camera, light source to excite fluorescent probes, computer, and so on
  2. Computer software for analysis of images
  3. Informatics to manage, archive, mine, and determine meaningful results of the experiment.

Each of these components is usually adaptable for upgrade to further optimise the overall productivity of the technology. New developments should be embraced by the end-users of the technology to continuously provide feedback at every aspect of these scientific disciplines so manufactures can continually update and implement new tools to advance the technology.

One of many problems scientists are faced with is the question when to use high content imaging. Automated high content screening and high content analysis approaches may not be the answer for every biological imaging question. There are times when other techniques are required including manually counting cells without automation. Remember scientists are dependent on a computer generated image and image analysis to formulate an answer. In most cases this approach is acceptable but in other cases where two and three dimensional morphology is applied or stereological visualisation by the human eye is used to determine a conclusion the automated high imager is not well suited, but at the same time the human eye along with the brain can make false conclusions. It is clear this is an awe-inspiring challenge to the HCI hardware and software components from vendors but there is a tremendous need to evolve three dimensional analysis to allow scientists to make better decisions without nonbiased manual imaging approaches to advance the technology.

Without a doubt informatics is leading the way as the most influential component of this technology as determined by questions from investigators trying to better understand the meaning of instrument generated data from the milieu. There is an enormous amount of metadata or numerical data generated from the analysis of a captured image which can be mind-boggling to analyse without appropriate informatics tools.

Assay development and optimisation of assay conditions in the development of high content image based drug discovery screens continues to be an important hurdle to address. The systematic development of assay models, kits, new reagents, and the adaptation of creative scientists thinking outside of the box is needed to advance this technology and to enhance our understanding of the science. Also, the development of new instrumentation components such as more sensitive cameras, better light sources, and optical path could benefit researchers in the future. This is an exciting time.

P Simpson: HCS has now become relatively mature and has found a settled niche. In the current financial climate, it is even more important than ever to be only using more expensive and resource-intensive approaches like HCS for applications where it is business critical and adding substantial value. I think that at the moment the available HCS hardware is adequate for many purposes. The algorithms required for more complex assays can still in some cases be relatively laborious to set up for a new application, but HCS groups can often have dedicated IS support. So, perhaps the greatest remaining constraint is that virtually all HCS assays still rely solely upon fluorescence intensity, with its inherent limitations in terms of intensity, and discrimination between different labels and locations.

5. In your opinion what are the driving factors moving the industry forward and where do you envisage the major growth within this market occurring over the next few years?

Davies: Smaller and Cheaper cell based assays for HCA – To sustain the viability of large scale research programs running in academia and industry (especially when considering the current global economic climate), it will be necessary to further reduce costs. With the advent of micron resolution robotics and nano litre capable liquid handlers, large scale and automated assay miniaturisation is now possible. The advantages of miniaturisation are clear, when one considers the savings in reagents and experimental materials. As such I would expect to see growth in miniaturised assay systems, such as lab on chip e.g. cell based arrays.

Payne: Driving factors seem to be to firstly understand the disease in a greater depth of knowledge to begin with. Large scale genome-wide studies for identifying gene function, in combination with genotype analysis for identifying disease relevant/predictive outcome biomarkers, are crucial areas of research.

The desire to perform high throughput, target-based or phenotypic screens using disease relevant cellular models is also an area of focus right now. Secondary follow-up with in-depth mechanism of action profiling is proving popular for the prediction of both on-, and equally importantly, off-target effects at an earlier stage of the drug discovery cascade. Profiling techniques against many different signalling cascades may also help to develop drug regimes that involve more than one pharmaceutical. Disease states are multi-factorial and the specific factors involved can change with different population subsets. The ability to profile treatment responses in greater depth will therefore help develop novel regimes that can be focussed towards patient subsets.

To keep up with the market driving factors mentioned above, I envisage seeing some advances in the cellular screening technologies for yes/no decision making over the next few years. These will include both developments in instrumentation but also plate/chamber-based technologies etc, developed to represent a more biologically relevant environment.

Currently there is a trade off between throughput and the quality of biological information gained at the beginning of the drug discovery process. Recent history may suggest that the decrease in the number of drugs making it through to the market place could, in part, result from this balance being tipped too much in the wrong direction. As such, I believe that the ability to rapidly perform large scale profiling screens on focussed compound sets, perhaps versus many different cell lines/primary cells – without compromising throughput too heavily – is where we will see some activity. Additionally, any tools developed for screening that address the concern of biological relevance, such as advances in items like microchambers, would be beneficial.

Truong: As mentioned before, the quality of information/data obtained with HCA will be the most important factor for acceptance in the industry. This will largely depend on the software solutions that can be provided to researchers for demonstrating the significance of their data. The cost/benefit analysis will then determine whether the technology will grow further. In the screening world, compared to more traditional approaches such as biochemical screens, it will have to show whether HCS can be used as a complementary method or partially/completely replace them.

J Simpson: In these difficult economic times both industry and academia need to maximise their output from the more limited resources available. Obtaining more information from each individual experiment is becoming increasingly important, and so I envisage more routine use of multiplexing and multi-phenotypic analyses using HCA approaches. There will also be growth in the application of live cell time-lapse assays over end-point assays, as this significantly enhances the richness of data obtained relative to only modest increases in cost.

Trask: Certainly the manufacturers play a role in making a highly reliable sophisticated instrument but it is end-users of the technology that ultimate push the scientific envelope by breaking into new barriers to showcase the technology. One example is a new instrument called Cell Voyager just introduced by Yokogawa that will be useful to address high throughput live-cell kinetic imaging questions from scientists. In this case the manufacturer, Yokogawa, determined the driving factors from a diligent tour around the world to produce this new instrument, in other cases companies not mentioned have adopted new tools based on end-users feedback. If we compare high content imaging technology to another well established technology such as flow cytometry, then the future is brilliant.

For this technology to achieve beyond its early successes, it will require the scientific community to cultivate a society for investigators to contribute, showcase, and exchange ideas at annual meetings and in journal publication. Does the name “Journal of High Content Imaging” sound appealing? I expect to see either the formation of a new society or another scientific society such as the International Society for Analytical Cytometry (ISAC) or the Society for Biomolecular Sciences (SBS) consume all aspects of the technology. In the past, both ISAC and SBS have reached out to these scientists since many are members of one or both of these societies or attend the meetings sponsored by these organisations. Also, expect to see the manufacturers of these instruments adopt industry wide standards for image file format. Although not trivial to accomplish, it is something that has been lacking for years. There are a few scientists discussing the use of standards to check the performance of the instruments since there is not one so expect to hear more about this in the future.

I can speculate major growth in the HCI market in developmental biology and it will gain even more exposure with a continuous expansion towards the use of stems cells as long as we fully understand and validate the differentiated cell types as previously mentioned. Furthermore, the expansion of ex-vivo, ex-plant tissue, whole invertebrate organism imaging will be explored. It is likely we will see glimpses of in-vivo imaging adapting some of the high content imaging technology components to further streamline higher throughput imaging approaches but ultimately the growth in information technology will be needed to simplify the process to better understand the meaning of multiparametric high content information, so expect to see an expansion of improved informatics and release of newer software tools.

As a side note to the readers, I recently setup a website on Google Groups for users of high content imaging to exchange ideas and communicate that is open to all at

P Simpson: To a large extent the direction this goes in will be up to the vendors. If they are satisfied with the instrumentation they are currently producing, the growth will be at the bottom end in more use of HCS within the academic community as the costs become manageable and most of the required algorithms are available off-the-shelf. Higher end growth in the pharmaceutical industry, in contrast, can only be enabled by genuine technological innovation. The next leap forward may come from a vendor looking to innovate rather than produce yet another microscope in a box.

Opportunities to truly deliver a package that makes kinetic high content screening a routine tool have been around for a while but have not been fully delivered on yet. And there are other opportunities for vendors to innovate – new reagents and better detection technologies would facilitate greater use of other types of fluorescence, such as FLIM, in routine instrumentation & assay applications.