HCS (High Content Screening) - Articles and news items
Issue 3 2011 / 20 June 2011 / Willem G.E.J. Schoonen, Walter M.A. Westerink, Femke M. van de Water and G. Jean Horbach, Department of Toxicology & Drug Disposition, Merck Sharp & Dohme
The application of High Content Screening for in vitro toxicity testing is a relatively new approach in the preclinical research phase of drug development. A battery of tests have been developed for screening on general parameters such as cytotoxicity, while more dedicated assays are available with respect to the identification of genotoxicity, phospholipidosis, steatosis and cholestasis. All these tests are very beneficial within the pharmaceutical industry for the selection of appropriate candidates for drug development as well as for reduction of the attrition rate.
High content screening (HCS) is quickly growing in popularity within the field of in vitro toxicity testing. The maturity of HCS equipment and software has made HCS accessible for many technicians and scientists working in the area of cellular and molecular biology. Although this technique was introduced in the mid 1990’s, the simplification in the use of the software programs, the growth in computer storage capacities as well as the improved qualities of resolution of the digital microscopic cameras has largely increased the accessibility of this equipment. At the start of HCS technology, many scientists were sceptical about this technique as image-based mathematical algorithms had to be written for the analysis. (more…)
Issue 6 2010 / 16 December 2010 / Carl A.K. Borrebaeck and Christer Wingren, Department of Immunotechnology and CREATE Health, Lund University
Deciphering crude proteomes in the quest for candidate biomarker signatures for disease diagnostics, prognostics and classifications has proven to be challenging using conventional proteomic technologies. In this context, affinity protein microarrays, and in particular recombinant antibody microarrays, have recently been established as a promising approach within high-throughput (disease) proteomics1-3. The technology will provide miniaturised set-ups capable of profiling numerous protein analytes in a sensitive, selective and multiplexed manner. (more…)
Issue 5 2010 / 1 November 2010 / Andreas Vogt, Department of Pharmacology and Chemical Biology and the University of Pittsburgh Drug Discovery Institute, University of Pittsburgh
Cell motility plays an important role in many human diseases and normal cellular processes. Cell migration is critical for wound healing as cells of the inflammatory system and fibroblasts populate the wound and initiate re-epithelialisation1. On the other hand, unregulated cell migration contributes to cancer cell invasion and metastasis2. Agents that affect cell motility, either positively or negatively, could therefore find applications as promoters of wound healing or as antimetastatic drugs. Cell migration in a biological context is an extremely complex process and the understanding of genetic and biochemical determinants remains incomplete. (more…)
Issue 4 2010 / 19 August 2010 / Karol Kozak, Angela Bauch, Gabor Csucs,Tomasz Pylak & Bernd Rinn, ETH Zurich
As High Content Screening (HCS) has moved into the mainstream for biological and pharmaceutical investigations, a lag of well integrated pipelines for automated acquisition, management and analysis of HCS results turns out to be a bottleneck for fully leveraging the wealth of information contained in a screen and moving to higher throughput. For many applications, monolithic pipelines cannot deliver the flexibility and versatility needed. Laboratories and scientific service providers instead usually look into integrating components from both the open source world and the commercial software world into best-of-breed data pipelines. In this article, we will present two open source components that can be used as flexible and powerful building blocks for such a pipeline. (more…)
Issue 3 2010, Past issues / 25 June 2010 / Peter Alcock, Colin Bath, Carolyn Blackett & Peter B. Simpson, Screening & Assay Sciences, Cancer Bioscience, AstraZeneca Alderley Park
Over the last 15 years, vendors have offered microscope-based instruments capable of producing images of fluorescent labelled components of cells grown in microtitre plates. These instruments are typically bundled with analysis software capable of defining the relative distribution of several fluorescent markers on a cell by cell basis1,2. As the readers have improved and image acquisition and analysis times have reduced, the potential for screening larger compound libraries has presented itself. High Content Screening (HCS) i.e. the generation of multiparameter data from a single well, has thus become an important tool in the High-Throughput Screening (HTS) laboratory. (more…)
Issue 1 2010 / 22 February 2010 /
The World Cancer Report (2008) predicts a 50% worldwide increase in cancer incidence by 2030, predicting 75 million people living within a five year diagnosis of cancer1. This increase is partially fuelled by significant medical advances in developed countries ensuring people live longer. However, it is also attributable to developing countries adopting habits linked to cancer risk such as increased uptake of smoking and the acquisition of western diets. In 2007, cancer caused approximately 7.6 million or 13% of all human deaths2. Cancers associated with the greatest mortalities are lung, stomach, colorectal, liver and breast cancer respectively. There are modifiable risk factors common to many malignancies, including tobacco, overweight or obesity, poor physical activity, dietary factors, alcohol, sunlight exposure and chronic infection. Effective prevention will reduce the risk of cancer, and efficient screening will enable many to be successfully treated for their disease.
Issue 6 2009, Past issues / 12 December 2009 /
Automated high content screening platforms are capable of producing thousands of images per day. The challenge is to use appropriate analysis methods to extract the maximum amount of biologically-relevant information from these images. In this article we summarise the basic concepts of image analysis and highlight examples of both open-source and commercial software that are available for use with image data sets generated using high-throughput methods. (more…)
Issue 4 2009, Past issues / 30 July 2009 /
The understanding of properties of any biological system requires a detailed and quantitative analysis of its parts and their interactions. As different processes within a system occur at defined space and time, each process holds its own optimal observation and investigation technique. One of the most powerful tools to analyse biological samples quantitatively is based on fluorescence microscopy. Comprehensive studies of diverse biological processes were lately performed by fluorescence screening microscopy, which came up extensively during the last decade1,2,3. (more…)
Issue 6 2008, Past issues / 3 December 2008 /
Approximately 45% of all deaths and 50% of all hospitalisations in the western world are a direct result of cardiovascular disease. Cardiomyocyte hypertrophy is a mechanism by which myocardial mass is increased to compensate for any elevated physical demands placed upon the heart, thus ensuring that adequate perfusion of body tissues is maintained during these periods. However, if the hypertrophic response persists, the heart enters a critical transition from compensatory to a patho-physiological de-compensatory state which eventually leads to heart failure.
The development of new therapeutic tools for the treatment of this condition has in many cases been hampered by the lack of biologically relevant experimental models on which new treatments can be tested.
With the advent of laboratory automation technologies, it is now possible to screen libraries comprising of hundreds of thousands of potential therapeutic agents. These new technological innovations have increased the demand for cell based experimental models that can be used in conjunction with research platforms such as High Content Screening technologies. (more…)
Issue 5 2008, Past issues / 29 September 2008 /
Traditional drug discovery screening assays tend to employ simplistic endpoint assays that often monitor the activity of a single target. While these approaches are amenable to high-throughput screening they provide limited information on how candidate drugs influence complex biological systems that exist in vivo. Such limitations are a contributing factor to high attrition rates of drugs as a consequence of poor efficacy in clinical trials.
Issue 4 2008, Past issues / 2 August 2008 /
Cell-based assays are essential for drug discovery and development as they increase the quality of lead compounds due to their physiological relevance. Toxicological data can be gathered during the early phases of hit selection and verification, reducing costs and attrition rates during clinical trials.
Issue 3 2008, Past issues / 19 June 2008 /
Phenotypic drug discovery (PDD) has come of age – again. Using a microscope to observe a cell, one of the oldest techniques available to a cellular biologist dates back to the 17th century studies of Antony van Leeuwenhoek and his characterisation of ‘animalcules’. These early analyses, which simply described the appearance of a cell or group of cells, are the basis for today’s phenotypic assays. Although this early work might seem irrelevant when compared to the powerful array of tools that modern science brings to bear on a problem, the use of cellular phenotype as a method of scientific investigation evolved over time into the primary method of drug discovery up through the 1970s. Cellular phenotypes and the phenotypic changes induced upon compound treatment were commonly followed in both basic science and drug discovery[1-3] and even led to the development of early forms of automated data acquisition and analysis.
Although early phenotypic discovery was successful in bringing drugs into the marketplace, in the 1980s the pharmaceutical industry widely switched its focus to a target-based approach in which large chemical libraries were screened against individual enzymatic targets in vitro. This target-based approach to drug discovery (TDD), used in conjunction with high throughput screening and combinatorial chemistry, has remained the industry standard for several decades. Recently, however, productivity in pharmaceutical research and development, as measured by the number of new medical entities (NME), has been declining and it is clear that methods complementary to TDD must be employed5,6. In addition to advances in instrumentation, informatics, and chemical libraries, PDD is now used as an additional method to bring novel therapies to market. PDD is an ideal complement to TDD, where the focus is on a compound’s effect on a single target in vitro, but ignores potential effects on other targets, both positive and negative. (more…)