article

The changing role of automation in High Throughput Screening

Posted: 10 January 2009 | Ulrich Schopfer, Director and Head of Biochemical Screening, Novartis Institutes for BioMedical Research | No comments yet

Among the challenges for the pharmaceutical industry, declining research productivity and increasing research costs take a prominent position. This is often put in the context of efforts in the pharmaceutical industry to automate and “industrialise” research activities, combinatorial chemistry and High Throughput Screening being the most prominent examples. An argument is being put forward that the industry replaced scientists with robots and scientists’ ingenuity with mindless screening. It is then concluded that the investments into automation were misguided and led to a decline in research productivity.

Among the challenges for the pharmaceutical industry, declining research productivity and increasing research costs take a prominent position. This is often put in the context of efforts in the pharmaceutical industry to automate and "industrialise" research activities, combinatorial chemistry and High Throughput Screening being the most prominent examples. An argument is being put forward that the industry replaced scientists with robots and scientists' ingenuity with mindless screening. It is then concluded that the investments into automation were misguided and led to a decline in research productivity.

Among the challenges for the pharmaceutical industry, declining research productivity and increasing research costs take a prominent position. This is often put in the context of efforts in the pharmaceutical industry to automate and “industrialise” research activities, combinatorial chemistry and High Throughput Screening being the most prominent examples. An argument is being put forward that the industry replaced scientists with robots and scientists’ ingenuity with mindless screening. It is then concluded that the investments into automation were misguided and led to a decline in research productivity.

On the other hand, many academic institutions build up screening centres of their own and chemical biology and chemical genetics are well established fields in academic research. Obviously, there is another side to automation, one that is concerned with advancing science rather that driving down costs. In this contribution we try to resolve the apparent contradiction and outline the role of automation in drug discovery in the future.

The development of High Throughput Screening

High Throughput Screening (HTS) emerged in the early 1990’s as a concept that linked the ability of combinatorial chemistry to generate large compound libraries with automated screening to provide cost-effective, high throughput methods of testing. In the two decades since then, an impressive array of automation and assay technologies have been developed. From the early definition of microtiter plate standards, the development of multi-parallel liquid handling robots, the development of homogeneous assay formats to today’s uHTS factories with throughputs of 100,000 wells per day and nanoliter dispensing technologies, the field has developed to a mature level. Efforts to “industrialise” HTS were directed at achieving throughput, reducing assay volume and keeping cost at a reasonable level as libraries grew into the million compounds. An intense focus on reducing variability at all stages of the process has eliminated many of the earlier weaknesses in High Throughput Screening so that a high data quality can be achieved today.

Process-driven screening

Measuring success of automation in terms of increased throughput, a process-driven screening paradigm developed that is mainly characterised by the capabilities of large, automated screening factories. The mindset that tried to “industrialise” screening led to a focus on process optimisation, borrowing process-improvement methodologies such as Six Sigma from other industries to reduce failure rates and to drive down costs. However, the impressive successes came at a price of inflexible systems that were optimised to perform a limited set of tasks. Not only does it take a long time to build such systems, it also takes considerable time to adapt an assay to the capabilities and processes of an HTS system.

Hypothesis-driven screening

However, increased productivity in research does not necessarily equate to reducing operational costs. There is a fundamental misunderstanding about the role of automation in science. Automation is not used to replace experimental design with mindless screening; it just allows doing experiments faster and with less variation. It thus allows to probe a scientific question more comprehensively, to test more variables – but it is the scientific hypothesis and the validity of the experimental design that determines the relevance of the resulting data. Automation relieves scientists of repetitive, manual tasks and enables experiments on a scale that are not feasible otherwise. Automation is therefore pre-requisite for scientific breakthroughs in systems biology, in the elucidation of signaling pathways or in proteomics, just to name a few.

HTS practitioners shift their focus from technology development to skillful application of screening technologies to investigate biological problems. A well-defined scientific question leads to the formulation of a robust experimental strategy that is designed to answer that specific question. The results, regardless of whether they support or reject the hypothesis, in turn help to define the next round of experiments. Hypothesis-based screening combines input from virtual screening, structural biology and other approaches to develop a powerful strategy for both target and lead discovery.

Next generation automation systems

Logistics, automation and data handling systems will have to be adapted to support the new paradigm. HTS systems will have to be designed with a flexibility that supports experimentation, rather than implementing an industrialised process. Next generation screening systems will have to be smaller and faster to build, because user needs will change faster than the current project implementation timelines allow. Given that technologies often have a lifecycle as short as three to five years, project timelines of more than one year from planning to implementation are no longer affordable. Systems will have to be modular, allowing an easy exchange of components or re-design of processes. This flexibility will have to be supported by standardised, plug-and-play interfaces and intuitive user interfaces. Designs that allow an easy and quick setup and easy data transfer should reduce the need for internal automation groups as more “off-the-shelf” technology replaces systems that require extensive customisation.

Conclusion

Rather than being part of the problem, automation will continue to play an important role in increasing research productivity. Rather than replacing scientific thinking, automation will allow scientists to do smarter, more comprehensive experiments and to ask complex questions on a larger scale. To achieve this, next generation automation systems will have to be adaptable to the changing needs of experimentation, rather than forcing scientists to adapt their experiment to the capabilities of the automation.