article

Trends in laboratory automation: From speed and simplicity to flexibility and information content

Posted: 24 June 2010 |

The pharmaceutical industry has significantly influenced laboratory automation trends in the past two decades. The need to screen large collections of chemical entities in a short time with minimised consumption of reagents has driven a strong demand of parallelisation, automation, simplification and miniaturisation solutions from the suppliers of instruments, labware and assay technologies. Currently, the levels of automation and miniaturisation seem to have reached a plateau and the new paradigms are flexibility and information content.

Automation in the Pharmaceutical Industry

The pharmaceutical industry has significantly influenced laboratory automation trends in the past two decades. The need to screen large collections of chemical entities in a short time with minimised consumption of reagents has driven a strong demand of parallelisation, automation, simplification and miniaturisation solutions from the suppliers of instruments, labware and assay technologies. Currently, the levels of automation and miniaturisation seem to have reached a plateau and the new paradigms are flexibility and information content.

Since the rise of high throughput screening (HTS) as the central process in early drug dis – covery during the 1990’s, laboratory automation has undergone huge transformations in response to successive paradigms. The following lines depict the evolution of different aspects of automation up to the current situation.

The microtitre plate

Originally proposed by a Hungarian physician in 1951, it was not until the mid 1960’s that the first commercial 96-well plates appeared in the market, especially used for applications requiring serial dilutions. During the blooming of ELISA (1970’s) and PCR (1980’s) techniques, microtitre plate usage increased exponentially and the first bespoke liquid handlers and readers were developed. HTS then came into play and drug discovery organisations liaised with technology suppliers to develop standardised plate layouts of higher well densities. Multiplying the number of wells by four in the same area, 348-well and 1536-well plates were made available for a wide range of applications. HTS labs gradually increased their capacity and throughput transitioning to these new formats sequentially. Higher density formats (2080, 3456, etc.) have also been developed, although their usage has been less spread. Currently, most HTS facilities run their screens preferentially in 1536-well format while 384-well plates are still the option in cases where further miniaturisation is not feasible. Further miniaturisation is now viewed as too challenging and/or not required so the industry seems to have reached a plateau in this respect. An alternative to microtitre plates are microfluidic devices (chips). They enable greater miniaturisation as they circumvent issues related to evaporation and surface-to-volume ratio. However, cost-benefit analysis is not viewed as favourable by every organisation and they also present intrinsic problems such as channel clogging.

The liquid handler

Several evolutions have taken place in this field. On the parallelisation arena, there has been a ramp from multi-channel pipette up to 1536-channels pipetting heads. Currently, the most common formats are 384-channels for head pipettors and 8-channels for dispensing units or ‘broadcasters’. The most impressive advancements have been made in the field of non-contact dispensing. These techniques allow the addition of solutions into wells without any part of the liquid handler ever contacting the contents of the wells. For reagent dispensing applications, peristaltic-driven pumps are still a popular choice because of the low cost and speed of operation. Syringe-driven micro-solenoid valve instruments have also proven useful for these kinds of applications. However, stamping tiny amounts of collection compounds onto assay plates is a more challenging task for which exotic technologies from other fields have been adopted. Piezoelectric driven pumps, used for decades in inkjet printers, have been successfully adapted to dispensing biological reagents and pipetting compounds in low-nanolitre amounts. Even more shocking was the recent application of sonic energy to propel tiny amounts (a few nanolitres) of compound stock solutions to assay plates with great accuracy and precision.

The microplate reader

The first microplate readers were colorimeters which used the same detection systems as classical readers in biochemistry laboratories, differing only in the sample container and holder, which was no longer a cuvette, but rather a microtitre plate. However, creative thinking has allowed configuration of versatile readers with optics capable of reading in absorbance, fluorescence in its numerous flavours and luminescence. The greatest revolution in terms of reading speed was the replacement of onewell- at-a-time detectors with charge-coupled device cameras which are able to take an image of the plate in almost any reading mode using telecentric lenses. These kinds of readers can read any plate format in a short time, enabling high throughput. However, photomultiplier tubes and photodiodes are still used as they offer some advantages over CCD cameras. High content assays are becoming more popular and these require instruments with great spatial resolution, which can be achieved using confocal optics or line-scanning devices.

The automation suite

From the early 1990’s up to the mid 2000’s the trend was clearly towards as much automation and throughput as possible. This led to the ‘industrialisation’ of HTS, i.e. implementation of fully integrated robotic platforms, process quality assurance and round-the-clock operation. This evolution required a radical change in mindset and enrichment of staff with engineering, physics, statistical and informatics profiles. Large investments were made in screening ‘factories’ and the pressure was then exerted upon assay technologies suppliers: industrial operation requires simple and robust assays. A wide variety of homogenous assays were made available to drug discovery scientists using creative and innovative ways to measure molecular interactions and biochemical reactions in a fast and simple way. Needless to say, as with all simplifications, screening hits then need more complex and slow analyses to discriminate false from true effectors of the biological process of interest and characterisation of the latter. In more recent years, automation suites have been built with less focus on throughput and paying more attention to flexibility and information content. A combination of bench-top instruments, workstations and fully integrated but easily re-configured robotic systems is now the fashion. Moreover, homo – genous and simple-output assays are no longer a sacred cow and high-information content assays are increasingly considered as primary assays for HTS even if their throughput is not as high as for simpler alternatives. The rationale behind this change is the quest for the best possible compromise between biological relevance of results and the highest speed.

Data analysis tools

The massive wealth of information generated each day in a discovery organisation clearly requires tailored solutions for automation of data acquisition, analysis of results, quality assurance and data integrity. Additionally, advanced statistical and data mining tools become indispensable. The screening scientist deals with a variety of software applications which ensure on-line analysis, process control and data integration in relational databases, as well as exploration of data sets and generation of reports. The presence of potent IT groups developing, implementing and supporting these applications is crucial and hybrid professional profiles that combine biological or chemical and informatics skills are becoming more and more popular.

High-content assays and the advantage of failing early

With focus put on the patient, the real objective is to increase the success rate of discovery activities as a whole. Innovative thinking applied to drug discovery will likely reduce the number of projects that have to be terminated due to the lack of target validation, intractability of hits or poor safety and efficacy of promising compounds. Unfortunately though, it is likely that the high attrition rate that has characterised drug discovery efforts will continue to do so for some years. The intelligent approach, once we assume attrition cannot be completely avoided, is to reduce its impact on operational costs. One important factor in achieving this goal is to shift attrition to the earliest time possible in the lifespan of discovery programmes. This minimises the cost of dead-end efforts liberates resources soon. Deployment of these freed-up resources to other projects also enhances the overall success rate of the discovery organisation.

Implementation of more complex tech – nologies in the first phases of the drug discovery process is expected to reduce the time to succeed, but also shorten the time to fail. Using assay systems that closely resemble the actual biological processes we want to modulate may not be the fastest way to obtain a list of hits but it likely enriches such list with compounds really worth pursuing. There are several ways to approach this goal. One of them is the usage of imaging assays in which mammalian cells (or potentially pseudo-tissues and pseudo-organs) can be observed with great spatial resolution with reasonable speed. This technique allows not only observation of the biological process of interest in their natural environment but also enables measurements of compounds’ effect on sub-cellular distribution and trafficking. Another means of making screening assays more biologically relevant is to use the most native biological material possible. Ideally, one would consider using cells from human tissues for screening. However, availability of a large single batch of such primary cells is rarely found and those cells are difficult to maintain in culture as they are not immortalised. Fortunately, recent advances in stem cell biology are providing screening scientists the tools to differentiate cells in a controlled and synchronised manner. This enables screening of compound collections against the nearly genuine scenario where the desired drug actually has to exert its action.

So what is next?

While it is difficult to anticipate which will be the next paradigm in laboratory automation for drug discovery, it seems clear that flexibility and versatility of research organisations will continue to be important as emerging discovery strategies are assimilated as complements to existing ones. At the technological level, flexibility and versatility are achieved by implementation of plug-andplay interfaces, provision of open-architecture hardware and software and usage of generic (i.e. non-manufacturer-specific) solutions. Computing power will play an increasingly important role, firstly to cope with the information-rich outputs from more complex assays and ultimately (in several decades?) as a consequence of our eventual ability to perform integral in silico discovery by realistically simulating drug behaviour outside the laboratory.

About the Author

Fernando A. Ramon Olayo

Fernando A. Ramon Olayo is currently Manager, Platform Technology and Science at GlaxoSmithKline. He is responsible for development of biochemical assays for discovery and characterisation of biologically active compounds. He has 18 years of research experience in both academia (Universidad Complutense de Madrid, UCM) and industry (GlaxoSmithKline, GSK) with particular focus on enzym – ology, assay development, assay adaptation to robotics and HTS for all major therapeutic areas. During a significant period, he was heavily involved in laboratory automation and participated in specification, development, implementation and operation of screening systems with unprecedented miniaturisation capabilities in liaison with instrument manufacturers. Since 1998, he is also an Associate Professor of Biochemistry at UCM.