The volume of data generated in modern medical research centres is growing exponentially and becoming more diverse as advancements in automation and biotechnology transform the basic operations of these laboratories and clinics. Patient care and laboratory instrumentation generate data at a rate that rapidly outpaces the ability to track and process information with traditional (manual) methods.
We found that a robust electronic information management system is essential to maintain control over operations in a dependable and compliant fashion. Over the last seven years, we have developed and implemented a Laboratory Information Management (LIMS) system in our academic translational research laboratory and have since expanded to related research and clinical manufacturing operations. We describe examples of how the LIMS system was developed, implemented and how workflows were streamlined; and time and labour were reduced for routine data collection and management requirements, all while ensuring compliance with federal and state regulations.
Our original intent was to implement a system that could manage an active translational cell therapy laboratory and would aid in organising and reporting on preclinical development and clinical manufacturing activities including (but not limited to) capturing, storing and tracking of large amounts of compliance documentation; employee training records; raw material inventories; equipment calibration schedule management; product testing results and the tracking of samples for clinical use. Because we were developing data to support Phase I clinical trials in gene therapy, we required a system that would maintain compliance with Good Manufacturing Practices (GMP), Good Laboratory Practice (GLP) and Good Tissue Practices (GTP), known collectively as GXPs1-4.
After creating specifications for data capture including the various sources, formats, quantity of data, relationships between diverse data sets and the structure of reports generated, we reviewed and screened products from four vendors as previously described5. We ultimately selected a system that best met our programmatic needs, could leverage existing equipment (see hardware considerations box) and did not exceed our budget. In order to minimise on-going costs and in anticipation of future system modifications, we trained an existing member of our laboratory to develop and configure the system, eliminating significant post-implementation consulting costs. We also involved the targeted end users (laboratory personnel) in the design and initial testing to ensure acceptance upon implementation. User training involved an overall introduction with hands on training. Most laboratory personnel were able to fully utilise the system with minimal errors within a week. Once deployed, lab personnel readily transitioned to the new system and found it user friendly and helpful.
Are you looking to explore how lipid formulations in softgels can enhance drug absorption and bioavailability. Register for our upcoming webinar to find out!
3 September 2025 | 3:00 PM BST | FREE Webinar
This webinar will delve into the different types of lipid formulations, such as solutions, suspensions, emulsions, and self-(micro)emulsifying systems. Applications span diverse therapeutic areas including HIV therapy, oncology, immunosuppressants, and emerging treatments like medicinal cannabis (eg, CBD).
What You’ll Learn:
Lipid formulation development and screening tools for optimisation
Key steps in scale-up and industrialisation to ensure consistency and efficiency
Impact of lipid-based softgels on drug delivery and patient outcomes.
Implementation did not require the purchase of additional hardware or special equipment. By utilising clustered servers and Virtual machines (VM), we were able to use the existing infrastructure with redundancies for high system availability and lower cost of implementation. Utilising VMs also has the advantage of easily bringing up multiple instances for development and testing of new features. We also utilised external vendor applications for label printing bar-coding and image file processing (zipping and thumbnail creation), some of these were open source or freely available (7-Zip, www.7-zip.org or ImageMagick, www.imagemagick.org for example).
The first LIMS module (implemented in 2005) involved external data collection for the recruitment of healthy donors for our preclinical development protocols. The system providessignificant help in recording and tracking subject and sample information including donor eligibility, in-process and analytical test results, raw material usage and final product disposition within the lab. As of 2013, use of the LIMS systems has expanded to three GMP facilities and various research / clinical labs on campus for the purpose of information management, including sample, material and equipment storage and tracking, instrument data and external data import and integration and GMP compliance functions. For instance, the storage location inventory tracks products in cold storage by identifying available spaces for incoming samples (contiguous or dispersed) and providing user-entered sample attributes (metadata). Equipment entering a facility is immediately placed on a calibration / preventative maintenance schedule, and raw materials and supplies are quarantined, inspected and released for use in GMP manufacturing.
An important aspect of GXP management is the control and documentation of systems used to generate and release manufactured products. Before we implemented a LIMS-based document management system, batch record distribution, collection of records and document revision(s) were performed manually, requiring significant labour and time to complete. Now when batch records are assigned to manufacturing and quality control personnel, training records and referenced documents are cross-checked by the system. Automatic updates can also be sent out by the system when documents are either newly implemented or revised; ensuring proper training procedures are followed.
With the implementation of the LIMS document management system (Document Manager), we are able to control and track the generation, review, approval, release, revision and tracking of all documentation relevant to the production and release of clinical products. Document Manager is currently used to track standard operating procedures (SOP), production batch records, raw materials, product specifications, testing results, equipment calibration and maintenance, out of specification investigations (OOS), corrective and preventive actions (CAPA) and employee training. In an effort to link documents with manufacturing and regulatory investigations, we developed a series of templates for CAPA, SOP deviations, OOS and Quality Management Reports (QMR) that are designed to fully capture events that impact or have the potential to impact the quality, safety or efficiency of production. We have used this system to manage these activities and generate summary regulatory documents to ensure compliance with GXP requirements. Additionally, we have been able to automate the management of processes and link disparate pieces of information on the same product (i.e. donor ID, in-process test results, release test results, raw materials used, training and reference documents). Electronic records / documents related to product manufacturing and release testing are maintained in an orderly fashion and are readily available for review and inspection.
When adverse events occur inside or outside the manufacture of clinical products, QMR findings have led to process streamlining, leading to the development and implementation of a new SOP for root cause analysis (RCA), development of new forms that include a GMP equipment approval form, a project proposal form and a customer satisfaction survey, revisions to the SOP for CAPA and retraining of manufacturing personnel to prevent repeat occurrences. The QMR references other items within the system including personnel, instruments, materials, samples and batch records etc., allowing personnel or instruments, for example, to be queried for the number of investigations they are found in. Instruments which are part of more than a specified number of QMRs impacting product quality may be recommended for replacement or specific personnel recommended for retraining.
The LIMS system was readily able to connect to laboratory instrumentation to automatically retrieve and parse data. Interfacing with simpler instruments such as balances or pH meters was simple and reduced the potential for transcription error when processing a large number of samples. Sophisticated / advanced instrumentation run by user manipulated software applications allow the export of text (*.txt) or comma separated values (*.csv) files which were imported into a LIMS system. We have imported data directly from instruments such as microplate readers, cell counters and PCR machines, and also data from third party software used to perform analysis of large instrument generated data sets (e.g. flow cytometry and confocal imaging data). Utilising automatic import of instrument data, we are able to archive large file sets with a significant reduction in labour. Previous archiving methods required staff to dedicate about one week per month to perform this task and to catalogue the files stored on each disk. Our system reduced this time to a few hours every three months and has improved our search capabilities. File searches are done through use of date/time stamps, file name and owners, metadata if needed (run parameters) and thumbnail viewing.
Our LIMS system also utilises technology to interface and retrieve information from local enterprise systems such as web services, HL7 messaging, and ETL (Extraction, Transform, Load) systems. The automatic import of data saves a significant amount of manual labour and virtually eliminates transcription errors. By utilising scheduled data transfer from the Enterprise Purchase Requisition System, the LIMS application stores material and reagent information upon receipt. System users are only required to enter information on items received; expiration date, lot number and inspection information such as received condition (ice/cold packs, light sensitive requirements). By automatically retrieving data from external systems, large amounts of data may be retrieved, searched, stored and connected with existing elements within the system. Additionally, storage of this data allows information to be entered once. For example, on receipt of quality certificates (Certificate of Analysis, Certificate of Quality, etc.), materials received from the same manufacturer with the same lot number require less processing time as the certificate is already on file and re-entry is not needed. For manufactured products, certificates are archived and easily retrieved for manufacturing batch records.
The implementation of LIMS into our laboratory workflows has been a positive experience. We have been able to improve performance by reducing the time required for repetitive data entry tasks and sample / material locations have been efficiently organised for ease of retrieval and storage. By reducing the rate of transcription errors, the value of the data produced and archived has also increased allowing us to stay compliant with regulatory requirements (FDA and HIPAA). Utilising the system we are also able to link raw materials and certificates to manufactured clinical products. We have also implemented a storage location management module that is configured for any hierarchy of locations so entire boxes, racks, or even freezers can be tracked in a coordinated fashion. When storing samples and reagents, freezer space usage is maximised by allowing personnel to rapidly identify and fill available spaces without performing manual searches of boxes. This also serves to reduce the amount of time cryopreserved samples are exposed to room-temperature conditions and thus improves long-term storage of critical samples.
The ability to review and edit datasets is controlled through user-defined security access groups. Audit trails allow for tracking of all changes made in the system, when they were made and by whom. Additionally, samples are tracked when removed for specific protocols or projects, and sub-samples can be created, especially useful when tumour paraffin blocks or bulk tissue are sectioned, allowing for traceability throughout the sample’s lifecycle. Additionally, the storage of raw material data in the LIMS system facilitates the rapid identification and location of non-compliant materials during product recalls and out of specification investigations. Associated materials / manufactured products can be identified for recall or further impact analysis. Import of external information reduces the amount of manual data entry and improves our ability to track and trend. Vendors who consistently fail to adhere to required shipping conditions or do not meet certificate requirements are easily identified and eliminated from preferred and/or approved vendor list. Captured purchase order data may be used to compare raw material pricing, delivery consistency and other parameters that support lean manufacturing efficiencies.
In order to streamline and improve incoming materials processing, we implemented the printing of barcode labels for all received materials. This has improved material identification and helped maintain the readability of the labels on products requiring long term cold storage. Label material type was researched to ensure integrity and durability in a -80°C/LN2 environment and an external barcode labelling application was used to design the labels and selected for ease of integration with the LIMS system. We are currently implementing utilisation of bar-coded materials with common hand held devices for point of use tracking. For GMP manufacturing, barcoded reagents, supplies and disposables will be scanned into the system and recorded as part of the batch history record where product traceability is essential. Out of calibration equipment items or expired materials will be immediately flagged for replacement as personnel scan items at the beginning of a manufacturing production. Additionally, mobile devices will be used to remove samples from storage at the freezer location for quick and accurate logging and tracking of samples.
In summary, we are entering our eighth year of using the LIMS system and our overall experience continues to be a positive one. It has led to implementation for greater use at City of Hope’s 3 GMP facilities, the Pathology Core Bio-Specimen Repository, and a Research Light Microscopy Core facility. The system allows the ability to customise and meet our data storage and retrieval demands while also maintaining compliance with the FDA’s Code of Federal Regulations 21CFR§11 and 21CFR§820 guidelines with regards to electronic records, records maintenance, quality management as well as compliance with The Health Insurance Portability and Accountability Act (HIPAA) for patient privacy and security rules6-8. Implementation of the system has led to significant improvements in production efficiency, product safety, data security and has led to a reduction in operating costs.
Biography
Diana Russom (Department of Information Technology Systems, Beckman Research Institute of the City of Hope)
Diana Russom is a certified Business Analyst with over 10 years of experience in Biotechnology, laboratory system and database development. She has designed and implemented LIMS systems for basic and translational domains, including solutions in management of large files and file sets, GMP quality assurance, mouse pre-clinical research studies and clinical sample bio-repositories. Diana’s previous experience includes GMP manufacturing of plasmid DNA, process development of T-cell therapies and quality assurance systems development. She is currently a Senior Systems Analyst in the Department of Information Technology Services at the City of Hope.
References
21CFR 210. Current Good Manufacturing Practice in Manufacturing, Processing, Packing, or Holding of Drugs; General, (2012 )
21CFR211. Current Good Manufacturing Practice for Finished Pharmaceuticals, (2012)
21CFR1271. Human Cells, Tissues, and Cellular and Tissue-Based Products., (2012)
21CFR58. Good Laboratory Practice for Nonclinical Laboratory Studies, (2012)
Russom D, Ahmed A, Gonzalez N, Alvarnas J, DiGiusto D. Implementation of a configurable laboratory information management system for use in cellular process development and manufacturing. Cytotherapy. 2012 Jan;14(1):114-21. PubMed PMID: 21973024
This website uses cookies to enable, optimise and analyse site operations, as well as to provide personalised content and allow you to connect to social media. By clicking "I agree" you consent to the use of cookies for non-essential functions and the related processing of personal data. You can adjust your cookie and associated data processing preferences at any time via our "Cookie Settings". Please view our Cookie Policy to learn more about the use of cookies on our website.
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorised as ”Necessary” are stored on your browser as they are as essential for the working of basic functionalities of the website. For our other types of cookies “Advertising & Targeting”, “Analytics” and “Performance”, these help us analyse and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these different types of cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can adjust the available sliders to ‘Enabled’ or ‘Disabled’, then click ‘Save and Accept’. View our Cookie Policy page.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Cookie
Description
cookielawinfo-checkbox-advertising-targeting
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Advertising & Targeting".
cookielawinfo-checkbox-analytics
This cookie is set by GDPR Cookie Consent WordPress Plugin. The cookie is used to remember the user consent for the cookies under the category "Analytics".
cookielawinfo-checkbox-necessary
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-performance
This cookie is set by GDPR Cookie Consent WordPress Plugin. The cookie is used to remember the user consent for the cookies under the category "Performance".
PHPSESSID
This cookie is native to PHP applications. The cookie is used to store and identify a users' unique session ID for the purpose of managing user session on the website. The cookie is a session cookies and is deleted when all the browser windows are closed.
viewed_cookie_policy
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
zmember_logged
This session cookie is served by our membership/subscription system and controls whether you are able to see content which is only available to logged in users.
Performance cookies are includes cookies that deliver enhanced functionalities of the website, such as caching. These cookies do not store any personal information.
Cookie
Description
cf_ob_info
This cookie is set by Cloudflare content delivery network and, in conjunction with the cookie 'cf_use_ob', is used to determine whether it should continue serving “Always Online” until the cookie expires.
cf_use_ob
This cookie is set by Cloudflare content delivery network and is used to determine whether it should continue serving “Always Online” until the cookie expires.
free_subscription_only
This session cookie is served by our membership/subscription system and controls which types of content you are able to access.
ls_smartpush
This cookie is set by Litespeed Server and allows the server to store settings to help improve performance of the site.
one_signal_sdk_db
This cookie is set by OneSignal push notifications and is used for storing user preferences in connection with their notification permission status.
YSC
This cookie is set by Youtube and is used to track the views of embedded videos.
Analytics cookies collect information about your use of the content, and in combination with previously collected information, are used to measure, understand, and report on your usage of this website.
Cookie
Description
bcookie
This cookie is set by LinkedIn. The purpose of the cookie is to enable LinkedIn functionalities on the page.
GPS
This cookie is set by YouTube and registers a unique ID for tracking users based on their geographical location
lang
This cookie is set by LinkedIn and is used to store the language preferences of a user to serve up content in that stored language the next time user visit the website.
lidc
This cookie is set by LinkedIn and used for routing.
lissc
This cookie is set by LinkedIn share Buttons and ad tags.
vuid
We embed videos from our official Vimeo channel. When you press play, Vimeo will drop third party cookies to enable the video to play and to see how long a viewer has watched the video. This cookie does not track individuals.
wow.anonymousId
This cookie is set by Spotler and tracks an anonymous visitor ID.
wow.schedule
This cookie is set by Spotler and enables it to track the Load Balance Session Queue.
wow.session
This cookie is set by Spotler to track the Internet Information Services (IIS) session state.
wow.utmvalues
This cookie is set by Spotler and stores the UTM values for the session. UTM values are specific text strings that are appended to URLs that allow Communigator to track the URLs and the UTM values when they get clicked on.
_ga
This cookie is set by Google Analytics and is used to calculate visitor, session, campaign data and keep track of site usage for the site's analytics report. It stores information anonymously and assign a randomly generated number to identify unique visitors.
_gat
This cookies is set by Google Universal Analytics to throttle the request rate to limit the collection of data on high traffic sites.
_gid
This cookie is set by Google Analytics and is used to store information of how visitors use a website and helps in creating an analytics report of how the website is doing. The data collected including the number visitors, the source where they have come from, and the pages visited in an anonymous form.
Advertising and targeting cookies help us provide our visitors with relevant ads and marketing campaigns.
Cookie
Description
advanced_ads_browser_width
This cookie is set by Advanced Ads and measures the browser width.
advanced_ads_page_impressions
This cookie is set by Advanced Ads and measures the number of previous page impressions.
advanced_ads_pro_server_info
This cookie is set by Advanced Ads and sets geo-location, user role and user capabilities. It is used by cache busting in Advanced Ads Pro when the appropriate visitor conditions are used.
advanced_ads_pro_visitor_referrer
This cookie is set by Advanced Ads and sets the referrer URL.
bscookie
This cookie is a browser ID cookie set by LinkedIn share Buttons and ad tags.
IDE
This cookie is set by Google DoubleClick and stores information about how the user uses the website and any other advertisement before visiting the website. This is used to present users with ads that are relevant to them according to the user profile.
li_sugr
This cookie is set by LinkedIn and is used for tracking.
UserMatchHistory
This cookie is set by Linkedin and is used to track visitors on multiple websites, in order to present relevant advertisement based on the visitor's preferences.
VISITOR_INFO1_LIVE
This cookie is set by YouTube. Used to track the information of the embedded YouTube videos on a website.