Continued process verification – a challenge for the pharmaceutical industry?
75
SHARES
Posted: 10 March 2015 |
Nowadays, professional quality and process data trending is key for science-based pharmaceutical development and manufacturing. Recently, the U.S. Food and Drug Administration (FDA) and European Medicines Agency (EMA) issued revised process validation guidance to enforce recurring data analysis as a regulatory core requirement1,2: Periodic product and process monitoring, also known as “Process Verification”, is considered an integral part of process validation, with the aim being to demonstrate product compliance and process robustness during the whole life cycle.
Although this new guidance represents a consequence enhancement of the established processes, it bears the potential to revolutionise the daily business of the industry. Besides the traditional Annual Process Review/Product Quality Review (APR/PQR) monitoring of released attributes, the new PV-guidance (Process Validation guidance) enforces monitoring of critical in-process parameters and material attributes throughout the product life cycle. This review describes the requirements and the challenges arising from the implementation of the new approach, as well as potential pitfalls and upcoming solutions based on the current level of understanding, enabling a process verification that is both lean and efficient.
This report addresses the key factors shaping pharmaceutical formulation, including regulation, QC and analysis.
Access the full report now to discover the techniques, tools and innovations that are transforming pharmaceutical formulation, and learn how to position your organisation for long-term success.
What you’ll discover:
Key trends shaping the pharmaceutical formulation sector
Innovations leading progress in pharmaceutical formulation and how senior professionals can harness their benefits
Considerations and best practices when utilising QbD during formulation of oral solid dosage forms
Can’t attend live? No worries – register to receive the recording post-event.
Inside the new PV guidance, the authorities emphasise the great impact of the manufacturing process to a product’s quality and highlight effective process control as essential for product safety and quality. Both the FDA and EMA’s documents state that the requirements can be gained by a risk based approach evaluating each influence on the process and identifying its cause or rather the causal parameter. Although these are no revolutionary new ideas, the request for an effective demonstration of broad process knowledge and process control represents a change of paradigm in the normal daily process validation business. The traditional APR-process is substituted by a new approach requiring continued data trending through the entire lifecycle of a product. The guidance graduates the product’s lifecycle into three stages: ‘process design’, ‘process qualification’ and the new additional ‘continued process verification’. Furthermore it provides a detailed description of the expected results of each phase.
The new CPV concept
The first phase, the process design, follows a classical approach. It is used to achieve deep knowledge of the process through development activities including Quality by Design methods. At the end of the design phase all parameters having a potential impact on the process and the product quality are identified and well-known. This assessment is used as a basis for the classification of the parameters into critical, the so called critical quality attributes (CQA) and the critical process parameter (CPP), and non-critical.
The subsequent process qualification phase represents the second phase where all identified CPPs and ‘product quality attributes’ are evaluated during the scale-up of the process. The outcome of this is a condensed list of confirmed relevant parameters: the ‘control strategy summary’. Additional objectives of the process qualification phase are the demonstration of process capability and robustness based on scientific methods and the closely linked definition of the specification limits. Upon the completion of the second phase all prerequisites are defined to achieve persistent high product quality.
The last and final phase is specified as continued process verification and represents the enhancement to the classical approach for process validation. It is characterised by the new interpretation of the process validation as continued monitoring of the identified critical parameters at given limits through all stations of the product lifecycle. The FDA defines the continued process verification as “the collection and evaluation of data, from the process design stage through commercial production, which establishes scientific evidence that a process is capable of consistently delivering quality product.” This interpretation of process validation is in contrast to the traditional ‘three batches’ approach and brings two new topics up, process validation as a recurring data analysis through the complete product life cycle, starting with the late stage development phase through to the decommissioning of a product, and the constant verification of process robustness and capability by scientific methods. Any implementation of the CPV-concept will be matched on the realisation of these core requirements, which represent the greatest challenges.
It is quite evident that all this is beyond the capabilities of a regular laboratory worker in the quality department, thus an ‘integrated team’ representing a profile of all involved parties like qualified persons, QA representatives and statisticians is to be implemented as a constant working group. The major objectives are the review of the continued monitoring in periodic schedules and the adaption of the control strategy summary if necessary.
Emphasis on scientific methods
The basis of justification and classification of the process is the scientific analysis performed by experts familiar with the accurate application of statistic methods as statisticians or scientists. The relevant literature therefore contains a great amount of fitting solutions for each situation. Nonetheless, accurately implementing all of these methods is unrealistic. The application would be too complicated in daily business. Even if the guidance is not very specific about the applied statistical methods, a selection of a viable process is to be made. It is good practice to categorize the identified critical parameters and assign applicable analysis methods as a part of the Control Strategy Summary. Examples for feasible solutions are simple visualisation of the data or the application of generalised mathematical analyses. All these activities should follow a risk-based approach for the data evaluation.
Data evaluation
The simplest method to evaluate a process is the visualisation of the data in form of a line or scatter plot. Additional displayed specification limits and historical values of former batches allow a very good first assessment of the process regarding capability and robustness in its historical context. A good supplement to the visualisation of data can be the calculation of the CpK-value, an indicator for the process capability. On account of the assumption that the values are normally distributed, the CpK-value can be calculated based on a simplified formula. The CpK-value is in the majority of the cases a very good indicator for the process capability, but should be used rather as an additional reference than as a black-and-white criterion for a process.
An alternative way of describing the capability or robustness of a process is with statistical parameters calculated on the basis of the measured values in the observation period like average, minimum, maximum, standard deviation or variance coefficient. These values give a good impression of the location and the dispersion of a process without comprehensive analysis or visualisation.
Comprehensive data analysis
Statistics postulate that a process running in a ‘controlled’ state will have a random distribution of the measured values around the process average (Gaussian distribution). This, conversely leads to the view that the deviation from this pattern provides the indication of an uncontrolled process. An established testing method to detect and visualise these non-obvious ‘out-of-control’ conditions is the application of decision rules like ‘general electric‘. Based on the location of the defined observations relative to the limits or the centerline of a control chart, these rules can require an investigation of possible assignable causes. Even though the decision rules are good practice, they represent just a reference and do not classify a process according to capability or robustness. Control Charts (also known as Shewhart Runcharts) depict the actual values against the specification limits and support a classification of the process on the basis of the detection of shifts and trends according to the aforementioned interpretation rules. Thus a control chart supports the early detection of a potentially uncontrolled process in a very simple way.
Process knowledge and process control
A core requirement of the new guidance is the demonstration of broad process understanding and the evidence of constant process control. A common rule of thumb in order to fulfill these requirements is the creation of reports for 25-30 or so batches. This observation period enables preventive action to be taken, instead of being a reactive process, and can help to mitigate the risk of specification violations. Although on an individual basis all the above mentioned methods are easy to apply, the implementation is by virtue of the multiplicity of the required data sources and the required analysis reports are very difficult and challenging. It can easily overstrain a data analysis team without corresponding technical support, meaning the implementation of a suitable IT-system.
Technical requirement
All analysis methods described in this article have the common requirement of capturing a large volume of data derived from different sources to achieve trustful results and informative reports. The IT landscape in most companies is very heterogeneous due to the historical evolution of the systems. There is a need for a customisable middleware and interfaces to all incorporated systems or in the case of joined projects even companies. In order to achieve a complete quality overview of a specific product manufactured on several sites, as required by the product life cycle approach defined by the FDA and EMA, it might be necessary to harmonise and link data from several sources. In principle there are two core functions: the ‘data collection’ and the data analysis. The data collection component of the system must assemble all required data, aggregate it to readable data tables or reports and support the second component, the statistical analysis and visualisation as described above.
Technical solution
In the majority of the cases an out of the box implementation is not available at the moment and there is a special need to develop a proprietary system with respect to the release date of the guidance and the resulting expectations to implement the new introduced concept in near future. Basically there are two approaches which are competing with one another: a real-time and a batch processing data warehouse solution. The more fashionable one is the real-time solution. Its application requires a high quality of data and a very good performance of the underlying IT-infrastructure regarding global data aggregation and continuous availability of the data. Normally it is the preferred solution for companies with unified and centralised IT-systems. Under these prerequisites, the master data management and the linkage of data is an extremely simple and fast setup. The effort for data mapping and transformation processes can be mitigated or even eliminated. The result is a lean and high performing application.
Besides the fact that such a system is often not the reality, the great backdraft is that all data are stored in the origin systems and aggregated in virtual views calculated on demand. After decommission of the origin the data are no longer accessible for comprehensive analysis. The migration to a suitable archiving system is necessary to avoid the loss of data and the violation of data integrity rules. In reality, this may imply great investments and effort for migration of old data into a new environment at the end of the system’s lifetime.
The more old fashioned but no less robust approach is the batch processed data warehouse solution. Usually it is the preferred application in highly diverse companies maintaining a whole bunch of systems for the same purpose on different sites or companies in a network. A data warehouse solution provides the best conditions to maintain data integrity processes, enables persistent data storage and supports a good interface to commercial statistical analysis software. Both approaches have their benefits and backdrafts, but at the end of the day they must be compliant with the guidelines, the applicable law and regulations regarding electronic records. Therefore they must include mechanisms to enable the traceability, integrity and reproducibility of data essential for the subsequent analysis.
Conclusion
Although the new concept of the continued process verification represents a consequence enhancement of the established processes its implementation is a challenging and laborious task. On the other hand the use of smart supporting tools in association with a risk based approach can mean an improvement of product quality and process control.
References
FDA: Guidance for Industry – Process Validation: General Principles and Practices (2011) http://www.fda.gov/downloads/Drugs/Guidances/UCM070336.pdf
Dr. Michael Rommerskirchen obtained his PhD in Chemistry at the University of Cologne. After several years as project manager for the implementation of LIMS, in 2008, he accepted a position in the pharmaceutical industry and is now head of the Process Database Team responsible of setting up and organising a globally-distributed and automated data analysis system.
This website uses cookies to enable, optimise and analyse site operations, as well as to provide personalised content and allow you to connect to social media. By clicking "I agree" you consent to the use of cookies for non-essential functions and the related processing of personal data. You can adjust your cookie and associated data processing preferences at any time via our "Cookie Settings". Please view our Cookie Policy to learn more about the use of cookies on our website.
This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorised as ”Necessary” are stored on your browser as they are as essential for the working of basic functionalities of the website. For our other types of cookies “Advertising & Targeting”, “Analytics” and “Performance”, these help us analyse and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these different types of cookies. But opting out of some of these cookies may have an effect on your browsing experience. You can adjust the available sliders to ‘Enabled’ or ‘Disabled’, then click ‘Save and Accept’. View our Cookie Policy page.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Cookie
Description
cookielawinfo-checkbox-advertising-targeting
The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Advertising & Targeting".
cookielawinfo-checkbox-analytics
This cookie is set by GDPR Cookie Consent WordPress Plugin. The cookie is used to remember the user consent for the cookies under the category "Analytics".
cookielawinfo-checkbox-necessary
This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-performance
This cookie is set by GDPR Cookie Consent WordPress Plugin. The cookie is used to remember the user consent for the cookies under the category "Performance".
PHPSESSID
This cookie is native to PHP applications. The cookie is used to store and identify a users' unique session ID for the purpose of managing user session on the website. The cookie is a session cookies and is deleted when all the browser windows are closed.
viewed_cookie_policy
The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
zmember_logged
This session cookie is served by our membership/subscription system and controls whether you are able to see content which is only available to logged in users.
Performance cookies are includes cookies that deliver enhanced functionalities of the website, such as caching. These cookies do not store any personal information.
Cookie
Description
cf_ob_info
This cookie is set by Cloudflare content delivery network and, in conjunction with the cookie 'cf_use_ob', is used to determine whether it should continue serving “Always Online” until the cookie expires.
cf_use_ob
This cookie is set by Cloudflare content delivery network and is used to determine whether it should continue serving “Always Online” until the cookie expires.
free_subscription_only
This session cookie is served by our membership/subscription system and controls which types of content you are able to access.
ls_smartpush
This cookie is set by Litespeed Server and allows the server to store settings to help improve performance of the site.
one_signal_sdk_db
This cookie is set by OneSignal push notifications and is used for storing user preferences in connection with their notification permission status.
YSC
This cookie is set by Youtube and is used to track the views of embedded videos.
Analytics cookies collect information about your use of the content, and in combination with previously collected information, are used to measure, understand, and report on your usage of this website.
Cookie
Description
bcookie
This cookie is set by LinkedIn. The purpose of the cookie is to enable LinkedIn functionalities on the page.
GPS
This cookie is set by YouTube and registers a unique ID for tracking users based on their geographical location
lang
This cookie is set by LinkedIn and is used to store the language preferences of a user to serve up content in that stored language the next time user visit the website.
lidc
This cookie is set by LinkedIn and used for routing.
lissc
This cookie is set by LinkedIn share Buttons and ad tags.
vuid
We embed videos from our official Vimeo channel. When you press play, Vimeo will drop third party cookies to enable the video to play and to see how long a viewer has watched the video. This cookie does not track individuals.
wow.anonymousId
This cookie is set by Spotler and tracks an anonymous visitor ID.
wow.schedule
This cookie is set by Spotler and enables it to track the Load Balance Session Queue.
wow.session
This cookie is set by Spotler to track the Internet Information Services (IIS) session state.
wow.utmvalues
This cookie is set by Spotler and stores the UTM values for the session. UTM values are specific text strings that are appended to URLs that allow Communigator to track the URLs and the UTM values when they get clicked on.
_ga
This cookie is set by Google Analytics and is used to calculate visitor, session, campaign data and keep track of site usage for the site's analytics report. It stores information anonymously and assign a randomly generated number to identify unique visitors.
_gat
This cookies is set by Google Universal Analytics to throttle the request rate to limit the collection of data on high traffic sites.
_gid
This cookie is set by Google Analytics and is used to store information of how visitors use a website and helps in creating an analytics report of how the website is doing. The data collected including the number visitors, the source where they have come from, and the pages visited in an anonymous form.
Advertising and targeting cookies help us provide our visitors with relevant ads and marketing campaigns.
Cookie
Description
advanced_ads_browser_width
This cookie is set by Advanced Ads and measures the browser width.
advanced_ads_page_impressions
This cookie is set by Advanced Ads and measures the number of previous page impressions.
advanced_ads_pro_server_info
This cookie is set by Advanced Ads and sets geo-location, user role and user capabilities. It is used by cache busting in Advanced Ads Pro when the appropriate visitor conditions are used.
advanced_ads_pro_visitor_referrer
This cookie is set by Advanced Ads and sets the referrer URL.
bscookie
This cookie is a browser ID cookie set by LinkedIn share Buttons and ad tags.
IDE
This cookie is set by Google DoubleClick and stores information about how the user uses the website and any other advertisement before visiting the website. This is used to present users with ads that are relevant to them according to the user profile.
li_sugr
This cookie is set by LinkedIn and is used for tracking.
UserMatchHistory
This cookie is set by Linkedin and is used to track visitors on multiple websites, in order to present relevant advertisement based on the visitor's preferences.
VISITOR_INFO1_LIVE
This cookie is set by YouTube. Used to track the information of the embedded YouTube videos on a website.