Drug purity is crucial for ensuring the safety and effectiveness of pharmaceuticals. An important component to ensuring purity involves identifying contaminants in a drug product and determining if they occur at a sufficiently low concentration. Contamination can occur at any point of the drug manufacturing process, from the raw materials to the final product.
In this blog, we’ll look at how impurities are characterized using an extractables and leachables (E&L) study.
A Brief History of the Regulation of Contaminants
In the US, the Food and Drug Administration (FDA) has regulatory authority over the approval of new drugs. Like many other executive agencies, legislation gives the FDA authority to issue rules in its domain. These executive agency rules are collected and published in the Code of Federal Regulations (CFR).
Contamination in pharmaceutical development plays such an important role that the very first published CFR in 1938 included rules from the FDA related to contamination from packaging. These first contamination-related rules borrowed language directly from the 1938 Federal Food, Drug, and Cosmetic Act to specifically consider a drug as adulterated if the container was “composed, in whole or in part, of any poisonous or deleterious substance which may render the contents injurious to health” (21 CFR 342(a)1.6).
Decades later, the FDA would tighten up this rule. In September of 1978, a sentence was added to the Code of Federal Regulation (21 CFR 211.94) stating that the packaging of drugs can’t be “reactive, additive, or absorptive so as to alter the safety, identity, strength, quality, or purity of the drug.”
While the principles behind these executive actions are laudable, the lack of specificity has led to confusion and wasted time and effort. The issue of a lack of clarity eventually came to a head. Emerging studies on the use of propellants in metered dose inhalers led the FDA to release a guidance document for the drug manufacturing industry in May 1999. This 56-page non-binding document laid out the FDA’s thinking on evaluating contamination and established the groundwork for modern-day E&L studies.
What are Extractables and Leachables?
Contaminants can come from any surface that the drug comes into contact with during manufacturing, the container that stores the final drug product, components associated with the drug (such as a dosing cup or spoon), or the external packaging (including adhesives, shrink wrap, and even the ink used on the container). While the manufacturing process typically occurs under controlled conditions, the final drug can spend extended periods of time in its container-closure system. Consequently, safety guidelines increasingly focus on determining the potential contaminants derived from the container.
Contaminants can be classified as extractables, leachables, or both.
An extractable contaminant is a chemical that can be released under aggressive conditions from any of the manufacturing, container, and packaging components. The test uses exaggerated, but controlled, conditions that include factors such as high temperature, high pressure, solvents (such as water or strong acids/bases), excessive contact, and excessive time. Multiple types of extractable studies exist, including a Controlled Extraction Study (CES), an Accelerated Extractable Study (AES), and a Simulated Leachable Study (SLS). The compounds identified by an extractable study represent a “worst-case scenario” of what can be expected to be released by the materials associated with a drug.
In contrast, a leachable contaminant is a chemical that can be released under normal conditions. While an extractable test will analyze the manufacturing and packaging materials, a leachable test will examine the final drug product. The manufacturing and packaging process will proceed using production-level conditions, and a leachable study will identify any impurities in the final drug product.
Relationship Between Extractables and Leachables
It’s tempting to think that each leachable compound would also be an extractable. After all, the extractables include the compounds that can be removed from manufacturing surfaces under normal and aggressive conditions. In fact, most leachables are extractables. These are known as primary leachables. But a subset of leachables also exists that are not extractables. These secondary leachables derive from chemical reactions between the drug product and a primary leachable.
Identification of Extractables and Leachables
Various analytical approaches are used to identify the resulting compounds from E&L studies. While the specific assays used depend on the material being assayed and the experimental conditions, most approaches use a form of chromatography (liquid, gas, or ion) that can be coupled with mass spectrometry (MS).
Extractables and leachables can arise from all steps of manufacturing. Incomplete chemical reactions can lead to precursors of active ingredients in the final drug product. Even substances such as equipment lubricants and mold release agents can be introduced. A comprehensive E&L study will identify all possible extractables, check if they have leached into the final drug product, and may establish safe levels that will be monitored and enforced as quality control from batch to batch.
The Role of CMOs/CDMOs in E&L Testing
Many pharmaceutical companies opt to use the services of third-party providers, known as pharmaceutical contract manufacturing organizations (CMOs) and contract development and manufacturing organizations (CDMOs). These third-party companies can specialize in a single process or may offer services covering every aspect of drug development and manufacturing. Outsourcing E&L processes can be attractive to a pharmaceutical company due to guarantees of consistency and timeliness. Pharmaceutical companies may be especially interested in E&L testing due to the extensive equipment and expertise needed to design, implement, and interpret pharmaceutical testing. In addition, the use of a third-party can help mitigate pharmaceutical risks. Adding a new extractable test mandated by a regulatory agency would require significant capital expenditure and the acquisition of recurring costs. Larger third-party companies can scale and reallocate the use of existing capital equipment, leading to shorter lead times and fixed costs.
Regulatory Considerations of Extractables and Leachables
In addition to the CFR and FDA Guidance document cited earlier, various guidelines and recommendations exist describing the requirement of E&L studies for new treatments. While these guidelines and mandates communicate overall expectations, they fail to specify how E&L testing should be performed. To fill this gap, industry leaders have formed consortiums that develop guidelines establishing best practices. The Product Quality Research Institute (PQRI) Leachables and Extractables Working Group developed lists of recommendations for oral and nasal drug development in 2008 and parenteral and ophthalmic drug development in 2013. These industry-born recommendations have become the de facto gold standard for E&L testing procedures.
The Future of E&L Testing
Pharmaceutical production has been shifting toward disposable or single-use systems (SUS). The wide use of plastics has overtaken the traditional stainless-steel equipment and components that required cleaning and validation for multiple uses. Vendors of SUS tightly control the materials used in fabrication and can provide not only the disposable product but also the associated extractable test results. The increasing use of SUS has led to the adoption of modified guidelines that attempt to mitigate risk and interchangeability. However, SUS come with challenges as well. The use of plastics can potentially add more impurities to the final drug product and are not always amenable to standard extractable studies that can use aggressive solvents. Custom testing protocols can be tailored to work around these sensitivities but may require additional regulatory approval.
In addition to the growing use of SUS, biologics are quickly becoming a more significant proportion of the pharmaceutical development environment. In contrast to small molecules, biologics are larger, more complex, and more sensitive to environmental conditions. The impurities that find their way into the final drug product of a biologic are more likely to have a deleterious effect on the active ingredient, thereby decreasing stability and efficacy. Impurities upstream in the biopharmaceutical manufacturing process can also significantly impact biologics. For example, a leachable in a bioreactor or fermenter may negatively affect cell viability, causing decreased yields. Consequently, performing and interpreting the results of E&L studies with biopharmaceutical manufacturing can be significantly more complex.
That’s a Wrap
In summary, the FDA has prioritized the characterization and minimization of unsafe contaminants in drug treatments for nearly a century. Over the years, the focus has shifted to more complex testing that identifies and attempts to characterize the risk of leachables and extractables that are found. While the absence of all impurities would be ideal, minimizing their presence and ensuring that they do not negatively impact the performance or effectiveness of a treatment is an achievable goal. However, the increasing use of single-use products and the growth of biologics will require industry and regulatory research to continue to develop practical guidelines.
Sign up for our monthly Bolt from the Blue Newsletter to stay up to date with Cobalt blogs and other news.