However fast labs develop new therapies, the complex route to patients exacerbates shortages that compromise health outcomes. Connecting quality, regulatory, and manufacturing data is key to improving supply chain agility, explain Vicki Cookson and Sofia Lange.
Lack of data transparency across the supply chain is a core challenge for regulators and suppliers of all sizes. It prevents insights into supply fluctuations and their root causes. Decreased access to global compliance and quality data for the active pharmaceutical ingredients (APIs) and finished drugs manufactured offshore has only compounded supply risks. Between 2020 and 2022, the US Food and Drug Administration’s five-year backlogs for offshore API facility inspections increased from 30% to 80%, and the agency is pivoting to remote and other inspection formats.
Regulators, governments, patient advocates and industry groups are actively working to solve today’s supply problems. By listing the generics most vulnerable to shortages, the European Medicines Agency has taken an essential first step, which should help guide future efforts. So far, discussions of next steps have emphasised the need for economic incentives, to help generics manufacturers make updates, and to speed better approaches to supply chain data collection, monitoring and analytics.
Meanwhile, at the micro level, at the individual facility and company, there is a new focus on establishing data transparency and connecting data across functions. Datadriven approaches are helping more companies reduce the risk of shortages by improving the efficiency of compliance and quality operations.
Connecting data for agile change control
One area of focus is improving post-approval process change control, a time and labour intensive behind-the-scenes process that often leads to supply delays. Using traditional approaches, with disconnected data, separate IT systems and manual processes, a single change control process can take from six months to two years to complete. Depending on the regulatory agencies and regional requirements involved, the work required can delay a drug’s availability by up to five years. Today, a typical large biopharma company manages 40,000 of these applications each year, with up to 200 for a single product.
Imagine the EMA approved a manufacturer ’s new therapy two years ago. Since then, the company has developed a safer manufacturing process that reduces product costs. Its leaders also plan to use more sustainable packaging to reduce carbon footprint, and to shift from laboratory-based quality control to real-time batch release.
Each of these continuous improvements may require separate regulatory agency approval.
Gathering the data required for each change takes months. First, regulatory teams must determine the impact of each change, and which countries and internal documents will be affected. Supply chain teams must then do the same for individual product lots.
According to Lange, the quality department must then ensure all change impact assessments are completed, identify and make sure affected documents and processes are up to date, incorporate changes into new training programmes, and even qualify potential new suppliers. The team will also have to manage and keep track of these actions and estimate the potential risks of making each change.
Currently, at many companies, regulatory and quality teams use different electronic systems for each of these steps and communicate by email and phone. Delayed communication and errors can result in noncompliance and regulatory warning letters.
But that is only the beginning. After compiling, publishing and submitting the applications, regulatory teams must optimise ongoing communication with regulatory agencies. In the end, regulators from each affected country can still decide that they need to reinspect the facility to re-approve the new and improved product, triggering additional delays in product availability.
Connecting manual, disconnected processes
Unified approaches to quality and regulatory data management bring different software together onto a single platform, which can help streamline and simplify change control. They make it easier for users to meet regulatory requirements, and to spot and address problems faster.
Integrating quality, regulatory and supply chain data, documents and processes can enable even greater agility. It is now possible, for example, to connect regulatory and quality data and content, particularly product documentation, with a corporate enterprise resource planning (ERP) system.
A growing number of companies of all sizes and types are unifying quality and/ or regulatory data management. Some are connecting regulatory and quality operations to facilitate cross-functional collaboration, while others are connecting regulatory and quality data with their ERPs, which can potentially reduce batch-release timelines by up to 30%. Functional and cross-functional teams expect greater data transparency to make compliance and real-time information exchange easier, which would mitigate the risk of drug supply gaps.
Every day that a drug isn’t available costs a manufacturer hundreds of thousands to millions of dollars, whether for highly specialised drugs or everyday over-the-counter medicines. Unified approaches that improve data visibility, centralise access to real-time information, and automate workflows are already proving that they can speed patient access to the treatments they need.
Gathering the data required for each change takes months. First, regulatory teams must determine the impact of each change, and which countries and internal documents will be affected. Supply chain teams must then do the same for individual product lots
Tracking the time and cost of using traditional approaches and technologies can reveal surprising insights into the total cost of ownership and operation. Consider the savings and improvements from the following:
- Reorienting highly trained and qualified people away from manual, administrative tasks to focus on priority efforts, such as interaction with regulators.
- Reducing the time spent on disconnected, one-off email and telephone communications, measured by the number of hours each employee spends on these efforts each da y, within teams and across functions.
- Minimising risk of errors and duplication of tasks resulting from disconnected, manual information exchange.
- Strengthening patient and healthcare providers’ trust in access to critical treatments. Although this cannot be measured, quantifying missed product release deadlines over time could offer insight into performance gaps and trends.
- Avoiding the intangible, but significant reputational costs of having a drug supply problem.
Disconnected data not visible across systems can affect name-brand, generic and biopharma manufacturers alike. Change control is only one of several behind-the-scenes operations that drain time and resources and delay patient access to treatments. CMC submissions and submissions publishing are two other examples, both of which are undergoing significant change.
As the current drug supply situation has reminded us, we are all patients, and the industry’s supply chain issues affect us all. Solutions already exist to help automate more behind-the-scenes processes and maximise access to connected, real-time data. However, they can only work from a foundation of connected and transparent data, at the individual plant level and beyond.
Vicki Cookson and Sofia Lange are Veeva Systems strategy directors, respectively for Vault RIM, Enterprise and Quality & Manufacturing