If regulatory agencies had to review every syllable of data at every regulated manufacturing facility, few products would actually get to market. This is why agencies like the U.S. Food and Drug Administration (FDA) rely on manufacturers to provide complete and accurate information in their submissions. Naturally, the FDA tends to get agitated when companies try to pass off partial, fabricated or manipulated data. The agency even threatened to pursue criminal charges when Novartis submitted faulty data in its $2.1 million gene therapy application.(1)
The FDA’s patience is further tried when follow-up inspections or visits to a company’s other sites result in repeated warning letters for the same infractions. Despite data integrity sitting squarely in the regulatory spotlight, it still holds the record for the most common reason for warning letters. A report produced by Deloitte cites that data integrity violations account for over 70 percent of the warning letters issued globally.(2)
This is a sign that life sciences companies petitioning for a regulatory thumbs-up on their products should plan on devoting more time, effort and even technology to data integrity.
Some of the observations noted in actual warning letters include:
Regulatory organizations provide various resources with guidelines and instructions for complying with data integrity requirements. However, there still seems to be some misconceptions about the concept, including:
Consumers don’t have the option of reviewing active pharmaceutical ingredient (API) data, Certificates of Analysis (CoA), clinical study information or any other information involved in a product’s development. They count on the FDA to cover those bases and ensure drug products meet the requirements for quality, safety and efficacy.
To be confident that no corners were cut or data was falsified in the development of a drug product, the FDA expects manufacturers to ensure all data meets the guidelines outlined in the ALCOA acronym: attributable, legible, contemporaneous, original and accurate. This includes all metadata (data about the data) and data history as spelled out in the current good manufacturing practice (CGMP) requirements.
In a statement addressing issues regarding the quality of generic drugs, former FDA Commissioner Scott Gottlieb highlighted the following points about the agency’s efforts to ensure product safety:(3)
Based on these assertions, companies that are unable to demonstrate good data integrity practices can count on experiencing delays when seeking regulatory approval.
Pharmaceutical manufacturing environments have a lot of moving parts, numerous personnel and a considerable amount of data throughout an extensive supply chain. In business environments, employees are expected to multitask and produce high levels of output in a short amount of time with a very narrow margin of error.
There are several components that either individually or collectively undermine an organization’s ability to effectively manage data. Human error, improperly calibrated or maintained equipment, and the lack of clear procedures for detecting and reporting issues are just a few of the culprits. Ensuring data integrity under these circumstances is a tall order.
Still, because of the high stakes involved with drug products, all data must be recorded, stored and remain traceable throughout a product’s life cycle. However, data is commonly gathered or created by multiple people using different processes. In these cases, the data is often in varying formats and spread out across several locations such as spreadsheets, paper documents and department-specific databases.
Compiling data for reporting or audit purposes often involves tracking down logbooks, sifting through stacks of bins and file folders, or clarifying data that was written on scratch paper. If there are any gaps along the way, retracing steps to identify and resolve issues can cause major delays. This scenario is a recipe for lost files, inaccurate or incomplete data sets and a plethora of data integrity violations.
Data management is already painstaking, and it gets more complicated as products become more complex. In today’s pharmaceutical manufacturing landscape, a lot more data is available, but it’s useless without the right tools to organize and analyze it.
While there are many factors involved in an organization’s data integrity issues, a major contributor is human error. For example, many laboratory processes for determining product quality rely on a significant amount of human input and subjective assays to produce quality control data. It’s not that analysts lack knowledge or skill; it’s mostly because many processes simply allow for errors to occur.
The intent of advancing technology for CGMP purposes is to simplify processes, improve productivity and reduce opportunities for mistakes. Production timelines are always tight, and delays are extremely costly. In pharmaceutical manufacturing, automating as many tasks as possible alleviates the bottlenecks and setbacks caused by data management errors.
With so many areas of the supply chain to oversee and the number of opportunities for data integrity violations, companies that automate manufacturing and data management processes are able to gain tighter control of operations. The following are examples of how organizations can use technology solutions to confidently ensure data integrity compliance:
Digitization is critical for any type of organization. It augments data management best practices, enables stakeholders to have more visibility and control of the entire supply chain and allows companies to make faster and more confident strategic decisions. Most of all, it effectively unifies people, production equipment and systems.
References