There are several steps you can take today to ensure you save money and adapt to the FDA’s revised 21 CFR Part 11
Question: In February, you published a strategy article, Virtualization and Validation, that argued the Food & Drug Administration’s 21 Code of Federal Regulations Part 11 is on a collision course with current computing technology if the agency doesn’t release its long-promised revision soon. What prompted your article?
John Avellanet: There were really two sparks that started the article. One was a question from an attendee of my seminar last year, Understanding and Implementing the Revised FDA Part 11 and EU Annex 11. The second was a discussion I had with several subscribers to my quality systems compliance newsletter on how to save money when it comes to IT but still be in compliance with current FDA expectations.
Question: You write in your article about the status of the agency’s efforts to revise Part 11. How did you learn about this?
John Avellanet: One of the key components of my SmarterCompliance newsletter is compliance intelligence – it’s not really different from business intelligence. I survey a broad range of information sources: the agency’s own publications and presentations, warning letters, 483s, establishment inspection reports, daily FDA news articles, other government agency publications, blog postings, private conversations with colleagues in the FDA, and so on. It takes quite a bit of time just to gather and synthesize all the information. I then analyze it, look for subtexts, trends and themes, and assess probabilities. From all this I make recommendations for my clients.
Question: Given what you’ve learned about the revised Part 11, how can companies take advantage of where the agency seems to be heading?
John Avellanet: One approach is to leverage a technology concept called “computer virtualization.” I won’t go into all the details, but essentially, there are a two types of virtualization you can think about:
- running multiple software on the same computer and having each piece of software think it has the computer to itself (in other words, your production line monitoring software thinks it’s the only software on the computer); and
- spreading your software across many different computers connected across many different locations in the world – in other words, “cloud computing.”
In the latter model, your production line monitoring software runs and stores data on multiple computers either in a contracted data center or in data centers spread across sites in India, the US and Sweden. From a cost perspective, technology analysts have shown that companies can save between 30 to 80% of their IT budgets with virtualization. So the return on investment is there.
For a big company like AstraZeneca, outsourcing your systems to an IT vendor that uses virtualization is going to drive down costs. Where the revised scope of Part 11 come into play is around validation. Under the old rubric of “validate everything,” virtualization was impossible. But because so much of the agency’s current Part 11 thinking is centered around data integrity – as opposed to software code validation, for instance – companies can focus their compliance efforts on data controls and leave the software and hardware largely to the technology vendors.
Question: Can you give some examples?
John Avellanet: From the big picture standpoint, you need to do three things:
- Do your homework first – In the Virtualization and Validation article, I gave several suggestions on how to find a technology virtualization provider that fits your company, and folks can walk through those on their own. The key is to make sure you do your homework and not just pick some firm that made a splash with last month’s press release; inspectors will look to understand the logic behind your decisions
- Conduct risk-based due diligence – If the systems (and data) you’re going to outsource are low risk in terms of your quality system, product safety or efficacy, you may be able to get away without doing a full on-site audit, and just use the so-called “paper audit.” The intensity of your due diligence needs to be based on the criticality of the records contained within the systems you plan to virtualize
- Craft a quality or technical agreement with reasonable expectations and sharp teeth. Clearly identify your minimum level of expectations. I suggest you do some research on what typical levels are in the industry for each category of system you want to virtualize and outsource. For instance, I’m sure MasterControl can provide typical uptime statistics for any of its solutions; I’d then find out similar numbers for other technologies, then average them all together to get a median uptime expectation – say 98.4% for example. And then that percentage would be expressly written into any quality or technical agreement I signed. The teeth of the contract might be financial penalties that would be assessed if average uptime dropped below the 98.4% level for a defined period
When we narrow down to the details, the focus has to be on controls – and verification thereof – around electronic record integrity. If nothing else, you want the ability to conduct independent verifications of the vendor’s controls on your data. This is where working with someone independent to conduct a mock Part 11 audit can help ensure you clarify reasonable controls and thresholds (plus some “stretch” goals), and then help you push back against any vendor objections.
Whomever you involve, make sure they have both IT compliance expertise and records management experience; one without the other is going to leave you vulnerable. I’ll address the records management side since therein lies the most common weaknesses and gaps I uncover when I conduct audits. Electronic data is most at risk when it’s sitting in storage – either on a computer or backed up on tape. Remember, your data can sit there for a long time – five, ten, even twenty years in some cases, depending on the regulation involved. What the information inspectors want to look at is not really the document you did yesterday, but the one that supports a process undertaken six months ago or a clinical trial conducted six years ago. And, if you’re sued, it’s all your stored information for which the litigators are going to file discovery motion so that they can get their hands on it. So making sure you understand the records management controls and implications around your electronic data integrity is crucial.
Question: When you advise clients and subscribers on saving money with virtualization and staying Part 11 compliant, what’s the one thing you want them to keep in mind?
John Avellanet: If you do nothing else, make sure to identify – and execute – a strategy based on reasonable risk mitigation, focusing primarily on controlling risk to your record integrity. This will allow you to take advantage of where the FDA seems to be driving Part 11 while you take advantage of new technology to save money. Keeping the 20th-century “validate everything” Part 11 mindset while trying to leverage today’s 21st-century technology is a recipe for noncompliance and budget breakdown.
Question: As far as the expected revisions to Part 11 seem to indicate, do you see any advantages in maintaining an electronic quality management system over a paper-based system?
John Avellanet: For companies that go the virtualization route – and particularly those that choose to outsource their IT systems – an electronic quality management system would be more efficient, allow greater tie-ins and monitoring of record integrity parameters, and likely serve up more cost savings over the long run. Given the lengthy timelines of new drug, biologic and device time to market, long-term planning for compliance goes hand-in-hand with fiscal responsibility.Originally published March 2009 in GXP Lifeline