Categories

Search This Blog

DATA INTEGRITY - ALCOA+


In 2013, The US-FDA reported that laboratory processes and deficiencies associated with laboratory controls were ranked in the top three most frequent causes of observations following US-FDA inspections. And same report also cited that an increase of 50% in warning letters related data integrity.

During Annual meeting of International Society of Pharmaceutical engineering (ISPE) held in 2014 at Las Vegas, it was reported that the FDA has identified that a dozen Indian pharmaceutical manufacturers who had problem with data integrity practices at their facilities. That number is significant, since India and china account for 80% of API production.

Other regulatory bodies, including the European Medicine Agency have made similar observations. It is expected that this trend will continue to grow.
Data integrity is currently one of the highest cited area in regulatory observations yet data integrity is not a new requirement.

For years the basic principles have been described in international GMP guidelines. Here I will highlight the meaning and principles of Data Integrity.

Definitions:

DATA:
Data is the information derived or obtained from ‘Raw data’.

RAW DATA:
Original records and documentation retained in the format in which they were originally generated (Paper or electronic) or as a ‘True copy’.

META DATA:
Meta data is the data that describes the attributes of other data and provide context and meaning.

Examples:

1.  For example, Analyst-A reported impurity-A results as 0.05% from HPLC chromatogram.
Raw data: HPLC Chromatogram
Data: 0.05%
Meta Data: Analyst-A, Impurity-A

2.  For example, Operator-A recorded in BPCR as Reactor-A temperature raised to 20°C and maintained for 1 hr.
Raw data: BPCR
Data: 20°C, 1 hr
Meta Data: Operator-A, Recorder-A

DATA INTEGRITY:
  1.  Data integrity is a policy of the firm which assurance that all data are accurate, complete, intact and maintained within their original context including their relationship to other data records throughout the data life cycle. 
  2. In short, data integrity aims to prevent unintentional changes to data or information. i.e ensuring data integrity means protecting original data from accidental or intentional modification, alteration, malicious intent (fraud) or even data deletion (data loss).

ALCOA+ Principle:

  1. As per Regulatory, data should meet certain fundamental elements of quality as follows whether they are recorded on paper or electronically. 
  2. ALCOA is commonly used acronym short for “Accurate, Legible, Contemporaneous, Original and Attributable.  
  3. Later on Complete, Consistent, Enduring and available also added to ALCOA principle which then termed as ALCOA+.  
  4. As per ALCOA+ Principle data should be Accurate, Legible, Contemporaneous, Original, Attributable, Complete, Consistent, Enduring and available.


ACCURATE:
The term Accurate means data are correct, truthful, valid and reliable. This means an honest, accurate and thorough representation of facts describing conduct of study.
Example-1:
A manufacturing instruction state as follows
1.    Take 25 gram of RM1 and add to 100 L water
2.    Mix for 20 min. Check complete dissolution
3.    Heat to 70°C.
Now while the solution was being heated for whatever reason the temperature rose to 72°C.
Is it deviation? Obviously Yes! So what does one do? Report? Ideally Report? What happens then Investigation, Risk assessment, CAPA, Massive documentation and probability of Auditors comments?
So, is there an easier remedy? Simply write 70°C in BPCR instead 72°C?

Example-2:
The result of Impurity from a HPLC chromatogram getting out of specification results as 0.11% against limit 0.10%. Ideally OOS initiation, Investigation, Impact assessment, CAPA and training.
So, is there an easier remedy? Simply write adjust integration parameters and adjust impurity result to 0.09% instead of 0.11%.

Other examples for inaccuracy:
·      Not or inadequately qualified/ calibrated / maintained equipment or instruments used.
·      Not or inadequately validated method / process used.
·      Investigation of OOS results & Deviation not done or doubtful.

      So, never compromise accuracy at any situation record actual accurate details.

There will be times when source documents are in complete, inconsistent, or wrong. If changes need to be made modifying a document always need to done in complaint manner. When the source is electronic, Audit trails can provide transparency to prevent data from being altered in a way that it is difficult to detect.

Finally, Data must correctly reflect the action / observation made.

LEGIBLE:
Data should be readable and understandable and must be possible to interpret data after it is recorded.
Example-1:
The typo error in date was identified in the document as 20/09/2016 instead of 19/09/2016. During the correction good documentation practices were not followed due to that old entries are not readable or not understandable. i.e. generated inlegibel document

Example-2:
During the issuance BPCR, it was noticed that the Xerox (True copy) of the master BPCR not legible due to printer problem but same was issued. i.e. generated inlegible document.

CONTEMPORANEOUS:
Data must be recorded at the time it was generated and observed. The documentation should serve as an accurate attestation of what was done and what was decided and why i.e. what influenced the decision at that time.

Example-1:
A manufacturing instruction state as follows
1.    Take 25 gram of RM1 and add to 100 L water
2.    Mix for 20 min. Check complete dissolution
3.    Heat to 70°C and maintain for 15 min.

But during recording of BPCR, Ideally record 70°C without noticing online for actual temperature.

Example-2:
During the HPLC analysis, the online entries (Updation of balance usage, pH meter usage logbooks) were not made and all entries were made after completion of analysis.

ORIGINAL:

Original Record can describe the first source capture of data or information. If corrections or revisions need to be made to original record, changes should not obscure prior entries.

Example:
In case of HPLC, The first source data is electronic copy of chromatograms and in case of balance the first source data is paper weight print which comes under original data.

ATTRIBUTABLE:

Attributable means information relating to originator of the data. i.e. when documenting data on a paper every written element is need to be tracked back to the authorized individual who is responsible for recording it. It requires the signature and the date.

Audit trail in the electronic system make it very obvious who created record, when it was created, who made a change, when the change was made and reason or the change. A complaint system will automatically track this information and enable electronic signature. Data is attributable to a unique user with secure password and role based permissions.

COMPLETE:

Complete data can be describe all relevant data is present and available. i.e. Complete data is data with all required data.

Example:
In case BPCR, BPCR is only a data not a complete data, a complete data includes Raw material issuance slips, on demand slips, in-process analysis reports, Labels..etc.

CONSISTENT:

All elements of record, such as sequence of events follow on and are dated or time stamped in expected sequence. i.e consistent practices to be followed like Good documentation practices…etc.
For example correction of wrong entries to be done in same manner for all documents.

AVAILABLE:

Data/Documents should readily available for review and auditor or inspection over the lifetime of document.

Records must be available for review at any time during the required retention period, accessible in readable format to all applicable persons who are responsible for their review whether for routine release decisions, investigations, trending, annual reports, audits or inspections.






CLEANING VALIDATION – BRACKETING – WORST CASE RATING

BRACKETING APPROACH

The cleaning process of multi product use equipment are subjected to requirements of cleaning validation. The validation effort could be huge. In order to minimize the amount of validation required, a worst case approach of for the validation can be used. 

Cleaning procedures for products or process which are very similar do not need to be individually validated. A single validation study under consideration of worst case can then be carried out which takes account of relevant criteria used for worst case selection. 

The bracketing approach may be considered acceptable for similar products and/or equipment’s provided appropriate justification based on sound and scientific rationale is given. 
Company should demonstrate the objective of bracketing and its scientific rationale for its worst case rating of the substances in the cleaning validation programme. 

Approach:

By means of bracketing procedure the substances/ products/ equipment’s are grouped and then sub grouped as applicable. 
A worst case rating procedure is used to select the worst case in each group/sub group as applicable. 
Validation of worst case situation takes place. However it is of utmost important that a documented scientific rational for chosen worst case exist. 

Grouping by Equipment Train:

For example if a multipurpose site has manufacturing number of organic substances by using number of equipment trains as given below.

Train A – 9 Substances can be produced which have same cleaning procedure

Train B – 9 Substances can be produced which have same cleaning procedure

Train C – 8 Substances can be produced with two different cleaning procedures. Out of 8 substances 4 substances have cleaning procedure-A, and other 4 have different cleaning procedure-B

Train D – 8 Substances can be produced which have same cleaning procedure

Train E – 10 Substances can be produced which have same cleaning procedure. 

Train F –11 Substances can be produced Out of 11 substances 6 substances have cleaning procedure-C, and other 5 have different cleaning procedure-D.

With no bracketing and worst case rating cleaning validation studies required for each of 55 substances. 

The substances to be grouped first based on equipment train. Hence 6 groups will be formed as per above data. Then the groups to be sub grouped based on cleaning procedure. Hence 2 sub-groups will be formed in each Train C & Train F groups. 

Finally the company would have 8 groups for cleaning validation purpose as follows

Train A – 1 Group
Train B – 1 Group 
Train C – 2 Group
Train D – 1 Group
Train E – 1 Group 
Train F – 2 Groups

Once the product groups have been established the next step is determined the so-called ‘worst case’ representative of each group and cleaning validation of the same.

By using bracketing approach we validated only 8 products out of 55 products.


Grouping by Substances: 

Substances can be grouped as follows

Produce in the same train substances with the same cleaning procedure. 

Produce in the same train substances with very low therapeutic dose and/or low batch sizes. (Then sub groups to be formed based on cleaning process) 

Produce in the same train substances with very high therapeutic dose and/or large batch sizes. (Then sub groups to be formed based on cleaning process). 

Produce in the same train substances with very low ADE. (Then sub groups to be formed based on cleaning process). 

Produce in the same train substances with very High ADE. (Then sub groups to be formed based on cleaning process). 


Once the product groups have been established the next step is determined the so-called ‘worst case’ representative of each group and cleaning validation of the same.

Grouping by Product: 

1. The common basis for grouping is by product. The grouping is usually based on the formulations or dosage form of the product. When this approach is used products are divided in to groups according to the dosage form and then according to formulation.


For example 

A company might have 10 tableted products, 6 ointment products and 4 liquid products. In this case the first evaluation would be that the products fall naturally in to 3 broad groups.


However if 6 of the tableted products were manufactured by wet granulation process, whereas 4 of the products were manufactured by a dry, direct compression method this would be a basis for subdividing the tablet in to 2 sub-groups. Likewise if 2 of liquid products were suspensions and other 2 liquid product were true solutions. This will also create 2 subgroups for this group.

The company would have 5 groups of products for cleaning validation purpose.

Tableted products – 2 groups
Ointment products – 1 group
Liquid products – 2 groups.

Once the product groups have been established the next step is determined the so-called ‘worst case’ representative of each group. 

2. Another example is would be a group composed of several products of similar potency. In this case the worst case selection might be based on the basis of solubility. 

3. Third example might be group composed of several products having same API and differing only in concentration of API. In this it would be reasonable to select product having highest concentration as worst case.


It is unlikely that single worst case product could apply to entire line of products having significantly different formulation and dosage forms.

The substance / Product which does not fall within bracketing approach must be validated individually.


WORST CASE RATING

Worst case rating will generally depend on following points.

a) Hardest to clean, Experience from production 
b) Solubility in used solvent 
c) Lowest acceptable daily exposure 
d) Lowest therapeutic dose

Hardest to clean, Experience from production:
One criterion which can be used is, experience from production with regard to how difficult a substance is to clean out. This study is recommended to be in the form of interviews with operators and supervisors.

Difficulty of cleaning could be rated according to the three categories suggested below.

Category: 

1 = Easy
2 = Medium
3 = Difficult

Solubility in used solvent:

Solubility rating should be carried out as follows.


Acceptable daily exposure (ADE):

The acceptable daily exposure (ADE) defines a limit at which patient may be exposed every day for a life time with acceptable risk related to adverse health effects.

ADE rating should be carried out as follows.


If ADE data are not available, other pharmacological (dose), OEL or toxicity data LD50 may be used.


Therapeutic dose:

Rating based on therapeutic dose can be given as follows.


If dose data are not available, other pharmacological (dose), OEL or toxicity data LD50 may be used.


Rating Procedure:

The worst case rating can be executed according to an issued protocol in which the methods and procedures for rating will be identified. And a formal rating matrix has been filled as follows.

For example if a group has formed from 9 substances (Esubstance, Fsubstance, Csubstance, Lsubstance, Osubstance, Msubstance, Psubstance, Rsubstance and Tsubstance) which can produce from same equipment train. Out 9 substances 6 substances have one cleaning procedure where as other 3 have different cleaning procedure.

All categories are introduced as column in matrix to identify worst case based on rating.


For the products in this train two cleaning procedures (Class 1 & Class III) are used.

Therefore two groups have to be validated. 


The worst case product (for the validation study) for class III is Osubstance (Solubility 2 and hardest to clean is 2.8). 

The worst case product (for the validation study) for class I is Rsubstance (Solubility 2 and hardest to clean is 2.6) 

In both cases the limit should be calculated with the most active substance (ADE4) if ADE data not available the limit should be calculated with the most active substance (Therapeutic substance 4). 

If limit calculated with ADE4 or therapeutic dose 4 is achievable for all products this limit can be chosen for both the groups. If limit is two low and not achievable Esubstance & Fsubstance should be considered as a separate group or produced in dedicated equipment’s. 

The limit for the remaining group should be calculated with the next most active substance (i.e ADE 3 or Therapetic dose4)

CLEANING VALIDATION ACCEPTANCE LIMITS


1984 - Samuel Harder Article: ‘Validation of Cleaning procedures’

Concerning the setting of acceptable limits Harder wrote that “Must be practical and achievable by reasonable cleaning procedure… …must be verifiable by analytical methodology existing in the company…. … must be safe and acceptable”

1989 – Doug mendenhall Article: “Cleaning validation”

Mendenhall expanded upon the ideas presented by Harder adding ideas, Such as using matrix approach, testing for cleaning agents, placebo batches, and most interestingly pointed out the potential use of visual inspection.And also mendenhall proposed that limits for surface residue levels be calculated based on smallest batch size / Maximum dose combination.

Surprisingly these two industry articles laid the foundation from which most cleaning validation acceptance criteria were derived and are origin of many cleaning validation activities.Shortly after these publications a major event began to unfold that shaped the direction of cleaning validation and industry practices.

1989-1992 – The US-FDA Vs Barr Laboratories and the Wolin decision

From 1989 to 1992 US-FDA inspected several Barr Laboratories facilities and issued multiple FDA form 483’s with increasing number of observations. The FDA finally sued a Barr Laboratories with district judge Alfred M. Wolin presiding over trail court. The trial ended in Feb 1993 with the decision by judge Wolin injunction against Barr Laboratories.The action on Barr Laboratories case was closely followed by Pharmaceutical Industry and PMA (Pharmaceutical Manufacturer Association) conducted a survey asking many questions on what they were doing in regarding cleaning validation and how companies were setting acceptance limits. In the survey results, PMA listed 44 unique acceptance criteria from the responses which were inconsistent from company to company and in many cases arbitrarily (randomly) selected.Concurrent with Barr Laboratories trial and PMA survey, Foundation or acceptance Limits were further expanded by another publication known in the industry The Fourmen and Mullen Article.

1993 – Fourman and Mullen Article:

At the time of Barr Laboratories trial, another company Ely Lilly, was also involved in a number of issues with FDA over cleaning validation specifically on setting of acceptable limits. In 1993 Gary Fourman and Dr. Mike Mullen published an article where they suggested carryover of product residue meet the following criteria.


1. Not more than 0.001 dose of any product will appear in the maximum daily dose of another product. 

2. Not more than 10 ppm of a product will appear in another product.

3. No quantity of residue will be visible on the equipment after cleaning procedure are performed.

The author even though provided explanation for 0.001 & 10 ppm that is not scientific and no regulatory reference provided.

However at that time, this article was land mark in the world of cleaning validation as it was first publication to lay out specific criteria for determining cleaning validation acceptance limits.

After effects of Barr Laboratories Decision:

During course of trial Judge Wolin observed that GMP regulations were vague and not very detailed- Certainly not detailed enough to expect companies to easily understand what the FDA interpretation and expectations were. Judge Wolin criticized the GMPs for their lack of detail and clarity.


The FDA inspectors in the Mid-Atlantic region put together a guide clarifying that what their expectation for cleaning validation. This guide is very detailed and specific. One year later the guide developed by Mid-Atlantic region inspectors was adopted by national centre for use by all FDA inspectors.


The guide states that

1.The firm rationale for the residue limits established should be logical based on the manufacturer knowledge of the material involved and be practical, achievable and verifiable.

2.It is important to define the sensitivity of analytical methods in order to set reasonable limits.

3.Clearly stated companies will put thought and analysis in to the selection of their cleaning validation acceptance limits. Simple adoption of the three Fourman and Mullin criteria is not satisfactory without a scientific justification for using these limits.

In this guide there is short section with concerns about detergents used in the cleaning process.

“If a detergent or soap is used for cleaning, determine and consider the difficulty that may arise when attempting to test for residues. A common problem associated with detergent use is its composition. Many detergent suppliers will not provide specific composition, which make it difficult to user to evaluate residues, As with product residues, it is important and it is expected that the manufacturer evaluate the efficiency of cleaning process for the removal of residues”.

Moreover FDA make it clear that they expected companies to test for detergent residues not just API residues.

Current Cleaning Validation Approach:

After Barr Laboratory decision, the concept of cleaning validation was changed year to year in a very drastic way and the number of regulatory agencies guided pharma industries on cleaning validation with detailed guidelines.

Current Cleaning Validation Approach: Acceptance Criteria.

After Barr Laboratory decision, the concept of cleaning validation was changed year to year in a very drastic way and the number of regulatory agencies guided pharma industries on cleaning validation with detailed guidelines.


The subject of cleaning validation has continued to receive a large amount of attention from regulators, companies and customers alike. The integration of cleaning validation with in an effective quality system supported by Quality risk management process enlightens the significance of cleaning validation.


Companies must demonstrate during cleaning validation that cleaning procedure routinely employed for a piece of equipment limits potential carryover to an acceptable level. That limit established must be calculated based on sound scientific rational.


Cleaning validation should give assurance that the manufacturing operations are performed in such a way that risk to patients related to cleaning validation are understood, assessed for impact are mitigated as necessary.

The acceptance criteria for equipment cleaning should be based on visually clean in dry condition and an analytical Limit.

Methods for calculating Acceptance criteria:

1) Based on Health based Data

MACO should be based upon the health based data when this data is available.

MACO = (PDE/ADE previous X MBS Next) / TDD Next

MACO – Maximum Allowable carryover (mg)

ADE – Acceptable Daily exposure of previous product (mg/day)

PDE - Permitted daily exposure of previous product (mg/day)

MBS – Minimum batch size of next product.

TDD – Therapeutic Daily dose for the next product (mg/day) 

ADE = (NOAEL X BW) / (F1 X F2 X F3 X F4 X F5)

NOAEL – No observed adverse effect level

BW – Weight of Average adult (ex. 70 Kg)

F1 – A factor (Value between 2 and 12) to account extrapolation between species.

F1=5 for extrapolation from rats to humans

F1=12 for extrapolation from rats to humans

F1=2 for extrapolation from rats to humans

F1=2.5 for extrapolation from rabbits to humans

F1=3 for extrapolation from monkeys to humans

F1=10 for extrapolation from other animals to humans

F2 – A factor of 10 to account for variability between species

F3 – A factor of 10 to account for repeat dose toxicity studies of short term exposure

F3 = 1 for studies that last at least one half lifetime 
(1 year for rodent/rabbits, 7 years for cats, dog and monkeys)

F3 = 1 for reproductive studies in which the whole period of organogenesis is covered.

F3 = 2 for a 6 month study in rodents or 3-5 study in non-rodents

F3 = 5 for 3 month study in rodents or a 2 year study in non-rodents

F3 = 10 for studies shorter duration

F4 - A factor (1-10) that may be applied in case of sever toxicity 

F4 = 1 for fetal toxicity associated with maternal toxicity

F4 = 5 for fetal toxicity without maternal toxicity

F4 = 5 for a teratogenic effect with maternal toxicity

F4 = 10 for a teratogenic effect without maternal toxicity

F5 – A variable factor that may be applied if no –effect level was established, when LOEL is available.

A factor 10 could be used based on severity and toxicity.

PDE = (NOAEL X BW) / (UFc X MFX PK)

UFc - Composite Uncertainty Factor: combination of factors which reflects the interindividual variability, interspecies differences, sub-chronic-to-chronic extrapolation, LOEL-to-NOEL extrapolation, database completeness.

MF - Modifying Factor: a factor to address uncertainties not covered by the other factors

PK - Pharmacokinetic adjustment


2) Based on Therapeutic daily dose 

When limited toxicity data is available and the TDD is known MACO should be calculated by following formulae. It is used for final product change over API process A to API process B.

MACO = (TDD previous X MBS Next) / (SF X TDD Next)

SF – Safety Factor (Generally 1000 used for calculations) 


3) Based on LD50 

In case where no other data available and only LD50 data is available. Use following formulae.

NOEL = (LD50 x BW) /2000

MACO = (NOEL previous X MBS Next) / (SF Next X TDD Next)

LD50 – Lethal dose 50 mg/kg animal. The identification of the animal (mouse, rat…etc.) and the way of entry is important.

(LD50 is the amount of toxic agent that is sufficient to kill 50% of a population of animals usually with in certain time.)

BW – Weight of average adult (ex.70 Kg)

SF Next – Safety Factor

(For Topical 10-100, for oral products 100 – 1000, for parental 1000 - 10000) 


4) General Limit as Acceptance criteria:

If MACO values are unacceptably high or irrelevant carry over figures or toxicological data for intermediates is not known the approach of general limit may be suitable.

MACO PPM = MAXCONC X MBS Next

MAXCONC = Maximum allowed concentration (kg / kg or ppm) of previous substance in the next batch


A general upper limit for maximum concentration of contaminating substance in a subsequent batch is often set to 5 – 500 ppm (100 ppm in APIs is very frequent) of the previous product in to next product depending on the nature of the products produced from the individual company.

CLEANING VALIDATION - ORIGIN


Cleaning validation is a program which demonstrates that the used cleaning procedures are adequate to eliminate/ control potential cross contamination.


The Resin story: 

The awareness on cross contamination was came in light in 1988 due to recall of a finished drug product Cholestyramine Resin USP due to inadequate cleaning procedures.

Reason for recall:

The bulk pharmaceutical chemical used to produce the product had become contaminated with low levels of intermediates and degradants from the production of agriculture pesticides.



The main reason for cross contamination is Solvent recovery storage drums were used twice without proper cleaning. Drums that had been used to store recovered solvents from a pesticide production process were later used to store recovered solvents used for the resin manufacturing process.

The firm did not have adequate controls over these solvent drums and did not have validated cleaning procedures for the drums.

This event which increased the FDA awareness over cleaning validation and potential cross contamination due to inadequate procedures.

FDA Expectation: 
1.FDA expects firms to have written procedures detailing the cleaning processes used for various piece of equipment.

If firm have one cleaning process for cleaning between different batches of same product and use a different process for cleaning between product changes, we expect the written procedures to address these different scenario. 

2.FDA expects firm to have written general procedures on how cleaning processes will be validated.

3.FDA expects the general validation procedures to address who is responsible for performing and approving the validation study , the acceptance criteria and when re-validation will be required.

4.FDA expects firm to prepare specific written validation protocols in advance for studies to be performed on each manufacturing system or piece of equipment which should address such issue as sampling procedures, and analytical methods to be used including the sensitivity of those methods.

5.FDA expects firm to conduct the validation studies in accordance with the protocols and to document the result of studies.

6.FDA expects a final validation report which is approved by management and which states whether or not the cleaning process is valid. The data should supports a conclusion that residues have been reduced to an acceptable level.

So as part of cleaning validation firm should focus on following.

Acceptance criteria
Levels of cleaning
Control of the cleaning process
Bracketing and worst case rating
Determination of Amount of residue