Stability Data Evaluation Guide
Stability Data Evaluation Guide
Contents
13.1 Data Evaluation and Trending . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
13.1.1 Evaluation of Raw Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
13.1.2 Evaluation of Stability Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
13.2 Investigation of Out-of-Specification (OOS) Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269
13.2.1 Phase I – Laboratory Investigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270
13.2.2 Phase II – Full-Scale OOS Investigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272
13.2.3 Outlier Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274
13.2.4 When the OOS Result Is Confirmed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274
13.2.5 Trending OOS Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
13.3 Setting Specifications and Stability Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
13.3.1 Refinement of Specifications Using Data from Stability Studies . . . . . . . . . . . . 275
13.3.2 Expiry Dating of Clinical Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277
13.3.3 Commercial Specifications and Extension of Expiration Dating . . . . . . . . . . . . 277
13.4 Preparation of Stability Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277
13.4.1 GMP Requirements for Records and Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277
13.4.2 Elements of a Stability Data Sheet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278
13.4.3 Anatomy of a Stability Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279
13.4.4 Requirements for Stability Section in the CMC . . . . . . . . . . . . . . . . . . . . . . . . . . 280
13.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 282
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
Abstract This chapter discusses the evaluation of stability data. It follows the
stability study information from the point that raw data is generated in the lab,
calculations are performed to give test results, and test results are entered in the sta-
bility summary sheets, until data is finally entered into a stability report for submis-
sion purposes. This chapter also includes a summary of data evaluation addressed
in ICH Q1E and a discussion of Out-of-Specification (OOS) and Out-of-Trend
N. Subbarao (B)
Biologics Consulting Group, Inc., 3 Serina Drive, Plainsboro, NJ 08536, USA
e-mail: nsubbarao@bcg-usa.com
Initiate
Intermediate condition
sample testing
Yes
Yes
Aberrant data
Lab Investigation Lab error?
/OOS
No
No
Data of Accelerated
Add results to Stability Full Scale Investigation Condition?
Summary Table
No
Yes
Report Results
Issue FAR if
Commercial lot
specifications for identity, strength, quality, and purity. Results for tests such as
appearance and package integrity are evaluated directly against the specification.
ICH Q1A(R2) defines significant changes for stability samples and which can be
found in Chapter 3, Table 3.4. Additional information regarding physical testing
is discussed in Chapter 10. For other tests such as purity by chromatography, the
raw data must be examined for changes such as new or growing impurity peaks. It
is important that any significant changes or aberrant observations be noted imme-
diately, and investigated promptly, at a time when the original unexpired sample
solutions and reagents are still available.
The evaluation of the raw data can be effectively performed only if the analyst
has access to the stability specifications as well as to the results and chromatograms
from the previous time points of the stability study. Chromatograms of excipient lots
and of the drug substance lot used for manufacture of the drug product lot are also
useful. Designation of an appropriate person in the laboratory to evaluate data and
act promptly if an OOS is found is invaluable for prompt and meaningful laboratory
investigations of aberrant raw data. Any OOS results found must be investigated
promptly and the procedures for Laboratory Investigations and OOS Investigations
are described in Section 13.2.
procedures therefore depend on the availability of data to define the norm. During
early development stability studies, where little information about the product or
formulation is known, the test results from earlier time points are set as the norm
for later time points. Any significant deviation from this earlier result is identified as
an OOT incident and action is taken as appropriate. Where a significant amount of
stability data is available, a lot or packaging configuration is identified as behaving
OOT if its rate of degradation is different from the normal degradation rate for that
formulation or package type.
The trend identification can be qualitative and performed by graphing the sta-
bility data or could be performed by statistical analysis of the collated data. In
both cases, the site OOT Standard Operating Procedure (SOP) defines criteria for
designating a data deviation from the norm as an OOT incident. The OOT criteria
must be set in such a way that all significant OOT incidences are identified, ideally
without false positives.
101
100
99
98
97
96
Lot 1
95
94
93
92
91
90
0 3 6 9 12 15 18 21 24 27
Months
Fig. 13.2 Stability results for a single lot over time. The 9 months result can be considered as OOT
13 Evaluation of Stability Data 267
101
100
99
98 Lot 2
97
Months
Lot 3
96
Lot 4
95
Lot 5
94
Lot 6
93
92
91
90
0 5 10 15 20 25 30
Fig. 13.3 Stability results for 5 lots over time. Here, the rate of change of lot 4 is being compared
to that of the other lots and appears to be OOT
Where k is a multiplier chosen from a table of normal quantiles to give the desired
protection level and s is the square root of the mean square error from the regres-
sion. The choice of k value allows the control of the confidence level and thus the
rate of false alarms. This approach depends on the data being normally distributed
and independent and is applicable only to data with a common linear slope for all
batches.
268 N. Subbarao and K. Huynh-Ba
By-Time-Point Method
In this approach, historical data is used to compute a tolerance interval for each
stability time point. The tolerance interval can be calculated for the stability results
themselves or for the difference between the result and the initial stability result.
The interval at a certain time point can be calculated as:
where k is the multiplier chosen from a table of normal quantiles to give the desired
protection and s is the standard deviation at the time point.
Any result outside the tolerance interval is considered OOT. This approach
depends on the data being normally distributed and independent and does not require
any assumptions about the shape of the degradation curve.
LOQ levels, and could be a concern for those impurities with specification limits
close to the LOQ.
Some common procedures for evaluating OOT close to the LOQ are given below:
r If all test result values are above the LOQ, the distribution is normal, and the
variance is constant, the regression control chart method or the slope control
chart method may be applied to the data.
r If all test results are below LOQ, any test result which appears above the LOQ
may be considered as OOT. However, this OOT identification procedure will
result in false positives if the impurity peak is normally just below the LOQ and
its appearance above the LOQ is due to test method variability.
r If some test values are below LOQ, one strategy would be to consider all results
which are <LOQ as either the LOQ level or 1/2 (LOQ) for purposes of the statis-
tical calculation leading to the identification of the OOT. This OOT Identification
approach is impacted by the distortion of information due to the approximation
for peaks <LOQ.
OOT identification for Impurity test results may also point to the need for iden-
tification of unknowns which may be increasing in levels. A growing unknown
impurity should trigger identification of the impurity and validation of the impurity
analytical method, before the peak reaches levels where the guidance documents
require its identification.
The first phase of the investigation occurs in the lab and is focused on the possible
identification of assignable laboratory errors. The responsibilities of the supervisor
and the analyst during this phase are listed below.
Analysts are responsible for:
r Ensuring that the equipment used is calibrated and meets the required acceptance
criteria.
r Reporting data only if the required system suitability tests pass acceptance
criteria.
r Checking the data for compliance to specifications before discarding any test
solutions.
r Informing the supervisor if any unexpected results are obtained.
r Stopping testing if an obvious error occurs; they should not knowingly continue
testing when they expect to invalidate the data at a later time for an assignable
cause, except when the sole purpose is to see what results are obtained when
obvious errors are known.
The supervisor is responsible for:
r Performing an objective and timely assessment.
r Confirming the analyst’s knowledge and performance of correct procedures.
r Examining the raw data and identifying anomalous or suspect information.
r Confirming the performance of the instruments.
r Examining the solutions, reagents, and standards to confirm that they were appro-
priate for use during testing.
r Evaluating the performance of the test method.
r Documenting and preserving evidence of the assessment.
Prompt initiation of the investigation is essential for several reasons. Test solu-
tions, reagents, and standard solutions will still be available and may be re-analyzed
if necessary. The analyst’s memory of all stages of the testing will be clearest on
the day of the test, and equipment is more likely to be in the configuration used for
testing and can therefore be checked for errors.
13 Evaluation of Stability Data 271
Sample
Sample ID and condition satisfactory? y/n
Packaging satisfactory? y/n
Reagent
Correct reagent used? y/n
Within expiry Date? y/n
Glassware/supplies
Correct glassware type used? y/n
Clean glassware used? y/n
Solvent washed/dried Glassware used? y/n
Correct Volume (volumetric) glassware used? y/n
Equipment
Equipment qualified for intended purpose? y/n
Equipment within calibration period? y/n
Equipment setting appropriate? y/n
Chromatography Column:
Correct column used as per analytical method? y/n
Column wash steps completed prior to injection? y/n
Analyst Training
Trained on use of equipment? y/n
Trained on Analytical Method? y/n
SOP steps
Weights in correct range? y/n
Dilutions performed per analytical method? y/n
All steps performed as per Analytical method y/n
Calculations
Software qualified? y/n
All calculations checked and found correct? y/n
Other y/n
The investigation must be documented and a checklist (see Fig. 13.4 for an exam-
ple) is often used to aid in reviewing all the relevant facts and serves to speed up the
review process.
If the review does not reveal the root cause of the anomalous results, there may
be a need to test the final prepared solution, retained samples from earlier steps of
the sample preparation or tablet grinds to identify the root cause. The procedures
for such testing must be defined in an SOP and the testing must be supervised and
272 N. Subbarao and K. Huynh-Ba
approved by a supervisor, with a review of the results at each stage before proceed-
ing to the next.
If the anomalous result can be unequivocally assigned to laboratory error, the
result may be invalidated. Marking the notebook entry as invalid and retaining all
related instrument outputs will be invaluable during future audits, to account for the
raw data and results which are retained in the instrument electronic database.
The OOS guidance document indicates that laboratory or analyst errors should be
relatively rare, and frequent occurrence can be an indication of inadequate training
of analysts, poorly calibrated/maintained equipment, or careless work. It should not
be assumed that the failing result is attributable to analytical error without perform-
ing and documenting an investigation. When a laboratory error is confirmed, the
company must determine the source of error, take appropriate corrective actions,
and prevent reoccurrence of the incident.
13.2.2.1 Retesting
Retesting is performed using the same homogenous material as the original sample.
The concept of retesting as described in the OOS guidance does not apply to some
tests such as content uniformity and dissolution.
Companies must have a written procedure that specifies the maximum number of
retests. The SOP must define how retesting will be performed. It is understood that
the investigation procedure cannot be fully pre-defined and depends on the prob-
lem and product. Instead, each testing step must be approved and supervised by a
responsible person in the Quality unit. It is important that the retesting be performed
13 Evaluation of Stability Data 273
Analyst (s):
(Attach Lab Investigation report including Checklist and summary of all repeated testing listed above)
Re-sampling authorization (specify lot number, stability study number, condition, time point:
Investigation report including Root Cause and Corrective/Preventive action attached? Y/N
time point sample is pulled and the results are designated as such. For example,
if in a study the 6-month sample test results are under investigation, and additional
containers at the 7-month time point are tested as part of the investigation, the results
are reported as belonging to the 7-month time point. The investigation may conclude
that either the original test result or the original sample tested was not representative
of the lot and may therefore be invalidated.
When faced with insufficient samples for testing of stability OOS investigations,
some companies may consider taking samples from other programs such as reten-
tion programs. However, such practices are not advisable as the storage conditions
of the stability and retention programs may differ significantly.
The guidance for preparing specifications for drug substances and drug products is
provided in ICH Q6A [6] with additional guidance in ICH Q6B [7] for biologics.
The discussion below for considering stability data in specifications is applicable
only to drug products. The upper and lower acceptance criteria limits in the regu-
latory specification (shelf-life specification in the EU) are usually set based on the
potency and/or impurity levels of the clinical lots and safety and efficacy consid-
erations. The extent of degradation or change in the attributes during the shelf-life
of the product is factored in to determine the in-house release acceptance criteria
(lot release specification in the EU) to ensure that the product meets the regulatory
specification at the end of shelf-life.
The acceptance criteria for some attributes such as package integrity or sterility
must not differ between lot release limits and regulatory acceptance criteria, and test
results for these attributes must not change over the shelf-life of the product. How-
ever, results for other attributes such as potency and impurity profile could change
significantly over the shelf-life of the product. Stability data are used in deriving the
regulatory specification limits for these attributes.
No
Yes
Fig. 13.6 Establishing acceptance criterion for a specified impurity in a new drug substance
Yes
No
Acceptance criterion = A or B Is maximum likely level greater
(as appropriate) than the qualified level
Yes
Fig. 13.7 Establishing acceptance criterion for a degradation product in a new drug product
The mathematical model thus provides the procedure for including the degrada-
tion slope in the calculation of the specification acceptance criteria.
The drug product and drug substance shelf-life and expiry periods may be extended
after product approval when satisfactory data from three stability lots has been
obtained. It may also be possible to propose excluding or replacing certain specifica-
tion tests originally included in the new drug application from the commercial drug
product specification. For example, degradation product testing may be reduced or
eliminated if it has been conclusively proven that a certain impurity is not formed
in the specific formulation and under the specific storage condition proposed in the
new drug application. Any testing reduction must be approved if the product has
been filed with regulatory authorities.
21 CFR Part 211 Subpart J indicates that records and reports must reviewed at least
annually and be available for inspections at any time. Laboratory records include
278 N. Subbarao and K. Huynh-Ba
such as manufacturing date, packaging date, and expiration date, and site informa-
tion such as manufacturing site, packaging site, and testing site.
STABILITY REPORT
Data
Primary Secondary
Stability Stability Statistical Summary
Stability Stability
Protocols Commitment Evaluation And
Data Data
Evaluation
The stability portion of the Chemistry and Manufacturing Controls (CMC) dossier
contains the sections from the stability report described above. The requirements
for the CMC sections can be found in 21 CFR Part 312 for IND application and
Part 314 for NDA and Abbreviated New Drug Application (ANDA). Tables 13.1
and 13.2 provide the requirements for the CMC, and the location of the stability
related documents within the CMC are highlighted in Table 13.2.
13 Evaluation of Stability Data 281
In September 2002, the ICH issued guideline M4, Organization of the Common
Technical Document (CTD) for the Registration of Pharmaceuticals for Human Use.
Each CTD contains 5 modules:
282 N. Subbarao and K. Huynh-Ba
13.5 Conclusions
Stability raw data and results must be reviewed and evaluated promptly after the
analysis. The analyst must also review the stability profile of the batch, as well as
stability data of the product after each data point generated. Many companies have
implemented LIMS to help making reporting and evaluating stability data more effi-
cient. The stability report is an important segment in the CMC document package.
Every company must have an OOS and OOT SOP. If a laboratory error cannot be
shown to be the root cause of an OOS or OOT incident, then a cross-functional
investigation must be initiated. OOS and OOT investigations are important, as they
continue to be one of the leading causes of warning letters.
13 Evaluation of Stability Data 283
References
1. International Conference on Harmonization (2003) Q1A(R2): Stability testing of new drug
substances and products (second revision).
2. International Conference on Harmonization (2003) Q1E: Evaluation of stability data.
3. PhRMA CMC Statistics and Stability Expert Teams (April 2003) Identification of out-of-trend
stability results. Pharm Technol pp 38–52.
4. FDA (1998) Guide to inspection of quality control laboratories.
5. FDA/CDER (October 2006) Guidance for industry: Investigating out-of-specification (OOS)
test results for pharmaceutical production.
6. International Conference on Harmonization (1999) Q6A: Specifications: test procedures and
acceptance criteria for new drug substances and new drug products: chemical substances.
7. International Conference on Harmonization (1999) Q6B: Specifications: test procedures and
acceptance criteria for biotechnological/biological products.