Primary & Secondary Standard Definition, Use and Complete Guidelines

 

                       PRIMARY  AND SECONDRY STANDARD



Any chemical analysis can be considered valid only if the method of analysis is validated before adoption and results are reported against internationally recognized standard reference materials. Results of such analysis can be relied upon by consumers as well as other laboratories across the world.

Standards are primarily used for the following types of analytical studies:

  • Assay
  • Identification tests
  • Limit tests for related substances
  • Analytical method validation
  • System suitability for analysis techniques and in particular spectroscopic and chromatographic analysis

It becomes necessary to understand the fine differences between different categories of standards used for chemical analysis. The standards are commonly grouped into two groups:

  • Primary standards or certified reference materials or standard
  • Secondary standards or working standards

Primary Standard OR Referance Standard

Primary standard is a reagent which is generally representative of the number of substance contains and easily weighed.A Primary standard is a reagent that’s stable, it’s not a hydrate /has no water of hydration, and has a high molecular weight.
A reference standard or reference material (RM) is a “material or substance one or more of whose property values are sufficiently homogeneous and well established to be used for the calibration of an apparatus, the assessment of a measurement method, or for assigning values to materials”, whereas a certified reference material (CRM) is a “reference material accompanied by a certificate issued by a certifying body, one or more of whose property values are certified by a technically valid procedure which establishes its traceability to an accurate realization of the unit in which the property values are expressed, and for which each certified value is accompanied by an uncertainty at a stated level of confidence.” According to these definitions, CRMs form a subgroup of RMs, namely, that RMs which possesses additional characteristics - a certificate and traceable assigned values with an uncertainty statement.

A primary standard reference material is an ultra high purity grade compound used in analysis involving assay, identification or purity tests. It can be a single compound or a mixture having the analyte of interest in a specified and certified amount.

The impurities, if any, should be identified and controlled for use in assay studies. The material selected as a primary standard should be highly stable, free from water of hydration and bear traceability to a national or international standards body

In many cases it may not be possible to procure a reference material from such sources because of following reasons :

  • The new molecule is encountered in R&D activity. In such case a reference may not be available from standard bodies. A laboratory can use a reference compound whose purity is established for its routine use.
  • In- house primary reference material can be selected from a particular batch in manufacturing industries. After assigning its batch number its characteristic properties are documented for reference purpose and comparison with future production lots

Examples

Sodium carbonate Na2CO3
Sodium borate Na2B4O7
Potassium hydrogen iodate KH(IO3)2
Pure metals and their salts like Zn, Mg, Cu, Mn, Ag, AgNO3 , NaCl, KCl, KBr   – used in Acid base titrations
K2Cr2O7, KBrO3, KIO3, KI(IO3)2, NaC2O4, As2O3, pure iron – used in redox titrations

Eligibility criteria for a primary standard

A primary standard should satisfy the following conditions

primary standard
primary standard
  1. should be very pure
  2. should neither be deliquescent (absorbing moisture) nor efflorescent (losing water)
  3. should have high molecular weight so that weighing errors can be minimized
  4. should be chemically stable
  5. shall be readily soluble under given conditions
  6. it should react stoichiometrically


When to use a primary standard?

In pharmaceutical QC, the use of reference standards to calibrate the analytical procedure is mandatory when measurements are performed with relative methods such as HPLC in combination with a UV or MS detector. These measurements need to be traceable to a primary standard. This requirement is realized either by using the primary standard directly for the calibration purposes, or by using a secondary standard which is compared to the primary one.

ICH guideline Q7 states:

11.17 Primary reference standards should be obtained as appropriate for the manufacture of APIs. The source of each primary reference standard should be documented. Records should be maintained of each primary reference standard’s storage and use in accordance with the supplier’s recommendations. Primary reference standards obtained from an officially recognized source are normally used without testing if stored under conditions consistent with the supplier’s recommendations.

Officially recognized sources, however, are not specified in Q7, but the FDA does mention sources in their ‘Guidance for Industry on Analytical Procedures and Methods Validation for Drugs and Biologics’ but, interestingly, does not refer to these institutions as an official or definitive list:

Reference standards can often be obtained from the USP and may also be available through the European Pharmacopoeia, Japanese Pharmacopoeia, World Health Organization, or National Institute of Standards and Technology.

Instead, the FDA states that “reference materials from other sources should be characterized by procedures including routine and beyond routine release testing” and that producers “should consider orthogonal methods for reference material characterization”.

For primary RSs, both the ICH guideline and FDA guidance allow other sources than the “officially recognized sources”. Independent manufacturers can provide such primary standards, ideally characterized by processes like those outlined in the general text 5.12. of the European Pharmacopoeia (Ph.Eur).

Correct use of primary reference standards: what to keep in mind?

In essence, a primary RS needs to be fit for its intended purpose. A pharmacopoeial RS has been shown to be fit for its compendial purpose, but has not been demonstrated to be fit for any other purpose; this needs to be proven by the user. Consequently challenges of compendial standard use for non-compendial purposes have been reported in regulatory inspections. Other primary standards with fully documented CoAs can be used for most applications, providing they have been characterized appropriately. If a primary RS is used to establish a secondary standard then the secondary RS can only be used for the same purpose as the primary one.

Secondary or working standard

Secondary standard is a chemical that has been standardized against a primary standard for use in a specific analysis. Secondary standards are commonly used to calibrate analytical methods.

A secondary standard is a standard that is prepared in the laboratory for a specific analysis. It is usually standardized against a primary standard.

Primary standards come with a certificate of analysis and bear traceability to a globally recognized standards body. The cost is often too high for even milligram range quantities. Secondary standards or working standards are also high purity grade materials which are quantified in relation to primary standards and put to routine use in laboratories. Such working standards are assigned a limited validity depending on stability of the material and before expiry fresh working standards should be prepared for future use. It is important to realize that if an expired working standard is used in analysis then no credibility can be placed on the reported results.
These compounds are second-line materials. They consist in each instance of a material that is compared against the primary material, and used in its place.
It does not matter whether the secondary standard is compared against a pharmacopoeial primary standard, or against a primary standard obtained in-house or from a third source. A secondary standard can only be used for the same purposes as the primary standard. Thus if a primary standard was designed solely for a qualitative purpose (i.e. identification via IR, system suitability test or peak identification), then to use the corresponding secondary standard for quantitative purposes is not valid. For example, a large number of Ph.Eur. reference standards for APIs have been set up for IR comparisons only, and should not be used as a basis for quantitative secondary standards.

Handling and storage of standards

Standards play a crucial role in analysis so require to be preserved under specified conditions so that their authenticity is preserved over the prescribed storage periods.

Following recommendations can prove useful:

  • Store in amber coloured glass vials or bottles which are properly sealed or cappedand stored in controlled humidity conditions .
  • Reseal vial or cap the bottle securely after use
  • Temperature sensitive standards should be stored in cooling chambers or dedicated refrigerators between 280C
  • Ensure that the standard is within its validity before use
  • Do not return a new standard to original container to prevent contamination of the original stock.

The significant contribution of standards in any chemical analysis is emphasized in the article. It is indeed difficult to imagine a laboratory functioning without making use of standards and reference materials.

Preparation of Working Standard 

 Select any approved batch, the quality attributes of the selected batch will be reviewed critically with special emphasis on its assay and related substances. This batch shall have reviewed according to pharmaceutical standards and the selected batch shall have maximum assay/purity and the lowest related substance.  

Analyse the selected batch against the reference standard using logbook as per Annexure – VI and follow the stipulated control procedure which includes Description, Identication, Moisture Content, Assay, Perform Assay, and Moisture Content Analysis in triplicate.  The QC manager shall allocate the material abbreviation and assign a unique number to each working standard as below.

 



 

Noise and drift

In HPLC we deal with the time-dependent process. The appearance of the component from the column in the detector represented by the deflection of the recorder pen from the baseline. It is a problem to distinguish between the actual component and artifact caused by the pressure fluctuation, bubble, compositional fluctuation, etc. If the peaks are fairly large, one has no problem in distinguishing them. However, the smaller the peaks, the more important that the baseline be smooth, free of noise, and drift.

Baseline noise is the short time variation of the baseline from a straight line caused by electric signal fluctuations, lamp instability, temperature fluctuations and other factors. Noise usually has much higher frequency than actual chromatographic peak. Noise is normally measured "peak-to-peak": i.e., the distance from the top of one such small peak to the bottom of the next. Sometimes, noise is averaged over a specified period of time. Noise is the factor which limits detector sensitivity. In trace analysis, the operator must be able to distinguish between noise spikes and component peaks. A practical limit for this is a 3 x signal-to-noise ratio, but only for qualitative purposes. Practical quantitative detection limit better be chosen as 10x signal-to-noise ratio. This ensures correct quantification of the trace amounts with less than 2% variance. Figure below illustrates this, indicating the noise level of a baseline(measured at highest detector sensitivity) and the smallest peak which can be unequivocally detected.


Definition of noise, drift, and smallest detectable peak.

Another parameter related to the detector signal fluctuation is drift. Noise is a short-time characteristic of a detector, an additional requirement is that the baseline should deviate as little as possible from a horizontal line. It is usually measured for a specified time, e.g., 1/2 hour or one hour. Drift usually associated to the detector heat-up in the first hour after power-on. Figure also illustrates the meaning of drift.





COVISHIELD™ Overview , You should know.....

  ChAdOx1 nCoV- 19 Corona Virus Vaccine (Recombinant) COVISHIELD™                                                                           ...

Most Popular Post