information on:

USP23  Fifth Supplement for ultrapure water


Add the following


Total Organic Carbon (TOC) is an indirect measure of organic molecules present in pharmaceutical waters measured as carbon. Organic Molecules are introduced into the water from the source water, from purification and distribution system materials, and from biofilm growing in the system. TOC can also be used as a process control attribute to monitor the performance of unit operations comprising the purification and distribution system.

A number of acceptable methods exist for analyzing TOC. This chapter does not limit or prevent alternative technologies from being used, but provides guidance on how to qualify these analytical technologies for use as well as guidance on how to interpret instrument results for use as a limit test. The Standard Solution is a theoretically easy to oxidize solution that gives an instrument response at the attribute limit. The analytical technology is qualified by challenging the capability of the instrument using a theoretically difficult to oxidize solution in the system suitability portion of the method.

Analytical technologies utilized to measure TOC share the objective of completely oxidizing the organic molecules in an aliquot of sample water to carbon dioxide (CO2), measuring the resultant CO2 levels, and expressing this response as carbon concentration. All technologies must discriminate between the inorganic carbon which may be present in the water from sources such as dissolved CO2 and bicarbonate, and the CO2 generated from the oxidation of organic molecules in the sample.

Two general approaches are used to measure TOC. One approach determines TOC by subtracting the measured inorganic carbon (IC) from the measured total carbon (TC), which is the sum of organic carbon and inorganic carbon:


The other approach first purges the IC from the sample before any carbon measurement is performed. However, this IC purging step also purges some of the organic molecules, which can be retrapped, oxidized to CO2 and quantitated as purgeable organic carbon (POC). The remaining organic matter in the sample is also oxidized to CO2 and quantitated as nonpurgeable organic carbon (NPOC). In this approach, TOC is the sum of POC and NPOC:


In pharmaceutical waters, the amount of POC is negligible and can be discounted. Therefore, for the purpose of this methodology, NPOC is equivalent to TOC.

Apparatus Requirements-This test method is performed either as an on-line test or as an off-line laboratory test using a calibrated instrument. The suitability of the apparatus must be periodically demonstrated as described below. In addition, it must have a manufacturer’s specified limit of detection of 0.05 mg of carbon per liter (0.05 ppm of carbon) or lower.

USP Reference Standards (11) - USP 1,4-Benzoquinone RS. USP Sucrose RS.

Reagent Water-Use High-purity Water as defined under Containers (661), having, in addition, a TOC level of not more than 0.25 mg per liter.

Glassware Preparation-Organic contamination of glassware results in higher TOC values. Therefore, use glassware and sample containers that have been scrupulously cleaned of organic residues. Any method that is effective in removing organic matter can be used (see Cleaning Glass Apparatus (1051)). Use Reagent Water for the final rinse.

Standard Solution-Dissolve in the Reagent Water an accurately weighed quantity of USP Sucrose RS, previously dried at 105 for 2 hours, to obtain a solution having a concentration of 1.19 mg of sucrose per liter (0.50 mg of carbon per liter).

Test Solution- [ Note-use extreme caution when obtaining samples for TOC analysis. Water samples can be easily contaminated during the process of sampling and transportation to a testing facility.] Collect the Test Solution in a tight container with minimal head space, and test in a timely manner to minimize the impact of organic contamination from the closure and container.

System suitability solution-Dissolve in Reagent Water an accurately weighed quantity of USP 1,4-Benzoquinone RS to obtain a solution having a concentration of 0.75 mg per liter (0.50 mg of carbon per liter).

Reagent Water Control-Use a suitable quantity of Reagent Water obtained at the same time as that used in the preparation of the Standard Solution and the System Suitability Solution

Other Control Solutions-Prepare appropriate reagent blank solutions or other specified solutions or other specified solutions needed for establishing the apparatus baseline or for calibration adjustments following the manufacturer’s instructions, and run the appropriate blanks to zero the instrument.

 System suitability- Test the Reagent Water Control in the apparatus and record the response, rw. Repeat the test using the Standard Solution, and record the response rs. Calculate the corrected Standard Solution response, which is also the limit response, by subtracting the Reagent Water Control response from the response of the Standard Solution. The theoretical limit of 0.50 mg of carbon per liter is equal to the corrected Standard Solution response, rs - rw. Test the System Suitability Solution in the apparatus, and record the response, rss . Calculate the corrected System Suitability Solution response by subtracting the Reagent Water Control response from the response of the System Suitability Solution, rss - rw. Calculate the response efficiency for the System Suitability Solution by the formula:

 100 [ (rss - rw)/(rs - rw)]

The system is suitable if the response efficiency is not less than 85% and not more that 115% of the theoretical response.

Procedure-Perform the test on the Test Solution, and record the response, ru. The Test Solution meets the requirements if ru is not more than the limit response, rs- rw. This method also can be performed alternatively using on-line instrumentation that has been appropriately calibrated, standardized, and has demonstrated acceptable system suitability. The acceptability of such on-line instrumentation for quality attribute testing is dependent on its location(s) in the water system. These instrument location(s) and responses must reflect the quality of the water used.

 Add the following:


Electrical conductivity in water is a measure of the ion-facilitated electron flow through it. Water molecules dissaciate into ions as a function of pH and temperature and result in a very predictable conductivity. Some gases, most notably carbon dioxide, readily dissolve in water and interact to form ions, which predictably affect conductivity as well as pH. For the purpose of this discussion, these ions and their resulting conductivity can be considered intrinsic to the water.

Water conductivity is also affected by the presence of extraneous ions. The extraneous ions used in modeling the conductivity specifications described below are the chloride and sodium ions. The conductivity of the ubiquitous chloride ion (at the theoretical endpoint concentration of 0.47 ppm when it was a required attribute test in USP XXII and earlier versions) and the ammonium ion at the limit of 0.3 ppm represents a major portion of the allow water impurity level. A balancing quantity of cations , such as sodium ion, is included in this allowed impurity level to maintain electroneutrality. Extraneous ions such as these may have significant impact on the water’s chemical purity and suitability for use in pharmaceutical applications. The combined conductivities of the intrinsic and extraneous ions vary as a function of pH and are the basis for the conductivity specifications described in the accompanying table and used when performing Stage 3 of the test method . Two preliminary stages are included in the test method. If the test conditions and conductivity limits are met at either of these preliminary stages, the water meets the requirements of this test. Proceeding to the third stage of the test in these circumstances is unnecessary. Only in the event of failure at the final test stage is the sample judged noncompliant with the requirements of the test.

 Instrument Specifications and Operating Parameters

 Water conductivity must be measured accurately using calibrated instrumentation. The conductivity cell constant, a factor used as a multiplier for the scale reading from the meter, must be known within 2%. The cell constant can be verified directly by using a solution of known concentration, or indirectly by comparing the instrument reading taken with the cell in question to readings from a cell of known or certified cell constant.

 Meter calibration is accomplished by replacing the conductivity cell with NIST-traceable precision resistors (accurate to 0.1% of the stated value) or an equivalently accurate adjustable resistance device such as a Wheatstone Bridge to give a predicted instrument response. Each scale on the meter may require separate calibration prior to use. The frequency of recalibration is a function of instrument design, degree of use, etc. However, because some multiple scale instruments have a single calibration adjustment, recalibration may be required between each use of a different scale. The instrument must have a minimum resolution of 0.1 m S/cm* on the lowest range. Excluding the cell accuracy, the instrument accuracy must be 0.1 m S/cm..

Because temperature has a substantial impact on conductivity readings of specimens at high and low temperatures, many instruments automatically correct the actual reading to display the value that theoretically would be observed at the nominal temperature of 25. This is done using a temperature sensor in the conductivity cell probe and an algorithm in the instrument’s circuitry. This temperature compensation algorithm may not be accurate. Conductivity values used in this method are non-temperature-compensated measurements.

 The procedure described below is designed for measuring the conductivity of Purified Water and Water for Injection using a conductivity meter equipped with a "dip" type conductivity cell. For water having a conductivity below 1.0 m S/cm, a "flow through" cell may function better. Stage 1 of the procedure below may alternatively be performed (with the appropriate modifications to Step 1) using on-line instrumentation that has been appropriately calibrated, whose cell constants have been accurately determined, and whose temperature compensation function has been disabled. The suitability of such on-line instrumentation for quality control testing is also dependent on its location(s) in the water system. The selected instrument location(s) must reflect the quality of the water used.


 Stage 1

1. Determine the temperature of the water and the conductivity of the water using a non-temperature-compensated conductivity reading. The measurement may be performed in a suitable container or as an on-line measurement.

2. Using the Stage 1-Temperature and Conductivity Requirements table, find the temperature value that is not greater than the measured temperature. The corresponding conductivity value is the limit at that temperature.

3. If the measured conductivity is not greater than the table value, the water meets the requirements of the test for conductivity. If the conductivity is higher than the table value, proceed with Stage 2.

Stage 2

4. Transfer a sufficient amount of water (100 mL or more) to a suitable container, and stir the test specimen. Adjust the temperature, if necessary, and, while maintaining it at 25 1, begin vigorously agitating the test specimen while periodically observing the conductivity (due to uptake of atmospheric carbon dioxide) is less than a net of 0.1 m S/cm per 5 minutes, note the conductivity.

 5. If the conductivity is not greater than 2.1 m S/cm, the water meets the requirements of the test for conductivity. If the conductivity is greater than 2.1 m S/cm, proceed with Stage 3.

*m S/cm (microSiemen per centimeter) = m mho/cm = reciprocal of Megohm-cm.

 Stage 1-Temperature and Conductivity Requirements

(for non-temperature-compensated conductivity measurements only)


Conductivity Requirement
(m S/cm)*











































*m S/cm (microSiemen per centimeter) = m mho/cm = reciprocal of Megohm-cm.

Stage 3-pH and Conductivity Requirements

(for atmosphere and temperature equilibrated samples only)


Conductivity Requirement (m S/cm)*











































*m S/cm (microSiemen per centimeter) = m mho / cm = reciprocal of Megohm-cm

 Stage 3

6. Perform this test within approximately 5 minutes of the conductivity determination in Step 5, while maintaining the sample temperature at 25 1. Add a saturated potassium chloride solution to the same water sample (0.3 mL per 100 mL of the test specimen), and determine the pH to the nearest 0.1 pH unit, as directed under pH (791).

7. Referring to the Stage 3-pH and Conductivity Requirements table, determine the conductivity limit at the measured pH value. If the measured conductivity in Step 4 is not greater than the conductivity requirements for the pH determined in Step 6, the water meets the requirements of the test for conductivity. If either the measured conductivity is greater than this value or the pH is outside of the range of 5.0 to 7.0, the water does not meet the requirements of the test for conductivity.

Return to :

copyright STATE-OF-THE-ART