Water purification systems can control water quality, which decreases manufacturing downtime and increases accurate test results.
IVD tests are used today in many different ways for various purposes. For example, several types of biological techniques have been adopted for performing sensitive IVDs, such as biochemistry, microbiology, immunoassays, immunohistochemistry, and molecular biology. Such IVD tests are used for both clinical and industrial applications by hospital laboratories, biomedical research laboratories, blood banks, transfusion centers, and physician office laboratories.
The growth of preventive medicine and, in particular, personalized medicine has driven the continuing development of molecular IVD testing. Such development has been possible due to the sequencing of the human genome, recent advances in pathology and genetic research, and improvements in the assays themselves.
IVD testing is also now available for many conditions, including cardiovascular disease, cancer, metabolic diseases (e.g., diabetes and obesity), infectious diseases, allergies, and neurodegenerative diseases. For several years, prenatal programs have been detecting newborn diseases. For example, the Newborn Screening Program in the Division of Genetic Disorders at the New York State Department of Health's Wadsworth Center (Albany, NY) performs more than 11 million tests annually for more than 40 congenital diseases and HIV.1
More recently, the introduction of microarrays and other related molecular biology techniques in scientific laboratories has allowed the ability to study gene expression levels, characterize chromosomal restructuring, detect new mutations, and identify single nucleotide polymorphisms (SNP). All these advanced techniques have shown to be critical in the detection of infectious diseases and in predictive oncology. Overall, molecular biology IVD tests can detect a predisposition to a hereditary disease, confirm the diagnosis, and aid in predicting the development of a disease in people who are at risk.
Because IVDs are used in many areas by a variety of people with very different backgrounds and skills, they must meet the criteria of reliability, accuracy, and ease-of-use. A test's reliability and the quality of its results are the responsibility of the IVD manufacturers, the reagent manufacturers, the medical laboratories, and the test users.
From patient testing to reagent manufacturing, IVD manufacturers must consider the accuracy and reproducibility of new assay reagents, along with their stability and ease of adaptation to multiple platforms. Manufacturers of devices and reagents must also validate the systems used during the manufacturing process and establish quality controls to ensure that each assay will perform according to the specifications. Medical laboratories must comply with standards and good practices to ensure that all assays are performed under conditions defined by the specifications. Finally, test users must be trained and certified to perform the analyses.
Since pure water is used not only to prepare most IVD devices and many reagents but also to run the assays, water purification systems should be validated to ensure consistent water quality. The water quality needed at different stages and for the various tasks during the manufacturing process can vary. For example, while biochemical assays tolerate total organic contamination up to 500 ppb, immunoassays call for further water purity, sterility, and freedom from immunoreactive molecules. Molecular biology tests often have the most stringent water purity requirements because such assays often need nuclease-free water.
The problem is that many laboratory personnel assume their current water purification systems produce water with an appropriate quality that meets all their needs. As a result, even though standards change and IVD testing becomes more sophisticated, the water remains status quo, despite being the most significant and important component in multiple testing platforms. Water systems are part of the laboratories' infrastructure, and unless the facilities have recently been renovated, systems can age and become problematic over time. Most routine system maintenance, consisting merely of filter changes, fails to prevent ever-increasing biofilm buildup and the resulting problems.
Water Quality Standards
The requirements for water quality in medical laboratories are covered in a guideline issued by the Clinical and Laboratory Standards Institute (CLSI; Wayne, PA).2 The guideline replaces the type I, II, and III nomenclature with water purity types, and provides recommendations for purified water usage in clinical laboratories. The guideline defines clinical laboratory reagent water (CLRW) as the minimum water quality for performing chemistry and biochemistry assays. CLRW replaces type I and II water for most applications.3 The definition of quality is based on the measurement of the following parameters: ionic purity (resistivity greater than 10 MΩ∙cm), organic purity (total organic carbon (TOC) less than 500 ppb), bacteria levels (less than 10 cfu/ml), and particulate level.
Another new water quality designation, instrument feed water (IFW), allows IVD instrument manufacturers to clarify specifications for their particular methods. The autoclave and wash water meet the requirements of the previously classified type III quality.
The new water quality definitions also include parameters that were not previously specified. For example, special reagent water (SRW) may be specified when additional parameters are needed to ensure that the water quality is suitable for specific applications (e.g., immunoassays and molecular biology-based assays). A complete review of the CLSI document should be done when considering new IVD applications to ensure that the contaminants found in the source water do not become an issue. The SRW grade often must be defined by the users based on their specific needs.
Poor water quality can affect any assay and has an inherent risk, especially when physicians rely solely on the results to make medical decisions. Different IVD tests require different qualities of water so a relevant risk assessment should be established.
Molecular Biology Testing
Polymerase chain reaction (PCR) and other related methods, enzymatic assays, and nucleic acid-binding assays can become contaminated with bacteria, bacterial by products, charged organics, and ions. Bacteria release various enzymes and ions that mimic the behavior of targeted enzymes used in the assay methods. Some contaminating ions are used as enzyme cofactors, while others act as inhibitors. The high-purity water used in such experiments must be nuclease free, the ionic purity must be high (resistivity 18.2 MΩ∙cm), and the level of organic compounds must be fairly low (TOC less than 10 ppb).4 Water meeting these specifications can be delivered by purification units and can be used as is (with no diethyl pyrocarbonate treatment) to inactivate ribonucleases before running reverse transcriptase or PCR experiments.5
Figure 1. (click to enlarge
) Left two panels: High levels of organics in water result in inconsistent array hybridization and misleading spot analysis. Right two panels: After improving water quality and removing organic contamination, microarrays exhibit consistent signals.
Microarrays are also affected by water quality. The presence of organics and ions can affect the hybridization and rinsing steps, and can also interfere with fluorescence detection. The water quality for microarrays must meet stringent specifications: high resistivity (18.2 MΩ∙cm), low organic content (TOC less than 5 ppb), no nucleases, and no bacteria. Examples of the effect of water quality on DNA microarrays are highlighted in Figure 1
. The presence of high levels of organics in water can lead to a haze on the plates and inconsistent hybridization, preventing a correct spot analysis.
While these guidelines were intended for use in clinical laboratories, similar parameters should be considered by manufacturers of reagents and IVD devices.
The proliferation of automated immunoassays and new assay development has been accompanied by the demand for smaller sample volumes and reaction vessels that are subjected to harsher reaction conditions. All these factors increase assay sensitivity to water quality. Analytical factors that are linked to water quality need to be controlled and optimized to reduce the number of test failures. Drifting calibrations, high blanks, and patient values trending toward the high or low end of the assay can stem from poor water quality, which contributes to erroneous test results.
Figure 2. (click to enlarge
) Top panel: Immunoassay blank before removal of contaminants by ultrafiltration. Bottom panel: Immunoassay blank following removal of contaminants by ultrafiltration.
The typical impact of poor water quality on blanks from an immunoassay method is demonstrated in the top panel of Figure 2. After ultrafiltration was used to eliminate the bacterial alkaline phosphatase released from the decaying bacteria behind a 0.22µm membrane used to filter the blank, the signal detected was much more stable and reliable (see Figure 2
, lower panel).
Reagent and Device Manufacturing
Manufacturing sites that develop and produce reagents used by medical laboratories adhere to standards other than CLSI's guidelines. However, similar standards in terms of water quality are also applicable to IVD manufacturers. Water used to prepare reagents must adhere to standards requiring such manufacturing to be done in a stable and accurate manner; therefore, pure water is required. Reagent and device manufacturers need greater amounts of water than medical laboratories, and they use larger water purification systems, although they are still based on similar technologies.
In addition to the water quality specifications (e.g., resistivity, TOC, bacteria, and particulate levels), IVD manufacturing sites have to comply with standards and recommendations issued by various regulatory bodies, such as the American Society for Testing and Materials (ASTM), the International Organization for Standardization (ISO), GMP, and FDA. Such standards require validation of most equipment used during the production of reagents and devices. The ability to validate and qualify the water purification units must be a key criterion in selecting a pure water provider.
Water Purification Technologies
Millipore Corp. (Billerica, MA) has developed high-purity water systems that use a combination of purification technologies designed specifically for clinical laboratories and IVD manufacturing facilities. The purification systems reduce contaminant levels and ensure that the water produced has consistent quality. For example, the Elix Clinical systems combine different technologies to build a complete water purification system. Typical water purification technologies include general filtration to reduce the incoming particle load, and reverse osmosis (RO) to decrease the load of ions, organics, colloids, and particulates. However, RO water is susceptible to the daily and seasonal variations in the quality of tap water.
To eliminate such variations and ensure a more constant quality, electrodeionization (EDI) is included in the water purification process. This technology uses selective anion and cation semipermeable membranes and ion exchange resins that are regenerated with a small electrical current. To reach CLRW standards, additional water purification requires ion exchange resins that reduce ionic contamination to a very low level.
Moreover, bacteria removal and control are very important in any high-purity water installation. These processes are accomplished by various means including screen (0.22µm) membrane filtration, germicidal UV254nm treatment, and chemical sanitization. Typically, a 0.22-µm membrane filter is used at the water outlet to control the release of bacteria from the purification system.
More recently, as a result of Millipore's research, ultrafiltration (UF) has been proposed as another method for eliminating bacterial by-products (e.g., alkaline phosphatase, endotoxins) from the final water product. The UF membranes used in Millipore's water purification solutions are hollow fibers that provide a large surface in a small volume and therefore supply the required flow rate with minimum space occupancy. The UF cartridges used in Millipore's water purification systems have also been validated for the efficient removal of pyrogens and nucleases (RNase, DNase). In addition, the UF membranes retain most colloids on their surface. The UF membranes can retain soluble contaminating compounds with a molecular weight as low as 3000 Da, thereby ensuring high-quality water for immunoassay production and testing.
Figure 3. (click to enlarge
) Resistivity and TOC values at the outlet of a Milli-Q Advantage water purification system over time.
To provide water suitable for molecular diagnostics, additional purification units can be connected after the RO-EDI steps. Such units will complete the water purification process by removing ions, organics, and bacterial by-products to trace levels. The so-called polishing units, such as the Milli-Q water purification systems, operate to provide high resistivity, low TOC, nuclease-free and bacteria-free water for molecular biology testing. They combine various technologies including UV photooxidation and activated carbon to reduce the level of organics, ion exchange resins to remove ions, and UF to remove nucleases. The water quality consistently meets standards of resistivity and freedom from nuclease contamination (see Figures 3
Figure 4. (click to enlarge
) Filtration of RNase A solution by an ultrafiltration cartridge. This cartridge was used at the point-of-use of a Milli-Q water purification system.
After installation, the maintenance and upkeep of the water purification system are crucial to IVD manufacturing and assay testing. Any potential risks must be considered. For example, what are the potential consequences if the water quality is below standard and this water is used for calibration curves, dilutions, wash cycles, and cuvette cleaning? How water quality affects assay production and testing will ultimately affect the quality of the end product.
A typical water purification system warns the end-users with real-time alerts and alarms. Water system parameters should be monitored on a daily basis for trends that might indicate a failing system. Establishing any action limits or levels ensures that the water system is under control; therefore, action limits need to be defined. Any action limit established will depend on the overall water purification system, any further processing of the final water product, and its use. When action limits are exceeded, the users must investigate the cause of the problem, take action to correct the problem, assess the impact of the contamination on products manufactured with the water, and document the results of the investigation. By recognizing the importance of the deterioration of a parameter, the operator is alerted to the need for system maintenance. Such logs should also be reviewed frequently so that problems with the system and equipment can be evaluated.
The new CLSI water quality standards state that “the water purification system must be validated.”2 Chapter 4.1.1 in the guideline adds that, “The system producing purified water must be validated to meet the user requirement specification (see Section 5.3).”
Any manufacturer of a water purification system should provide a written plan stating how to conduct validation or qualification, including test parameters, product specifications, and acceptance criteria. This validation will establish a high degree of confidence that the equipment as installed is consistent with the users' requirements and specifications. Validation can only confirm that the process has been properly developed and is under control. Ideally, any development activity during the later stages should be finalized by a validation phase.
Adequate validation may benefit IVD assay manufacturers in the following ways:
• Deepens the understanding of processes, decreases the risks of processing problems, and assures the smooth running of the process.
• Decreases the risks of defect costs.
• Decreases the risks of regulatory noncompliance.
• A fully validated process may require less in-process control and end-product testing.
Stéphane Mabic, PhD, is a worldwide application manager at Millipore Corp. (Saint-Quentin-Yvelines, France). He can be reached at stephane_mabic
All water quality parameters, from feed-water properties to high-purity water production, need to be monitored by IVD test users and manufacturers on a regular basis. The water quality delivered for assay testing and production is as important as any other reagent. Controlling bacteria and their by-products with advanced water purification technologies and filters provides high-quality water for developing assays that are sensitive to such contaminants. Controlling water quality eliminates frequent decontamination and lowers cost, thereby optimizing performance and reducing downtime that can be costly to IVD manufacturers. Moreover, as the challenges of developing and commercializing molecular testing move from research to clinical laboratories, consistent water quality in production and testing will ensure more-accurate patient results.
1. “Division of Genetics,” Wadsworth Center Web site (Albany, NY [cited 7 May 2009]); available from Internet: www.wadsworth.org/divisions/gendis.htm.
2. “Preparation and Testing of Reagent Water in the Clinical Laboratory; Approved Guideline, Fourth Edition,” Clinical and Laboratory Standards Institute Web site (Wayne, PA [cited 7 May 2009]); available from Internet: www.clsi.org/Content/NavigationMenu/Shop/CurrentVersionsofCLSIDocuments/C03A4.htm.
3. S Mabic, “Maintaining Water Quality in Clinical Chemistry,” Advance for Administrators of the Laboratory 15, no. 8 (2006): 83.
4. S Mabic and I Kano, “Impact of Purified Water Quality on Molecular Biology Experiments,” Clinical Chemistry 41 (2003): 486-91.
5. J Plurad and S Mabic, “RNase Undetectable in Water after Ultrafiltration,” Bioscience Technology 11 (2004): 27-28.
Copyright ©2009 IVD Technology