Tag Archives: ISO

Newer Regulations Clarify Food Microbiology Parameters for Labs

By Jacob Bowland
No Comments

Accuracy and validity of food test results hinge on purified water and annual water testing.

Laboratory-grade water literature is well documented among the large life science water manufacturers. General levels of resistivity, total organic carbon (TOC), particles and bacteria in water classify into Types 1, 2, or 3, with Type 1 having the most stringent requirements. Each type is useful for a different application depending on the procedure:1,2,3

  • Type 3. Generic applications where water will not come into contact with analytes during the procedure
  • Type 2. Standard applications such as media and buffers
  • Type 1. Critical applications such as GC, MS, HPLC analyzers4

Achieving high-quality water requires purification through a polishing step such as deionization (DI), reverse osmosis (RO), ultraviolet light (UV), filtration or distillation, which removes specific impurities.3,5

This classification system gets muddled, as different agencies have their own standard that examines different end-point analysis and levels:

  • ISO (International Organization for Standards)
  • CLSI (Clinical and Laboratory Standards Institute)
  • ASTM (American Society for Testing & Materials)
  • USP (United States Pharmacopoeia)2,5

With all these standards and testing in place, many labs assume that their installed DI water supply is clean, yet in reality, the water in general would be closer to Type 3 rather than the required Type 1. 

The problem with using lower quality water in food testing labs is that the accuracy and validity of tests will be compromised. Many of the analyzers requiring Type 1 water would recognize contamination from lower quality water, creating difficulty in identifying actual contamination or yielding false positives. False positives can result due to microorganism contamination in the water that is amplified through the testing procedure. In addition, dirty water can damage expensive machinery, because tools in the laboratory that are designed for a high-purity water supply can malfunction when less-pure water is used. For example, a system with microfilters can become rapidly clogged with lower quality water, introducing the possibility of flooding when tubing bursts, if left unnoticed.

Newer regulations in regards to ISO 11133:2014, along with ISO 17025:2005, provide clarity on food microbiology water parameters for the laboratory. ISO 11133:2014 “Microbiology of food, animal feed and water–Preparation, production, storage and performance testing of culture media” describes how water for culture media must be purified. The purification recommended is distilled, demineralized, DI, or RO, and stored in an inert container. To verify purity, labs must regularly test the water to assure microbial contamination is kept to a minimum. Regarding 17025:2005, which refers to food microbiology requirements for accreditation, there should be daily, weekly and monthly testing of the laboratory’s water source to verify required quality for microbiological water. Daily testing examines resistivity of water; monthly testing examines the water’s chlorine levels and aerobic plate counts; yearly testing examines heavy metals in the water. Therefore, accuracy and validity of food test results critically revolve around producing purified water and annual water testing.

1. Veolia. (n.d.). Water Quality. Retrieved from: http://www.elgalabwater.com/water-quality-en-us
2. Puretec Industrial Water. (n.d.). Laboratory Water Quality Standards. Retrieved from: http://puretecwater.com/laboratory-water-quality-standards.html
3. Millipore. (n.d.). Water in the Laboratory. Retrieved from: http://www.emdmillipore.com/US/en/water-purification/learning-centers/tutorial/OPab.qB.IxUAAAE_MkoRHe3J,nav
4. Denoncourt, J. (2010). Pure Water. Retrieved from: http://www.labmanager.com/lab-design-and-furnishings/2010/09/pure-water?fw1pk=2#.VRrT7fnF-Cn
 5. The National Institutes of Health. (2013). Laboratory Water, It’s Importance and Application. Retrieved from: http://orf.od.nih.gov/PoliciesAndGuidelines/Documents/DTR%20White%20Papers/Laboratory%20Water-Its%20Importance%20and%20Application-March-2013_508.pdf

Jacob Bowland is Product Manager at Heateflex and Steven Hausle is Vice President of Sales and Marketing at Heateflex.


Thomas R. Weschler, Founder and President, Strategic Consulting, Inc (SCI)

High False Positive Rates for Pathogen Food Safety Testing

By Thomas R. Weschler
No Comments
Thomas R. Weschler, Founder and President, Strategic Consulting, Inc (SCI)

This article looks at proficiency testing (PT) for pathogen analysis, and the recent finding by the the American Proficiency Institute (API) of a 6.6 percent false-negative rate on food safety PT samples (14-year average for the 1999-2012 period).

While at IAFP this year, I met with Heather Jordan, who directs food PT programs at API. The proficiency testing programs are used at many food labs in conjunction with lab accreditation programs. Proficiency testing is done at food plant labs (FPLs) and corporate labs, as well as at food contract testing labs (FCLs) as a way to demonstrate quality results in their food micro and chemistry testing.

More proficiency testing but less proficiency?

In fact, the use of PTs is increasing in food labs, which is probably tied in part to the push for lab accreditation by FSMA and non-government groups like GFSI. Yet it seems to me that the current use of PTs doesn’t go far enough to enable an FPL or FCL to demonstrate overall laboratory competency, and gain or maintain accreditation (ISO 17025).

In most labs, PTs are done just a few times a year. And really, they test the competency of the lab technician and protocols used in analyzing the PT samples. They are not a holistic measure of the lab and its ability to consistently generate quality results on every test run by every operator in the lab.

In a previous life I ran a group of environmental testing labs, which also are required to run PT samples during the year. From this experience, I know that lab personnel are aware that PTs are in-house: The sample-receiving group logs them in, and then alerts management. As a result, the best operators usually are assigned to run the PTs. This kid-glove treatment is not representative of day-to-day practices and processes. If we really want to validate and accredit the proficiency of an entire lab, shouldn’t every operator be tested on all protocols in use?

Plus, if labs know when they are running PT samples, and likely have their best operators running them, shouldn’t there be few, if any, false-negative or false-positive results? Surprisingly, that’s not what the API research found…

API study: Performance accuracy for food pathogens remains problematic

In a retrospective study, “Pathogen Detection in Food Microbiology Laboratories: An Analysis of Proficiency Test Performance,” API analyzed the results from 39,500 food proficiency tests conducted between 1999 and 2012 to see how U.S. labs are doing in detecting or ruling out contamination of four common food pathogens.

Over the 14-year period, “False negative results ranged from 3.3 percent to 14.0 percent for E. coli O157:H7; 1.9 percent to 10.6 percent for Salmonella spp; 3.4 percent to 11.0 percent for L. monocytogenes; and 0 percent to 19.8 percent for Campylobacter spp.” Most concerning is that while both false positive and false negative rates were down in the last year of the study, the cumulative false negative rate for the 14-year period was 6.6 percent.

As we know, false positive results (in which a sample that does not contain pathogens is incorrectly shown as positive) are a nuisance. But false negative test results—which fail to detect true pathogenic organisms in the sample—are not unacceptable.


The cumulative average false positive rate was 3.1 percent, less than half of the false negative rate for the same period.

The objective of the study—and, I would think, of proficiency testing in general—is to demonstrate improvement in lab performance year over year. The results of the API report concluded to the contrary, however: “Performance accuracy for food pathogens remains problematic with the recent cumulative trend showing a slight decrease for false positive and false negative results.”

Clearly if false negatives happen in proficiency programs, they happen in the course of regular testing at food labs. I’m told that many FCLs and FPLs rely on other parts of their QA systems to make sure testing is being conducted properly. Even so, the documentation of ongoing and unacceptably high false negative rates in PT testing is a big concern for everyone. It also points to a number of follow-on questions:

  • Would the false negative and false positive results be even higher if every technician, rather than the best operator, performed the analysis?
  • PT samples are created in only a couple of sample matrices. Would results be even worse if performed on the myriad of sample matrices present in the food industry?
  • What are the performance results among all of the pathogen methods available? Are some methods better than others when measured in real world conditions? Do the more complex protocols of some pathogen diagnostic systems result in poorer PT performance results?
  • Would PT results and, even more important, lab proficiency improve if the frequency of PTs increased, and were required of every technician involved with real food samples?
  • How can proficiency testing be used to isolate problem areas, whether in the pathogen diagnostic method or the competency of lab operators and processes?
  • And finally, is the performance data different between food contract labs and food plant labs? And are all FCLs are equal, or are some more able to deliver quality results?