Tag Archives: validation

The Validation Conversation

By Joy Dell’Aringa
No Comments

Our industry is in a perpetual food safety discussion. We are constantly mulling over the finer points of hazards, risk, preventive controls, training, sanitation, and regulations. Validation is also a key component of the food safety dialog. Here we will explore common themes industry professionals discuss in regard to validation in this era of food safety.

Definitions

In any good conversation, terms must be set and semantics agreed upon. It is helpful to start off with a simplistic definition of validation and verification that can be applied across industries and applications. We often return to these reductive definitions throughout conversations to recalibrate and ensure that all parties are on the same page.

  • Validation:  Are we using the correct system / method?
  • Verification: Are we using the system / method correctly?

From there, we narrow our focus. Using the FSMA backdrop, from the FDA’s “Draft Guidance for Industry: Control of Listeria monocytogenes in Ready-To-Eat Foods” we find the following definitions:

Validation: Obtaining and evaluating scientific and technical evidence that a control measure, combination of control measures, or the food safety plan as a whole, when properly implemented, is capable of effectively controlling the identified hazards.

Verification: The application of methods, procedures, tests and other evaluations, in addition to monitoring, to determine whether a control measure or combination of control measures is or has been operating as intended and to establish the validity of the food safety plan.

Validation and Verification: Semantics Matter.

Definitions for validation and verification are available from various standards organizations and regulatory bodies. What is most important, however, is that in this conversation there is a clear distinction between validation and verification—both in activities and objectives. These are not interchangeable terms. Further, validation and verification can be discussed from two general perspectives in the food safety landscape. Process validation addresses manufacturing activities and controls to prevent product hazard and contamination. Method validation addresses the analytical methods used to verify the physical, chemical or microbiological properties of a product.

Process Validation

Our industry is comprised of a variety of categorical segments. Each segment faces unique processing challenges, risks and requirements that must be addressed in the validation and verification conversation.

Some segments, such as the dairy industry, have long standing processes in place that have a robust scientific backbone and leave little room for guesswork, experimentation or modification. “Milk  processes were validated years ago and are part of the Pasteurized Milk Ordinance (PMO). The science is there,” states Janet Raddatz, vice president of quality & food safety systems at Sargento Foods, Inc. ” It is well established that when you pasteurize the product for the time and temperature that has been validated, then you simply verify the pasteurizer is working to the validated specifications.”

However, process validation challenges arise when novel applications, ingredients and processes are employed. Even in an established industry, reformulations of products such as sauces and dressings require fresh validation perspective and risk assessment. “You must assess the risk anytime there is a change. Properties such as pH, salt and water are critical variables to the safety and microbial stability of a product. Novel processing techniques aimed at ‘all natural’ or ‘minimal processing’ consumer demands should also be challenged.” Raddatz suggests conducting a full assessment to identify potential areas of risk. A challenge study may also be a critical piece to validate that a certain process or formulation is appropriate.

To help the food industry understand, design and apply good validation and verification practices, the Institute for Food Safety and Health (IFSH) published “Validation and Verification: A Practical, Industry-driven Framework Developed to Support the Requirement of the Food Safety Modernization Act (FSMA) of 2011.” This insightful document provides various definitions, guidance, practical advice, and offers several Dos and Don’ts on validation and verification activities.

Do:

  • Divide validation and verification into separate tasks
  • Think of validation as your scientific evidence and proof the system controls the hazards
  • Use science-based information to support the initial validation
  • Use management to participate in validation development and operations of verification
  • Use lessons from “near-misses” and corrections to adjust and improve the food safety system

Don’t:

  • Confuse the activities of verification with those of routine monitoring
  • Rely on literature or studies that are unlike your process/ product to prove controls are valid
  • Conduct audit processes and then not review the results
  • Perform corrective actions without determining if a system change may be needed to fix the problem
  • Forget, reanalysis is done every three years or sooner if new information or problems suggest

Method Validation

Analytical methods used to verify a validated food process must also be validated for the specific product and conditions under which they will be conducted. For example, a manufacturer that has their laboratory test a product for Salmonella to verify that a kill step in the manufacturing process worked, must ensure that the method the laboratory uses is both validated for that product and has been verified as appropriate for use in that laboratory. Three general considerations should be discussed with the laboratory:

  • Is the method validated for the product (matrix)?
    • Often, the method will carry several matrix validations that were previously conducted by the diagnostic provider, an industry organization or as a reference method.
    • If the matrix to be tested is not validated the laboratory should conduct a validation study before proceeding.
  • Has the laboratory verified this method on the product (matrix)?
    • The laboratory should demonstrate that they can indeed perform the validated method appropriately.
    • Verification activities typically involve a matrix specific spiked recovery.
  • Are there any modifications made to the validated method?
    • All method modifications should be validated and verified. Additionally, modification should be noted on the laboratory report or Certificate of Analysis issued.
    • Method modifications may include time and temperature alterations, media changes and sample preparation factors.

AOAC International is an organization that certifies the validation of methods to a specific prescribed standard. “Diagnostic companies seek AOAC approval, which entails rigorous validation protocol with the selected matrices,” says Ronald Johnson Ph.D., president of AOAC International and associate director of validation for bioMérieux, describes the importance of commercial standardization.  “The AOAC validation scheme ensures that the method is robust, rugged, inclusive and exclusive, stable and meets the sensitivity presented.” Standards such as these provide confidence to the user that the method is fit-for-purpose, a critical first step in method selection.

While many diagnostic companies will perform standardized validation as described above, how a laboratory validates and verifies a method is incredibly nuanced in the food industry. Currently, there is no standardized approach to study design and execution. Even ISO 17025 accredited laboratories are only required to have a validation and verification protocol—there is no dictation about what that protocol should look like.

“Currently, there is a lot of variation in the industry around [method] validation,” says Patrick Bird, microbiology R&D laboratory supervisor at Q Laboratories. Bird is a method validation expert who is on the U.S. ISO TAG TC34/SC9 working group 3 for the new ISO validation and verification standards, including ISO/DIS 16140-4 guidelines, “Microbiology of the food chain – Method Validation – Part 4: Protocol for single-laboratory (in-house) method validation.”

“Variables such as number of replicates, spike levels, and even acceptance criteria vary widely from lab to lab—both in manufacturing laboratories and contract testing laboratories. We hope the ISO guidelines will standardize that, ” says Bird. He goes on to discuss the importance of good laboratory stewardship in the industry. “While some look at validations as a proprietary or competitive advantage, the testing industry must realize that without standardization, poor validation and verification practices by a few can tarnish the great science done by the many, and ultimately jeopardize the safety of our food supply.” He stresses the importance of quality operations and open communications with laboratories, whether in house or third party. “Now that validation is highlighted as a required area in FSMA Preventive Controls, more and more companies are paying attention to the methods and associated validation/verification data their labs can provide.”

Continue to page 2 below.

Listeria

How One Company Eliminated Listeria Using Chlorine Dioxide Gas

By Kevin Lorcheim
No Comments
Listeria

The previous article discussed the various decontamination options available to eliminate Listeria. It was explained why the physical properties of gaseous chlorine dioxide make it so effective. This article focuses on one company’s use of chlorine dioxide gas decontamination for both contamination response and for preventive control.

The summer of 2015 saw multiple ice cream manufacturers affected by Listeria monocytogenes. The ice cream facility detailed in this article never had a supply outage, but ceased production for a short amount of time in order to investigate and correct their contamination. After a plant-wide review of procedures, workflows, equipment design and product testing, multiple corrective actions were put into place to eliminate Listeria from the facility and help prevent it from returning. One such corrective action was to decontaminate the production area and cold storage rooms using chlorine dioxide gas. This process took place after the rest of the corrective actions, so as to decontaminate the entire facility immediately before production was set to resume.

Responsive Decontamination

The initial decontamination was in response to the Listeria monocytogenes found at various locations throughout the facility. A food safety investigation and microbiological review took place to find the source of the contamination within the facility in order to create a corrective action plan in place. Listeria was found in a number of locations including the dairy brick flooring that ran throughout the production area. A decision was made to replace the flooring, among other equipment upgrades and procedural changes in order to provide a safer food manufacturing environment once production resumed. Once the lengthy repair and upgrade list was completed, the chlorine dioxide gas decontamination was initiated.

The facility in question was approximately 620,000 cubic feet in volume, spanning multiple rooms as well as a tank alley located on a different floor. The timeline to complete the decontamination was 2.5 days. The first half-day consisted of safety training, a plant orientation tour, a meeting with plant supervisors, and the unpacking of equipment. The second day involved the setup of all equipment, which included chlorine dioxide gas generators, air distribution blowers, and a chlorine dioxide gas concentration monitor. Gas injection tubing was run from the chlorine dioxide gas generators throughout the facility to approximately 30 locations within the production area. The injection points were selected to aid its natural gaseous distribution by placing them apart from one another. Gas sample tubing was run to various points throughout the facility in locations away from the injection locations to sample gas concentrations furthest away from injection points where concentrations would be higher. Sample locations were also placed in locations known to be positive for Listeria monocytogenes to provide a more complete record of treatment for those locations. In total, 14 sample locations were selected between plant supervisors and the decontamination team. Throughout the entire decontamination, the gas concentration monitor would be used to continuously pull samples from those locations to monitor the concentration of chlorine dioxide gas and ensure that the proper dosage is reached.

As a final means of process control, 61 biological indicators were brought to validate that the decontamination process was effective at achieving a 6-log sporicidal reduction. 60 would be placed at various challenging locations within the facility, while one would be randomly selected to act as a positive control that would not be exposed to chlorine dioxide gas. Biological indicators provide a reliable method to validate decontamination, as they are produced in a laboratory to be highly consistent and contain more than a million bacterial spores impregnated on a paper substrate and wrapped in a Tyvek pouch. Bacterial spores are considered to be the hardest microorganism to kill, so validating that the process was able to kill all million spores on the biological indicator in effect also proves the process was able to eliminate Listeria from surfaces. The biological indicators were placed at locations known to be positive for Listeria, as well as other hard-to-reach locations such as the interior of production equipment, underneath equipment and inside some piping systems.

In order to prepare the facility for decontamination, all doors, air handling systems, and penetrations into the space were sealed off to keep the gas within the production area. After a safety sweep for personnel, the decontamination was performed to eliminate Listeria from all locations within the production area.

Click page 2 to continue reading.

Dr. Douglass Marshall, Chief Scientific Officer – Eurofins Microbiology Laboratories
Food Genomics

Microbiomes a Versatile Tool for FSMA Validation and Verification

By Douglas Marshall, Ph.D., Gregory Siragusa
No Comments
Dr. Douglass Marshall, Chief Scientific Officer – Eurofins Microbiology Laboratories

The use of genomics tools are valuable additions to companies seeking to meet and exceed validation and verification requirements for FSMA compliance (21 CFR 117.3). In this installment of Food Genomics, we present reasons why microbiome analyses are powerful tools for FSMA requirements currently and certainly in the future.

Recall in the first installment of Food Genomics we defined a microbiome as the community of microorganisms that inhabit a particular environment or sample. For example, a food plant’s microbiome includes all the microorganisms that colonize a plant’s surfaces and internal passages. This can be a targeted (amplicon sequencing-based) or a metagenome (whole shotgun metagenome-based) microbiome. Microbiome analysis can be carried out on processing plant environmental samples, raw ingredients, during shelf life or challenge studies, and in cases of overt spoilage.

As a refresher of FSMA requirements, here is a brief overview. Validation activities include obtaining and evaluating scientific and technical evidence that a control measure, combination of control measures, or the food safety plan as a whole, when properly implemented, is capable of effectively controlling the identified microbial hazards. In other words, can the food safety plan, when implemented, actually control the identified hazards? Verification activities include the application of methods, procedures, tests and other evaluations, in addition to monitoring, to determine whether a control measure or combination of control measures is or has been operating as intended, and to establish the validity of the food safety plan. Verification ensures that the controls in the food safety plan are actually being properly implemented in a way to control the hazards.

Validation establishes the scientific basis for food safety plan process preventive controls. Some examples include using scientific principles and data such as routine indicator microbiology, using expert opinions, conducting in-plant observations or tests, and challenging the process at the limits of its operating controls by conducting challenge studies. FSMA-required validation frequency first includes before the food safety plan is implemented (ideally), within the first 90 calendar days of production, or within a reasonable timeframe with written justification by the preventive controls qualified individual. Additional validation efforts must occur when a change in control measure(s) could impact efficacy or when reanalysis indicates the need.

FSMA requirements stipulate that validation is not required for food allergen preventive controls, sanitation preventive controls, supply-chain program, or recall plan effectiveness. Other preventive controls also may not require validation with written justification. Despite the lack of regulatory expectation, prudent processors may wish to validate these controls in the course of developing their food safety plan. For example, validating sanitation-related controls for pathogen and allergen controls of complex equipment and for how long a processing line can run between cleaning are obvious needs.

There are many routine verification activities expected of FSMA-compliant companies. For process verification, validation of effectiveness, checking equipment calibration, records review, and targeted sampling and testing are examples. Food allergen control verification includes label review and visual inspection of equipment; however, prudent manufacturers using equipment for both allergen-containing and allergen-free foods should consider targeted sampling and testing for allergens. Sanitation verification includes visual inspection of equipment, with environmental monitoring as needed for RTE foods exposed to the environment after processing and before packaging. Supply-chain verification should include second- and third-party audits and targeted sampling and testing. Additional verification activities include system verification, food safety plan reanalysis, third-party audits and internal audits.

Verification procedures should be designed to demonstrate that the food safety plan is consistently being implemented as written. Such procedures are required as appropriate to the food, facility and nature of the preventive control, and can include calibration of process monitoring and verification instruments, and targeted product and environmental monitoring testing.

Safe Food: A Product of a Clean Environment

By Gina R. Nicholson-Kramer
1 Comment

Most recently we have seen an increase in foodborne illness outbreaks from Listeria to Salmonella to Norovirus to E.coli, many of which are a result of post-lethal contamination of processed foods. This is often a direct result of a gap in the sanitation programs that were in place at the processing facilities. Every facility should conduct a sanitation gap analysis on an annual basis. In order to receive unbiased feedback, this activity is best performed by a third party that is not a chemical provider.

Join Gina Kramer at the Listeria Detection & Control Workshop, May 31–June 1 in St. Paul, MN | LEARN MOREDeveloping and implementing a sound environmental hygiene program at a food processing facility is essential to its success in producing safe food for consumer consumption. There are fundamental basics of sanitation that every plant must follow in developing a strong program. The fundamental basics include: Developing sanitation standard operating procedures (SSOPs) for; Floors and drains, walls, ceilings, equipment and utensils, and employees. SSOPs must also contain perimeter control, foot traffic control into food preparation areas, zoning, and environmental sampling procedures.

Jeff Mitchell, Gina Kramer, Listeria
VIDEO: Jeff Mitchell and Gina Kramer discuss the increase in Listeria recalls. | WATCH NOW

When developing SSOPs, using the proper risk reduction formula will lead to sanitation success. To determine the best risk reduction formula, I sought the advice of sanitation expert, Jeff Mitchell, vice president of food safety at Chemstar. Before working for Chemstar, Mitchell was the Command Food Safety Officer for the United States Department of Defense (DOD). Serving more than 20 years for the DOD has given him the opportunity to visit thousands of processing facilities all over the world, seeing the best and the worst, and assisting in finding the root cause of contamination issues and negative environmental sampling results. In this article, I share Mitchell’s risk reduction formula for sanitation success and how to use the formula to build a solid and successful sanitation program.

Foundational Science

“An understanding of the difference between transient and persistent (or resident) pathogens is a key part in the foundational science of sanitation solutions,” explained Mitchell as we discussed the details of the risk reduction formula. Transient pathogens are those that are introduced to the processing facility from the external environment. Entrance occurs from deliveries on transportation vehicles and pallets, food, and non-food products and its packaging, employees and visitors, pests and rodents, along with leaks in the roof or improper cleaning of drains, which are known reservoirs.

Kramer_FoundationalScience“Persistent pathogens are those pathogens that establish residency within the processing facility. Most bacteria will aggregate within a biofilm, allowing them to live in communities. A biofilm is a survival mode for the bacteria; it protects it from sanitizer penetration. The biofilm layers actually masks it from sampling detection. You could swab a surface or an area and not get a positive pathogen test result, because the biofilm is masking it,” Mitchell stated. He continued to explain that most contamination risks are likely from established populations. Four things need to exist for resident populations to form: Pathogen introduction, water, trace organics and niche area for attachment and growth. Food processing facilities should be most concerned about these populations, as they’re being traced to many recent outbreaks and recalls.

In his experience, Mitchell shared that sanitation efforts should focus on areas within the processing facility where moisture and nutrients are collected; both are needed for biofilm formation. Disruption of these niche areas containing biofilm can result in direct (food contact) and indirect (non-food contact) contamination if the biofilm is not completely penetrated or removed. This can occur through active and passive dispersal of pathogens. Active dispersal refers to mechanisms that are initiated by the bacteria themselves where they naturally eject from the biofilm and land on other surfaces. Passive dispersal refers to biofilm cell detachment that is mediated by external forces that shear the biofilm, causing it to move and further spread. This can be caused through fluid shear, abrasion and/or vibration due to power washing, equipment vibration, or deep cleaning/scrubbing that does not penetrate and remove all the aggregate layers of biofilm. In other words, the biofilm and pathogens are just smeared around the facility like cleaning a mirror with a greasy wiping cloth.

Chemistry and Application

Kramer_CleaningMatrixThe cleaning matrix must be considered to properly remove soils that house both transient and persistent pathogens. This is done by combining proper cleaning and sanitizing agent concentration (PPM), adequate exposure time, proper temperature and mechanical action (agitation) or good old elbow grease. If there is a decrease in one area of the matrix, then an increase in the other areas needs to be made as an accommodation to the cleaning process. My years working in industry have taught me that the most expensive quadrant of the cleaning matrix is agitation, because it requires manual labor. Reduction of labor is one of the first ways companies build in efficiencies to increase profit margins. That means a solution must be built that focuses on temperature, concentration and proper contact time to produce the sanitation results necessary to prevent persistent pathogens from establishing residency within processing facilities.

Temperature should be regulated by the type of soils that need to be removed. High fat soils need a higher temperature of about 140⁰ F. However, when removing high protein soils, the temperature needs to be reduced so that the protein is not baked onto the surface. Baked proteins that are not removed become nutrients for bacteria to aggregate and reside. High temperature is does not work in every food processing plant, Jeff explained.

Proper balance of detergent and sanitizer is necessary to remove and destroy both transient and persistent pathogens. The detergent needs to be the right formulation and contact time to break down soils and biofilms with application of the right concentration and contact time of sanitizer to kill the exposed pathogens. Without the right balance in place it can create the perfect storm for spread and contamination within the processing facility.

Validation

Do your homework. Research is the most valuable tool when validating the effectiveness of a cleaning process. Private research is good but not the only form of validation on which to base a business decision. I have found that peer reviewed published research is best to use in validating all quadrants of the cleaning matrix. Academic research based on sound science that has practical application results is worth the investment to make sound business decisions.

Many products have been developed to penetrate and destroy the biofilm layers that bacteria aggregate. Again, do your homework. Choose a product that also provides a pathogen kill once the biofilm has been penetrated. I cannot stress enough to make sure that the SSOPs follow the manufacturer’s validated processes and the sanitation team follows the SSOPs’ directions.

Solution

Applying the desired solution requires dividing the processing facility into zones to designate specific sanitation requirements. This will assist in the development of specific SSOPs that apply the right solution in the right zone throughout the site.

Kramer_ZoneMitchell also gave great advice about cleaning tools and cleaning chemical basics. He explained that a facility should color code the cleaning tools according to zone and only use them in the designated zone area. This prevents cross contamination from occurring, because cleaning tools can be vehicles of contamination transfer. Utilize foam detergents and foam sanitizers as they are more forgiving and increase contact time, and sanitation crew can see where they have applied the chemicals. Use the Ross-Miles foam test for stability: Foam should last more than three minutes before breaking and turning into a liquid solution that runs down the drain, costing a site money and opening up the potential for introducing pathogens into production rooms.

Mitchell advised the development of sanitation procedures that focus on daily thorough cleaning of everything from the knees down in Zones 1-3. “You want to knock everything down and keep it down. The objective is to keep bacterial creep from occurring,” he said. “Creep is where bacteria are moved by processes like water spray, splash and aerosolization, causing the bacteria to move from one area (it usually develops on the floor) to then move up walls and the legs of equipment, etc.— eventually causing contamination of food during food production and packaging.” Obviously, all food contact surfaces in Zone 1 need to have specialized SSOPs according to the equipment, food processing shifts per day, and type of foods that are being processed.

Mitchell stressed that perimeter and foot traffic control entry programs should incorporate a good foam sanitizer that stands up to the Ross-Miles test with optimal duration of five minutes. The distribution of the foam should cover a large enough area that the employees’ foot path and equipment must travel through the foam to achieve contact to control transient pathogen entrance into Zones 1–3. Concentration levels of these areas should be at least double what the food contact area strength is for effectiveness of log kill needed for control.

Environmental monitoring procedures should follow the zoning process set up for sanitation. “Swabbing for Adenosine Triphosphate (ATP) and/or Aerobic plate count (APC) are tools that can be used to help identify biofilm locations. One thing to note is that the bacteria located under the biofilm are in a modified dormant state requiring less energy and making less ATP available for detection.  With that said, ATP and APC swabbing are still both viable tools to use in sanitation verification,” said Mitchell. If you only test for general risk pathogens in your facility you may receive false negatives due to biofilm masking the pathogen from showing up as a positive in environmental testing. Utilizing both general pathogen, ATP and APC in concert, is the best combination in a facility’s environmental monitoring program. The goal is to seek and find then destroy and verify.

I recently discovered a great biofilm visual detection test from Realzyme that is wonderful to use to verify whether the sanitation system in place is working. It can also differentiate between protein build-up and biofilm formation. In my professional opinion, this visual detection test is essential to incorporate in a robust environmental testing system.

Safe Food: The End Product

Our responsibility as food safety/quality professionals is to provide the safest, most delicious food for our customers to enjoy. To ensure safe food in our end product, we need to develop a robust sanitation and environmental testing program that follows the risk reduction formula (Foundational Science + Chemistry & Application + Validation = Solution) and conduct an annual sanitation gap analysis by a third-party expert for continuous improvements.

Apply these steps to protect your food, protect your brand and protect your customers so that they Savor Safe Food in every bite!

Microbiological Method Validation: The Elephant in the Lab

By Evan Henke, PhD, MPH
No Comments

Although commonly overlooked, microbiological method validation studies are the linchpins of entire quality programs, and method validations done without rigor are crippling our industry’s ability to truly ensure the quality and safety of foods on a daily basis.

Food quality managers, it is time we discussed the critical importance of validation studies in the quality lab. Although commonly overlooked, microbiological method validation studies are the linchpins of entire quality programs, and method validations done without rigor are crippling our industry’s ability to truly ensure the quality and safety of foods on a daily basis. This article discusses the purpose and importance of microbiological method validation studies and why the food industry should insist on validation study designs of maximum rigor and validity.

What is a microbiological method, and what exactly is a validation study?

A microbiological method, for the purposes of this discussion, is any microbiology test or assay used in the food industry. It may be a test for indicator organisms such as Coliforms or yeast and mold, pathogens such as Salmonella or E. coli O157, or toxins secreted by microorganisms such as Staphylococcal enterotoxin.

A validation study is a one-time study that food safety risk managers complete in order to assure themselves that a new microbiological method produces accurate results that will enable them to effectively measure and manage food safety risk. A validation study is conducted in the actual lab where testing will be performed, with current laboratory analysts, with the specific formulations of foods that are tested regularly.

Food industry regulators and certifying bodies such as SQF expect food producers to use microbiological test methods that are proven fit for use on specific foods. If we are to draw inferences about the fitness of a new test method on specific foods, then we must study how that new test method compares to an accepted reference method, or “gold standard” method. Reference methods are those written in the Food and Drug Administration’s Bacteriological Analytical Manual, the United States Department of Agriculture’s Microbiological Laboratory Guidebook, or ISO methods. Regulators and experts agree that these methods represent the standard to which all other tests should measure up. Methods certified by the Association of Analytical Communities (AOAC) are not considered reference methods and must be validated as fit for use on foods that are appreciably different from the matrices studied. Likewise, AOAC Performance Tested Method (PTM) and Official Methods of Analysis (OMA) certificates are not substitutes for internal validation studies in any given food plant.

In my experience working with quality labs across the United States, I have seen several different validation study designs used to evaluate alternative, more rapid and cost-effective microbiological methods. Some common validation study designs are shown in Table 1. Multiple alternative tests are available, however an internal validation study is needed regardless of the test kit manufacturer. Rarely does a validation study include a comparison to agar plates, which are required for almost every microbiological reference method. Material costs, labor costs, and emergency situations typically prohibit food labs from conducting a rigorous validation study that can speak to the performance of a new method in relation to the current gold standard.

Table 1: Scientific Questions Inherent in Food Microbiology Method Validation Study Designs 

 Validation Study Design  Inherent Scientific Question  Does Study Explain Performance
of New Test?
 
Test positive or spiked samples side by side on reference method and the new test  Does the test perform comparably to the reference method on my food? Yes 
Test positive or spiked samples on the new test  Regardless of accuracy, can the test detect certain or specific bacteria in my food?   No, but may be useful to understand workflow
Test any samples side by side on current AOAC certified method and new test  How do the new test’s results compare to my current AOAC certified method on my food?  No, but may be useful to understand workflow 
Test any positive or negative samples on the new test  Will the new test’s workflow improve my lab’s efficiency?  No, but may be useful to understand workflow 
This table presents several validation study designs common in the food industry and the scientific questions that are addressed by each design.

It is in the best interests of food producers and the public’s health to conduct rigorous validation studies that give food safety risk managers good information to make correct risk management decisions. In theory, some percentage of unsolved epidemiological foodborne illness clusters must be due to incorrect risk management decisions that allowed contaminated products to reach the market. At the same time, some percentage of all food lot rejections and recalls must be made incorrectly. A portion of these events must be related to food matrix interference that yielded incorrect microbiological results and caused the wrong risk management decision. As they say, “Garbage in, garbage out.”

In addition, including a comparison to agar reference methods in your microbiological method validation study is critical, as it reduces your chances of making an incorrect risk management decision.

Look at things this way: Plants certified with a GFSI accredited quality scheme have already put in effort to ensure analytical equipment such as thermometers and scales are calibrated. Similarly, validating microbiological methods against a reference method is equally if not more important. Finished product microbiology results inform decisions made every day that affect your profits and losses, and those results are likely a primary metric you use to study the effectiveness of your prerequisite programs and preventative controls.

Consider a quality lab that is using an alternative microbiological method that has not been rigorously validated with the plant’s specific foods. Unknown to the lab, the test results every day are twice as variable as the reference agar method and are frequently inaccurate relative to the plant’s product specifications. A rigorous method validation would demonstrate that results on the current method vary widely, while the same samples are consistent with a reference method. This well-intentioned plant is unknowingly making incorrect risk management decisions not just multiple times per year, but multiple times per week, either accidentally releasing contaminated product, reworking product that is acceptable, or disposing of perfectly good product. For the millions of dollars the food producer invests in prerequisite programs, preventative controls, quality personnel, and testing, the plant is unable to optimize their food safety risk management simply due to an unknown and overlooked incompatibility of the microbiological method with the plant’s product.

In my estimation, the costs of rigorously validating a microbiological method on all of your food products outweigh the potential hidden costs that could result from method incompatibility. The business case justifying the costs of a validation study are strong and compelling. And learning how to apply current microbiological methods specific to your foods is not as hard as you might think, considering the large host of test manufacturers, third-party labs, consultants, food safety extensions, and industry groups available to regularly provide study design education and services.