Tag Archives: microorganisms

Raj Rajagopal, 3M Food Safety
In the Food Lab

Pathogen Detection Guidance in 2020

By Raj Rajagopal
No Comments
Raj Rajagopal, 3M Food Safety

Food production managers have a critical role in ensuring that the products they make are safe and uncontaminated with dangerous pathogens. Health and wellness are in sharp focus for consumers in every aspect of their lives right now, and food safety is no exception. As food safety becomes a continually greater focus for consumers and regulators, the technologies used to monitor for and detect pathogens in a production plant have become more advanced.

It’s no secret that pathogen testing is performed for numerous reasons: To confirm the adequacy of processing control and to ensure foods and beverages have been properly stored or cooked, to name some. Accomplishing these objectives can be very different, and depending on their situations, processors rely on different tools to provide varying degrees of testing simplicity, speed, cost, efficiency and accuracy. It’s common today to leverage multiple pathogen diagnostics, ranging from traditional culture-based methods to molecular technologies.

And unfortunately, pathogen detection is more than just subjecting finished products to examination. It’s become increasingly clear to the industry that the environment in which food is processed can cross-contaminate products, requiring food manufacturers to be ever-vigilant in cleaning, sanitizing, sampling and testing their sites.

For these reasons and others, it’s important to have an understanding and appreciation for the newer tests and techniques used in the fight against deadly pathogens, and where and how they might be fit for purpose throughout the operation. This article sheds light on the key features of one fast-growing DNA-based technology that detects pathogens and explains how culture methods for index and indicator organisms continue to play crucial roles in executing broad-based pathogen management programs.

LAMP’s Emergence in Molecular Pathogen Detection

Molecular pathogen detection has been a staple technology for food producers since the adoption of polymerase chain reaction (PCR) tests decades ago. However, the USDA FSIS revised its Microbiology Laboratory Guidebook, the official guide to the preferred methods the agency uses when testing samples collected from audits and inspections, last year to include new technologies that utilize loop-mediated isothermal amplification (LAMP) methods for Salmonella and Listeria detection.

LAMP methods differ from traditional PCR-based testing methods in four noteworthy ways.

First, LAMP eliminates the need for thermal cycling. Fundamentally, PCR tests require thermocyclers with the ability to alter the temperature of a sample to facilitate the PCR. The thermocyclers used for real-time PCR tests that allow detection in closed tubes can be expensive and include multiple moving parts that require regular maintenance and calibration. For every food, beverage or environmental surface sample tested, PCR systems will undergo multiple cycles of heating up to 95oC to break open DNA strands and cooling down to 60oC to extend the new DNA chain in every cycle. All of these temperature variations generally require more run time and the enzyme, Taq polymerase, used in PCR can be subjected to interferences from other inhibiting substances that are native to a sample and co-extracted with the DNA.

LAMP amplifies DNA isothermally at a steady and stable temperature range—right around 60oC. The Bst polymerase allows continuous amplification and better tolerates the sample matrix inhibitors known to trip up PCR. The detection schemes used for LAMP detection frees LAMP’s instrumentation from the constraints of numerous moving pieces.

Secondly, it doubles the number of DNA primers. Traditional PCR tests recognize two separate regions of the target genetic material. They rely on two primers to anneal to the subject’s separated DNA strands and copy and amplify that target DNA.

By contrast, LAMP technology uses four to six primers, which can recognize six to eight distinct regions from the sample’s DNA. These primers and polymerase used not only cause the DNA strand to displace, they actually loop the end of the strands together before initiating amplification cycling. This unique looped structure both accelerates the reaction and increases test result sensitivity by allowing for an exponential accumulation of target DNA.

Third of all, it removes steps from the workflow. Before any genetic amplification can happen, technicians must enrich their samples to deliberately grow microorganisms to detectable levels. Technicians using PCR tests have to pre-dispense lysis buffers or reagent mixes and take other careful actions to extract and purify their DNA samples.

Commercialized LAMP assay kits, on the other hand, offer more of a ready-to-use approach as they offer ready to use lysis buffer and simplified workflow to prepare DNA samples. By only requiring two transfer steps, it can significantly reduces the risk of false negatives caused by erroneous laboratory preparation.

Finally, it simplifies multiple test protocols into one. Food safety lab professionals using PCR technology have historically been required to perform different test protocols for each individual pathogen, whether that be Salmonella, Listeria, E. coli O157:H7 or other. Not surprisingly, this can increase the chances of error. Oftentimes, labs are resource-challenged and pressure-packed environments. Having to keep multiple testing steps straight all of the time has proven to be a recipe for trouble.

LAMP brings the benefit of a single assay protocol for testing all pathogens, enabling technicians to use the same protocol for all pathogen tests. This streamlined workflow involving minimal steps simplifies the process and reduces risk of human-caused error.

Index and Indicator Testing

LAMP technology has streamlined and advanced pathogen detection, but it’s impractical and unfeasible for producers to molecularly test every single product they produce and every nook and cranny in their production environments. Here is where an increasing number of companies are utilizing index and indicator tests as part of more comprehensive pathogen environmental programs. Rather than testing for specific pathogenic organisms, these tools give a microbiological warning sign that conditions may be breeding undesirable food safety or quality outcomes.

Index tests are culture-based tests that detect microorganisms whose presence (or detection above a threshold) suggest an increased risk for the presence of an ecologically similar pathogen. Listeria spp. Is the best-known index organism, as its presence can also mark the presence of deadly pathogen Listeria monocytogenes. However, there is considerable skepticism among many in the research community if there are any organisms outside of Listeria spp. that can be given this classification.

Indicator tests, on the other hand, detect the presence of organisms reflecting the general microbiological condition of a food or the environment. The presence of indicator organisms can not provide any information on the potential presence or absence of a specific pathogen or an assessment of potential public health risk, but their levels above acceptable limits can indicate insufficient cleaning and sanitation or operating conditions.

Should indicator test results exceed the established control limits, facilities are expected to take appropriate corrective action and to document the actions taken and results obtained. Utilizing cost-effective, fast indicator tests as benchmark to catch and identify problem areas can suggest that more precise, molecular methods need to be used to verify that the products are uncontaminated.

Process Matters

As discussed, technology plays a large role in pathogen detection, and advances like LAMP molecular detection methods combined with strategic use of index and indicator tests can provide food producers with powerful tools to safeguard their consumers from foodborne illnesses. However, whether a producer is testing environmental samples, ingredients or finished product, a test is only as useful as the comprehensive pathogen management plan around it.

The entire food industry is striving to meet the highest safety standards and the best course of action is to adopt a solution that combines the best technologies available with best practices in terms of processes as well –from sample collection and preparation to monitoring and detection.

Gregory Siragusa, Eurofins
Food Genomics

GenomeTrakr: What Do You Know and What Should You Know?

By Gregory Siragusa, Douglas Marshall, Ph.D.
No Comments
Gregory Siragusa, Eurofins

This month we are happy to welcome our guest co-authors and interviewees Eric Brown, Ph.D. and Marc Allard, Ph.D. of CFSAN as we explore the FDA’s GenomeTrakr program in a two-part Food Genomics column. Many of our readers have heard of GenomeTrakr, but are likely to have several questions regarding its core purpose and how it will impact food producers and processors in the United States and globally. In Part I we explore some technical aspects of the topic followed by Part II dealing with practical questions.

Part I: The basics of GenomeTrakr

Greg Siragusa/Doug Marshall: Thank you Dr. Allard and Dr. Brown for joining us in our monthly series, Food Genomics, to inform our readers about GenomeTrakr. Will you begin by telling us about yourselves and your team?

Eric Brown/Marc Allard: Hello, I am Eric, the director of the Division of Microbiology at the U.S. Food and Drug Administration at the Center for Food Safety and Applied Nutrition. Our team is made up of two branches, one that specializes in developing and validating methods for getting foodborne pathogens out of many different food matrices and the other branch conducts numerous tests to subtype and characterized foodborne pathogens. The GenomeTrakr program is in the subtyping branch as Whole Genome Sequencing (WGS) is the ultimate genomic subtyping tool for characterizing a foodborne pathogen at the DNA level.

Hello, my name is Marc, I am a senior biomedical research services officer and a senior advisor in Eric’s division. We are part of the group that conceived, evaluated and deployed the GenomeTrakr database and network.

Siragusa/Marshall: Drs. Allard and Brown, imagine yourself with a group of food safety professionals ranging from vice president for food safety to director, manager and technologists. Would you please give us the ‘elevator speech’ on GenomeTrakr?

Brown/Allard: GenomeTrakr is the first of its kind distributed network for rapidly characterizing bacterial foodborne pathogens using whole genome sequences (WGS). This genomic data can help FDA with many applications, including trace-back to determine the root cause of an outbreak as well providing one work-flow for rapidly characterizing all of the pathogens for which the agency has responsibility. These same methods are also very helpful for antimicrobial resistance monitoring and characterization.

Siragusa/Marshall: From the FDA website, GenomeTrakr is described as “a distributed network of labs to utilize whole genome sequencing for pathogen identification.” We of course have very time-proven methods of microbial identification and subtyping, so why do we need GenomeTrakr for identification and subtyping of microorganisms?

Brown/Allard: If all you want to know is species identification then you are correct, there are existing methods to do this. For some applications you need full characterization through subtyping (i.e., Below the level of species to the actual strain) with WGS. WGS of pathogens provides all of the genetic information about an organism as well as any mobile elements such as phages and plasmids that may be associated with these foodborne pathogens. The GenomeTrakr network and database compiles a large amount of new genetic or DNA sequence data to more fully characterize foodborne pathogens.

GenomeTrakr and WGS are a means to track bacteria based on knowing the sequence of all DNA that comprises that specific bacterium’s genome. It can be called the “ultimate identifier” in that it will show relationships at a very deep level of accuracy.

Siragusa/Marshall: Is it an accurate statement that GenomeTrakr can be considered the new iteration of PulseNet and Pulse field gel electrophoresis (PFGE)? Will PulseNet and PFGE disappear, or will PulseNet and GenomeTrkr merge into a single entity?

Brown/Allard: PulseNet is a network of public health labs run by the CDC, with USDA and FDA as active participants. The network is alive and well and will continue subtyping pathogens for public health. The current and historical subtyping tool used by PulseNet for more than 20 years is PFGE. It is expected that CDC, USDA and FDA’s PFGE data collection will be replaced by WGS data and methods. That transformation has already begun. GenomeTrakr is a network of public health labs run by the FDA to support FDA public health and regulatory activities using WGS methods. Starting in 2012, this network is relatively new and is focused currently on using WGS for trace back to support outbreak investigations and FDA regulatory actions. CDC PulseNet has used WGS data on Listeria and collects draft genomes (i.e., unfinished versions of a final genome are used for quicker assembly) of other foodborne pathogens as well, and USDA’s FSIS has used WGS for the pathogens found on the foods that they regulate. All of the data from GenomeTrakr and Pulsenet are shared at the NCBI Pathogen Detection website (see Figure 1).

Sequences, GenomeTrakr
Figure 1

Siragusa/Marshall: Does an organism have to be classified to the species level before submitting to GenomeTrakr?

Brown/Allard: Yes, species-level identification is part of the minimal metadata (all of the descriptors related to a sample such as geographic origin, lot number, sources, ingredients etc.) required to deposit data in the GenomeTrakr database. This allows initial QA/QC metrics to determine if the new genome is labeled properly.

Siragusa/Marshall: After an isolate is identified to the species level, would you describe to the reader what the basic process is going from an isolated and speciated bacterial colony on an agar plate to a usable whole genome sequence deposited in the GenomeTrakr database?

Brown/Allard: The FDA has a branch of scientists who specialize in ways to isolate foodborne pathogens from food. The detailed methods used ultimately end up in the Bacteriological Analytical Manual (BAM) of approved and validated methods. Once a pathogen is in pure culture then DNA is extracted from the bacterial cells. The DNA is then put into a DNA sequencing library, which modifies the DNA to properly attach and run sequencing reactions depending on the specific sequencing vendor used. The sequence data is downloaded from the sequencing equipment and then uploaded to the National Center for Biotechnology Information (NCBI) Pathogen Detection website. The database is publicly open to allow anyone with foodborne pathogens to upload their data and compare their sequences to what is available in the database.

Siragusa/Marshall: Suppose a specific sequence type of a foodborne bacterial pathogen is found and identified from a processing plant but that the plant has never had a positive assay result for that pathogen in any of its history of product production and ultimate consumption. If an outbreak occurred somewhere in the world and that same specific sequence type were identified as the causative agent, would a company be in anyway liable? Could one even make an association between the two isolates with the same sequence type isolated at great distances from open another?

Brown/Allard: The genetic evidence from WGS supports the hypothesis that the two isolates shared a recent common ancestor. If, for example, the isolate from the processing plant and the outbreak sample where genetically identical across the entire genome, the prediction is that the two samples are connected in some way that is currently not understood. The genetic matches guide the FDA and help point investigations to study the possible connections. This might include additional inspection of the processing plant as well as linking this to the typical epidemiological exposure data. Sometimes due to the indirect nature of how pathogens circulate through the farm to fork continuum and the complex methods of trade, no connection is made. More commonly, these investigative leads from genetic matches help the FDA establish direct links between the two bacterial isolates through a shared ingredient, shared processing, distribution or packaging process. The genetic information and cluster helps the FDA discover new ways that the pathogens are moving from farm to fork. We are unaware of any example where identical genomes somehow independently arose and were unrelated. This is counter to molecular evolutionary theory anyway. Genetic identity equals genetic relatedness and the closer two isolates are genetically to each other, the more recent that they shared a common ancestor. With regard to liability, this is a topic beyond the scope of our group, but genomic data does not by itself prove a direct linkage and that is why additional investigations must follow any close matches.

Siragusa/Marshall: We know that SNPs (Single Nucleotide Polymorphisms or single base pair differences in the same location in a genome) are commonly used to distinguish clonality of bacteria with highly similar genomes. Are there criteria used by GenomeTrakr bioinformaticists that are set to help define what is similar, different or the same?

Brown/Allard: As the database grows with more examples of diverse serotypes or kinds of foodborne pathogens, the FDA WGS group is observing common patterns that can be used as guidance to define what is same or different. For example, closely related for Salmonella and E. coli are usually in the five or fewer SNPs, and closely related for Listeria is 20 or fewer SNPs using the current FDA validated bioinformatics pipeline. These values are not set in stone but should be considered more like guidance for what FDA and GenomeTrakr have observed already from earlier case studies that have already been collected and examined. Often, a greater number (e.g., 21-50) of SNP differences have been observed between strains isolated in some outbreaks. Any close match might support or direct an outbreak investigation if there is evidence that suggests that a particular outbreak looks most closely like an early case from a specific geographic location. WGS data helps investigators focus their efforts toward and international verses domestic exposure or possible country of origin. Even more divergent WGS linkages, when SNPs are greater than 50-100, often connect to different foods or different geographic locations that would lead investigators away from the source of an outbreak as the data provides both inclusivity as well as exclusivity.

When two strains have more than 50–100 SNPs, different food or geographic sources of those strains can be incorrectly linked resulting in investigators pursuing an incorrect source.

Siragusa/Marshall: Can SNPs be identified from different agar-plate clones of the same strain (i.e., Different colonies on the same plate)?

Brown/Allard: Since understanding the natural genetic variation present in foodborne pathogens is the basis to understanding relatedness, the FDA conducted validation experiments on growing then sequencing colonies from the same plate, colonies from frozen inocula, thawing and plating, as well as running the same DNAs on different instruments and with different sequencing technicians. The FDA’s work with Salmonella enterica Montevideo sequencing as well as ongoing proficiency testing among laboratories shows that the same isolate most often has no differences, although some samples have 1-2 SNP differences. Genetic differences observed in isolates collected by FDA inspectors all related to a common outbreak generally have more genetic differences, and this appears to be dependent on the nature of the facility and the length of time that the foodborne pathogen has been resident in the facility and the selective pressure to which the pathogen was exposed to in a range from 0–5 SNPs different.

Siragusa/Marshall: Regarding the use of WGS to track strains in a particular processing plant, is it possible that within that closed microenvironment that strains will evolve sufficiently so that it becomes unique to that source?

Brown/Allard: Yes, we have discovered multiple examples of strains that have evolved in a unique way that they appear to be specific to that source. Hospitals use the same practice to understand hospital-acquired infections and the routes of transmission within a hospitals intensive care unit or surgery. Food industry laboratories as well as FDA investigators could use WGS data in a similar way to determine the root cause of the contamination by combining WGS data with inspection and surveillance. The FDA Office of Compliance uses WGS as one piece of evidence to ask the question: Have we seen this pathogen before?

Siragusa/Marshall: The number of sequences in the GenomeTrakr database is approaching 120,000 (~4,000 per month are added). Are the sequences in the GenomeTrakr database all generated by GenomeTrakr Network labs?

Brown/Allard: The sequences labeled as GenomeTrakr isolates at the NCBI biosample and bioproject databases are the WGS efforts supported by the U.S. FDA and USDA FSIS. GenomeTrakr is a label identifying the FDA, USDA FSIS and collaborative partner’s efforts to sequence food and environmental isolates. Additional laboratories, independent and beyond formal membership in the GT network, upload WGS data to the NCBI pathogen detection website of which GenomeTrakr is one part. CDC shares WGS data on primarily clinical PulseNet isolates and USDA FSIS shares WGS foodborne pathogens for foods that they regulate. Numerous international public health laboratories also upload WGS data to NCBI. The NCBI pathogen detection website includes all publicly released WGS data for the species that they are analyzing, and this might include additional research or public health data. The point of contact for who submitted the data is listed in the biosample data sheet, an example of which can be seen here.

Siragusa/Marshall: Once sequences are deposited into the GenomeTrakr database, are they also part of GenBank?

Brown/Allard: The majority of the GenomeTrakr database is part of the NCBI SRA (sequence read archive) database, which is a less finished version of the data in GenBank. GenBank data is assembled and annotated, which takes more time and analysis to complete. Once automated software is optimized and validated, NCBI likely will place all of the GenomeTrakr data into GenBank. Currently, only the published WGS data from GenomeTrakr is available in GenBank. All of the GenomeTrakr data is available in SRA both at GenomeTrakr bioprojects and in the NCBI pathogen detection website.

Readers, look for the Part II of this column where we continue our exploration with Drs. Brown and Allard and ask some general questions about the logistics surrounding GenomeTrakr. As always, please contact either Greg Siragusa or Doug Marshall with comments, questions or ideas for future Food Genomics columns.

About the Interviewees

Marc W. Allard, Ph.D.

Marc Allard, FDAMarc Allard, Ph.D. is a senior biomedical research services officer specializing in both phylogenetic analysis as well as the biochemical laboratory methods that generate the genetic information in the GenomeTrakr database, which is part of the NCBI Pathogen Detection website. Allard joined the Division of Microbiology in FDA’s Office of Regulatory Science in 2008 where he uses Whole Genome Sequencing of foodborne pathogens to identify and characterize outbreaks of bacterial strains, particularly Salmonella, E. coli, and Listeria. He obtained a B.A. from the University of Vermont, an M.S. from Texas A&M University and his Ph.D. in biology in from Harvard University. Allard was the Louis Weintraub Associate Professor of Biology at George Washington University for 14 years from 1994 to 2008. He is a Fellow of the American Academy of Microbiology.

Eric W. Brown, Ph.D.

Eric Brown, FDAEric W. Brown, Ph.D. currently serves as director of the Division of Microbiology in the Office of Regulatory Science. He oversees a group of 50 researchers and support scientists engaged in a multi-parameter research program to develop and apply microbiological and molecular genetic strategies for detecting, identifying, and differentiating bacterial foodborne pathogens such as Salmonella and shiga-toxin producing E. coli. Brown received his Ph.D. in microbial genetics from The Genetics Program in the Department of Biological Sciences at The George Washington University. He has conducted research in microbial evolution and microbial ecology as a research fellow in the National Cancer Institute, the U.S. Department of Agriculture, and as a tenure-track Professor of Microbiology at Loyola University of Chicago. Brown came to the Food and Drug Administration in 1999 and has since carried out numerous experiments relating to the detection, identification, and discrimination of foodborne pathogens.

Robin Stombler, Auburn Health Strategies
In the Food Lab

Five Questions Food Facilities Should Ask About Testing

By Robin Stombler
No Comments
Robin Stombler, Auburn Health Strategies

The FDA issued the first of several final regulations aimed at modernizing the food safety system through the use of hazard analysis and risk-based preventive controls. Inherent in this system are a number of requirements that eligible food facilities must follow, such as developing a written food safety plan, monitoring, corrective actions and verification. Laboratory testing is an essential component as well.

Robin Stombler presented “Laboratory Oversight and FSMA: Why and When” at the Food Labs Conference in Atlanta, GA | March 7–8, 2016So, what should food facilities know about laboratory testing within the context of the preventive controls for human food final rule?  First and foremost, the final rule states, “facilities have a responsibility to choose testing laboratories that will produce reliable and accurate test results.”  While a future regulation is expected to address the need for accredited laboratories and model laboratory standards, the preventive controls rule adopts other requirements pertaining to testing. Here are five questions that food facilities should ask about testing and the preventive controls rule.

1. What is the difference between pathogens and microorganisms?

The final rule defines “pathogen” to mean a microorganism that is of public health significance. A microorganism is defined as “yeasts, molds, bacteria, viruses, protozoa and microscopic parasites, and includes species that are pathogens.” Microorganisms that are of public health significance and subject food to decomposition or indicate that the food is adulterated or is contaminated with filth are considered “undesirable.”

2. How must food facilities account for pathogens?

Food facilities must prepare and implement a written food safety plan. One component of the food safety plan must include a written hazard analysis. This analysis must identify known or reasonably foreseeable hazards. These hazards may be biological, which includes parasites, environmental pathogens and other pathogens.

In another example, the food safety plan must include written verification procedures. This is to demonstrate that the facility is verifying that its preventive controls are implemented consistently and are significantly minimizing or preventing the hazards. These verification procedures are intended to be appropriate to the particular food facility, the food in question, and the nature of the preventive control and its role within the facility’s food safety system. With this in mind, facilities must conduct activities such as product testing for a pathogen or an appropriate indicator organism or other hazard, and environmental monitoring.

3. Are there written procedures specific to product testing?

Yes. Procedures for product testing must be scientifically valid and must identify the test microorganisms or other analytes. The procedures for identifying samples, including their relationship to specific lots of products, must be written and implemented. The procedures for sampling, including the number of samples and the sampling frequency, must be outlined. The facility must recognize the laboratory conducting the testing as well as describe the tests that are performed and the analytical methods used. Corrective action steps must also be included.

4. What are the procedures for environmental monitoring?

Similar to product testing, these procedures must be scientifically valid, identify the test microorganisms, and be put in writing. For routine environmental monitoring, the location from which the samples are collected and the number of sites that are tested must be stated. The final rule indicates that the “number and location of sampling sites must be adequate to determine whether preventive controls are effective.”  Written procedures must also identify the timing and frequency for collecting and testing samples. Again, similar to product testing, the laboratory conducting the testing and the tests and analytical methods used must be divulged. Corrective action procedures must also be included.

5. How does the supply-chain program incorporate testing?

A receiving facility is required to document a written supply chain program in its records. A component of that program includes documentation of sampling and testing performed as a supplier verification activity. The documentation must include identification of the raw material or other ingredient (including, if appropriate, lot number) and the number of samples tested. It also means that the tests conducted and the analytical methods used must be identified. The date the test is conducted as well as the date of the test report must be provided, and the identity of the laboratory performing the testing must be revealed. Any corrective actions that were taken in response to a hazard detection must also be reported.

This Q&A provides a glimpse into how the preventive controls final rule for human food incorporates laboratory testing. For more details, access the final rule.

Using ATP-based Methods for Cleaning and Sanitation Verification

By Camila Gadotti, M.S., Michael Hughes
1 Comment

There are several factors that must be considered when selecting a reliable and accurate system for detecting adenosine triphosphate.

A common way to assess the effectiveness of cleaning and sanitation programs in food manufacturing facilities is through the use of methods that detect adenosine triphosphate (ATP). Methods based on ATP detection are inexpensive and rapid, and provide the ability to perform onsite in real-time. There are several manufacturers of ATP-based methods, but choosing the most reliable one can be a daunting task. This article will discuss how these methods work and which factors should be considered to make an informed purchasing decision.

ATP is the universal energy currency in all living cells. It is present in all viable microorganisms (with the exception of viruses) and in foodstuffs. High amounts of ATP can be found in some fresh foods like vegetables, while other foods, especially highly processed foods such as fats, oils or sugar, contain very low amounts of this molecule. It is also important to know that ATP can be found in the environment in its free form hours after a cell has died.1 An ATP bioluminescence assay operates on the principle that ATP in food/food residues and microorganisms, in the presence of a luciferin/luciferase complex, leads to light emission. This light can be measured quantitatively by a luminometer (light-detecting instrument), with results available in 10–40 seconds. The amount of light emitted is proportional to the amount of ATP on a surface and hence its cleanliness. The light emitted is typically measured in relative light units (RLUs), calibrated for each make of instrument and set of reagents. Therefore, the readings obtained from assessing the cleaning of food manufacturing facilities need to be compared with baseline data representing acceptable clean values.

Varying Optical Components

Luminometers have evolved over the years from very large and cumbersome in size to small handheld models that can be used anywhere within a manufacturing facility. Although several components are housed inside these instruments, the optical component is the most important part of a luminometer. Used to detect light coming from the ATP/luciferin/luciferase reaction, the optical component is the defining factor related to luminometer reliability, sensitivity and repeatability. Good luminometers use a photomultiplier tube (PMT) in the light detection system; however, as part of the drive toward cheaper and smaller instruments, some manufacturers have replaced PMTs with less-sensitive photodiode-based systems. When using photodiodes, the swab chemistry must be adapted to produce more intense light. This results in a shorter duration of light, decreasing the time window allotted to place the swab in the luminometer and obtain an accurate read. A PMT, however, multiplies the electrical current produced when light strikes it by millions of times, allowing this optical device to detect a single photon. This approach emits light over a longer period of time. Although the weight of the system is also dependent on factors such as the battery, case and the display screen, a luminometer constructed with a photodiode will generally weigh less than a luminometer constructed with a PMT, since the former is smaller than the latter.

Sensitivity Testing

When an ATP hygiene monitoring system has poor sensitivity or repeatability, there is substantial risk that the test result does not truly represent the hygienic status of the location tested. Therefore, it may provide false positives leading to unnecessary chemical and labor costs and production delays, or false negatives leading to the use of contaminated pieces of equipment. A system that is sensitive to low-level contamination of a surface by microorganisms and/or food residues allows sanitarians to more accurately understand the status of a test point. The ability of a system to repeat results gives one peace of mind that the result is reliable and the actions taken are appropriate. To test different ATP systems for sensitivity, one can run the following simple test using at least eight swabs per system:

•    Make at least four serial dilutions of a microbial culture and a food product in a sterile phosphate buffer solution.
•    Using an accurate pipette, dispense 20 μl of these dilutions carefully onto the tip of the swabs of each ATP system and read the swabs in the respective luminometer, following the manufacturer’s instructions.
•    Use caution when dispensing the inoculum onto the swab head to prevent any sample loss or spillage. In addition, it is very important the swabs are inoculated immediately prior to reading, which means that each swab should be inoculated one at a time and read in the respective luminometer. Repeat this process for all the swabs.

 

 
To test different ATP systems for sensitivity, one can run a simple test using at least eight swabs per system. Photo courtesy of 3M

The most sensitive system will be the one that results in the most “fail results” (using the manufacturers’ recommended pass/caution/fail limits).

One can also test different ATP systems for repeatability by the following test:

•    Prepare a dilution of a standard ATP positive control or a food product such as fluid milk in a sterile phosphate buffer. If using a standard ATP positive control, follow the manufacturer’s direction to prepare dilution. If using fluid milk, add 1 ml of milk into 99 ml of phosphate buffer.
•    Using an accurate pipette, dispense 20 μl of this standard onto the tip of the swabs of each ATP system and read these swabs in their respective luminometer, following the manufacturer’s instructions.
•    Prepare and read at least 10 swabs for each system you are evaluating, and capture the results on a digital spreadsheet.
•    Once all 10 swab results (for each system) are in the spreadsheet, calculate the mean (=average) and standard deviation (=stdev) for each system’s data set. Divide the standard deviation by the mean and transform the result in percentage; this value is called the coefficient of variation percentage (CV%).
The test with the lowest CV% is the most repeatable and will provide the most reliable information to help make the correct decisions for a food manufacturing facility.

Choosing the Right ATP System

There are many ATP systems available on the market to support cleaning and sanitation verification in manufacturing plants. Some systems are more reliable than others and will provide results that are meaningful, accurate and repeatable. Be sure, therefore, not to choose a system solely based on its price. Check for the quality of the instrument, ask the sales representative what kind of optical device is used in the construction of the instrument and, moreover, perform an evaluation running tests for both sensitivity and repeatability. It is also important to consider the functionality and usability of the software provided with the system to ensure that the software can be used to customize sample plans, store test results and produce reports and graphs.

Reference

  1. Jay, J. M., ‎Loessner, M. J., & Golden, D. A. (2008). Modern Food Microbiology.

 


About the Author:

Camila Gadotti, M.S., is a field technical service professional and Michael Hughes is a technical service professional with 3M Food Safety.