Tag Archives: PCR

Susanne Kuehne, Decernis
Food Fraud Quick Bites

The Straw that Broke the Camel’s Back

By Susanne Kuehne
No Comments
Susanne Kuehne, Decernis
Camel, cow, food fraud
Find records of fraud such as those discussed in this column and more in the Food Fraud Database. Image credit: Susanne Kuehne

Due to its health benefits, camel meat is gaining in popularity for consumers but unfortunately also for fraudsters for economic gain. Polymerase chain reaction (PCR) technologies allow quick and accurate detection of specific meat types, including processed and cooked meats. This newly developed PCR lateral flow immunology method found adulteration of camel meat with beef in 10% of the 20 samples that were investigated in this Chinese study.

Resource

  1. Zhao, L., et. al. (July 30, 2020). “Identification of camel species in food products by a polymerase chain reaction-lateral flow immunoassay”. Food Chemistry. Science Direct. Volume 319.
Susanne Kuehne, Decernis
Food Fraud Quick Bites

A New Way to Spot a Fake

By Susanne Kuehne
No Comments
Susanne Kuehne, Decernis
Cuttlefish, food fraud
Find records of fraud such as those discussed in this column and more in the Food Fraud Database. Image credit: Susanne Kuehne

The common cuttlefish (Sepia officinalis) is a popular food source, and it is often adulterated with other cephalopod and sepia species. A new, low cost, real time polymerase chain reaction (PCR) method can be used on fresh, cooked, grilled, frozen and canned preparations of Sepia officinalis, producing quick and highly reliable results. In this study, 25% of the samples were found to be different cephalopod species, and not Sepia officinalis.

Resource

  1. Amaya Velasco, Graciela Ramilo-Fernandez, Carmen G. Sotelo (March 4, 2020) Instituto de Investigaciones Marinas (IIM-CSIC), Eduardo Cabello 6, 36208 Vigo (Pontevedra), Spain: “A Real-Time PCR Method for the Authentication of Common Cuttlefish (Sepia officinalis) in Food Products”. This study is part of the SEATRACES project (www.seatraces.eu).
Raj Rajagopal, 3M Food Safety
In the Food Lab

Pathogen Detection Guidance in 2020

By Raj Rajagopal
No Comments
Raj Rajagopal, 3M Food Safety

Food production managers have a critical role in ensuring that the products they make are safe and uncontaminated with dangerous pathogens. Health and wellness are in sharp focus for consumers in every aspect of their lives right now, and food safety is no exception. As food safety becomes a continually greater focus for consumers and regulators, the technologies used to monitor for and detect pathogens in a production plant have become more advanced.

It’s no secret that pathogen testing is performed for numerous reasons: To confirm the adequacy of processing control and to ensure foods and beverages have been properly stored or cooked, to name some. Accomplishing these objectives can be very different, and depending on their situations, processors rely on different tools to provide varying degrees of testing simplicity, speed, cost, efficiency and accuracy. It’s common today to leverage multiple pathogen diagnostics, ranging from traditional culture-based methods to molecular technologies.

And unfortunately, pathogen detection is more than just subjecting finished products to examination. It’s become increasingly clear to the industry that the environment in which food is processed can cross-contaminate products, requiring food manufacturers to be ever-vigilant in cleaning, sanitizing, sampling and testing their sites.

For these reasons and others, it’s important to have an understanding and appreciation for the newer tests and techniques used in the fight against deadly pathogens, and where and how they might be fit for purpose throughout the operation. This article sheds light on the key features of one fast-growing DNA-based technology that detects pathogens and explains how culture methods for index and indicator organisms continue to play crucial roles in executing broad-based pathogen management programs.

LAMP’s Emergence in Molecular Pathogen Detection

Molecular pathogen detection has been a staple technology for food producers since the adoption of polymerase chain reaction (PCR) tests decades ago. However, the USDA FSIS revised its Microbiology Laboratory Guidebook, the official guide to the preferred methods the agency uses when testing samples collected from audits and inspections, last year to include new technologies that utilize loop-mediated isothermal amplification (LAMP) methods for Salmonella and Listeria detection.

LAMP methods differ from traditional PCR-based testing methods in four noteworthy ways.

First, LAMP eliminates the need for thermal cycling. Fundamentally, PCR tests require thermocyclers with the ability to alter the temperature of a sample to facilitate the PCR. The thermocyclers used for real-time PCR tests that allow detection in closed tubes can be expensive and include multiple moving parts that require regular maintenance and calibration. For every food, beverage or environmental surface sample tested, PCR systems will undergo multiple cycles of heating up to 95oC to break open DNA strands and cooling down to 60oC to extend the new DNA chain in every cycle. All of these temperature variations generally require more run time and the enzyme, Taq polymerase, used in PCR can be subjected to interferences from other inhibiting substances that are native to a sample and co-extracted with the DNA.

LAMP amplifies DNA isothermally at a steady and stable temperature range—right around 60oC. The Bst polymerase allows continuous amplification and better tolerates the sample matrix inhibitors known to trip up PCR. The detection schemes used for LAMP detection frees LAMP’s instrumentation from the constraints of numerous moving pieces.

Secondly, it doubles the number of DNA primers. Traditional PCR tests recognize two separate regions of the target genetic material. They rely on two primers to anneal to the subject’s separated DNA strands and copy and amplify that target DNA.

By contrast, LAMP technology uses four to six primers, which can recognize six to eight distinct regions from the sample’s DNA. These primers and polymerase used not only cause the DNA strand to displace, they actually loop the end of the strands together before initiating amplification cycling. This unique looped structure both accelerates the reaction and increases test result sensitivity by allowing for an exponential accumulation of target DNA.

Third of all, it removes steps from the workflow. Before any genetic amplification can happen, technicians must enrich their samples to deliberately grow microorganisms to detectable levels. Technicians using PCR tests have to pre-dispense lysis buffers or reagent mixes and take other careful actions to extract and purify their DNA samples.

Commercialized LAMP assay kits, on the other hand, offer more of a ready-to-use approach as they offer ready to use lysis buffer and simplified workflow to prepare DNA samples. By only requiring two transfer steps, it can significantly reduces the risk of false negatives caused by erroneous laboratory preparation.

Finally, it simplifies multiple test protocols into one. Food safety lab professionals using PCR technology have historically been required to perform different test protocols for each individual pathogen, whether that be Salmonella, Listeria, E. coli O157:H7 or other. Not surprisingly, this can increase the chances of error. Oftentimes, labs are resource-challenged and pressure-packed environments. Having to keep multiple testing steps straight all of the time has proven to be a recipe for trouble.

LAMP brings the benefit of a single assay protocol for testing all pathogens, enabling technicians to use the same protocol for all pathogen tests. This streamlined workflow involving minimal steps simplifies the process and reduces risk of human-caused error.

Index and Indicator Testing

LAMP technology has streamlined and advanced pathogen detection, but it’s impractical and unfeasible for producers to molecularly test every single product they produce and every nook and cranny in their production environments. Here is where an increasing number of companies are utilizing index and indicator tests as part of more comprehensive pathogen environmental programs. Rather than testing for specific pathogenic organisms, these tools give a microbiological warning sign that conditions may be breeding undesirable food safety or quality outcomes.

Index tests are culture-based tests that detect microorganisms whose presence (or detection above a threshold) suggest an increased risk for the presence of an ecologically similar pathogen. Listeria spp. Is the best-known index organism, as its presence can also mark the presence of deadly pathogen Listeria monocytogenes. However, there is considerable skepticism among many in the research community if there are any organisms outside of Listeria spp. that can be given this classification.

Indicator tests, on the other hand, detect the presence of organisms reflecting the general microbiological condition of a food or the environment. The presence of indicator organisms can not provide any information on the potential presence or absence of a specific pathogen or an assessment of potential public health risk, but their levels above acceptable limits can indicate insufficient cleaning and sanitation or operating conditions.

Should indicator test results exceed the established control limits, facilities are expected to take appropriate corrective action and to document the actions taken and results obtained. Utilizing cost-effective, fast indicator tests as benchmark to catch and identify problem areas can suggest that more precise, molecular methods need to be used to verify that the products are uncontaminated.

Process Matters

As discussed, technology plays a large role in pathogen detection, and advances like LAMP molecular detection methods combined with strategic use of index and indicator tests can provide food producers with powerful tools to safeguard their consumers from foodborne illnesses. However, whether a producer is testing environmental samples, ingredients or finished product, a test is only as useful as the comprehensive pathogen management plan around it.

The entire food industry is striving to meet the highest safety standards and the best course of action is to adopt a solution that combines the best technologies available with best practices in terms of processes as well –from sample collection and preparation to monitoring and detection.

Susanne Kuehne, Decernis
Food Fraud Quick Bites

Finding the Root Cause for Starch Fraud

By Susanne Kuehne
No Comments
Susanne Kuehne, Decernis
Food fraud, cassava starch
Find records of fraud such as those discussed in this column and more in the Food Fraud Database.
Image credit: Susanne Kuehne

Due to its lower cost, cassava starch is a common adulterant in higher priced starches, such as for potato and wheat. Tests with droplet digital polymerase chain reaction ddPCR in China uncovered that over 30% of sweet potato starch samples, 25% of cornstarch samples and 40% of potato starch samples were adulterated with cassava starch. Besides the economic impact, this kind of fraud also poses a risk to consumers allergic to cassava.

Resource

  1. Chen, J., et. al. (February 26, 2020). “Identification and quantification of cassava starch adulteration in different food starches by droplet digital PCR”. PLOS One.
Michael Bartholomeusz, TruTag
In the Food Lab

Intelligent Imaging and the Future of Food Safety

By Michael Bartholomeusz, Ph.D.
1 Comment
Michael Bartholomeusz, TruTag

Traditional approaches to food safety no longer make the grade. It seems that stories of contaminated produce or foodborne illnesses dominate the headlines increasingly often. Some of the current safeguards set in place to protect consumers and ensure that companies are providing the freshest, safest food possible continue to fail across the world. Poorly regulated supply chains and food quality assurance breakdowns often sicken customers and result in recalls or lawsuits that cost money and damage reputations. The question is: What can be done to prevent these types of problems from occurring?

While outdated machinery and human vigilance continue to be the go-to solutions for these problems, cutting-edge intelligent imaging technology promises to eliminate the issues caused by old-fashioned processes that jeopardize consumer safety. This next generation of imaging will increase safety and quality by quickly and accurately detecting problems with food throughout the supply chain.

How Intelligent Imaging Works

In broad terms, intelligent imaging is hyperspectral imaging that uses cutting-edge hardware and software to help users establish better quality assurance markers. The hardware captures the image, and the software processes it to provide actionable data for users by combining the power of conventional spectroscopy with digital imaging.

Conventional machine vision systems generally lack the ability to effectively capture and relay details and nuances to users. Conversely, intelligent imaging technology utilizes superior capabilities in two major areas: Spectral and spatial resolution. Essentially, intelligent imaging systems employ a level of detail far beyond current industry-standard machinery. For example, an RGB camera can see only three colors: Red, green and blue. Hyperspectral imaging can detect between 300 and 600 real colors—that’s 100–200 times more colors than detected by standard RGB cameras.

Intelligent imaging can also be extended into the ultraviolet or infrared spectrum, providing additional details of the chemical and structural composition of food not observable in the visible spectrum. Hyperspectral imaging cameras do this by generating “data cubes.” These are pixels collected within an image that show subtle reflected color differences not observable by humans or conventional cameras. Once generated, these data cubes are classified, labeled and optimized using machine learning to better process information in the future.

Beyond spectral and spatial data, other rudimentary quality assurance systems pose their own distinct limitations. X-rays can be prohibitively expensive and are only focused on catching foreign objects. They are also difficult to calibrate and maintain. Metal detectors are more affordable, but generally only catch metals with strong magnetic fields like iron. Metals including copper and aluminum can slip through, as well as non-metal objects like plastics, wood and feces.

Finally, current quality assurance systems have a weakness that can change day-to-day: Human subjectivity. The people put in charge of monitoring in-line quality and food safety are indeed doing their best. However, the naked eye and human brain can be notoriously inconsistent. Perhaps a tired person at the end of a long shift misses a contaminant, or those working two separate shifts judge quality in slightly different ways, leading to divergent standards unbeknownst to both the food processor and the public.

Hyperspectral imaging can immediately provide tangible benefits for users, especially within the following quality assurance categories in the food supply chain:

Pathogen Detection

Pathogen detection is perhaps the biggest concern for both consumers and the food industry overall. Identifying and eliminating Salmonella, Listeria, and E.coli throughout the supply chain is a necessity. Obviously, failure to detect pathogens seriously compromises consumer safety. It also gravely damages the reputations of food brands while leading to recalls and lawsuits.

Current pathogen detection processes, including polymerase chain reaction (PCR), immunoassays and plating, involve complicated and costly sample preparation techniques that can take days to complete and create bottlenecks in the supply chain. These delays adversely impact operating cycles and increase inventory management costs. This is particularly significant for products with a short shelf life. Intelligent imaging technology provides a quick and accurate alternative, saving time and money while keeping customers healthy.

Characterizing Food Freshness

Consumers expect freshness, quality and consistency in their foods. As supply chains lengthen and become more complicated around the world, food spoilage has more opportunity to occur at any point throughout the production process, manifesting in reduced nutrient content and an overall loss of food freshness. Tainted meat products may also sicken consumers. All of these factors significantly affect market prices.

Sensory evaluation, chromatography and spectroscopy have all been used to assess food freshness. However, many spatial and spectral anomalies are missed by conventional tristimulus filter-based systems and each of these approaches has severe limitations from a reliability, cost or speed perspective. Additionally, none is capable of providing an economical inline measurement of freshness, and financial pressure to reduce costs can result in cut corners when these systems are in place. By harnessing meticulous data and providing real-time analysis, hyperspectral imaging mitigates or erases the above limiting factors by simultaneously evaluating color, moisture (dehydration) levels, fat content and protein levels, providing a reliable standardization of these measures.

Foreign Object Detection

The presence of plastics, metals, stones, allergens, glass, rubber, fecal matter, rodents, insect infestation and other foreign objects is a big quality assurance challenge for food processors. Failure to identify foreign objects can lead to major added costs including recalls, litigation and brand damage. As detailed above, automated options like X-rays and metal detectors can only identify certain foreign objects, leaving the rest to pass through untouched. Using superior spectral and spatial recognition capabilities, intelligent imaging technology can catch these objects and alert the appropriate employees or kickstart automated processes to fix the issue.

Mechanical Damage

Though it may not be put on the same level as pathogen detection, food freshness and foreign object detection, consumers put a premium on food uniformity, demanding high levels of consistency in everything from their apples to their zucchini. This can be especially difficult to ensure with agricultural products, where 10–40% of produce undergoes mechanical damage during processing. Increasingly complicated supply chains and progressively more automated production environments make delivering consistent quality more complicated than ever before.

Historically, machine vision systems and spectroscopy have been implemented to assist with damage detection, including bruising and cuts, in sorting facilities. However, these systems lack the spectral differentiation to effectively evaluate food and agricultural products in the stringent manner customers expect. Methods like spot spectroscopy require over-sampling to ensure that any detected aberrations are representative of the whole item. It’s a time-consuming process.

Intelligent imaging uses superior technology and machine learning to identify mechanical damage that’s not visible to humans or conventional machinery. For example, a potato may appear fine on the outside, but have extensive bruising beneath its skin. Hyperspectral imaging can find this bruising and decide whether the potato is too compromised to sell or within the parameters of acceptability.

Intelligent imaging can “see” what humans and older technology simply cannot. With the ability to be deployed at a number of locations within the food supply chain, it’s an adaptable technology with far-reaching applications. From drones measuring crop health in the field to inline or end-of-line positioning in processing facilities, there is the potential to take this beyond factory floors.

In the world of quality assurance, where a misdiagnosis can literally result in death, the additional spectral and spatial information provided by hyperspectral imaging can be utilized by food processors to provide important details regarding chemical and structural composition previously not discernible with rudimentary systems. When companies begin using intelligent imaging, it will yield important insights and add value as the food industry searches for reliable solutions to its most serious challenges. Intelligent imaging removes the subjectivity from food quality assurance, turning it into an objective endeavor.

Benjamin Katchman, PathogenDx
In the Food Lab

Revolutionary Rapid Testing for Listeria Monocytogenes and Salmonella

By Benjamin A. Katchman, Ph.D., Michael E. Hogan, Ph.D., Nathan Libbey, Patrick M. Bird
No Comments
Benjamin Katchman, PathogenDx

The Golden Age of Bacteriology: Discovering the Unknown in a Farm-to-Market Food Supply.

The last quarter of the 19th Century was both horrific and exciting. The world had just emerged from four decades of epidemic in cholera, typhoid fever and other enteric diseases for which no cause was known. Thus, the great scientific minds of Europe sought to find understanding. Robert Koch integrated Pasteur’s Germ Theory in 1861 with the high technology of the day: Mathematical optics and the first industrialized compound microscopes (Siebert, Leiss, 1877), heterocycle chemistry, high-purity solvents (i.e., formaldehyde), availability of engineered glass suitable as microscope slides and precision-molded parts such as tubes and plates in 1877, and industrialized agar production from seaweed in Japan in 1860. The enduring fruit of Koch’s technology integration tour de force is well known: Dye staining of bacteria for sub-micron microscopy, the invention of 13 cm x 1 cm culture tubes and the invention of the “Petri” dish coupled to agar-enriched culture media. Those technologies not only launched “The Golden Age of Bacteriology” but also guided the entire field of analytical microbiology for two lifetimes, becoming bedrock of 20th Century food safety regulation (the Federal Food, Drug and Cosmetic Act in 1938) and well into the 21st century with FSMA.

Learn more about technologies in food safety testing at the Food Labs / Cannabis Labs Conference | June 2–4, 2020 | Register now!Blockchain Microbiology: Managing the Known in an International Food Supply Chain.

If Koch were to reappear in 2020 and were presented with a manual of technical microbiology, he would have little difficulty recognizing the current practice of cell fixation, staining and microscopy, or the SOPs associated with fluid phase enrichment culture and agar plate culture on glass dishes (still named after his lab assistant). The point to be made is that the analytical plate culture technology developed by Koch was game changing then, in the “farm-to-market” supply chain in Koch’s hometown of Berlin. But today, plate culture still takes about 24 to 72 hours for broad class indicator identification and 48 to 96 hours for limited species level identification of common pathogens. In 1880, life was slow and that much time was needed to travel by train from Paris to Berlin. In 2020, that is the time needed to ship food to Berlin from any place on earth. While more rapid tests have been developed such as the ATP assay, they lack the speciation and analytical confidence necessary to provide actionable information to food safety professionals.

It can be argued that leading up to 2020, there has been an significant paradigm shift in the understanding of microbiology (genetics, systems based understanding of microbial function), which can now be coupled to new Third Industrial Age technologies, to make the 2020 international food supply chain safer.

We Are Not in 1880 Anymore: The Time has Come to Move Food Safety Testing into the 21st Century.

Each year, there are more than 48 million illnesses in the United States due to contaminated food.1 These illnesses place a heavy burden on consumers, food manufacturers, healthcare, and other ancillary parties, resulting in more than $75 billion in cost for the United States alone.2 This figure, while seemingly staggering, may increase in future years as reporting continues to increase. For Salmonella related illnesses alone, an estimated 97% of cases go unreported and Listeria monocytogenes is estimated to cause about 1,600 illnesses each year in the United States with more than 1,500 related hospitalizations and 260 related deaths.1,3 As reporting increases, food producers and regulatory bodies will feel an increased need to surveil all aspects of food production, from soil and air, to final product and packaging. The current standards for pathogenic agriculture and environmental testing, culture-based methods, qPCR and ATP assays are not able to meet the rapid, multiplexed and specificity required to meet the current and future demands of the industry.

At the DNA level, single cell level by PCR, high throughput sequencing, and microarrays provide the ability to identify multiple microbes in less than 24 hours with high levels of sensitivity and specificity (see Figure 1). With unique sample prep methods that obviate enrichment, DNA extraction and purification, these technologies will continue to rapidly reduce total test turnaround times into the single digit hours while simultaneously reducing the costs per test within the economics window of the food safety testing world. There are still growing pains as the industry begins to accept these new molecular approaches to microbiology such as advanced training, novel technology and integrated software analysis.

It is easy to envision that the digital data obtained from DNA-based microbial testing could become the next generation gold standard as a “system parameter” to the food supply chain. Imagine for instance that at time of shipping of a container, a data vector would be produced (i.e., time stamp out, location out, invoice, Listeria Speciation and/or Serovar discrimination, Salmonella Speciation and/or Serovar discrimination, refer toFigure 1) where the added microbial data would be treated as another important digital attribute of the load. Though it may seem far-fetched, such early prototyping through the CDC and USDA has already begun at sites in the U.S. trucking industry, based on DNA microarray and sequencing based microbial testing.

Given that “Third Industrial Revolution” technology can now be used to make microbial detection fast, digital, internet enabled and culture free, we argue here that molecular testing of the food chain (DNA or protein based) should, as soon as possible, be developed and validated to replace culture based analysis.

Broad Microbial Detection
Current microbiological diagnostic technology is only able to test for broad species of family identification of different pathogens. New and emerging molecular diagnostic technology offers a highly multiplexed, rapid, sensitive and specific platforms at increasingly affordable prices. Graphic courtesy of PathogenDx.

References.

  1. Scallan, E., Hoekstra, R. M., Angulo, F. J., Tauxe, R. V., Widdowson, M. A., Roy, S. L., … Griffin, P. M. (2011). Foodborne illness acquired in the United States–major pathogens. Emerging infectious diseases, 17(1), 7–15. doi:10.3201/eid1701.p11101
  2. Scharff, Robert. (2012). Economic Burden from Health Losses Due to Foodborne Illness in the United States. Journal of food protection. 75. 123-31. 10.4315/0362-028X.JFP-11-058.
  3. Mead, P. S., Slutsker, L., Dietz, V., McCaig, L. F., Bresee, J. S., Shapiro, C., … Tauxe, R. V. (1999). Food-related illness and death in the United States. Emerging infectious diseases, 5(5), 607–625. doi:10.3201/eid0505.990502
Sasan Amini, Clear Labs
FST Soapbox

Beyond the Results: What Can Testing Teach Us?

By Sasan Amini
No Comments
Sasan Amini, Clear Labs

The microbiology lab will increasingly be understood as the gravitational center of big data in the food industry. Brands that understand how to leverage the data microbiology labs are producing in ever larger quantities will be in the best position to positively impact their bottom line—and even transform the lab from a cost center to a margin contributor.

The global rapid microbiology testing market continues to grow at a steady pace. The market is projected to reach $5.09 billion by 2023, up from $3.45 billion in 2018. Increased demand for food microbiology testing—and pathogen detection in particular—continues to drive the overall growth of this sector. The volume of food microbiology tests totaled 1.14 billion tests in 2016—up 15% from 2013. In 2018 that number is estimated to have risen to 1.3 billion tests, accounting for nearly half the overall volume of industrial microbiology tests performed worldwide.

The food industry is well aware that food safety testing programs are a necessary and worthwhile investment. Given the enormous human and financial costs of food recalls, a robust food safety testing system is the best insurance policy any food brand can buy.

We are going through a unique transition where food safety tests are evolving from binary tests to data engines that are capable of generating orders of magnitude of more information. This creates a unique opportunity where many applications for big data collected from routine pathogen testing can help go beyond stopping an outbreak. Paired with machine learning and other data platforms, these data have the opportunity to become valuable, actionable insights for the industry.

While some of these applications will have an impact on fundamental research, I expect that big data analytics and bioinformatics will have significant opportunity to push the utilities of these tests from being merely a diagnostic test to a vehicle for driving actions and offering recommendations. Two examples of such transformations include product development and environmental testing.

Food-Safety Testing Data and Product Development

Next-generation-sequencing (NGS) technologies demonstrate a great deal of potential for product development, particularly when it comes to better understanding shelf life and generating more accurate shelf-life estimates.

Storage conditions, packaging, pH, temperature, and water activity can influence food quality and shelf life among other factors. Shelf-life estimates, however, have traditionally been based on rudimentary statistical models incapable of accounting for the complexity of factors that impact food freshness, more specifically not being able to take into consideration the composition and quantity of all microbial communities present on any food sample. These limitations have long been recognized by food scientists and have led them to look for cost-effective alternatives.

By using NGS technologies, scientists can gain a more complete picture of the microbial composition of foods and how those microbial communities are influenced by intrinsic and extrinsic factors.

It’s unlikely that analyzing the microbiome of every food product or unit of product will ever be a cost-effective strategy. However, over time, as individual manufacturers and the industry as a whole analyze more and more samples and generate more data, we should be able to develop increasingly accurate predictive models. The data generation cost and logistics could be significantly streamlined if existing food safety tests evolve to broader vehicles that can create insights on both safety and quality indications of food product simultaneously. By comparing the observed (or expected) microbiome profile of a fresh product with the models we develop, we could greatly improve our estimates of a given product’s remaining shelf life.

This will open a number of new opportunities for food producers and consumers. Better shelf-life estimates will create efficiencies up and down the food supply chain. The impact on product development can hardly be underestimated. As we better understand the precise variables that impact food freshness for particular products, we can devise food production and packaging technologies that enhance food safety and food quality.

As our predictive models improve, an entire market for these models will emerge, much as it has in other industries that rely on machine learning models to draw predictive insights from big data.

Data Visualization for Environmental Monitoring

In the past one to two years, NGS technologies have matured to the point that they can now be leveraged for high-volume pathogen and environmental testing.

Just as it has in other industries, big data coupled with data visualization approaches can play a mainstream role in food safety and quality applications.

Data visualization techniques are not new to food safety programs and have proven particularly useful when analyzing the results of environmental testing. The full potential of data visualizations has yet to be realized, however. Visualizations can be used to better understand harborage sites, identifying patterns that need attention, and visualize how specific strains of a pathogen are migrating through a facility.

Some of this is happening in food production facilities already, but it’s important to note that visualizations are only as useful as the underlying data is accurate. That’s where technologies like NGS come in. NGS provides the option for deeper characterization of pathogenic microorganisms when needed (down to the strain). The depth of information from NGS platforms enables more reliable and detailed characterization of pathogenic strains compared to existing methods.

Beyond basic identification, there are other potential use cases for environmental mapping, including tracking pathogens as they move through the supply chain. It’s my prediction that as the food industry more broadly adopts NGS technologies that unify testing and bioinformatics in a single platform, data visualization techniques will rapidly advance, so long as we keep asking ourselves: What can the data teach us?

The Food Data Revolution and Market Consolidation

Unlike most PCR and immunoassay-based testing techniques, which in most cases can only generate binary answers, NGS platforms generate millions of data points for each sample for up to tens to hundreds of samples. As NGS technologies are adopted and the data we collect increases exponentially, the food safety system will become the data engine upon which new products and technologies are built.

Just as we have seen in any number of industries, companies with access to data and the means to make sense of it will be in the best position to capitalize on new revenue opportunities and economies of scale.

Companies that have adopted NGS technologies for food safety testing will have an obvious advantage in this emerging market. And they won’t have had to radically alter their business model to get there. They’ll be running the same robust programs they have long had in place, but collecting a much larger volume of data in doing so. Companies with a vision of how to best leverage this data will have the greatest edge.

Megan Nichols
FST Soapbox

Technology Tools Improving Food Safety

By Megan Ray Nichols
No Comments
Megan Nichols

To cap off a tumultuous year for foodborne illnesses, the end of 2018 saw a rather large E. coli outbreak that affected several different types of lettuce. In all, about 62 people got sick in the United States, with another 29 affected in Canada. The outbreak was traced back to a farm in California thanks to a specific DNA fingerprint in the E. coli. It started in a water reservoir and spread to the nearby crops.

Unfortunately, the event was only one of two separate incidents involving romaine lettuce last year. Another E.coli outbreak was traced back to a source in Arizona. Are these outbreaks more common than we realize? The CDC estimates that 48 million Americans fall ill each year from foodborne pathogens. Of those who get sick, 128,000 have to be hospitalized, and about 3,000 perish.

It’s clear that the industry as a whole needs to buckle down and find more effective solutions, not just for preventing outbreaks but also for mitigating damage when they happen. A new level of safety and management can be achieved with the help of many new, innovative technologies.

The following are some of the technology tools shaping the future of food safety and quality management fields.

Blockchain

As a result of the E. coli outbreak, Walmart implemented blockchain technology to track leafy greens and boost supply chain transparency. The systems and infrastructure is anticipated to be in place by the end of 2019.

Blockchain is a secure, digital ledger. It holds information about various transactions and data, all of which are carried out on the network. It’s called a blockchain because each data set within the network is a chunk or “block,” and they’re all linked to one another—hence the chain portion of the name. What this allows for is complete transparency throughout the supply chain, because you can track goods from their origin all the way to distribution and sale.

Each block is essentially a chunk of information, and when it’s entered into the chain, it cannot be altered, modified or manipulated. It’s simply there for viewing publicly. You cannot alter information contained within a single block without modifying the entire chain—which operates much like a peer-to-peer network and is split across many devices and servers.
This unique form of security establishes trust, accuracy and a clear representation of what’s happening. It allows a company to track contaminated foods along their journey, stopping them before they contaminate other goods or reach customers.

Infrared Heating

Thanks to the rising popularity of ready-to-eat meals, the industry is under pressure to adopt preservation and pasteurization methods. Particularly, they must be able to sanitize foods and package them with minimal exposure and bacteria levels. This practice allows them to stay fresh for longer and protects customers from potential foodborne illness.

Infrared heating is a method of surface pasteurization, and has been used for meats such as ham. Infrared lamps radiate heat at low temperatures, effectively killing surface bacteria and contaminants. The idea is to decontaminate or sanitize the surface of foods before final packaging occurs.

Industrial IoT and Smart Sensors

The food and beverage industry has a rather unique challenge with regard to supply chain operations. Food may be clean and correctly handled at the source with no traces of contamination, but it’s then passed on to a third party, which changes the game. Maybe a refrigerated transport breaks down, and the food within is thawed out. Perhaps a distributor doesn’t appropriately store perishable goods, resulting in serious contamination.

This transportation stage can be more effectively tracked and optimized with the help of modern IoT and smart, connected sensors. RFID tags, for instance, can be embedded in the packaging of foods to track their movements and various stats. Additional sensors can monitor storage temps, travel times, unexpected exposure, package tears and more.

More importantly, they’re often connected to a central data processing system where AI and machine learning platforms or human laborers can identify problematic changes. This setup allows supply chain participants to take action sooner in order to remedy potential problems or even pull contaminated goods out of the supply.

They can also help cut down on fraud or falsified records, which is a growing problem in the industry. Imagine an event where an employee says that a package was handled properly via forms or reporting tools, yet it was exposed to damaging elements. The implications of even simple fraud can be significant. Technology that automatically and consistently reports information—over manual entry—can help eliminate this possibility altogether.

Next-Generation Sequencing

NGS refers to a high-throughput DNA sequencing process that is now available to the food industry as a whole. It’s cheaper, more effective and takes a lot less time to complete, which means DNA and RNA sequencing is more accessible to food companies and suppliers now than it ever has been.

NGS can be used to assess and sequence hundreds of different samples at a time at rates of up to 25 million reads per experiment. What that means is that monitoring teams can accurately identify foodborne pathogens and contamination at the speed of the modern market. It is also a highly capable form of food safety measurement and is quickly replacing older, molecular-based methods like PCR.

Ultimately, NGS will lead to vastly improved testing and measurement processes, which can identify potential issues faster and in higher quantities than traditional methods. The food industry will be all the better and safer for it.

The Market Is Ever Evolving

While these technologies are certainly making a splash—and will shape the future of the food safety industry—they do not exist in a vacuum. There are dozens of other technologies and solutions being explored. It is important to understand that many new technologies could rise to the surface even within the next year.

The good news is that it’s all meant to improve the industry, particularly when it comes to the freshness, quality and health of the goods that consumers eat.

Gabriela Lopez, 3M Food Safety
Allergen Alley

Method Acting: Comparing Different Analytical Methods for Allergen Testing and Verification

By Gabriela Lopez-Velasco, Ph.D.
1 Comment
Gabriela Lopez, 3M Food Safety

Every day, food industries around the world work to comply with the food labeling directives and regulations in place to inform consumers about specific ingredients added to finished products. Of course, special attention has been placed on ensuring that product packaging clearly declares the presence of food allergens including milk, eggs, fish, crustacean shellfish, tree nuts, peanuts, wheat, soy, sesame and mustard. (Additional food allergens may also be included in other regions.)

But labeling only covers the ingredients deliberately added to foods and beverages. In reality, food manufacturers have two jobs when it comes to serving the needs of their allergic consumers:

  1. Fully understand and clearly declare the intentional presence of allergenic foods
  2. Prevent the unintended presence of allergenic foods into their product

Almost half of food recalls are the result of undeclared allergens, and often these at-fault allergens were not only undeclared but unintended. Given such, the unintended presence of allergenic foods is something that must be carefully considered when establishing an allergen control plan for a food processing facility.

How? It starts with a risk assessment process that evaluates the likelihood of unintentionally present allergens that could originate from raw materials, cross-contact contamination in equipment or tools, transport and more. Once the risks are identified, risk management strategies should then be established to control allergens in the processing plant environment.
It is necessary to validate these risk management strategies or procedures in order to demonstrate their effectiveness. After validation, those strategies or procedures should then be periodically verified to show that the allergen control plan in place is continually effective.

In several of these verification procedures it may be necessary to utilize an analytical test to determine the presence or absence of an allergenic food or to quantify its level, if present. Indeed, selecting an appropriate method to assess the presence or the level of an allergenic food is vitally important, as the information provided by the selected method will inform crucial decisions about the safety of an ingredient, equipment or product that is to be released for commercialization.

A cursory review of available methods can be daunting. There are several emerging methods and technologies for this application, including mass spectroscopy, surface plasmon resonance, biosensors and polymerase chain reaction (PCR). Each of these methods have made advancements, and some of them are already commercialized for food testing applications. However, for practical means, we will discuss those methods that are most commonly used in the food industry.

In general, there are two types of analytical methods used to determine the presence of allergenic foods: Specific and non-specific methods.

Specific tests

Specific methods can detect target proteins in foods that contain the allergenic portion of the food sample. These include immunoassays, in which specific antibodies can recognize and bind to target proteins. The format of these assays can be quantitative, such as an enzyme-linked immunosorbent assay (ELISA) that may help determine the concentration of target proteins in a food sample. Or they can be qualitative, such as a lateral flow device, which within a few minutes and with minimum sample preparation can display whether a target protein is or is not present. (Note: Some commercial formats of ELISA are also designed to obtain a qualitative result.)

To date, ELISA assays have become a method of choice for detection and quantification of proteins from food allergens by regulatory entities and inspection agencies. For the food industry, ELISA can also be used to test raw ingredients and final food products. In addition, ELISA is a valuable analytical tool to determine the concentration of proteins from allergenic foods during a cleaning validation process, as some commercial assay suppliers offer methods to determine the concentration of target proteins from swabs utilized to collect environmental samples, clean-in-place (CIP) final rinse water or purge materials utilized during dry cleaning.

ELISA methods often require the use of laboratory equipment and technical skills to be implemented. Rapid-specific methods such as immunoassays with a lateral flow format also allow detection of target specific proteins. Given their minimal sample preparation and short time-to-result, they are valuable tools for cleaning validation and routine cleaning verification, with the advantage of having a similar sensitivity to the lowest limit of quantification of an ELISA assay.

The use of a specific rapid immunoassay provides a presence/absence result that determines whether equipment, surfaces or utensils have been cleaned to a point where proteins from allergenic foods are indiscernible at a certain limit of detection. Thus, equipment can be used to process a product that should not contain a food allergen. Some commercial rapid immunoassays offer protocols to use this type of test in raw materials and final product. This allows food producers to analyze foods and ingredients for the absence of a food allergen with minimum laboratory infrastructure and enables in-house testing of this type of sample. This feature may be a useful rapid verification tool to analyze final product that has been processed shortly after the first production run following an equipment cleaning.

Non-Specific Tests

While non-specific testing isn’t typically the best option for a cleaning validation study, these tests may be used for routine cleaning verification. Examples of non-specific tests include total protein or ATP tests.

Tests that determine total protein are often based on a colorimetric reaction. For example, commercial products utilize a swab format that, after being used to survey a defined area, is placed in a solution that will result in a color change if protein is detected. The rationale is that if protein is not detected, it may be assumed that proteins from allergenic foods were removed during cleaning. However, when total protein is utilized for routine verification, it is important to consider that the sensitivity of protein swabs may differ from the sensitivity of specific immunoassays. Consequently, highly sensitive protein swabs should be selected when feasible.

ATP swab tests are also commonly utilized by the food industry as a non-specific tool for hygiene monitoring and cleaning verification. However, the correlation between ATP and protein is not always consistent. Because the ATP present in living somatic cells varies with the food type, ATP should not be considered as a direct marker to assess the removal of allergenic food residues after cleaning. Instead, an analytical test designed for the detection of proteins should be used alongside ATP swabs to assess hygiene and to assess removal of allergenic foods.

Factors for Using One Test Versus Another

For routine testing, the choice of using a specific or a non-specific analytical method will depend on various factors including the type of product, the number of allergenic ingredients utilized for one production line, whether a quantitative result is required for a particular sample or final product, and, possibly, the budget that is available for testing. In any case, it is important that when performing a cleaning validation study, the method used for routine testing also be included to demonstrate that it will effectively reflect the presence of an allergenic food residue.

Specific rapid methods for verification are preferable because they enable direct monitoring of the undesirable presence of allergenic foods. For example, they can be utilized in conjunction with a non-specific protein swab and, based on the sampling plan, specific tests can then be used periodically (weekly) for sites identified as high-risk because they may be harder to clean than other surfaces. In addition, non-specific protein swabs can be used after every production changeover for all sites previously defined in a sampling plan. These and any other scenarios should be discussed while developing an allergen control plan, and the advantages and risks of selecting any method(s) should be evaluated.

As with all analytical methods, commercial suppliers will perform validation of the methods they offer to ensure the method is suitable for testing a particular analyte. However, given the great diversity of food products, different sanitizers and chemicals used in the food industry, and the various processes to which a food is subjected during manufacturing, it is unlikely that commercial methods have been exhaustively tested. Thus, it is always important to ensure that the method is fit-for-purpose and to verify that it will recover or detect the allergen residues of interest at a defined level.

magnifying glass

PCR or LAMP: Food Safety Considerations when Choosing Molecular Detection Methods

By Joy Dell’Aringa, Vikrant Dutta, Ph.D.
No Comments
magnifying glass

Food microbiology pathogen detection technology is constantly evolving and improving for fast, efficient and accurate analysis. Thanks to the wide commercialization of easy-to-use diagnostic kits, the end-user no longer needs a deep understanding of the intricacies of diagnostic chemistries to perform the analysis. However, when navigating the selection process in search of the technology that is best fit-for-purpose, it is critical to understand the key differences in principle of detection and how they can impact both operations and risk. Here, we will explore the difference between two broad categories of molecular pathogen detection: PCR and isothermal technologies such as LAMP.

PCR & LAMP Detection Chemistries: An Overview

PCR detection chemistries have come a long way from non-specific DNA-binding dyes like SYBR Green, to highly precise sequence-specific molecular probes. The efficiency of the real-time PCR reaction today allows for the use of a variety of detection probes, the most popular being Dual-Labeled Fluorescent Probes such as FRET, TaqMan probes, and Molecular Beacon probes.1 The precision of these probes is showcased in their ability to distinguish allelic single-nucleotide polymorphisms (SNPs).2,3 The most prevalent isothermal chemistry, Loop-Mediated Isothermal Amplification (LAMP), typically does not use molecular probes due to the lack of structure and formation consistency in its amplified products. As a result, LAMP mostly relies on detection through non-specific signal generation like ATP bioluminescence or non-specific dyes. In theory, this could come from specific and non-specific amplification events. This also makes LAMP inept to detect the allelic polymorphisms, which in some cases are critical to detecting crucial variations, like between close species, and within serotypes. In the end, the detection chemistries are only as good as the amplified products.

Key Takeaways:

  • PCR technology has improved greatly in detection efficiencies via target specific probes
  • LAMP technology typically does not utilize specific molecular probes, but instead relies on indirect signal generation
  • Target specific probes ensures signal from specific amplification events only
  • Indirect signal can come from specific and non-specific amplification events, which can lead to a reduced specificity and inability to detect in certain cases

PCR & LAMP: Amplification Strategies

Food safety pathogen detection protocols aim to find the single cell of a target organism lurking in a relatively large sample. In order to achieve detection, molecular technologies utilize amplification strategies to increase the concentration of target DNA to a detectable level. Nucleic acid amplifications in both PCR and isothermal technologies start by making a variety of amplified products. These products include non-specific amplifications (NSA), and specific (target) amplifications.4,5,6,7 Ideally, the concentration of the desired target amplified product increases over time to levels above NSA where the detection chemistries are able to provide a detectable signal from the desired amplified product (target). Various reaction components such as: Target DNA concentration, polymerase, buffers and primers play a defining role in maintaining the progressive amplification dynamics, and thereby act as core contributors to the robustness of the reaction. However, none play a more crucial contribution to the success of a reaction than temperature. Herein lies a key difference between the fundamentals of PCR and Isothermal amplification technologies.

Key Takeaways:

  • PCR and LAMP both make a variety of amplification products: Non-Specific (NSA) and Specific (target)
  • Ideally, target products increase above the levels of NSA to reach a reliable detectable signal
  • A variety of factors contribute to the overall robustness of the reaction

What Is the Difference between PCR and Isothermal Detection Technologies?

A key foundational difference between the two technologies lies in the utilization of the thermal profiles. PCR utilizes thermocycling, while isothermal does not. This difference is the tether around how the different amplification chemistries work. In PCR, the cyclical denaturation of DNA during thermocycling separates all dimers (specific and non-specific). As the reaction progresses, this leads to frequent correction of the amplification dynamics away from the NSA and favors amplification of the desired target amplifications. Isothermal chemistries do not have the ability to correct the NSA through thermocycling, so it must rely on alternate mechanisms to achieve the same result. For example, LAMP utilizes “nested” primers where the primer sequences outside the target region are used to create early amplification products. These are subsequently used as a template for the desired target amplifications. The presence of these extra primers, along with the diverse amplified structures formed during the LAMP reaction, creates many more opportunities for NSA production.5,8,9 This causes a less controlled and inefficient amplification, and is perhaps why the preheating of the DNA prior to the LAMP has shown to increase the LAMP sensitivity.10, 11 To the end user, this inefficiency can manifest itself in various ways such as restricted multiplexing, lack of internal amplification control, complex assay design, tedious sample prep methods, and increased chance for inaccurate results (i.e., false positives and false negatives).12 Scientific literature does provide a fair amount of evidence that, under controlled conditions, the isothermal amplification reaction can provide equivalent results to PCR. Isothermal chemistries also usually require simplified instruments and thereby can present interesting opportunities in non-conventional test environments with simple and predictable matrices. This likely explains the early footing of isothermal technologies in the clinical test environment as a “point of care test” (POCT) alternative. However, it must also be noted that recently PCR has also been adapted and successfully commercialized for the POCT format.13,14

Key Takeaways:

  • PCR utilizes thermocycling, Isothermal does not
  • In PCR, thermocycling allows for the reaction to favor the target amplification over the NSA
  • LAMP must rely on alternate mechanisms to correct for NSA and these mechanisms lead to a less controlled and therefore inefficient amplification
  • Under controlled conditions, isothermal technology can provide equivalent results to PCR
  • Low instrumentation requirements make isothermal technologies interesting for non-conventional test environments (i.e. POCT); however, PCR has also been recently adapted as a POCT

Internal Amplification Controls in Molecular Pathogen Detection Technologies: The Value & The Challenges

The purpose of an internal amplification control (IAC) is to provide an indication of the efficacy of the test reaction chemistry. The closer the IAC is to the target DNA sequence, the better view into the inner workings of each reaction. For food microbiology testing, the role of the IAC is more important now than ever. Driven by regulations, industry self-accountability and brand protection initiatives, more food laboratories are testing diverse product types with novel and innovative formulations and ingredients. IAC capability not only helps with troubleshooting, but it also allows for a more confident adoption of the technology for new and diverse food and environmental matrices.

Over the years, PCR has progressively developed into a robust and efficient technology that can provide a dynamic IAC, giving the end user a direct look into the compatibility of the test matrix within the PCR reaction. From a single reaction, we can now make a qualitative assessment of whether the crude DNA prep from a matrix undergoing testing is working with this PCR or if it is inhibiting the reaction. With legacy technologies, including the older generation PCR’s, we were limited to an “it-did-not-work” scenario, leaving the end user blind to any insights into the reason. Since isothermal chemistries typically do not have an IAC, the end user is vulnerable to false results. Even when isothermal chemistries such as nicking enzyme amplification reaction (NEAR) can provide IAC, they typically do not mimic the target reaction and, therefore, are not a direct indicator of the reaction dynamics. This limits the end user back to the “it-did-not-work” scenario. LAMP technology attempts to mitigate the absence of IAC by performing a separate and external reaction with each test matrix. This strategy leaves the final result vulnerable to a number of factors that are otherwise non-existent for IAC: Sampling variations, reagent and machine anomalies, and user error. External control approaches also have a notable impact to the end user, as the burden to demonstrate fit-for-purpose of the method for even the smallest matrix composition change increases both validation and verification activities, which can have a notable financial impact to the laboratory.

There are a few reasons why IAC incorporation is not always plausible for isothermal technologies such as LAMP. First, inefficient, less-controlled amplification reactions leave little room for reliable and meaningful supplementary reactions, like the ones required for IAC. Second, the lack of consistent amplified products make it much more difficult to pinpoint a DNA structure that can be dependably used as an IAC. Third, lack of specific detection mechanisms makes it hard to distinguish signal from the target versus the IAC reaction.

Key Takeaways:

  • Internal amplification controls (IAC) are critical for the food industry due to complex and ever-changing matrix formulations
  • IAC is useful for troubleshooting, optimizing assay performance, and adapting test for novel matrices
  • PCR has evolved to provide dynamic IAC, leading to increased confidence in results
  • LAMP is not able to utilize IAC due to the nature of the amplification products, reaction efficiency, and lack of specific detection mechanisms

Follow the link to page 2 below.