Tag Archives: DNA

Raj Rajagopal, 3M Food Safety
In the Food Lab

Pathogen Detection Guidance in 2020

By Raj Rajagopal
No Comments
Raj Rajagopal, 3M Food Safety

Food production managers have a critical role in ensuring that the products they make are safe and uncontaminated with dangerous pathogens. Health and wellness are in sharp focus for consumers in every aspect of their lives right now, and food safety is no exception. As food safety becomes a continually greater focus for consumers and regulators, the technologies used to monitor for and detect pathogens in a production plant have become more advanced.

It’s no secret that pathogen testing is performed for numerous reasons: To confirm the adequacy of processing control and to ensure foods and beverages have been properly stored or cooked, to name some. Accomplishing these objectives can be very different, and depending on their situations, processors rely on different tools to provide varying degrees of testing simplicity, speed, cost, efficiency and accuracy. It’s common today to leverage multiple pathogen diagnostics, ranging from traditional culture-based methods to molecular technologies.

And unfortunately, pathogen detection is more than just subjecting finished products to examination. It’s become increasingly clear to the industry that the environment in which food is processed can cross-contaminate products, requiring food manufacturers to be ever-vigilant in cleaning, sanitizing, sampling and testing their sites.

For these reasons and others, it’s important to have an understanding and appreciation for the newer tests and techniques used in the fight against deadly pathogens, and where and how they might be fit for purpose throughout the operation. This article sheds light on the key features of one fast-growing DNA-based technology that detects pathogens and explains how culture methods for index and indicator organisms continue to play crucial roles in executing broad-based pathogen management programs.

LAMP’s Emergence in Molecular Pathogen Detection

Molecular pathogen detection has been a staple technology for food producers since the adoption of polymerase chain reaction (PCR) tests decades ago. However, the USDA FSIS revised its Microbiology Laboratory Guidebook, the official guide to the preferred methods the agency uses when testing samples collected from audits and inspections, last year to include new technologies that utilize loop-mediated isothermal amplification (LAMP) methods for Salmonella and Listeria detection.

LAMP methods differ from traditional PCR-based testing methods in four noteworthy ways.

First, LAMP eliminates the need for thermal cycling. Fundamentally, PCR tests require thermocyclers with the ability to alter the temperature of a sample to facilitate the PCR. The thermocyclers used for real-time PCR tests that allow detection in closed tubes can be expensive and include multiple moving parts that require regular maintenance and calibration. For every food, beverage or environmental surface sample tested, PCR systems will undergo multiple cycles of heating up to 95oC to break open DNA strands and cooling down to 60oC to extend the new DNA chain in every cycle. All of these temperature variations generally require more run time and the enzyme, Taq polymerase, used in PCR can be subjected to interferences from other inhibiting substances that are native to a sample and co-extracted with the DNA.

LAMP amplifies DNA isothermally at a steady and stable temperature range—right around 60oC. The Bst polymerase allows continuous amplification and better tolerates the sample matrix inhibitors known to trip up PCR. The detection schemes used for LAMP detection frees LAMP’s instrumentation from the constraints of numerous moving pieces.

Secondly, it doubles the number of DNA primers. Traditional PCR tests recognize two separate regions of the target genetic material. They rely on two primers to anneal to the subject’s separated DNA strands and copy and amplify that target DNA.

By contrast, LAMP technology uses four to six primers, which can recognize six to eight distinct regions from the sample’s DNA. These primers and polymerase used not only cause the DNA strand to displace, they actually loop the end of the strands together before initiating amplification cycling. This unique looped structure both accelerates the reaction and increases test result sensitivity by allowing for an exponential accumulation of target DNA.

Third of all, it removes steps from the workflow. Before any genetic amplification can happen, technicians must enrich their samples to deliberately grow microorganisms to detectable levels. Technicians using PCR tests have to pre-dispense lysis buffers or reagent mixes and take other careful actions to extract and purify their DNA samples.

Commercialized LAMP assay kits, on the other hand, offer more of a ready-to-use approach as they offer ready to use lysis buffer and simplified workflow to prepare DNA samples. By only requiring two transfer steps, it can significantly reduces the risk of false negatives caused by erroneous laboratory preparation.

Finally, it simplifies multiple test protocols into one. Food safety lab professionals using PCR technology have historically been required to perform different test protocols for each individual pathogen, whether that be Salmonella, Listeria, E. coli O157:H7 or other. Not surprisingly, this can increase the chances of error. Oftentimes, labs are resource-challenged and pressure-packed environments. Having to keep multiple testing steps straight all of the time has proven to be a recipe for trouble.

LAMP brings the benefit of a single assay protocol for testing all pathogens, enabling technicians to use the same protocol for all pathogen tests. This streamlined workflow involving minimal steps simplifies the process and reduces risk of human-caused error.

Index and Indicator Testing

LAMP technology has streamlined and advanced pathogen detection, but it’s impractical and unfeasible for producers to molecularly test every single product they produce and every nook and cranny in their production environments. Here is where an increasing number of companies are utilizing index and indicator tests as part of more comprehensive pathogen environmental programs. Rather than testing for specific pathogenic organisms, these tools give a microbiological warning sign that conditions may be breeding undesirable food safety or quality outcomes.

Index tests are culture-based tests that detect microorganisms whose presence (or detection above a threshold) suggest an increased risk for the presence of an ecologically similar pathogen. Listeria spp. Is the best-known index organism, as its presence can also mark the presence of deadly pathogen Listeria monocytogenes. However, there is considerable skepticism among many in the research community if there are any organisms outside of Listeria spp. that can be given this classification.

Indicator tests, on the other hand, detect the presence of organisms reflecting the general microbiological condition of a food or the environment. The presence of indicator organisms can not provide any information on the potential presence or absence of a specific pathogen or an assessment of potential public health risk, but their levels above acceptable limits can indicate insufficient cleaning and sanitation or operating conditions.

Should indicator test results exceed the established control limits, facilities are expected to take appropriate corrective action and to document the actions taken and results obtained. Utilizing cost-effective, fast indicator tests as benchmark to catch and identify problem areas can suggest that more precise, molecular methods need to be used to verify that the products are uncontaminated.

Process Matters

As discussed, technology plays a large role in pathogen detection, and advances like LAMP molecular detection methods combined with strategic use of index and indicator tests can provide food producers with powerful tools to safeguard their consumers from foodborne illnesses. However, whether a producer is testing environmental samples, ingredients or finished product, a test is only as useful as the comprehensive pathogen management plan around it.

The entire food industry is striving to meet the highest safety standards and the best course of action is to adopt a solution that combines the best technologies available with best practices in terms of processes as well –from sample collection and preparation to monitoring and detection.

Benjamin Katchman, PathogenDx
In the Food Lab

Revolutionary Rapid Testing for Listeria Monocytogenes and Salmonella

By Benjamin A. Katchman, Ph.D., Michael E. Hogan, Ph.D., Nathan Libbey, Patrick M. Bird
No Comments
Benjamin Katchman, PathogenDx

The Golden Age of Bacteriology: Discovering the Unknown in a Farm-to-Market Food Supply.

The last quarter of the 19th Century was both horrific and exciting. The world had just emerged from four decades of epidemic in cholera, typhoid fever and other enteric diseases for which no cause was known. Thus, the great scientific minds of Europe sought to find understanding. Robert Koch integrated Pasteur’s Germ Theory in 1861 with the high technology of the day: Mathematical optics and the first industrialized compound microscopes (Siebert, Leiss, 1877), heterocycle chemistry, high-purity solvents (i.e., formaldehyde), availability of engineered glass suitable as microscope slides and precision-molded parts such as tubes and plates in 1877, and industrialized agar production from seaweed in Japan in 1860. The enduring fruit of Koch’s technology integration tour de force is well known: Dye staining of bacteria for sub-micron microscopy, the invention of 13 cm x 1 cm culture tubes and the invention of the “Petri” dish coupled to agar-enriched culture media. Those technologies not only launched “The Golden Age of Bacteriology” but also guided the entire field of analytical microbiology for two lifetimes, becoming bedrock of 20th Century food safety regulation (the Federal Food, Drug and Cosmetic Act in 1938) and well into the 21st century with FSMA.

Learn more about technologies in food safety testing at the Food Labs / Cannabis Labs Conference | June 2–4, 2020 | Register now!Blockchain Microbiology: Managing the Known in an International Food Supply Chain.

If Koch were to reappear in 2020 and were presented with a manual of technical microbiology, he would have little difficulty recognizing the current practice of cell fixation, staining and microscopy, or the SOPs associated with fluid phase enrichment culture and agar plate culture on glass dishes (still named after his lab assistant). The point to be made is that the analytical plate culture technology developed by Koch was game changing then, in the “farm-to-market” supply chain in Koch’s hometown of Berlin. But today, plate culture still takes about 24 to 72 hours for broad class indicator identification and 48 to 96 hours for limited species level identification of common pathogens. In 1880, life was slow and that much time was needed to travel by train from Paris to Berlin. In 2020, that is the time needed to ship food to Berlin from any place on earth. While more rapid tests have been developed such as the ATP assay, they lack the speciation and analytical confidence necessary to provide actionable information to food safety professionals.

It can be argued that leading up to 2020, there has been an significant paradigm shift in the understanding of microbiology (genetics, systems based understanding of microbial function), which can now be coupled to new Third Industrial Age technologies, to make the 2020 international food supply chain safer.

We Are Not in 1880 Anymore: The Time has Come to Move Food Safety Testing into the 21st Century.

Each year, there are more than 48 million illnesses in the United States due to contaminated food.1 These illnesses place a heavy burden on consumers, food manufacturers, healthcare, and other ancillary parties, resulting in more than $75 billion in cost for the United States alone.2 This figure, while seemingly staggering, may increase in future years as reporting continues to increase. For Salmonella related illnesses alone, an estimated 97% of cases go unreported and Listeria monocytogenes is estimated to cause about 1,600 illnesses each year in the United States with more than 1,500 related hospitalizations and 260 related deaths.1,3 As reporting increases, food producers and regulatory bodies will feel an increased need to surveil all aspects of food production, from soil and air, to final product and packaging. The current standards for pathogenic agriculture and environmental testing, culture-based methods, qPCR and ATP assays are not able to meet the rapid, multiplexed and specificity required to meet the current and future demands of the industry.

At the DNA level, single cell level by PCR, high throughput sequencing, and microarrays provide the ability to identify multiple microbes in less than 24 hours with high levels of sensitivity and specificity (see Figure 1). With unique sample prep methods that obviate enrichment, DNA extraction and purification, these technologies will continue to rapidly reduce total test turnaround times into the single digit hours while simultaneously reducing the costs per test within the economics window of the food safety testing world. There are still growing pains as the industry begins to accept these new molecular approaches to microbiology such as advanced training, novel technology and integrated software analysis.

It is easy to envision that the digital data obtained from DNA-based microbial testing could become the next generation gold standard as a “system parameter” to the food supply chain. Imagine for instance that at time of shipping of a container, a data vector would be produced (i.e., time stamp out, location out, invoice, Listeria Speciation and/or Serovar discrimination, Salmonella Speciation and/or Serovar discrimination, refer toFigure 1) where the added microbial data would be treated as another important digital attribute of the load. Though it may seem far-fetched, such early prototyping through the CDC and USDA has already begun at sites in the U.S. trucking industry, based on DNA microarray and sequencing based microbial testing.

Given that “Third Industrial Revolution” technology can now be used to make microbial detection fast, digital, internet enabled and culture free, we argue here that molecular testing of the food chain (DNA or protein based) should, as soon as possible, be developed and validated to replace culture based analysis.

Broad Microbial Detection
Current microbiological diagnostic technology is only able to test for broad species of family identification of different pathogens. New and emerging molecular diagnostic technology offers a highly multiplexed, rapid, sensitive and specific platforms at increasingly affordable prices. Graphic courtesy of PathogenDx.

References.

  1. Scallan, E., Hoekstra, R. M., Angulo, F. J., Tauxe, R. V., Widdowson, M. A., Roy, S. L., … Griffin, P. M. (2011). Foodborne illness acquired in the United States–major pathogens. Emerging infectious diseases, 17(1), 7–15. doi:10.3201/eid1701.p11101
  2. Scharff, Robert. (2012). Economic Burden from Health Losses Due to Foodborne Illness in the United States. Journal of food protection. 75. 123-31. 10.4315/0362-028X.JFP-11-058.
  3. Mead, P. S., Slutsker, L., Dietz, V., McCaig, L. F., Bresee, J. S., Shapiro, C., … Tauxe, R. V. (1999). Food-related illness and death in the United States. Emerging infectious diseases, 5(5), 607–625. doi:10.3201/eid0505.990502
Susanne Kuehne, Decernis
Food Fraud Quick Bites

Comparing Ceylon and Cassia

By Susanne Kuehne
No Comments
Susanne Kuehne, Decernis
Food fraud, cinnamon
Find records of fraud such as those discussed in this column and more in the Food Fraud Database. Image credit: Susanne Kuehne.

Cinnamon is in high demand worldwide, with Ceylon cinnamon or true cinnamon (Cinnamon verum) the most sought-after and higher priced variety. It is therefore tempting to “cut” Ceylon cinnamon with cheaper cassia cinnamon. Previous detection methods for such adulterations included HPLC testing or DNA barcoding, which was time consuming and could only be applied by experts. New FT-NIR (Fourier transform near-infrared) and FTIR (Fourier transform infrared) spectroscopic methods in combination with multivariate analysis enable quick detection of cinnamon adulteration.

Resources

  1. J. Yasmin, M.R. Ahmed, S. Lohumi, C. Wakholi, H. Lee, C. Mo, B.-K. Cho, Corresponding author: chobk@cnu.ac.kr, Quality Assurance and Safety of Crops & Foods: 11 (3)- Pages: 257 – 267 (25 April, 2019). “Rapid authentication measurement of cinnamon powder using FT-NIR and FT-IR spectroscopic techniques”. Retrieved from Wageningen Academic Publishers, wageningenacademic.com
Lettuce

Romaine Lettuce Likely Source of Widespread E. Coli Outbreak

By Food Safety Tech Staff
No Comments
Lettuce

At least 35 people in 11 states have been infected with E.coli O157:H7, according to the CDC, and the FDA is investigating a likely link to these infections and chopped romaine lettuce from Yuma, Arizona. The reported illnesses occurred between March 22 and March 31, and 93% of the 28 people interviewed reported eating romaine lettuce (mainly from a restaurant) during the week that they became ill.

The FDA and CDC are advising consumers to ask restaurants and other food service establishments where they source their romaine lettuce from and to avoid any that came from Yuma, Arizona. In addition, they should not buy or eat it if they cannot confirm the source.

“Retailers, restaurants, and other food service operators should not sell or serve any chopped romaine lettuce from the winter growing areas in Yuma, Arizona. If you cannot determine the source of your chopped romaine lettuce, do not sell or serve it. The FDA currently does not have information to indicate that whole-head romaine lettuce or hearts of romaine have contributed to this outbreak.” – FDA

The agencies will continue to investigate this outbreak. FDA emphasized that this outbreak is not related to a multistate outbreak that occurred last November to December involving leafy greens, as those infections had a different DNA fingerprint of the E. coli O157:H7 bacteria.

Sequencing pattern, pathogens

Build Stronger Food Safety Programs With Next-Generation Sequencing

By Akhila Vasan, Mahni Ghorashi
No Comments
Sequencing pattern, pathogens

According to a survey by retail consulting firm Daymon Worldwide, 50% of today’s consumers are more concerned about food safety and quality than they were five years ago. Their concerns are not unfounded. Recalls are on the rise, and consumer health is put at risk by undetected cases of food adulteration and contamination.

While consumers are concerned about the quality of the food they eat, buy and sell, the brands responsible for making and selling these products also face serious consequences if their food safety programs don’t safeguard against devastating recalls.

A key cause of recalls, food fraud, or the deliberate and intentional substitution, addition, tampering or misrepresentation of food, food ingredients or food packaging, continues to be an issue for the food safety industry. According to PricewaterhouseCoopers, food fraud is estimated to be a $10–15 billion a year problem.

Some of the more notorious examples include wood shavings in Parmesan cheese, the 2013 horsemeat scandal in the United Kingdom, and Oceana’s landmark 2013 study, which revealed that a whopping 33% of seafood sold in the United States is mislabeled. While international organizations like Interpol have stepped up to tackle food fraud, which is exacerbated by the complexity of globalization, academics estimate that 4% of all food is adulterated in some way.

High-profile outbreaks due to undetected pathogens are also a serious risk for consumers and the food industry alike. The United States’ economy alone loses about $55 billion each year due to food illnesses. The World Health Organization estimates that nearly 1 in 10 people become ill every year from eating contaminated food. In 2016 alone, several high-profile outbreaks rocked the industry, harming consumers and brands alike. From the E. coli O26 outbreak at Chipotle to Salmonella in live poultry to Hepatitis A in raw scallops to the Listeria monocytogenes outbreak at Blue Bell ice cream, the food industry has dealt with many challenges on this front.

What’s Being Done?

Both food fraud and undetected contamination can cause massive, expensive and damaging recalls for brands. Each recall can cost a brand about $10 million in direct costs, and that doesn’t include the cost of brand damage and lost sales.

Frustratingly, more recalls due to food fraud and contamination are happening at a time when regulation and policy is stronger than ever. As the global food system evolves, regulatory agencies around the world are fine-tuning or overhauling their food safety systems, taking a more preventive approach.

At the core of these changes is HACCP, the long implemented and well-understood method of evaluating and controlling food safety hazards. In the United States, while HACCP is still used in some sectors, the move to FSMA is apparent in others. In many ways, 2017 is dubbed the year of FSMA compliance.

There is also the Global Food Safety Initiative (GFSI), a private industry conformance standard for certification, which was established proactively by industry to improve food safety throughout the supply chain. It is important to note that all regulatory drivers, be they public or private, work together to ensure the common goal of delivering safe food for consumers. However, more is needed to ensure that nothing slips through the food safety programs.

Now, bolstered by regulatory efforts, advancements in technology make it easier than ever to update food safety programs to better safeguard against food safety risks and recalls and to explore what’s next in food.

Powering the Food Safety Programs of Tomorrow

Today, food safety programs are being bolstered by new technologies as well, including genomic sequencing techniques like NGS. NGS, which stands for next-generation sequencing, is an automated DNA sequencing technology that generates and analyzes millions of sequences per run, allowing researchers to sequence, re-sequence and compare data at a rate previously not possible.

The traditional methods of polymerase chain reaction (PCR) are quickly being replaced by faster and more accurate solutions. The benefit of NGS over PCR is that PCR is targeted, meaning you have to know what you’re looking for. It is also conducted one target at a time, meaning that each target you wish to test requires a separate run. This is costly and does not scale.

Next-generation sequencing, by contrast, is universal. A single test exposes all potential threats, both expected and unexpected. From bacteria and fungi to the precise composition of ingredients in a given sample, a single NGS test guarantees that hazards cannot slip through your supply chain.  In the not-too-distant future, the cost and speed of NGS will meet and then quickly surpass legacy technologies; you can expect the technology to be adopted with increasing speed the moment it becomes price-competitive with PCR.

Applications of NGS

Even today’s NGS technologies are deployment-ready for applications including food safety and supplier verification. With the bottom line protected, food brands are also able to leverage NGS to build the food chain of tomorrow, and focus funding and resources on research and development.

Safety Testing. Advances in NGS allow retailers and manufacturers to securely identify specific pathogens down to the strain level, test environmental samples, verify authenticity and ultimately reduce the risk of outbreaks or counterfeit incidents.

Compared to legacy PCR methods, brands leveraging NGS are able to test for multiple pathogens with a single test, at a lower cost and higher accuracy. This universality is key to protecting brands against all pathogens, not just the ones for which they know to look.

Supplier Verification. NGS technologies can be used to combat economically motivated food fraud and mislabeling, and verify supplier claims. Undeclared allergens are the number one reason for recalls.

As a result of FSMA, the FDA now requires food facilities to implement preventative controls to avoid food fraud, which today occurs in up to 10% of all food types. Traditional PCR-based tests cannot distinguish between closely related species and have high false-positive rates. NGS offers high-resolution, scalable testing so that you can verify suppliers and authenticate product claims, mitigating risk at every level.

R&D. NGS-based metagenomics analysis can be used in R&D and new product development to build the next-generation of health foods and nutritional products, as well as to perform competitive benchmarking and formulation consistency monitoring.

As the consumer takes more and more control over what goes into their food, brands have the opportunity to differentiate not only on transparency, but on personalization, novel approaches and better consistency.

A Brighter Future for Food Safety

With advances in genomic techniques and analysis, we are now better than ever equipped to safeguard against food safety risks, protect brands from having to issue costly recalls, and even explore the next frontier for food. As the technology gets better, faster and cheaper, we are going to experience a tectonic shift in the way we manage our food safety programs and supply chains at large.

Sanjay Singh, Eurofins
Food Genomics

How is DNA Sequenced?

By Sanjay K. Singh, Douglas Marshall, Ph.D., Gregory Siragusa, Ph.D.
No Comments
Sanjay Singh, Eurofins

Here is a prediction. Within the next year or years, at some time in your daily work life as a food safety professional you will be called upon to either use genomic tools or to understand and relay information based on genomic tools for making important decisions about food safety and quality. Molecular biologists love to use what often seems like a foreign or secret language. Rest assured dear reader, these are mostly just vernacular and are easily understood once you get comfortable with a bit of the vocabulary. In this the fourth installment of our column we progress to give you another tool for your food genomics tool kit. We have called upon a colleague and sequencing expert, Dr. Sanjay Singh, to be a guest co-author for this topic on sequencing and guide us through the genomics language barrier.

The first report of the annotated (labeled) sequence of the human genome occurred in 2003, 50 years after the discovery of the structure of DNA. In this genome document all the genetic information required to create and sustain a human being was provided. The discovery of the structure of DNA has provided a foundation for a deeper understanding of all life forms, with DNA as a core molecule of genetic information. Of course that includes our food and our tiny friends of the microbial world. Further molecular technological advances in the fields of agriculture, food science, forensics, epidemiology, comparative genomics, medicine, diagnostics and therapeutics are providing stunning examples of the power of genomics in our daily lives.  We are only now beginning to harvest the fruits of sequencing and using that knowledge routinely in our respective professions.

In our first column we wrote, “DNA sequencing can be used to determine the names, types, and proportions of microorganisms, the component species in a food sample, and track foodborne diseases agents.” In this month’s column, we present a basic guide to how DNA sequencing chemistry works.

Image courtesy of US Human Genome Project Knowledge base
Image courtesy of US Human Genome Project Knowledge base

DNA sequencing is the process of determining the precise order of four nucleotide bases, adenine or A, cytosine or C, guanine or G, and thymine or T in a DNA molecule. By knowing the linear sequence of A, C, G, and T in a DNA molecule, the genetic information carried in that particular DNA molecule can be determined.

DNA sequencing happened from the intersections of different fields including biology, chemistry, mathematics, and physics.1,2 The critical breakthrough was provided in 1953 by James Watson, Francis Crick, Maurice Wlkins and Rosalind Franklin when they resolved the now familiar double helix structure of DNA.3 Each helical strand was a polynucleotide, which consists of repeating monomeric units called nucleotides. A nucleotide consists of a sugar (deoxyribose), a phosphate moiety, and one of the four nitrogenous bases—the aforementioned A, C, G, and T. In the double helix, the strands run opposite to each other, commonly referred as anti-parallel. Repeating units of base-pairs (bp), where A always pairs with T and C always pairs with G, are arranged within the double helix so that they are slightly offset from each other like steps in a winding staircase. On a piece of paper, the double helix is often represented by scientists as a flat ladder-like structure, where the base pairs (bp) form the rungs of the ladder while the sugar-phosphate backbone form the antiparallel rails (see Figure 1).

DNA Double Helix
Artistic representation of DNA Double Helix. Source: Eurofins

The two ends of each polynucleotide strand are called 5′ or 3′-end, a nomenclature that represents the chemical structure of the deoxyribose sugar at that terminus. The lengths of a single- or double-stranded DNA are often measured in bases (b) or bases pairs (bp), respectively. The two polynucleotide strands can be readily unzipped by heating, and on cooling, the initial double-helix structure is re-formed or re-annealed. The ability to rezip the initial ladder-like structure can be attributed to the phenomenon of base pairing, which merits repetition—the base A always pairs with T and the base G always with C. This rather innocuous phenomenon of base pairing is the basis for the mechanism by which DNA is copied when cells divide and is also the theoretical basis on which most traditional and modern DNA sequencing methodologies have been developed.

Other biological advancements also paved the way towards the development of sequencing technologies. Prominent amongst these were the discovery of enzymes that allowed a scientist to manipulate the DNA. For example, restriction enzymes that recognize and cleave DNA at specific short nucleotide sequences can be used to fragment a long duplex strand of DNA.4 The DNA polymerase enzyme, in the presence of the deoxyribose nucleotide triphosphates (dNTPs: Chemically reactive forms of the nucleotide monomers), can use a single DNA strand to fill in the complementary bases and extend a shorter rail strand (primer extension) of a partial DNA ladder.5 A critical part of the primer extension is the ‘primer’, which are short single-stranded DNA pieces (15 to 30 bases long) that are complementary to a segment of the target DNA. These primers are made using automated high-throughput synthesizer machines. Today, such primers can be rapidly manufactured and delivered on the following day. When the primer and the target DNA are combined through a process called annealing (heat and then cool), they form a structure that shows a ladder-like head and a long single-stranded tail. In 1983, Kary Mullis developed an enzyme-based process called Polymerase Chain Reaction (PCR). Using this protocol, one can pick a single copy of DNA and amplify the same sequence an enormous number of times. One can think of PCR as molecular photocopier in which a single piece of DNA is amplified up to approximately 30 billion copies!

The other critical event that changed the course of DNA sequencing efforts was the publication of the ‘dideoxy chain termination’ method by Dr. Frederick Sanger in December 1977.6 This marked the beginning of the first generation of DNA sequencing techniques. Most next-generation sequencing methods are refinements of the chain termination, or “Sanger method” of sequencing.

Frederick Sanger chemically modified each base so that when it was incorporated into a growing DNA chain, the chain was forcibly terminated. By setting up a primer extension reaction where in one of the chemically modified ‘inactive’ base in smaller quantity is mixed with four active bases, Sanger obtained a series of DNA strands, which when separated based on their size indicated the positions of that particular base in the DNA sequence. By analyzing the results from four such reactions run in parallel, each containing a different ‘inactive’ base, Sanger could piece together the complete sequence of the DNA. Subsequent modifications to the method allowed for the determination of the sequence using dye-labeled termination bases in a single reaction. Since, a sequence of less than <1000 bases can be determined from a single such reaction, the sequence of longer DNA molecules have to be pieced together from many such reads.

Using technologies available in the mid-1990’s, as many as 1 million bases of sequence could be determined per day. However, at this rate, determining the sequence of the 3 billion bp human genome required years of sequencing work. By analogy, this is equivalent to reading the Sunday issue of The New York Times, about 300,000 words, at a pace of 100 words per day. The cost of sequencing the human genome was a whopping  $70 million. The human genome project clearly brought forth a need for technologies that could deliver fast, inexpensive and accurate genome sequences.  In response, the field initially exploded with modifications to the Sanger method. The impetus for these modifications was provided by advances in enzymology, fluorescent detection dyes and capillary-array electrophoresis. Using the Sanger method of sequencing, one can read up to ~1,000 bp in a single reaction, and either 96 or 384 such reactions (in a 96 or 384 well plate) can be performed in parallel using DNA sequencers. More recently a new wave of technological sequencing advances, termed NGS or next-generation sequencing, have been commercialized. NGS is fast, automated, massively parallel and highly reproducible. NGS platforms can read more than 4 billion DNA strands and generate about a terabyte of sequence data in about six days! The whole 3 billion base pairs of the human genome can be sequenced and annotated in a mere month or less.

Continue to page 2.

Gregory Siragusa, Eurofins
Food Genomics

Microbiomes Move Standard Plate Count One Step Forward

By Gregory Siragusa, Douglas Marshall, Ph.D.
No Comments
Gregory Siragusa, Eurofins

Last month we introduced several food genomics terms including the microbiome. Recall that a microbiome is the community or population of microorganisms that inhabit a particular environment or sample. Recall that there are two broad types of microbiomes, a targeted (e.g., bacteria or fungi) or a metagenome (in which all DNA in a sample is sequenced, not just specific targets like bacteria or fungi). This month we would like to introduce the reader to uses of microbiomes and how they augment standard plate counts and move us into a new era in food microbiology. Before providing examples, it might be useful to review a diagram explaining the general flow of the process of determining a microbiome (See Figure 1).

Microbiome
Figure 1. General process for performing a targeted microbiome (bacterial or fungal)

By analogy, if one thinks of cultural microbiology and plate counts as a process of counting colonies of microbes that come from a food or environmental sample, microbiome analysis can be thought of as identifying and counting signature genes, such as the bacterial specific 16S gene, from the microbes in a food or environmental sample. Plate counts have been and remain a food microbiologist most powerful indicator tool in the tool kit; however, we know there are some limitations in their use. One limitation is that not all bacterial or fungal cells are capable of outgrowth and colony formation on specific media under a set of incubation conditions (temperature, time, media pH, storage atmosphere, etc.). Individual plate count methods cannot cover the nearly infinite number of variations of growth atmospheres and nutrients. Because of these limitations microbiologists understand that we have not cultured but many different types of bacteria on the planet (this led to the term “The Great Plate Count Anomaly” (Staley & Konopka, 1985). Think of a holiday party where guests were handed nametags on which was printed: “Hello, I grow on Standard Methods Agar” or “Hello, I grow at 15°C”, etc. We can group the partygoers by ability to grow on certain media; we can also count partygoers, but they still do not have names. As effective as our selective and differential media have become, bacterial colonies still do not come with their own “Hello, My Name Is XYZ” name tags. Therefore, in the lab, once a plate is counted it is generally tossed into the autoclave bag, along with unnamed colonies and all they represent. Microbiomes can provide a nametag of sorts as well as what proportion of people at that party share  a certain name. For instance: “Hello, My Name is Pseudomonas” or “Hello, My Name is Lactobacillus”, etc. The host can then say “Now we are going to count you; would all Pseudomonas pleased gather in this corner?” or “All Lactobacillus please meet at the punch bowl”.

A somewhat overly simplified analogy, but it makes the point that microbiome technology gives names and proportions. Microbiomes too have limitations. First, with current technologies microbiomes need a relatively large threshold of organisms of a specific group to appear in the microbiome pie chart— approximately 103. In theory, a colony on a plate of agar medium can be derived from a single cell or colony-forming unit (CFU). Not all amplified genes in a microbiome are necessarily from viable cells (A topic that will be covered later in this series of articles). Forming a colony on an agar surface on the other hand requires cell viability. Finally, the specificity of a microorganism name assigned to a group in a microbiome depends on the size of the sequenced amplicon (an amplicon is a segment of DNA, in this case the 16S gene DNA, resulting from amplification by PCR before sequencing) and how well our microbial databases cover different subtypes in a species. Targeted microbiomes can reliably name the genus of an organism, however resolution to the species and subspecies is not guaranteed. (Later in this series we will discuss metagenomes and how they have the potential to identify to a species or even subspecies level). Readers can find very informative reviews on microbiome specificity in the following cited references: Bokulich, Lewis, Boundy-Mills, & Mills, 2016; de Boer et al., 2015; Ercolini, 2013; Kergourlay, Taminiau, Daube, & Champomier Vergès, 2015.

When we consider the power of using cultural microbiology for quantitative functional indicators of microbial quality together with microbiomic analysis, with limitations  and all for both, microbiomes have opened a door to the vast and varied biosphere of our food’s microbiology to a depth never before observed. This all sounds great, but how will we benefit and use this information? We have constructed Table 1 with examples and links of microbiome applications to problems that would have required years to study by cultural microbiology techniques alone. Please note this is by no means an exhaustive list, but it serves to illustrate the very broad and deep potential of microbiomics to food microbiology. We encourage the reader to email the editors or authors with questions regarding any reference. Using PubMed and the search terms “Food AND microbiome” will provide abstracts and a large variety of applications of this technology.

Foodstuff Reference
Ale (Bokulich, Bamforth, & Mills, 2012)
Beef Burgers (Ferrocino et al., 2015)
Beefsteak (De Filippis, La Storia, Villani, & Ercolini, 2013)
Brewhouse and Ingredients (Bokulich et al., 2012)
Cheese (Wolfe, Button, Santarelli, & Dutton, 2014)
Cheese and Listeria growth (Callon, Retureau, Didienne, & Montel, 2014)
Cherries, Hydrostatic Pressure (del Árbol et al., n.d.)
Cocoa (Illeghems, De Vuyst, Papalexandratou, & Weckx, 2012)
Dairy Starters and Spoilage Bacteria (Stellato, De Filippis, La Storia, & Ercolini, 2015)
Drinking Water Biofilms (Chao, Mao, Wang, & Zhang, 2015)
Fermented Foods (Tamang, Watanabe, & Holzapfel, 2016)
Foodservice Surfaces (Stellato, La Storia, Cirillo, & Ercolini, 2015)
Fruit and Vegetables (Leff & Fierer, 2013)
Insect Protein (Garofalo et al., 2017)
Kitchen surfaces (Flores et al., 2013)
Lamb (Wang et al., 2016)
Lobster (Tirloni, Stella, Gennari, Colombo, & Bernardi, 2016)
Meat and storage atmosphere (Säde, Penttinen, Björkroth, & Hultman, 2017)
Meat spoilage and processing plant (Pothakos, Stellato, Ercolini, & Devlieghere, 2015)
Meat Spoilage Volatiles (Casaburi, Piombino, Nychas, Villani, & Ercolini, 2015)
Meat Stored in Different Atmospheres (Ercolini et al., 2011)
Milk (Quigley et al., 2011)
Milk and Cow Diet (Giello et al., n.d.)
 Milk and Mastitis  (Bhatt et al., 2012)
 Milk and Teat Preparation  (Doyle, Gleeson, O’Toole, & Cotter, 2016)
 Natural starter cultures  (Parente et al., 2016)
 Olives  (Abriouel, Benomar, Lucas, & Gálvez, 2011)
 Pork Sausage  (Benson et al., 2014)
Spores in complex foods (de Boer et al., 2015)
Tomato Plants (Ottesen et al., 2013)
Winemaking (Marzano et al., 2016)
Table 1. Examples of microbiome analysis of different foods and surfaces.

See page 2 for references

Next-Generation Sequencing Targets GMOs

By Maria Fontanazza
1 Comment

As the movement among consumers for more information about the products they’re purchasing and consuming continues to grow, the food industry will experience persistent pressure from both advocacy groups and the government on disclosure of product safety information and ingredients. Top of mind as of late has been the debate over GMOs. “Given all of the attention on GMOs on the legislative side, there is huge demand from consumers to have visibility and transparency into whether products have been genetically modified or not,” says Mahni Ghorashi, co-founder of Clear Labs.

Mahni Ghorashi, Clear Labs
Mahni Ghorashi, co-founder of Clear Labs

Today Clear Labs announced the availability of its comprehensive next-generation sequencing (NGS)-based GMO test. The release comes at an opportune time, as the GMO labeling bill, which was passed by the U.S. House of Representatives last week, heads to the desk of President Obama.

Clear Labs touts the technology as the first scalable, accurate and affordable GMO test. NGS enables the ability to simultaneously screen for multiple genes at one time, which could companies save time and money. “The advantage and novelty of this new test or assay is the ability to screen for all possible GMO genes in a single universal test, which is a huge change from the way GMO testing is conducted today,” says Ghorashi.

The PCR test method is currently the industry standard for GMO screening, according to the Non-GMO Project. “PCR tests narrowly target an individual gene, and they’re extremely costly—between $150–$275 per gene, per sample,” says Ghorashi. “Next-generation sequencing is leaps and bounds above PCR testing.” Although he won’t specify the cost of the Clear Labs assay (the company uses a tiered pricing structure based on sample volume), Ghorashi says it’s a fraction of the cost of traditional PCR tests.

The new assay screens for 85% of approved GMOs worldwide and targets four major genes used in manufacturing GMOs (detection based on methods of trait introduction and selection, and detection based on common plant traits), allowing companies to determine the presence and amount of GMOs within products or ingredient samples. “We see this test as a definitive scientific validation,” says Ghorashi. The company’s tests integrate software analytics to enable customers to verify GMO-free claims, screen suppliers, and rank suppliers based on risk.

Clear Labs, GMO, testing
Screenshot of the Clear Labs GMO test, which is based on next-generation sequencing technology.

Clear Labs isn’t targeting food manufacturers of a specific size or sector within the food industry but anticipates that a growing number of leading brands will be investing in GMO testing technology. “We expect to see adoption across the board in terms of company size, related more to what their stance is on food transparency and making that information readily available to their end consumers,” says Ghorashi.

David Chambliss, IBM Research
In the Food Lab

Scientific Breakthrough May Change Food Safety Forever

By David Chambliss
No Comments
David Chambliss, IBM Research

How safe is a raw diet? Could sterilizing our food actually make us more prone to sickness? Are vegans healthier than carnivores? In the last few decades, global food poisoning scares from beef to peanut butter have kept food scientists and researchers around the world asking these questions and searching for improved methods of handling and testing what we eat.

It’s been more than 150 years since Louis Pasteur introduced the idea of germ theory—that bacteria cause sickness—fundamentally changing the way we think about what makes our food safe to eat. While we’ve advanced in so many other industrial practices, we’re still using pasteurization as the standard for the global food industry today.

Although pasteurization effectively controls most organisms and keeps the food supply largely safe, we continue to have foodborne outbreaks despite additional testing and more sophisticated techniques. The potential health promise of genomics, and the gut microbiome genetics and bacterial ecosystems, could be the key to the next frontier in food safety.

The scientific community is once again at the cusp of a new era with the advent of metagenomics and its application to food safety.

What is metagenomics? Metagenomics is the study of the bacterial community using genetics by examining the entire DNA content at once. Whole genome sequencing of a single bacterium tells us about the DNA of a specific organism, whereas metagenomic testing tells us about the interaction of all the DNA of all the organisms within a sample or an environment. Think of the vast quantity of genetic material in the soil of a rice patty, a lettuce leaf, your hand, a chicken ready for cooking, or milk directly from a cow. All of them have thousands of bacteria that live together in a complex community called the microbiome that may contain bacteria that are sometimes harmful to humans—and possibly also other bacteria that help to keep the potentially harmful bacteria in check.

Metagenomics uses laboratory methods to break up cells and extract many millions of DNA molecular fragment, and sequencing instruments to measure the sequences of A’s, C’s, G’s, and T’s that represent the genetic information in each of those fragments. Then scientists use computer programs to take the information from millions or billions of fragments to determine from what bacteria they came. The process is a little like mixing up many jigsaws, grabbing some pieces from the mix, and figuring out what was in the original pictures. The “pictures” are the genomes of bacteria, which in some cases carry enough unique information to associate a given bacterium with a previously seen colony of the same species.

Genomics of single bacterial cultures, each from a single species, is well established as a way to connect samples of contaminated foods with reported cases of foodborne illnesses. With metagenomics, which essentially looks for all known species simultaneously, one hopes to do a better job of early detection and prevention. For example, if a machine malfunction causes pasteurization or cleaning to be incomplete, the metagenomics measurement will likely show compositional shifts in which bacterial phyla are abundant. This can make it possible to take remedial action even before there are signs of pathogens or spoilage that would have led to a costly recall.

Up until now, keeping food safe has meant limiting the amount of harmful bacteria in the community. That means using standard methods such as pasteurization, irradiation, sterilization, salt and cooking. To determine whether food is actually safe to eat, we test for the presence of a handful of specific dangerous organisms, including Listeria, E. coli, and Salmonella, to name a few. But what about all the “good” bacteria that is killed along with the “bad” bacteria in the process of making our food safe?

Nutritionists, doctors and food scientists understand that the human gut is well equipped to thrive unless threatened by particularly dangerous contaminants. The ability to determine the entire genetic makeup within a food could mean being able to know with certainty whether it contains any unwanted or unknown microbial hazards. Metagenomic testing of the food supply would usher in an entirely new approach to food safety—one in which we could detect the presence of all microbes in food, including previously unknown dangers. It could even mean less food processing that leaves more of the healthful bacteria intact.

More than 150 years ago, Pasteur pointed us in the right direction. Now the world’s brightest scientific minds are primed to take the food industry the next leap toward a safer food supply.

Steven Guterman, InstantLabs
In the Food Lab

Save Seafood with Digital Tracking

By Steven Guterman, Sarah McMullin, Steve Phelan
No Comments
Steven Guterman, InstantLabs

The combination of improved digital tracking along the food supply chain, as well as fast, accurate DNA testing will provide modern, state-of-the-art tools essential to guarantee accurate labeling for the ever-increasing quantities of foods and ingredients shipped globally.

The sheer scale of the international food supply chain creates opportunities for unscrupulous parties to substitute cheaper products with false labels. We know fraud is obviously a part of the problem. Some suppliers and distributors engage in economically motivated substitution. That is certain.

It’s equally true, however, that some seafood misidentification is inadvertent. In fact, some species identification challenges are inevitable, particularly at the end of the chain after processing. We believe most providers want to act in an ethical manner.

Virtually all seafood fraud involves the falsification or manipulation of documents created to guarantee that the label on the outside of the box matches the seafood on the inside. Unfortunately, the documents are too often vague, misleading or deliberately fraudulent.

Oceana, an international non-profit focused solely on protecting oceans and ocean resources, has published extensively on seafood fraud and continues to educate the public and government through science-based campaigns.

Seafood fraud is not just an economic issue. If the product source is unknown, it is possible to introduce harmful contamination into the food supply. By deploying two actions simultaneously, we can help address this problem and reduce mistakes and mishandling:

  • Improved digital tracking technologies deployed along the supply chain
  • Faster, DNA-based in-house testing to generate results in hours

Strategic collaborations can help industry respond to broad challenges such as seafood fraud. We partner with the University of Guelph to develop DNA-based tests for quick and accurate species identification. The accuracy and portability produced by this partnership allow companies to deploy tests conveniently at many points in the supply chain and get accurate species identification results in hours.

Our new collaboration with SAP, the largest global enterprise digital partner in the world, will help ensure that test results can be integrated with a company’s supply chain data for instant visibility and action throughout the enterprise. In fact, SAP provides enterprise-level software to customers who distribute 78% of the world’s food and accordingly its supply chain validation features have earned global acceptance.

The food fraud and safety digital tracking innovations being developed by SAP will be critical in attacking fraud. Linking paper documents with definitive test results at all points in the supply chain is no longer a realistic solution. Paper trails in use today do not go far enough. Product volume has rendered paper unworkable. Frustrated retailers voice concerns that their customers believe they are doing more testing and validation than they can actually undertake.

We must generate more reliable data and make it available everywhere in seconds in order to protect and strengthen the global seafood supply chain.

Catfish will become the first seafood species to be covered by United States regulations as a result of recent Congressional legislation. This change will immediately challenge the capability of supply chain accuracy. Catfish are but one species among thousands.

Increasingly, researchers and academics in the food industry recognize fast and reliable in-house and on-site testing as the most effective method to resolve the challenges of seafood authentication.

DNA-based analyses have proven repeatedly to be the most effective process to ensure accurate species identification across all food products. Unfortunately, verifying a species using DNA sequencing techniques typically takes one to two weeks to go from sample to result. With many products, and especially with seafood, speed on the production line is essential. In many cases, waiting two weeks for results is just not an acceptable solution.

Furthermore, “dipstick” or lateral-flow tests may work on unprocessed food at the species level, however, DNA testing provides the only accurate test method to differentiate species and sub-species in both raw and processed foods.

Polymerase chain reaction (PCR), which analyzes the sample DNA, can provide accurate results in two to three hours, which in turn enhances the confidence of producers, wholesalers and retailers in the products they sell and minimizes their risk of recalls and brand damage.

New technology eliminates multi-day delays for test results that slow down the process unnecessarily. Traditional testing options require sending samples to commercial laboratories that usually require weeks to return results. These delays can be expensive and cumbersome. Worse, they may prevent fast, accurate testing to monitor problems before they reach a retail environment, where brand and reputational risk are higher.

Rapid DNA-based testing conducted in-house and supported by sophisticated digital tracking technologies will improve seafood identification with the seafood supply chain. This technological combination will improve our global food chain and allow us to do business with safety and confidence in the accuracy and reliability of seafood shipments.