Sasan Amini, Clear Labs

NGS in Food Safety: Seeing What Was Never Before Possible

By Sasan Amini
No Comments
Sasan Amini, Clear Labs

For the past year, Swedish food provider Dafgård has been using a single test to screen each batch of its food for allergens, missing ingredients, and even the unexpected – an unintended ingredient or pathogen. The company extracts DNA from food samples and sends it to a lab for end-to-end sequencing, processing, and analysis. Whether referring to a meatball at a European Ikea or a pre-made pizza at a local grocery store, Dafgård knows exactly what is in its food and can pinpoint potential trouble spots in its supply chains, immediately take steps to remedy issues, and predict future areas of concern.

The power behind the testing is next-generation sequencing (NGS). NGS platforms, like the one my company Clear Labs has developed, consist of the most modern parallel sequencers available in combination with advanced databases and technologies for rapid DNA analysis. These platforms have reduced the cost of DNA sequencing by orders of magnitude, putting the power to sequence genetic material in the hands of scientists and investigators across a range of research disciplines and industries. They have overtaken traditional, first-generation Sanger sequencing in clinical settings over the past several years and are now poised to supplement and likely replace PCR in food safety testing.

For Dafgård, one of the largest food providers in Europe, the switch to NGS has given it the ability to see what was previously impossible with PCR and other technologies. Although Dafgård still uses PCR in select cases, it has run thousands of NGS-based tests over the past year. One of the biggest improvements has been in understanding the supply chain for the spices in its prepared foods. Supply chains for spices can be long and can result in extra or missing ingredients, some of which can affect consumer health. With the NGS platform, Dafgård can pinpoint ingredients down to the original supplier, getting an unparalleled look into its raw ingredients.

Dafgård hopes to soon switch to an entirely NGS-based platform, which will put the company at the forefront of food safety. Embracing this new technology within the broader food industry has been a decade-long process, one that will accelerate in the coming years, with an increased emphasis on food transparency both among consumers and regulators globally.

Transitioning technology

A decade ago, very few people in food safety were talking about NGS technologies. A 2008 paper in Analytical and Bioanalytical Chemistry1 gave an outlook for food safety technology that included nanotechnology, while a 2009 story in Food Safety Magazine2 discussed spectrometric or laser-based diagnostic technologies. Around the same time, Nature magazine named NGS as its “method of the year” for 2007. A decade later, NGS is taking pathogen characterization and food authentication to the next level.

Over the last 30 years, multiple technology transitions have occurred to improve food safety. In the United States, for example, the Hazard Analysis and Critical Control Points (HACCP) came online in the mid-1990s to reduce illness-causing microbial pathogens on raw products. The move came just a few years after a massive outbreak of E. coli in the U.S. Pacific Northwest caused 400 illness and 4 deaths, and it was clear there was a need for change.

Before HACCP, food inspection was largely on the basis of sight, touch, and smell. It was time to take a more science-based approach to meat and poultry safety. This led to the use of PCR, among other technologies, to better measure and address pathogens in the food industry.

HACCP set the stage for modern-era food testing, and since then, efforts have only intensified to combat food-borne pathogens. In 2011, the Food Safety Modernization Act (FSMA) took effect, shifting the focus from responding to pathogens to preventing them. Data from 20153 showed a 30% drop in foodborne-related bacterial and parasitic infections from 2012 to 2014 compared to the same time period in 1996 to 1998.

But despite these vast improvements, work still remains: According to the CDC, foodborne pathogens in the Unites States alone cause 48 million illnesses and 3,000 fatalities every year. And every year, the food safety industry runs hundreds of millions of tests. These tests can mean the difference between potentially crippling business operations and a thriving business that customers trust. Food recalls cost an average of $10M per incident and jeopardize public health. The best way to stay ahead of the regulatory curve and to protect consumers is to take advantage of the new technological tools we now have at our disposal.

Reducing Errors

About 60% of food safety tests currently use rapid methods, while 40% use traditional culturing. Although highly accurate, culturing can take up to five days for results, while PCR and antigen-based tests can be quicker – -one to two days – but have much lower accuracy. So, what about NGS?

NGS platforms have a turnaround of only one day, and can get to a higher level of accuracy and specificity than other sequencing platforms. And unlike some PCR techniques that can only detect up to 5 targets on one sample at a time, the targets for NGS platforms are nearly unlimited, with up to 25 million reads per sample, with 200 or more samples processed at the same time. This results in a major difference in the amount of information yielded.

For PCR, very small segments of DNA are amplified to compare to potential pathogens. But with NGS tools, all the DNA is tested, cutting it into small fragments, with millions of sequences generated – giving many redundant data points for comparing the genome to potential pathogens. This allows for much deeper resolution to determine the exact strain of a pathogen.

Traditional techniques are also rife with false negatives and false positives. In 2015, a study from the American Proficiency Institute4 on about 18,000 testing results from 1999 to 2013 for Salmonella found false negative rates between 2% and 10% and false positive rates between 2% and 6%. Several Food Service Labs claim false positive rates of 5% to 50%.

False positives can create a resource-intensive burden on food companies. Reducing false negatives is important for public health as well as isolating and decontaminating the species within a facility. Research has shown that with robust data analytics and sample preparation, an NGS platform can bring false negative and positive rates down to close to zero for a pathogen test like Salmonella, Listeria, or E.coli.

Expecting the Unexpected

NGS platforms using targeted-amplicon sequencing, also called DNA “barcoding,” represent the next wave of genomic analysis techniques. These barcoding techniques enable companies to match samples against a particular pathogen, allergen, or ingredient. When deeper identification and characterization of a sample is needed, non-targeted whole genome sequencing (WGS) is the best option.

Using NGS for WGS is much more efficient than PCR, for example, at identifying new strains that enter a facility. Many food manufacturing plants have databases, created through WGS, of resident pathogens and standard decontamination steps to handle those resident pathogens. But what happens if something unknown enters the facility?

By looking at all the genomic information in a given sample and comparing it to the resident pathogen database, NGS can rapidly identify strains the facility might not have even known to look for. Indeed, the beauty of these technologies is that you come to expect to find the unexpected.

That may sound overwhelming – like opening Pandora’s box – but I see it as the opposite: NGS offers an unprecedented opportunity to protect against likely threats in food, create the highest quality private databases, and customize internal reporting based on top-of-the-line science and business practices. Knowledge is power, and NGS technologies puts that power directly in food companies’ hands. Brands that adopt NGS platforms can execute on decisions about what to test for more quickly and inexpensively – all the while providing their customers with the safest food possible.

Perhaps the best analogy for this advancement comes from Magnus Dafgård, owner and executive vice president at Gunnar Dafgård AB: “If you have poor eyesight and need glasses, you could be sitting at home surrounded by dirt and not even know it. Then when you get glasses, you will instantly see the dirt. So, do you throw away the glasses or get rid of the dirt?” NGS platforms provide the clarity to see and address problem directly, giving companies like Dafgård confidence that they are using the most modern, sophisticated food safety technologies available.

As NGS platforms continue to mature in the coming months and years, I look forward to participating in the next jump in food safety – ensuring a safe global food system.

Common Acronyms in Food Genomics and Safety

DNA Barcoding: These short, standardized DNA sequences can identify individual organisms, including those previously undescribed. Traditionally, these sequences can come from PCR or Sanger sequencing. With NGS, the barcoding can be developed in parallel and for all gene variants, producing a deeper level of specificity.

ELISA: Enzyme-linked immunosorbent assay. Developed in 1971, ELISA is a rapid substance detection method that can detect a specific protein, like an allergen, in a cell by binding antibody to a specific antigen and creating a color change. It is less effective in food testing for cooked products, in which the protein molecules may be broken down and the allergens thus no longer detectable.

FSMA: Food Safety Modernization Act. Passed in 2011 in the United States, FSMA requires comprehensive, science-based preventive controls across the food supply. Each section of the FSMA consists of specific procedures to prevent consumers from getting sick due to foodborne illness, such as a section to verify safety standards from foreign supply chains.

HACCP: Hazard analysis and critical control points. A food safety management system, HACCP is a preventative approach to quantifying and reducing risk in the food system. It was developed in the 1950s by the Pillsbury Company, the Natick Research Laboratories, and NASA, but did not become as widespread in its use until 1996, when the U.S. FDA passed a new pathogen reduction rule using HACCP across all meat and poultry raw products.

NGS: Next-generation sequencing. NGS is the most modern, parallel, high-throughput DNA sequencing available. It can sequence 200 to 300 samples at a time and generates up to 25 million reads per a single experiment. This level of information can identify pathogens at the strain level and can be used to perform WGS for samples with unknown pathogens or ingredients.

PCR: Polymerase chain reaction. First described in 1985, PCR is a technique to amplify a segment of DNA and generate copies of a DNA sequence. The DNA sequences generated from PCR must be compared to specific, known pathogens. While it can identify pathogens at the species level, PCR cannot provide the strain of a pathogen due to the limited amount of sequencing information generated.

WGS: Whole genome sequencing. WGS uses NGS platforms to look at the entire DNA of an organism. It is non-targeted, which means it is not necessary to know in advance what is being detected. In WGS, the entire genome is cut it into small regions, with adaptors attached to the fragments to sequence each piece in both directions. The generated sequences are then assembled into single long pieces of the whole genome. WGS produces sequences 30 times the size of the genome, providing redundancy that allows for a deeper analysis.

Citations

  1. Nugen, S. R., & Baeumner, A. J. (2008). Trends and opportunities in food pathogen detection. Analytical and Bioanalytical Chemistry, 391(2), 451-454. doi:10.1007/s00216-008-1886-2
  2. Philpott, C. (2009, April 01). A Summary Profile of Pathogen Detection Technologies. Retrieved September 08, 2017, from https://www.foodsafetymagazine.com/magazine-archive1/aprilmay-2009/a-summary-profile-of-pathogen-detection-technologies/?EMID
  3. Ray, L., Barrett, K., Spinelli, A., Huang, J., & Geissler, A. (2009). Foodborne Disease Active Surveillance Network, FoodNet 2015 Surveillance Report (pp. 1-26, Rep.). CDC. Retrieved September 8, 2017, from https://www.cdc.gov/foodnet/pdfs/FoodNet-Annual-Report-2015-508c.pdf.
  4.  Stombler, R. (2014). Salmonella Detection Rates Continue to Fail (Rep.). American Proficiency Institute.
USP Food Fraud Database

Why Include Food Fraud Records in Your Hazard Analysis?

By Karen Everstine, Ph.D.
2 Comments
USP Food Fraud Database

Food fraud is a recognized threat to the quality of food ingredients and finished food products. There are also instances where food fraud presents a safety risk to consumers, such as when perpetrators add hazardous substances to foods (e.g., melamine in milk, industrial dyes in spices, known allergens, etc.).

FSMA’s Preventive Controls Rules require food manufacturers to identify and evaluate all “known or reasonably foreseeable hazards” related to foods produced at their facilities to determine if any hazards require a preventive control. The rules apply both to adulterants that are unintentionally occurring and those that may be intentionally added for economically motivated or fraudulent purposes. The FDA HARPC Draft Guidance for Industry includes, in Appendix 1, tables of “Potential Hazards for Foods and Processes.” As noted during the recent GMA Science Forum, FDA investigators conducting Preventive Controls inspections are using Appendix 1 “extensively.”

The tables in Appendix 1 include 17 food categories and are presented in three series:

  • Information that you should consider for potential food-related biological hazards
  • Information that you should consider for potential food-related chemical hazards
  • Information that you should consider for potential process-related hazards

According to the FDA draft guidance, chemical hazards can include undeclared allergens, drug residues, heavy metals, industrial chemicals, mycotoxins/natural toxins, pesticides, unapproved colors and additives, and radiological hazards.

USP develops tools and resources that help ensure the quality and authenticity of food ingredients and, by extension, manufactured food products. More importantly, however, these same resources can help ensure the safety of food products by reducing the risk of fraudulent adulteration with hazardous substances.

Incidents for dairy ingredients, food fraud
Geographic Distribution of Incidents for Dairy Ingredients. Graphic courtesy of USP.

Data from food fraud records from sources such as USP’s Food Fraud Database (USP FFD) contain important information related to potential chemical hazards and should be incorporated into manufacturers’ hazard analyses. USP FFD currently has data directly related to the identification of six of the chemical hazards identified by FDA: Undeclared allergens, drug residues, heavy metals, industrial chemicals, pesticides, and unapproved colors and additives. The following are some examples of information found in food fraud records for these chemical hazards.

Undeclared allergens: In addition to the widely publicized incident of peanuts in cumin, peanut products can be fraudulently added to a variety of food ingredients, including ground hazelnuts, olive oils, ground almonds, and milk powder. There have also been reports of the presence of cow’s milk protein in coconut-based beverages.

Drug residues: Seafood and honey have repeatedly been fraudulently adulterated with antibiotics that are not permitted for use in foods. Recently, beef pet food adulterated with pentobarbital was recalled in the United States.

Heavy metals: Lead, often in the form of lead chromate or lead oxide which add color to spices, is a persistent problem in the industry, particularly with turmeric.

Industrial Chemicals: Industrial dyes have been associated with a variety of food products, including palm oil, chili powder, curry sauce, and soft drinks. Melamine was added to both milk and wheat gluten to fraudulently increase the apparent protein content and industrial grade soybean oil sold as food-grade oil caused the deaths of thousands of turkeys.

Pesticides: Fraud in organic labeling has been in the news recently. Also concerning is the detection of illegal pesticides in foods such as oregano due to fraudulent substitution with myrtle or olive leaves.

Unapproved colors/additives: Examples include undeclared sulfites in unrefined cane sugar and ginger, food dyes in wine, and tartrazine (Yellow No. 5) in tea powder.

Adulteration, chili powder, skim milk powder, olive oil
Time Series Plot of Records for Chili Powder (blue), Skim Milk Powder (green), and Olive Oil (orange)

Continue to page 2 below.

Honey, adulteration

The Honey Trap: Analytical Technology Makes Food Fraud Easier to Catch

By Christopher Brodie
1 Comment
Honey, adulteration

Because of its high nutritional value and distinctive flavor, natural honey is a premium product with a price tag significantly higher than that of other sweeteners. As a result, honey is often the target of adulteration using low-cost invert sugar syrups. This article looks at two analytical approaches based on isotope fingerprint analysis using isotope ratio mass spectrometry (IRMS) that can be used to detect honey adulteration and safeguard product integrity.

Honey is a complex mixture of sugars, proteins and other compounds, produced in nature by honeybees from flower nectar or honeydew. The extent to which its sugars are present is heavily dependent on the floral source and differs significantly between honeys produced in different regions. Climate, processing and storage conditions can also have an effect on the amounts of these sugars.1

Fructose and glucose are the major components of honey, and account for 85–95% of the total sugars present. The remaining carbohydrates are a mixture of disaccharides, trisaccharides, and larger oligosaccharides, which give individual honeys their own characteristic taste.

These distinctive flavors, combined with honey’s renowned nutritional benefits and a growing consumer demand for natural, healthy ingredients, have contributed to a substantial increase in honey sales over the past few decades. However, this demand has also helped to raise costs, with some varieties, such as Manuka honey, reportedly selling for as much as $35 for a 250 gm jar.

Just like many other food products that have a premium price tag, intentional adulteration is a significant concern for the honey industry. The fraudulent addition of cheaper sweeteners, such as sugar derived from cane, corn and beet sources, to extend product sales, is unfortunately common within the marketplace.

Honey producers and suppliers therefore require reliable and accurate analytical techniques to profile the composition of honey to identify cases of adulteration. Using analytical data, honey adulteration and counterfeiting can be routinely identified and product integrity can be maintained.

Carbon Isotope Fingerprints of Honey

Analysis of honey is commonly undertaken using isotope ratio mass spectrometry (IRMS) for the detection of adulteration. Honey has a fingerprint, a unique chemical signature that allows it to be identified. To visualize this fingerprint, IRMS can be used to identify the botanical origin of its constituent sugars.

Two ways that carbon can be incorporated into plants by photosynthetic CO2 fixation are the Calvin cycle (also known as the C3 cycle) and Hatch-Slack cycle (the C4 cycle). The nectar used by bees to produce honey comes from plants that produce sugars via the C3 pathway, while the sugars derived from sugar cane and corn are produced by the C4 pathway.

Carbon naturally exists as two stable isotopes that behave in the same way, but possess different atomic mass numbers. Carbon-12 is the most abundant in nature (98.9%), whereas carbon-13 is far less common (1.1%). By measuring the ratio of carbon-13 to carbon-12 (13C/12C) using IRMS, the carbon isotope fingerprint of the honey can be determined. As more carbon-13 is incorporated in sugars produced by the C4 pathway, the adulteration of honey with sugar cane and fructose corn syrups, rich in C4 sugars, can be detected.

In unadulterated honey, the carbon isotope fingerprint will be similar to that of the natural protein precipitated from the honey. However, if cane sugar or high fructose corn syrup has been added, the isotope fingerprint of honey and protein will be significantly different.

Detection of Adulteration by EA-IRMS

One approach that has traditionally been used for the detection of honey adulteration is elemental analysis interfaced with IRMS (EA-IRMS).2 This highly robust, rapid and cost-effective technique is able to reliably detect the addition of C4 sugars in honey at levels down to 7%.3 The analytical approach complies with the official method for the analysis of C4 sugars in honey, AOAC method 998.12.4

In EA-IRMS, bulk honey is combusted in the presence of pure oxygen to form CO2 for analysis. The CO2 produced from the combustion of the bulk honey, including all sugars and the protein fraction, is analyzed by IRMS. Figure 1 shows carbon isotope fingerprints of four unique samples, including bulk honey and the proteins extracted from those honeys, determined using an EA-IRMS system. In each case of adulteration, shown in the grey columns, the honey δ13C value becomes more positive relative to the protein value, moving towards the carbon isotope fingerprint of C4 plants. The natural variation of δ13C in honey is shown by the red lines.5

Figure 1. Carbon isotope fingerprints of bulk honey and protein fractions from those honeys. The red lines show the natural variation of δ13C in honey.2

Detection of Adulteration by LC-IRMS

While EA-IRMS can be used to identify cases of honey adulteration using the bulk sample, the analysis of low levels of added C4 sugars and C3 sugars (i.e., beet sugars) to honey reveal that a compound specific technique with more powerful separation capabilities is needed. Furthermore, as fraudsters develop more sophisticated adulteration techniques and effective ways of concealing their actions, it can be necessary to utilize other IRMS techniques.

Much lower limits of adulteration detection can be obtained from liquid chromatography interfaced with IRMS (LC-IRMS). This technique permits the analysis of very small sample amounts without the need for extensive preparation or derivatization, and can also identify C3 sugar adulteration, which EA-IRMS cannot readily achieve, and therefore serves as a strong, complimentary isotope fingerprint technique. There are IRMS portfolios available that allow for sequential automated analysis of both analytical techniques.

Using LC-IRMS, the sample is oxidized within the aqueous solvent eluting from the HPLC column. The oxidation reagent consists of two solutions: The oxidizing agent itself and phosphoric acid. Both are pumped separately and added to the mobile phase. Within this mixture, all individual organic compounds eluting from the HPLC column are oxidized quantitatively into CO2 upon passing through a heated reactor. In a downstream separation unit, the generated CO2 is then separated from the liquid phase and carried by a stream of helium gas. The individual CO2 peaks in the helium are subsequently dried in an on-line gas drying unit and admitted to the isotope ratio mass spectrometer via an open split interface.

Continue to page 2 below.

Sudan dye

Adulteration with Sudan Dye Has Triggered Several Spice Recalls

By Thomas Tarantelli
4 Comments
Sudan dye

In the following article, the author reports finding Sudan dye in spices in New York State, making the argument for Class I recalls.

In New York State (NYS), Department of Agriculture and Markets food inspectors routinely sample domestic and imported food from retail markets for food dye determination. For decades, the NYS Food Lab has examined both domestic and imported food for undeclared allowed food dyes and unallowed food dyes utilizing a paper chromatography method. This method works well with water-soluble acid dyes, of which food dyes are a subset.

The NYS Food Lab has participated in four sets of the FAPAS proficiency tests: Artificial Colours in Soft Drinks and Artificial Colours in Sugar Confectionary (Boiled Sweets). The qualitative analysis was by paper, thin layer silica and thin layer cellulose chromatography. Satisfactory results were obtained.

The paper/thin layer chromatography method is a qualitative non-targeted method and has a limit of detection of approximately 1 to 5 ppm (parts per million) depending on the dye. If an unallowed dye is detected, the food product is violated as adulterated and results are forwarded to the FDA.

Some countries have a maximum concentration of allowed food dye in a food product. For example India has a 100 ppm to 200 ppm maximum for their allowed food dyes, in some food, singly or in combination.1

Sesame seeds, Rhodamine B
Early 2011, sesame seeds were found to contain Rhodamine B.

In early 2011, a food sample of pink colored sugar coated sesame seed from Pakistan was sent to the lab for color determination. The paper chromatography method could not determine any dyes. (As found out later, the unknown pink dye was not an acid dye.) From research it was found that Rhodamine B was a pink water soluble basic dye commonly used as a food adulterant.  A standard was ordered and then a qualitative high performance Liquid chromatography-tandem mass spectrometry (HPLC/MS/MS) method was developed (Waters UPLC Aquity w/Waters Premier XE triple quadrapole) to determine Rhodamine B. After utilizing this new method, Rhodamine B was found in the sugar coated sesame seed.

Rhodamine B is an industrial dye and is not allowed in food anywhere in the world. Industrial dyes are not allowed in food because they are toxic; in fact, some industrial dyes are used for suicide.2,3,4 In addition, industrial dyes are not made to “food grade” specifications with regard to dye purity, heavy metal (i.e., arsenic and lead) concentrations, subsidiary dye concentrations and concentrations of unreacted precursors. From additional research of news articles and research papers, more industrial dyes were identified as common food adulterants; more dye standards were ordered and incorporated into the HPLC/MS/MS method. The NYS Food Lab’s current HPLC/MS/MS surveillance method includes 36 compounds: Water soluble “acid dyes” and “basic dyes”, organic solvent soluble “solvent dyes”, and several pigments.

The HPLC/MS/MS method has a limit of detection in the ppb (parts per billion) range for some dyes and parts per trillion for other dyes. The FDA has an action level of 1 ppb for certain water-soluble basic dyes (such as Malachite Green) when used as a fish antibiotic. However, due to concern that unallowed dyes might be present due to contamination from packaging, the food lab subsequently set an action level of 1 ppm for unallowed dyes determined by the HPLC/MS/MS method. At levels over 1 ppm, detection of dyes in food would indicate intentional dye usage for coloring food.

The food lab has participated in three rounds of the FAPAS proficiency test, “Illegal Dyes found in Hot Pepper Sauce”. The qualitative analysis was by LC/MS/MS. Satisfactory results were obtained.

Sudan Dyes Considered to be Carcinogenic

“Sudan dyes are not allowed to be added to food. There has been worldwide concern about the contamination of chili powder, other spices, and baked foods with Sudan dyes since they may have genotoxic and carcinogenic effects (according to the International Agency for Research on Cancer)”.5

“There have been several documented cases of spices being contaminated with carcinogenic dyes such as Sudan I or lead oxide. We therefore assume that the presence of these chemicals in spice ingredients will be considered a reasonably foreseeable hazard under this rule.”6

“Sudan red dyes have been used to color paprika, chili powders, and curries, but are also known carcinogens and are banned for use in foods.” 7

Sudan Dyes are a family of more than 10 synthetic industrial “solvent dyes”. Solvent dyes are typically used to color oils and waxes, including shoe polish. Sudan dyes that the food lab has found in spices include Sudan 1 (Sudan I), and Sudan 4 (Sudan IV). Sudan 1, also known as Solvent Yellow 14, is an orange colored dye. Sudan 4, also known as Solvent Red 24, is a blue shade red colored dye.

Positive identification of Sudan 4 is often hindered by the existence of a positional isomer, Sudan Red B (Solvent Red 25). This problem was addressed by using the HPLC/MS/MS method with a transition unique to Sudan 4 (381.2 > 276.0). This information was obtained from one of the two corroborating labs. The food lab has recently identified a transition unique to Sudan Red B (381.2 > 366.1).

Sudan Dyes Found in Spices in Europe

In March 2001, Europe began discovering Sudan dyes in spices. A February 2017 search of Europe’s Rapid Alert System for Food and Feed (RASFF) for “unauthorised colour” and “sudan” in the “herbs and spices” food category resulted in 429 notifications.

The 429 RASFF notifications arranged by year and by maximum concentration reported of Sudan 1 and Sudan 4 during that year are listed in Table I.

Sudan dye
Table I.

In a search of the FDA’s Import Alert 45-02 (Detention Without Physical Examination and Guidance of Foods Containing Illegal and/or Undeclared Colors) the author could find no record of spices violated for Sudan dye adulteration.

In a search of the FDA’s Enforcement Reports the author could find no record of spices violated for Sudan dye adulteration.

Industrial Dyes in Food: Class II or Class I Recall?

The NYS Food Lab and the FDA routinely find imported food containing unallowed food dyes such as Ponceau 4R, Amaranth and Carmoisine. These unallowed food dyes are allowed for use in food in other parts of the world, while not allowed in the USA. Foods containing unallowed food dyes are violated as adulterated and a Class II recall will occur. Sudan dyes are not allowed as food dyes anywhere in the world. They are industrial dyes, used in coloring oils and waxes, such as shoe polish.

“Class I recall: A situation in which there is a reasonable probability that the use of or exposure to a violative product will cause serious adverse health consequences or death.

Class II recall: A situation in which use of or exposure to a violative product may cause temporary or medically reversible adverse health consequences or where the probability of serious adverse health consequences is remote.”8

With a Class II recall, there is no consumer notification. In contrast, as part of a Class I recall, a press release is issued. Consumers who have purchased the product might be informed and may discard the product or return it for a refund.

Continue to page 2 below.

The Validation Conversation

By Joy Dell’Aringa
No Comments

Our industry is in a perpetual food safety discussion. We are constantly mulling over the finer points of hazards, risk, preventive controls, training, sanitation, and regulations. Validation is also a key component of the food safety dialog. Here we will explore common themes industry professionals discuss in regard to validation in this era of food safety.

Definitions

In any good conversation, terms must be set and semantics agreed upon. It is helpful to start off with a simplistic definition of validation and verification that can be applied across industries and applications. We often return to these reductive definitions throughout conversations to recalibrate and ensure that all parties are on the same page.

  • Validation:  Are we using the correct system / method?
  • Verification: Are we using the system / method correctly?

From there, we narrow our focus. Using the FSMA backdrop, from the FDA’s “Draft Guidance for Industry: Control of Listeria monocytogenes in Ready-To-Eat Foods” we find the following definitions:

Validation: Obtaining and evaluating scientific and technical evidence that a control measure, combination of control measures, or the food safety plan as a whole, when properly implemented, is capable of effectively controlling the identified hazards.

Verification: The application of methods, procedures, tests and other evaluations, in addition to monitoring, to determine whether a control measure or combination of control measures is or has been operating as intended and to establish the validity of the food safety plan.

Validation and Verification: Semantics Matter.

Definitions for validation and verification are available from various standards organizations and regulatory bodies. What is most important, however, is that in this conversation there is a clear distinction between validation and verification—both in activities and objectives. These are not interchangeable terms. Further, validation and verification can be discussed from two general perspectives in the food safety landscape. Process validation addresses manufacturing activities and controls to prevent product hazard and contamination. Method validation addresses the analytical methods used to verify the physical, chemical or microbiological properties of a product.

Process Validation

Our industry is comprised of a variety of categorical segments. Each segment faces unique processing challenges, risks and requirements that must be addressed in the validation and verification conversation.

Some segments, such as the dairy industry, have long standing processes in place that have a robust scientific backbone and leave little room for guesswork, experimentation or modification. “Milk  processes were validated years ago and are part of the Pasteurized Milk Ordinance (PMO). The science is there,” states Janet Raddatz, vice president of quality & food safety systems at Sargento Foods, Inc. ” It is well established that when you pasteurize the product for the time and temperature that has been validated, then you simply verify the pasteurizer is working to the validated specifications.”

However, process validation challenges arise when novel applications, ingredients and processes are employed. Even in an established industry, reformulations of products such as sauces and dressings require fresh validation perspective and risk assessment. “You must assess the risk anytime there is a change. Properties such as pH, salt and water are critical variables to the safety and microbial stability of a product. Novel processing techniques aimed at ‘all natural’ or ‘minimal processing’ consumer demands should also be challenged.” Raddatz suggests conducting a full assessment to identify potential areas of risk. A challenge study may also be a critical piece to validate that a certain process or formulation is appropriate.

To help the food industry understand, design and apply good validation and verification practices, the Institute for Food Safety and Health (IFSH) published “Validation and Verification: A Practical, Industry-driven Framework Developed to Support the Requirement of the Food Safety Modernization Act (FSMA) of 2011.” This insightful document provides various definitions, guidance, practical advice, and offers several Dos and Don’ts on validation and verification activities.

Do:

  • Divide validation and verification into separate tasks
  • Think of validation as your scientific evidence and proof the system controls the hazards
  • Use science-based information to support the initial validation
  • Use management to participate in validation development and operations of verification
  • Use lessons from “near-misses” and corrections to adjust and improve the food safety system

Don’t:

  • Confuse the activities of verification with those of routine monitoring
  • Rely on literature or studies that are unlike your process/ product to prove controls are valid
  • Conduct audit processes and then not review the results
  • Perform corrective actions without determining if a system change may be needed to fix the problem
  • Forget, reanalysis is done every three years or sooner if new information or problems suggest

Method Validation

Analytical methods used to verify a validated food process must also be validated for the specific product and conditions under which they will be conducted. For example, a manufacturer that has their laboratory test a product for Salmonella to verify that a kill step in the manufacturing process worked, must ensure that the method the laboratory uses is both validated for that product and has been verified as appropriate for use in that laboratory. Three general considerations should be discussed with the laboratory:

  • Is the method validated for the product (matrix)?
    • Often, the method will carry several matrix validations that were previously conducted by the diagnostic provider, an industry organization or as a reference method.
    • If the matrix to be tested is not validated the laboratory should conduct a validation study before proceeding.
  • Has the laboratory verified this method on the product (matrix)?
    • The laboratory should demonstrate that they can indeed perform the validated method appropriately.
    • Verification activities typically involve a matrix specific spiked recovery.
  • Are there any modifications made to the validated method?
    • All method modifications should be validated and verified. Additionally, modification should be noted on the laboratory report or Certificate of Analysis issued.
    • Method modifications may include time and temperature alterations, media changes and sample preparation factors.

AOAC International is an organization that certifies the validation of methods to a specific prescribed standard. “Diagnostic companies seek AOAC approval, which entails rigorous validation protocol with the selected matrices,” says Ronald Johnson Ph.D., president of AOAC International and associate director of validation for bioMérieux, describes the importance of commercial standardization.  “The AOAC validation scheme ensures that the method is robust, rugged, inclusive and exclusive, stable and meets the sensitivity presented.” Standards such as these provide confidence to the user that the method is fit-for-purpose, a critical first step in method selection.

While many diagnostic companies will perform standardized validation as described above, how a laboratory validates and verifies a method is incredibly nuanced in the food industry. Currently, there is no standardized approach to study design and execution. Even ISO 17025 accredited laboratories are only required to have a validation and verification protocol—there is no dictation about what that protocol should look like.

“Currently, there is a lot of variation in the industry around [method] validation,” says Patrick Bird, microbiology R&D laboratory supervisor at Q Laboratories. Bird is a method validation expert who is on the U.S. ISO TAG TC34/SC9 working group 3 for the new ISO validation and verification standards, including ISO/DIS 16140-4 guidelines, “Microbiology of the food chain – Method Validation – Part 4: Protocol for single-laboratory (in-house) method validation.”

“Variables such as number of replicates, spike levels, and even acceptance criteria vary widely from lab to lab—both in manufacturing laboratories and contract testing laboratories. We hope the ISO guidelines will standardize that, ” says Bird. He goes on to discuss the importance of good laboratory stewardship in the industry. “While some look at validations as a proprietary or competitive advantage, the testing industry must realize that without standardization, poor validation and verification practices by a few can tarnish the great science done by the many, and ultimately jeopardize the safety of our food supply.” He stresses the importance of quality operations and open communications with laboratories, whether in house or third party. “Now that validation is highlighted as a required area in FSMA Preventive Controls, more and more companies are paying attention to the methods and associated validation/verification data their labs can provide.”

Continue to page 2 below.

Sequencing pattern, pathogens

Build Stronger Food Safety Programs With Next-Generation Sequencing

By Akhila Vasan, Mahni Ghorashi
No Comments
Sequencing pattern, pathogens

According to a survey by retail consulting firm Daymon Worldwide, 50% of today’s consumers are more concerned about food safety and quality than they were five years ago. Their concerns are not unfounded. Recalls are on the rise, and consumer health is put at risk by undetected cases of food adulteration and contamination.

While consumers are concerned about the quality of the food they eat, buy and sell, the brands responsible for making and selling these products also face serious consequences if their food safety programs don’t safeguard against devastating recalls.

A key cause of recalls, food fraud, or the deliberate and intentional substitution, addition, tampering or misrepresentation of food, food ingredients or food packaging, continues to be an issue for the food safety industry. According to PricewaterhouseCoopers, food fraud is estimated to be a $10–15 billion a year problem.

Some of the more notorious examples include wood shavings in Parmesan cheese, the 2013 horsemeat scandal in the United Kingdom, and Oceana’s landmark 2013 study, which revealed that a whopping 33% of seafood sold in the United States is mislabeled. While international organizations like Interpol have stepped up to tackle food fraud, which is exacerbated by the complexity of globalization, academics estimate that 4% of all food is adulterated in some way.

High-profile outbreaks due to undetected pathogens are also a serious risk for consumers and the food industry alike. The United States’ economy alone loses about $55 billion each year due to food illnesses. The World Health Organization estimates that nearly 1 in 10 people become ill every year from eating contaminated food. In 2016 alone, several high-profile outbreaks rocked the industry, harming consumers and brands alike. From the E. coli O26 outbreak at Chipotle to Salmonella in live poultry to Hepatitis A in raw scallops to the Listeria monocytogenes outbreak at Blue Bell ice cream, the food industry has dealt with many challenges on this front.

What’s Being Done?

Both food fraud and undetected contamination can cause massive, expensive and damaging recalls for brands. Each recall can cost a brand about $10 million in direct costs, and that doesn’t include the cost of brand damage and lost sales.

Frustratingly, more recalls due to food fraud and contamination are happening at a time when regulation and policy is stronger than ever. As the global food system evolves, regulatory agencies around the world are fine-tuning or overhauling their food safety systems, taking a more preventive approach.

At the core of these changes is HACCP, the long implemented and well-understood method of evaluating and controlling food safety hazards. In the United States, while HACCP is still used in some sectors, the move to FSMA is apparent in others. In many ways, 2017 is dubbed the year of FSMA compliance.

There is also the Global Food Safety Initiative (GFSI), a private industry conformance standard for certification, which was established proactively by industry to improve food safety throughout the supply chain. It is important to note that all regulatory drivers, be they public or private, work together to ensure the common goal of delivering safe food for consumers. However, more is needed to ensure that nothing slips through the food safety programs.

Now, bolstered by regulatory efforts, advancements in technology make it easier than ever to update food safety programs to better safeguard against food safety risks and recalls and to explore what’s next in food.

Powering the Food Safety Programs of Tomorrow

Today, food safety programs are being bolstered by new technologies as well, including genomic sequencing techniques like NGS. NGS, which stands for next-generation sequencing, is an automated DNA sequencing technology that generates and analyzes millions of sequences per run, allowing researchers to sequence, re-sequence and compare data at a rate previously not possible.

The traditional methods of polymerase chain reaction (PCR) are quickly being replaced by faster and more accurate solutions. The benefit of NGS over PCR is that PCR is targeted, meaning you have to know what you’re looking for. It is also conducted one target at a time, meaning that each target you wish to test requires a separate run. This is costly and does not scale.

Next-generation sequencing, by contrast, is universal. A single test exposes all potential threats, both expected and unexpected. From bacteria and fungi to the precise composition of ingredients in a given sample, a single NGS test guarantees that hazards cannot slip through your supply chain.  In the not-too-distant future, the cost and speed of NGS will meet and then quickly surpass legacy technologies; you can expect the technology to be adopted with increasing speed the moment it becomes price-competitive with PCR.

Applications of NGS

Even today’s NGS technologies are deployment-ready for applications including food safety and supplier verification. With the bottom line protected, food brands are also able to leverage NGS to build the food chain of tomorrow, and focus funding and resources on research and development.

Safety Testing. Advances in NGS allow retailers and manufacturers to securely identify specific pathogens down to the strain level, test environmental samples, verify authenticity and ultimately reduce the risk of outbreaks or counterfeit incidents.

Compared to legacy PCR methods, brands leveraging NGS are able to test for multiple pathogens with a single test, at a lower cost and higher accuracy. This universality is key to protecting brands against all pathogens, not just the ones for which they know to look.

Supplier Verification. NGS technologies can be used to combat economically motivated food fraud and mislabeling, and verify supplier claims. Undeclared allergens are the number one reason for recalls.

As a result of FSMA, the FDA now requires food facilities to implement preventative controls to avoid food fraud, which today occurs in up to 10% of all food types. Traditional PCR-based tests cannot distinguish between closely related species and have high false-positive rates. NGS offers high-resolution, scalable testing so that you can verify suppliers and authenticate product claims, mitigating risk at every level.

R&D. NGS-based metagenomics analysis can be used in R&D and new product development to build the next-generation of health foods and nutritional products, as well as to perform competitive benchmarking and formulation consistency monitoring.

As the consumer takes more and more control over what goes into their food, brands have the opportunity to differentiate not only on transparency, but on personalization, novel approaches and better consistency.

A Brighter Future for Food Safety

With advances in genomic techniques and analysis, we are now better than ever equipped to safeguard against food safety risks, protect brands from having to issue costly recalls, and even explore the next frontier for food. As the technology gets better, faster and cheaper, we are going to experience a tectonic shift in the way we manage our food safety programs and supply chains at large.

DNA sequencing

Whole Sample Next-Generation DNA Sequencing Method: An Alternative to DNA Barcoding

By Casey Schlenker, Jenna Brooks, Kent Oostra, Ryan McLaughlin
No Comments
DNA sequencing

This article discusses a non-targeted method for whole sample next generation DNA sequencing (NGS) that does not rely on DNA barcoding. DNA barcoding requires amplification of a specific gene region, which introduces bias. Our non-targeted method removes this bias by eliminating the amplification step. The applications of this method are broad and we have begun optimizing workflows for numerous materials, both processed and unprocessed. Some of the materials we have been able to successfully identify at the species level are fish tissue, fish meal, unrefined fish oil, unrefined plant-based oils (nuts, seeds, and fruits), specific components of cooked and processed products such as cookies and powders, and processed meats. Non-targeted NGS is also a very powerful tool to comprehensively identify constituents of microbial communities in probiotics and fermented products like kombucha. Additionally, this non-targeted technique is applicable to detection and identification of microbial contamination at various levels of manufacturing including equipment surfaces, processing water and assaying intermediate processing steps. In this communication we briefly review a current issue in the botanicals industry, discuss the methods that have been used in the past to tackle that problem, and present preliminary results from a pilot study we performed to determine the utility of non-targeted NGS in high-throughput identification of botanical raw materials.

The value of the global herbal dietary supplement (botanical) market was estimated to be greater than $90 billion in 2016, with a projected compound annual growth rate of 5-6%. Currently, regulators and manufacturers in this rapidly expanding market seek to confirm the veracity of label claims, investigate fraud, identify adulterants and ensure product quality.1 These products are often dried and ground, making visual identification difficult, time consuming and sometimes impossible.2 It is critical to this market that botanical identification be high-throughput, accurate and cost effective. Historically, various chromatography techniques have been used to meet this need, but those techniques rely on identification of molecules that can vary significantly due to storage conditions, which has led to the use of DNA barcoding as an analytical technique. However, DNA barcoding is not without significant challenges.1

For quite some time, scientists have had the ability to identify biological samples by sequencing their DNA.3 Currently DNA sequencing-based identification methods rely heavily on a technique called DNA barcoding, which functions analogously to the barcodes found on products in a grocery store. DNA barcoding amplifies a distinct small gene region that serves as a unique identifier and “scans” it by DNA sequencing.4 The advantages of this amplification are high sensitivity and simplification of data analysis. However, this amplification is not completely reliable and in practice can create biases and false positives.5 There is also the possibility that the amplification may fail, causing false negatives.6 When using DNA barcoding to identify botanical raw materials, numerous labs have observed notably high levels of apparent contamination.7 While it is certainly likely that some or even many botanical raw material samples contain contamination, it is also possible that the amplification-based method of DNA barcoding is itself contributing to the levels of contamination that are being observed.

We have partnered with Practical Informatics and Pacific Northwest Genomics to develop comprehensive whole sample DNA screening methods that don’t rely on amplification. To achieve this we are utilizing a non-targeted metagenomics workflow. Non-targeted metagenomic analysis is a powerful tool for examining the entire genetic content of a sample, instead of just one particular gene region (if a gene is a word or phrase, then a genome is the entire book, and the metagenome is the library). Unlike DNA barcoding, which requires PCR amplification, non-targeted metagenomic analysis requires no prior knowledge of a sample’s source and does not introduce the biases that plague PCR initiated methods. All of the DNA extracted from a sample is analyzed without targeting any particular gene region, relying instead on complex data analysis to identify the constituents (Figure 1). This is accomplished with the use of advanced molecular biology techniques and sophisticated computational methods, combined with a highly-curated database of species-identifying DNA sequences. Our research and development team has completed several experiments demonstrating the utility of a non-targeted DNA sequencing method.

DNA sequencing
Figure 1. The traditional targeted method, or DNA barcoding uses a PCR amplification step prior to sequencing. Non-target whole sample sequencing skips the amplification step and all present DNA is sequenced and used in analysis.

Our research endeavors to solve the issues of DNA sequence analysis that originate with the PCR step by simply eliminating amplification from our process entirely. PCR amplification as a prelude to DNA sequencing traces to traditional technologies that were lower throughput and required large amounts of material. Current generation high-throughput DNA sequencing technologies do not require large amounts of starting material, and therefore amplification can be avoided. Many DNA barcoding methods require universal primers, which, during PCR, can amplify some products but not others, leading to false negatives. A solution to that issue is to use specific primers, however this is also inherently problematic as a certain foreknowledge of the sample identity is required. What is the advantage to our non-targeted sequencing method? There is no need to direct the analysis to any particular identification before sequencing, decreasing the introduction of bias and false negatives. As an added bonus, we don’t need to know what the sample is prior to analysis—we can tell you what it is rather than you telling us.

Continue to page 2 below.

Reduce Foodborne Illness Causing Microorganisms through a Structured Food Safety Plan

By James Cook
1 Comment

In 2011 three U.S. government agencies, the CDC, the FDA and the USDA’s Food Safety Inspection Service (FSIS) created the Interagency Food Safety Analytics Collaboration (IFSAC). The development of IFSAC allowed these agencies to combine their federal food safety efforts. The initial focus was to identify those foods and prioritize pathogens that were the most important sources of foodborne illnesses.

The priority pathogens are Salmonella, E. coli O157:H7, Listeria monocytogenes and Campylobacter. To research the most important product sources, the three agencies collaborated on the development of better data collection and developed methods for estimating the sources of foodborne illnesses. Some of this research was to evaluate whether the regulatory requirements already in effect were reducing the foodborne pathogens in a specific product matrix. The collection, sharing and use of this data is an important part of the collaboration. For example, when the FDA is in a facility for routine audit or targeted enforcement, they will generally take environmental swabs and samples of air, water and materials, as appropriate, which are then tested for the targeted pathogens. If a pathogen is found, then serotyping and pulsed-field gel electrophoresis (PFGE) fingerprinting is performed, and this is compared to the information in the database concerning outbreaks and illnesses. This data collection enables the agencies to more quickly react to pinpoint the source of foodborne illnesses and thereby reduce the number of foodborne illnesses.

The IFSAC strategic plan for 2017 to 2021 will enhance the collection of data. The industry must be prepared for more environmental and material sampling. Enhancement of data collection by both agencies can be seen through the FSIS notices and directives, and through the guidance information being produced by the FDA for FSMA. Some examples are the raw pork products exploratory sampling project and the FDA draft guidance for the control of Listeria monocytogenes in ready-to-eat foods.

Starting May 1 2017, the next phase of the raw pork products exploratory sampling project will begin. Samples will be collected and tested for Salmonella, Shiga-toxin producing E. coli (STECs), aerobic plate count and generic E. coli. In the previous phase, the FSIS analyzed 1200 samples for Salmonella for which results are published in their quarterly reports. This is part of the USDA FSIS Salmonella action plan published December 4, 2013 in an effort to establish pathogen reduction standards. In order to achieve any objective, establishing baseline data is essential in any program. Once the baseline data is established and the objective is determined, which in this situation is the Health People 2020 goal of reducing human illness from Salmonella by 25%, one can determine by assessment of the programs and data what interventions will need to take place.

The FDA has revised its draft guidance for the control of Listeria monocytogenes in ready-to-eat food, as per the requirement in 21 CFR 117 Current Good Manufacturing Practice, Hazard Analysis and Risk-Based Preventive Controls for Human Foods, which is one of the seven core FSMA regulations. Ready-to-eat foods that are exposed to the environment prior to packaging and have no Listeria monocytogenes control measure that significantly reduces the pathogen’s presence, will be required to perform testing of the environment and, if necessary, testing of the raw and finished materials. Implementing this guidance document helps the suppliers of these items to cover many sections of this FSMA regulation.

The purpose of any environmental program is to verify the effectiveness of control programs such as cleaning and sanitizing, and personnel hygiene, and to identify those locations in a facility where there are issues. Corrective actions to eliminate or reduce those problems can then be implemented. Environmental programs that never find any problems are poorly designed. The FDA has stated in its guidance that finding Listeria species is expected. They also recommend that instead of sampling after cleaning and/or sanitation, the sampling program be designed to look for contamination in the worst-case scenario by sampling several hours into production, and preferably, just before clean up. The suggestion on this type of sampling is to hold and test the product being produced and to perform some validated rapid test methodology in order to determine whether or not action must be taken. If the presence of a pathogen is confirmed, it is not always necessary to dispose of a product, as some materials can be further processed to eliminate it.

With this environmental and product/material testing data collected, it is possible to perform a trends analysis. This will help to improve sanitation conditions, the performance of both programs and personnel, and identity the need for corrective actions. The main points to this program are the data collection and then the use of this data to reduce the incidence of foodborne illness. Repeated problems require intervention and resolution. Changes in programs or training may be necessary, if they are shown to be the root cause of the problem. If a specific issue is discovered to be a supply source problem, then the determination of a suppliers’ program is the appropriate avenue to resolve that issue. Generally, this will mean performing an audit of the suppliers program or reviewing the audit, not just the certificate, and establishing whether they have a structured program to reduce or eliminate these pathogens.

Continue to page 2 below.

Vulnerability assessment

Protecting Food Against Intentional Adulteration: The Vulnerability Assessment (Part One)

By Debby L. Newslow
2 Comments
Vulnerability assessment

FDA, as part of FSMA, released its rule titled “Protecting Food Against Intentional Adulteration” on May 27, 2016. This rule was proposed in 2013. FDA received and responded to 200+ comments prior to its final release.

FDA states that this rule “is aimed at preventing intentional adulteration from acts intended to cause wide-scale harm to public health, including acts of terrorism targeting the food supply. Such acts, while not likely to occur, could cause illness, death, [and] economic disruption of the food supply absent mitigation strategies.”1

The rule requires a documented “Food Defense Plan” that at a minimum includes the following:

  • Vulnerability assessment
  • Mitigation strategies
  • Procedures for food defense monitoring
  • Food defense corrective action procedures
  • Food defense verification procedures
  • Records confirming implementation, maintenance and conformance to the defined requirements
  • Evidence of effective training

As a food safety professional with more than 30 years in the industry, reviewing this rule brought back many memories. These memories combined with information gained from a recently completed Food Defense/ Crisis Management workshop presented by Rod Wheeler really set my brain into motion.2

Years ago, industry focused on crisis management and product recall. Requirements included having a crisis management team that was led by associates representing both upper and middle management. In addition, most programs included the following:

  • Posted identification of the crisis management team (i.e., pictures, phone numbers, etc.)
  • Specific training for receptionist and guards
  • Mock crisis exercises (i.e., fire drills)
  • Planned crisis calls to the operation’s direct incoming phone numbers (i.e., receptionist and guards)
  • Mock recalls (from supplier through finished product and distribution)
  • Security inspections which may now be considered the pre-cursor to today’s “Vulnerability Assessment”

With the introduction of the GFSI approved schemes (FSSC 22000, BRC, SQF, GlobalG.A.P., Primus, etc.), requirements for crisis management, emergency preparedness, security programs, food defense training and continuity planning gained an increase focus. Do any or all of these programs meet the requirement for a “vulnerability assessment”?

In the 2013 publication, Food Safety Management Programs, this subject-matter chapter was titled “Security, Food Defense, Biovigilance, and Bioterrorism (chapter 14)”.3 An organization must identify the focus/requirements that are necessary for its operation. This decision may relate to many different parameters, including the organization’s size, design, location, food sectors represented, basic GMPs, contractor and visitor communication/access, traceability, receiving, and any other PRP programs related to ensuring the safety of your product and your facility. Requirements must be defined and associates educated to ensure that everyone has a strong and effective understanding of the requirements and what to do if a situation or event happens.

Confirming the security of a facility has always been a critical operational requirement. Many audits have been performed that included the following management statement: “Yes, of course, all the doors are locked. Security is achieved through key cards or limited distribution of door keys, thus no unwanted intruder can access our building.” This statement reminds me of a preliminary assessment that I did not too long after the shootings at a Pennsylvania manufacturer in September of 2010. The organization’s representor and myself were walking the external parameter of a food manufacturer at approximately 7:30 PM (still daylight). We found two doors (one in shipping and one accessing the main office), with the inside door latch taped so that the doors were not secure. The tape was not readily evident. The doorknob itself was locked, but a simple pull on knob opened the door. Our investigation found that a shipping office associate was waiting for his significant other to bring his dinner and was afraid that he would not be at his desk when she arrived. An office associate admitted that that door had been fixed to pull open without requiring a key several months earlier because associates frequently forgot their keys and could not gain access to start work.

Debby Newslow Debby Newslow will present ” Sanitary Transportation for Human & Animal Food – Meeting the new FDA Requirements” at the Food Safety Supply Chain Conference  | June 5–6, 2017 | Attend in Rockville, MD or via webcast | LEARN MORE

We also observed a large overhead door adjacent to the boiler room along the street side of the facility open, allowing direct access to the processing area by passing through the boiler room and then the maintenance shop. It was stated that the door had been opened earlier in the day waiting for the delivery of new equipment. No one at the time knew the status of the shipment or why the door was still open.

Finding open access to facilities is becoming more and more common. A formal vulnerability assessment is not necessary to identify unsecured doors (24/7) in our facilities. Education and due diligence are excellent tools for this purpose.

Another frequently identified weakness is with organization’s visitor and contractor sign-in prerequisite programs. What type of “vulnerability” are we creating for ourselves (false confidence) with these programs? Frequently these programs provide more questions than answers:

  • Does everyone really sign in?
  • What does signing the visitor log mean?
  • Are visitors required to show identification?
  • Are the IDs actually reviewed and if so, what does this review include?
  • Who is monitoring visitors and contractors and are they trained?
  • Do all contractors have to sign the log or are they allowed to access the building at different locations?
  • Do those contractors who make frequent or regular trips have their own badges and/or keys (keycards) so they don’t have to take the time to sign-in (i.e., pest control, uniform supplier vending services)?
  • How are contractor badges controlled?
  • Are visitors required to be accompanied during the visit or does it depend on the visitor and whom they are visiting?
  • Are visitors and contractors trained in company requirements?
  • Do visitors and contractors have an identifying item to alert your associates of their status (i.e., visitor badge, visitor name badge, specifically colored bump cap, colored smock, etc.)?
  • How are truck drivers monitored? Do they have a secured room for them or do they have complete access to the facility to access the restrooms and breakroom?
  • How are terminated associates or associates that have voluntarily left the company controlled?
    • Can these associates continue to access the facility with keys, access cards, or just through other associates (i.e., friends or associates that did not know that they were no longer an employee)?
  • How many more questions can there be?

Continue to page 2 below

Audit

Best Practices for ISO 17025 Accreditation: Preparing for Your Food Laboratory Audit (Part II)

By Joy Dell’Aringa
2 Comments
Audit

In Part I of this article, we explored the considerations a laboratory should initially evaluate when pursuing accreditation, as well as guidance from leading industry experts on how to prepare for an ISO 17025 audit. Here we will review what comes after the on-site assessment and provide practical user-based advice for preparing a response, common areas of non-conformance, and future changes to the ISO 17025 Standard.

The Response

Once the assessor has completed the audit, they will typically hold a closing meeting on-site where they present their findings, also referred to as deficiencies or non-conformances. For each finding they will document a specific reference to the standard as evidence and provide opportunity for questions and discussion. Most assessors will be open and conversational during this final portion of the assessment; laboratories are well suited to take advantage of this time. Some assessors will even brainstorm possible responses and corrective actions while onsite; this is valuable insight for the laboratories quality team and can help them get a jump on the response.

Depending on the accrediting body, the laboratory will have a certain amount of time to respond to the findings, usually 30–60 days. The anatomy of a well-assembled response will include a full corrective action report, complete with root cause analysis. Often, the assessor will also request supporting documents and records to show the effectiveness of a corrective action. Most laboratories will have forms to help guide users through the corrective action and root cause process. It is important to have a systematic approach to ensure your corrective action is thorough and balanced.

Determining root cause is a critical part of this exercise. Erin Crowley, CSO of Q Laboratories shares their approach. “We use a variety of root cause analysis techniques, but have found for our operation the principle of the ‘5 Why’s’ is very effective,” she says. “Don’t simply answer the singular deficiency. Accrediting bodies will want to know that you have addressed all variables that might be associated with a finding. For example, if a specific incubator was out of range on a specific date, don’t just indicate that someone fixed it and move on. Assess how they addressed the issue, any impact on data, what they did to react to it, and how they are putting systems in place to prevent it from happening in the future on any other incubator. You have to show the full process.”

Implementing procedures as an outcome of a corrective action can also bring challenges to an operation. As a national multi-site reference lab, Eurofins Quality Manager Peter Dragasakis must work with other departments and locations to deploy new or changed systems for compliance. “Sometimes the most challenging part of the entire audit process is coordinating internal stakeholders across other departments such as IT or complimentary analytical departments,” he says. “Coordinating a response in a timely manner takes full organizational cooperation and support.” Communication throughout the quality and operational arms of an organization is critical to a successful response. Often, accrediting bodies and laboratories may shuttle a response back and forth a few times before everyone is satisfied with the outcome.

Common Areas of Non-Conformance: Pro-Tips

While all areas of the standard are important to a conformant operation, there are a few key areas that are frequently the focus of assessments and often bare the most findings.

Measurement Uncertainty. Depending on the laboratories Field of Testing (FOT), Measurement Uncertainty (MU) can be captured in a multitude of ways. The process aims to systematically and quantifiably capture variability in a process. For chemical analysis this is typically well defined and straightforward. For microbiological analysis the approach is more challenging. A2LA’s General Manager, Accreditation Services, Adam Gouker says the reason many labs find themselves deficient in this area is “they don’t consider all of the contributors that impact the measurement, or they don’t know where to begin or what they need to do.” Fortunately, A2LA offers categorical guidance in documents P103a and P103b (for the life sciences laboratories, two of the of many guidance documents aimed at helping laboratories devise systems and protocols for conformance.

Traceability. There are several requirements in the ISO 17025 standard around traceability. In terms of calibration conformance, which accrediting bodies seem to have emphasized in the last few years, Dragasakis offers this tip: “When requesting [calibration] services from a vendor, make sure you’re requesting 17025 accredited service. You must specify this, as several levels of service may be available, and “NIST Traceable” certificates are usually no longer sufficient.” He also advises that calibration certificates be scrutinized for all elements of compliance closely. “Some companies will simply state that it is ‘ISO 17025 compliant’, [and] this does not mean it is necessarily certified. Look for a specific reference to the accrediting body and the accreditation certificate number. Buyer beware, there is often a price difference between the different levels of calibration. Always practice due diligence when evaluating your calibration vendor and their services, and contact the calibration service if you have any questions.”

Validation vs. Verification. One of the more nuanced areas of the standard lies in determining when a test requires validation, verification, or an extension, specifically when there is a modification to a method or a sample type not previously validated by an internationally recognized organization (AOAC, AFNOR, etc.). Certified Laboratories Director Benjamin Howard reminds us, “think of validation and verification as existing on a spectrum. The more you stray away from an existing validation, the more validation work is required by the analyzing laboratory.” For example, analyzing Swiss cheese for Salmonella by a method that has already been validated for soft queso cheese may require only minimal verification or matrix extension. However, a laboratory that is altering a validated incubation time or temperature would require a much more robust and rigid validation process. Howard cautions, “Accredited laboratories must be transparent about modifications, not only on their scope of accreditation but on their reports [or CofA’s] as well. Under FSMA, companies are now accountable to the data that their laboratories generate. If you see a “modification” note on your report, perform due diligence and discuss this with your laboratory. Ensure a proper validation of the modification was performed. “Additionally, the ISO 17025 standard and accrediting bodies do not mandate how a validation or verification should be done. Laboratories should have a standalone SOP that outlines these procedures using scientifically supported justification for their approach.

CAPA / Root Cause. A good corrective action / preventive action (CAPA) and root cause (RC) analysis program is at the heart of every sound quality system. “Corrective and preventive action (CAPA) processes can either add value or steal time away from the organization according the quality of the root cause analysis,” says Vanessa Cook, quality systems manager at Tyson Foods Safety & Laboratory Services. “CAPA might be the single greatest influence on an organization’s ability to continuously improve and adapt to change if diligence is given to this activity.” Investing in resources such as ongoing training in CAPA/RC programs and techniques are key components to ensuring a robust and effective CAPA / RC program.

Continue to page 2 below.