Tag Archives: big data

Stephanie Pollard, ClearLabs
In the Food Lab

The Power of Advanced NGS Technology in Routine Pathogen Testing

By Stephanie Pollard
No Comments
Stephanie Pollard, ClearLabs

The food industry is beginning to transition into an era of big data and analytics unlike anything the industry has ever experienced. However, while the evolution of big data brings excitement and the buzz of new possibilities, it also comes coupled with an element of confusion due to the lack of tools for interpretation and lack of practical applications of the newly available information.

As we step into this new era and begin to embrace these changes, we need to invest time to educate ourselves on the possibilities before us, then make informed and action-oriented decisions on how to best use big data to move food safety and quality into the next generation.

Stephanie Pollard will be presenting “The Power of Advanced NGS Technology in Routine Pathogen Testing” at the 2018 Food Safety Consortium | November 13–15One of the big questions for big data and analytics in the food safety industry is the exact origins of this new data. Next Generation Sequencing (NGS) is one new and disruptive technology that will contribute significantly to a data explosion in our industry.

NGS-based platforms offer the ability to see what was previously impossible with PCR and other technologies. These technologies generate millions of sequences simultaneously, enabling greater resolution into the microbial ecology of food and environmental surfaces.

This represents a seismic shift in the food safety world. It changes the age-old food microbiology question from: “Is this specific microbe in my sample?” to “what is the microbial makeup of my sample?”

Traditionally, microbiologists have relied on culture-based technologies to measure the microbial composition of foods and inform risk management decisions. While these techniques have been well studied and are standard practices in food safety and quality measures, they only address a small piece of a much bigger microbial puzzle. NGS-based systems allow more complete visibility into this puzzle, enabling more informed risk management decisions.

With these advances, one practical application of NGS in existing food safety management systems is in routine pathogen testing. Routine pathogen testing is a form of risk assessment that typically gives a binary presence/absence result for a target pathogen.

NGS-based platforms can enhance this output by generating more than the standard binary result through a tunable resolution approach. NGS-based platforms can be designed to be as broad, or as specific, as desired to best fit the needs of the end user.

Imagine using an NGS-based platform for your routine pathogen testing needs, but instead of limiting the information you gather to yes/no answers for a target pathogen, you also obtain additional pertinent information, including: Serotype and/or strain identification, resident/transient designation, predictive shelf-life analysis, microbiome analysis, or predictive risk assessment.

By integrating an NGS-based platform into routine pathogen testing, one can begin to build a microbial database of the production facility, which can be used to distinguish resident pathogens and/or spoilage microbes from transient ones. This information can be used to monitor and improve existing or new sanitation practices as well as provide valuable information on ingredient quality and safety.

This data can also feed directly into supplier quality assurance programs and enable more informed decisions regarding building partnerships with suppliers who offer superior products.

Similarly, by analyzing the microbiome of a food matrix, food producers can identify the presence of food spoilage microbes to inform more accurate shelf-life predictions as well as evaluate the efficacy of interventions designed to reduce those microbes from proliferating in your product (e.g. modified packaging strategies, storage conditions, or processing parameters).

Envision a technology that enables all of the aforementioned possibilities while requiring minimal disruption to integrate into existing food safety management systems. NGS-based platforms offer answers to traditional pathogen testing needs for presence/absence information, all the while providing a vast amount of additional information. Envision a future in which we step outside of our age-old approach of assessing the safety of the food that we eat via testing for the presence of a specific pathogen. Envision a future in which we raise our standards for safety and focus on finding whatever is there, without having to know in advance what to look for.

Every year we learn of new advancements that challenge the previously limited view on the different pathogens that survive and proliferate on certain food products and have been overlooked (e.g., Listeria in melons). Advanced NGS technologies allow us to break free of those associations and focus more on truly assessing the safety and quality of our products by providing a deeper understanding of the molecular makeup of our food.

Big data

Embracing Big Data as an Asset to Your Company

By Maria Fontanazza
No Comments
Big data

Big data has become a fairly common term used across industries. It refers to large, complex volumes of data that are generated from multiple sources. The challenge may not be so much in gathering the data but more so in what to do with the information. Although it can be a bear to manage, if able to harness data correctly, food companies could have a leg up on their competition.

“The food industry is behind. As an example, the aerospace industry has the ability to monitor engines on a transatlantic flight to ensure they are operating at the optimal conditions. This data is being used by engineers within different organizations to make improvements,” says Kathy Wybourn, director of food & beverage, USA & Canada at DNV-GL. “Just having the ability to collect information in real time will shift the industry from reactive to proactive. This will require the industry to fit the pieces together to collect information. As an example, you could reject a product at the supplier site, even before it leaves the supplier—you would have all that information at the tips of your fingers.” In a Q&A with Food Safety Tech, Wybourn discusses how the food industry can benefit from the proper use of big data.

Food Safety Tech: What does the term “big data” mean to the food industry?

Kathy Wybourn, DNV-GL
Kathy Wybourn, director of food & beverage, USA & Canada at DNV-GL

Kathy Wybourn: Large volumes of data that is collected from both internal and external sources, used to make smarter business decisions. The supply chain in the food industry is very complex—receiving supplies from all over the globe. [Big data can identify] trends in different regions of the world and assist food companies make better risk decisions about their supply chain. Big data will ultimately improve the safety and quality of products for consumers. Improved supply chain management [and] traceability of products will also lower the risk of food fraud.

We’ve moved from an analog to digital age. The internet has provided the connectivity to link data from raw materials to end users. Using social media data, GPS, photos, videos and data sensors can provide real-time data about raw materials through manufacturing, distribution and retail, which will allow an organization to have better insights into information and decision making along the entire supply chain.

DNV GL recently conducted a survey called “ViewPoint” about the application of Big data. The survey found that 50% of the respondents already have been doing something with Big Data in different ways. Interesting enough, Big Data has different meanings and importance to the respondents, but what is common, is the fact that data will be used differently in the future than what is currently in their tool box. Big Data will allow better insight and enable companies to make fact-based decisions and better manage both performance and risks. The respondents may have different definitions for Big Data, but they all agree that data will be used differently than today for making both internal and external business decisions.

“A higher number of food and beverage companies indicate that big data will have a high or fairly high impact on their business in the next 2¬–3 years. The companies in this industry indicate fewer barriers, even today, in taking advantage of big data concepts. Already, 21% say that their management team is preparing for the new reality and seemingly more food and beverage companies plan to invest in big data.” – DNV-GL Viewpoint Report

FST: How can the industry use big data to make food safer and more sustainable?

Wybourn: Big data will allow the food industry to become even more transparent, which will help improve food safety. Big data will improve supply chain management and allow organizations to make more informed decisions regarding processes, both internally and externally. Food manufacturers can improve efficiency and quality of their own manufacturing processes: Increasing output and solving operational problems faster, which will both have a positive effect on an organization’s bottom line.

Non-conformity data is powerful and can be collected through advanced analytics throughout the supply chain. This data can be further sorted by regions of the world, which will improve knowledge and insight about suppliers. Big data brings further insight beyond what is gained from one audit, which will allow organizations to be confident about making better risk decisions.

Additionally, big data can be used to assess your organization’s performance by benchmarking against other companies’ performance in the areas of nonconformities to food safety standards in their own or different regions.

FST: Can you give some examples of where food companies are or should be leveraging big data to help them in the compliance phase of FSMA?

Wybourn: Both large as well as small companies are struggling with FSMA preventive controls. FSMA mandates that a manufacturing facility have a risk-based supply chain program for raw materials and ingredients for hazards that require a supply chain applied control. Manufacturing sites may rely on a supplier or customer to control a hazard. An organization’s ability to manage big data to improve the organization’s tools to capture, store and analyze this data can greatly improve the monitoring of hazards and lower risk to the supply chain.

FST: Do you have examples of how some companies are leveraging technology to make the best use of their data?

Wybourn: DNV GL has new digital platforms, which can be used to benchmark your own organization to the performance of others.

eAdvantage is a customer portal that provides customers with a complete overview of their former and future audit activities. Through the portal they can see upcoming activities, work with findings and close non-conformities, communicate with an auditor, share audit information, access certificates and monitor their overall progress.

Lumina is a set of tools that provides better insight into a company’s management system. It analyses information hidden in the company’s audit data and allows to benchmark that company against thousands of others worldwide based on more than 1.6 million audit findings. It allows an organization to obtain an overview of their own sites performance, spot warning signs at an early stage and see how they compare to similar companies in the industry, giving confidence to make the right decisions.

Veracity is an open industry data platform, ideal for integrating data in a secure way. The Veracity eco-system handles asset data, manages data quality and applies advanced analytics, connecting industry players for frictionless data aggregation, sharing and benchmarking. In the aquaculture industry, this will lay the foundation for predictive analyses, decision support, indication warning, and simulation capabilities unlocking substantial growth potential in the global aquaculture industry. All the while, we make sure fish farmers and other data providers retain ownership and control of their data.

I believe we are only at the tip of an iceberg of where big data can take the food & beverage Industry.

FST: Is it possible to get too much data? Are food companies going to be bombarded with too much info that they don’t know how to use?

Wybourn: The answer is simple, yes. We live in a world of data abundance and information overload each day. Data sets are growing rapidly, and the ability to store and analyze data is daunting. The tools we have today will become obsolete tomorrow. One only can sort through data with the tools he/she has today to understand even the simplest of processes.