While the first application of molecular methods for food could be attributed to the invention of the genetically modified corn plant, wide adoption of these technologies was ushered in through use in detecting food borne pathogens.
Generation 1 (Pre 1995) – Straight from research to food labs… not so fast
In the early 1990s, the amplification of specific gene targets and DNA using polymerase enzymes and thermal cycling (polymerase chain reaction, typically abbreviated as PCR) was gaining widespread use in the scientific research market. However, applications in clinical diagnostics and food testing were only just launching.
The first generation of molecular food safety assays were direct application of similar techniques used in the research environment. As was typical for the research laboratory, a user would run the assay by use of a PCR machine and then transfer an aliquot to visualize on an agar or polymer gel. The DNA separated through the gel matrix relative to its size and negative charge. Subsequent staining of the gel, similar to staining a slide for gram staining, would display by a pattern of specific bands. The researcher would then interpret the results by the presence or absence of the bands.
This workflow required a manual method of purifying DNA and setting up the PCR reaction as well as a subjective reading of the gels. Although this basic technique continues to be used in smaller experiments in today’s research lab, it was labor intensive, prone to errors and too cumbersome to lead to wide spread adoption in a food pathogen laboratory. Furthermore, another drawback to this early generation system was its reliance on non-specific and toxic dyes and required transferring of microliters volumes of liquids.
The major success of these early systems, however, demonstrated a core value of molecular technology: speed and specificity of nucleic acid (DNA) based methods. While the adoption in food slowly moved forward, advancements in both the research market and computing power would introduce a significant iteration for molecular methods in food testing – the introduction of automated results generation.
Generation 2 (1995- 2005): The emergence of standardized food molecular and method workflow
Beginning in the late 1990s, the promise of molecular methods in food was further realized with the introduction of systems that simplified the assay set up and automated the results calling. The removal of these two barriers led to the wider adoption of molecular methods in food testing.
Leveraging the wider availability of the desktop PC and available computing power, developers improved on generation 1 systems by automating the results interpretation. By training underlying software through algorithms to determine the presence or absence of a specific signal, the software could yield a non- subjective present/not present answer. This was a vast improvement over other methods such as immune assays or lateral flows, which relied on a user to interpret the presence or absence of a band. This advance, coupled with development of solid form reagents and removal of the requirement for transfer of the PCR reaction, provided the next core value of molecular methods: ease of use. However, like the first generation of molecular methods, generation two also relied on another non-specific dye that inserted itself in the DNA. This dye, SybrGreen, was a more specific dye for DNA, but relied on a non-specific melt curve and needed up to 3 hours for a detection cycle.
One success and significant outcome from generation two systems was establishing the standard steps for a molecular method workflow applied to food testing.
This workflow incorporated four major steps: first, an enrichment step increased the number of target organisms to a level matched to the method’s sensitivity; next, an aliquot of the enrichment would be used to make the DNA accessible, either thru DNA extraction or dilution of a crude cell lysate; thirdly, the PCR amplification step, followed; finally by the detection and generation of automated result calling-presence or absence.
One key outcome of using automated generation software was establishing the importance of including an in-sample positive reaction control. This positive control enabled labs to verify the correct performance of the PCR reaction and established the ability to rely on the results generated.
While ease of use enabled the wider adoption of molecular methods for pathogens, the amplification and detection was still slow and prone to interference by food matrixes. The limited use of melt curves and a single dye limited the application to single targets and, by pushing the boundaries of the technology, an internal positive control. Attempts to increase melt curve analysis beyond single targets were fraught with false positive and negative results as the software algorithms could not effectively distinguish multiple signals. The state of food pathogen testing using molecular testing would be further changed with the advent of real-time quantitative PCR (qPCR), and the pace of innovation in food testing would begin its rapid accent.