Skip to main content
ARS Home » Northeast Area » Beltsville, Maryland (BARC) » Beltsville Agricultural Research Center » Environmental Microbial & Food Safety Laboratory » Research » Research Project #440790

Research Project: Advancement of Sensing Technologies for Food Safety and Security Applications

Location: Environmental Microbial & Food Safety Laboratory

2023 Annual Report

Objective 1: Develop and validate an autonomous unmanned aerial vehicle with multimode imaging technologies for preharvest inspection of produce fields for animal intrusion and fecal contamination, and for irrigation water quality monitoring. Objective 2: Advance the development of customized compact spectral sensing technologies for food inspection and sanitation assessment in food processing, and for controlled-environment produce production, with embedded automated detection results for non-expert end users. Sub-objective 2.A: Develop a handheld line-scan hyperspectral imaging device with enhanced capabilities for contamination and sanitation inspection in food processing environments. Sub-objective 2.B: Develop a compact automated hyperspectral imaging platform for food safety and plant health monitoring for controlled-environment produce production in NASA space missions. Objective 3: Develop innovative spectroscopic and optical methods to characterize food composition and nondestructively detect adulterants and contaminants, for screening and inspecting agricultural commodities and commercially prepared food materials. Sub-objective 3.A: Develop a transportable multimodal optical sensing system for rapid, automated, and intelligent biological and chemical food safety inspection. Sub-objective 3.B: Develop a novel apparatus enabling dual-modality concomitant detection, along with associated methods and procedures, for assuring food integrity.

The overall goal of this project is to develop and validate automated sensing tools and techniques to reduce food safety risks in food production and processing environments. Engineering-driven research will develop the next generation of rapid, intelligent, user-friendly sensing technologies for use in food production, processing, and other supply chain operations. Feedback from industrial and regulatory end users, and from stakeholders throughout the food supply chain, indicates that effective automated sensing and instrumentation systems require real-time data processing to provide non-expert users with a clear understanding and ability to make decisions based on the system output. Towards this end, we will develop unmanned aerial vehicles with multimodal remote sensing platforms and on-board data-processing capability to provide real-time detection and classification of animal intrusion and fecal contamination in farm fields and of irrigation water microbial quality. We will upgrade our existing handheld imaging device for contamination and sanitation inspection with multispectral imaging and embedded computing and artificial intelligence. We are also partnering with the NASA Kennedy Space Center to develop a novel, compact, automated hyperspectral platform for monitoring food safety and plant health of space crop production systems. Food safety and integrity requires identifying adulterants, foreign materials, and microbial contamination as well as authenticating ingredients. We will develop innovative multimodal optical sensing systems utilizing dual-band laser Raman, and Raman plus infrared, for simultaneous detection on a single sampling site. Spectroscopic and spectral imaging-based methodologies will be developed to enhance detection efficacy for liquid or powder samples. These systems will be supported with intuitive, intelligent sample-evaluation software and procedures for both biological and chemical contaminants.

Progress Report
Significant progress has been made for all objectives of the project, which fall under National Program 108. For Objective 1, ARS scientists in Beltsville, Maryland, continued development, design, and construction of the field transportable multimodal imaging system. This system will be used to validate the results of the small Unmanned Autonomous Vehicle (sUAV) system and provide reference data during the processes of sensor calibration and post-collection data processing. The sensor suite (including hyperspectral, thermal, and 3D/color cameras) that will be present on both the field and UAV units was installed onto the cart, and testing and integration of control software was started. A full-scale first prototype was built and underwent initial testing and design revisions. A second iteration design has been produced and an evaluation of the new design is under way. Testing of settings for the sensors and the linear stage has been completed and the results will be integrated into the control software. We continue the design and completion of the power system that will provide the sensor suite, control laptop, and control mechanisms power during remote testing. For Objective 2A, ARS scientists in Beltsville, Maryland, continued working with multiple collaborators for testing and commercialization of ARS portable multispectral imaging technology for contamination and sanitization inspection. Based on an exclusive license for the ARS-patented (US patent no. US 8,310,544) handheld fluorescence imaging technology, a commercial prototype Contamination Sanitization Inspection and Disinfection (CSI-D) device was developed in 2021. The handheld CSI-D device provides visualization of contamination on food contact surfaces via ultraviolet-A (UVA) fluorescence imaging, disinfection via ultraviolet-C (UVC) illumination, and documentation of cleanliness. Experiments were conducted to determine the detection efficacy of the device for various vegetable and meat sample smears on food contact surfaces such as commercial-grade plastic cutting boards and stainless-steel. Furthermore, we completed the development of a fully automated bench-top UV illumination system to evaluate the effectiveness of ultraviolet-B (UVB) and UVC light for germicidal applications. The system comprises a 305-nm UVB LED, a fan for cooling, a sample holder that can be adjusted in height, a single-board computer equipped with a touchscreen monitor, and a safety trigger mechanism. This setup was used for assessing the intensity of UVB irradiance at different sample distances, with the aim of identifying optimal parameters for effectively eliminating foodborne bacteria through germicidal applications. Experiments were conducted to evaluate the effectiveness of UVC radiation in eradicating pathogenic bacteria that were cultivated in Petri dishes. For Objective 2B, in collaboration with the NASA Kennedy Space Center (KSC), ARS scientists in Beltsville, Maryland, continued to develop an advanced version of ARS hyperspectral imaging technology suitable for plant health and food safety monitoring in fresh produce production systems for future spaceflight. The ARS and KSC team designed the new multimodal imaging system and ARS scientists developed an automated imaging gantry platform to acquire a range of multimodal spectral and phenotype data to monitor the health of plants grown in a controlled environment. The system was installed in a plant growth chamber at NASA KSC for full-scale plant imaging experiments. The newly developed multimodal sensing platform consists of a 3-axis gantry system, two LED line lights that provide broadband visible to near-infrared illumination for reflectance, and 365 nm ultraviolet-A excitation for fluorescence imaging. The system interface software was developed on Python 3.0 to control the gantry motions and to acquire multimodal spectral image data. NASA KSC is using the multimodal sensing platform to acquire imaging data for pick-and-eat salad crop seedlings grown under control conditions (well-watered) and stress conditions (under- and over-watered). ARS scientists have been developing hyperspectral image processing and artificial intelligence / machine learning models for analyzing the imaging data collected at NASA KSC. ARS scientists in Beltsville, Maryland, developed a new portable hyperspectral imaging system for collaborators at the University of Florida (UF) conducting research on citrus diseases and bacteria. Because the transport of diseased citrus fruit and leaf samples from mid and south Florida is restricted, ARS scientists initiated a Material Transfer Agreement with UF and delivered a portable imaging system and control software so that UF collaborators can collect image data on-site from diseased citrus samples without transporting samples to UF. They have acquired hyperspectral reflectance and fluorescence images from diseased leaves and fruit peels with bacteria at two citrus research centers in Florida (Lake Alfred and Fort Pierce). They will develop image fusion algorithms and deep learning classification models using the collected data for disease and bacteria detection. For Objective 3A, in collaboration with the National Agricultural Products Quality Management Service (NAQS), South Korea, ARS scientists in Beltsville, Maryland, completed development of control software and algorithms for spectral and image preprocessing and machine learning for a newly developed multimodal optical sensing system for automated and intelligent biological and chemical assessment in food safety applications. We developed the system control software using LabVIEW. Visiting scientists from NAQS conducted experiments on identification of common foodborne bacteria and detection of aflatoxin contamination in ground maize. Using machine-learning AI models, classification accuracies were achieved at 98% to differentiate five common bacterial species grown on nonselective agar in Petri dishes and at 95% to classify aflatoxin contamination levels in naturally contaminated ground maize samples. ARS scientists filed a USPTO patent application for the methodology and the prototype system. In collaboration with a CRADA partner and the University of North Dakota (UND), we continued work to develop fish authentication methods based on multimode hyperspectral imaging techniques to address issues of species mislabeling and fraud as well as freshness of fish fillets. In this continuing study, all hyperspectral image data (including VNIR and SWIR reflectance, fluorescence, and Raman) were collected for raw samples from over 60 major fish species. UND developed a machine learning method using the fusion of three spectral data types (i.e., VNIR, SWIR reflectance, and fluorescence) and achieved 95% accuracy in classifying fish conditions spanning fresh to spoiled. The CRADA partner will use the results to design and develop portable smart spectroscopy-based sensing devices for industrial applications for on-site fish species and freshness inspection. For Objective 3B, ARS scientists in Beltsville, Maryland, conducted dual-modality IR and Raman measurements of the insecticide fipronil to determine which vibrational modes are more intense in IR relative to those more intense in the Raman. Although IR and Raman spectra are often deemed to be “complementary” in principle, experimental evidence on spectral data on individual specific compounds is sparse in practice. Experimental evidence comparing IR and Raman spectral information has also been studied for yellow turmeric powder mixed with white turmeric powder. Dual-modality IR and Raman measures of seven lipids (cis-polyunsaturated fatty acids) in situ at room temperature were performed. Since the food component is much more abundant than the adulterant, one spectrum confirming an adulterant in one modality can be diluted by a second spectrum at another site which detects the food only. Thus, single modality measurements can quickly default into measuring only variance in food spectral signature because most of the data collected is food. In contrast, dual-modality collection can often almost immediately confirm if one site contains an adulterant. A point-scan IR and Raman dual macro-scale imaging system is being developed to overcome the lack of instrumentation designed specifically for macro-scale sample measurement. The system uses a thermal infrared light source and a laser as separate sources for IR and Raman measurement.

1. A hyperspectral plant health monitoring system for space crop production. Plant monitoring in growth chambers onboard the International Space Station is currently conducted by estimating growth rates based on photographic analysis of daily increments in leaf areas. This limited approach cannot detect plant stresses or nutrient deficiencies that usually occur days before the leaves manifest any visible changes. In a collaborative project between scientists at USDA ARS, Beltsville, Maryland, and at NASA, Kennedy Space Center (KSC), Florida, a compact and automated hyperspectral imaging system was developed and installed at the KSC to monitor plant health for space crop production under controlled environments. The prototype system can collect both hyperspectral reflectance and fluorescence images in the visible and near-infrared region within a single imaging cycle, which can provide rich spectral and spatial information for possible early detection of abiotic stresses and diseases for pick-and-eat salad crops. In a preliminary study on Dragoon lettuce, the system showed potential for detecting drought stress before visible symptoms and leaf size differences were evident, using a machine learning method with the spectral reflectance data of the lettuce. The method would benefit NASA’s space crop production and other fresh produce production in controlled-environment agriculture in enhancing quality and safety of fruits and vegetables by reducing crop losses due to stress and disease and enabling earlier interventions to mitigate problems.

2. A multimodal optical sensing system for automated and intelligent food safety inspection. Commercial integrated spectroscopy systems are usually bulky and not flexible for testing various food and agricultural products. There exists a lack of compact sensing devices and methods for quick and routine analysis of the chemical and biological content of food samples. As an extension of previous ARS-developed macro-scale Raman technologies, ARS scientists at Beltsville, Maryland, developed a new transportable multimodal (transmission, color, fluorescence, and Raman) optical sensing system with embedded artificial intelligence capabilities for automated and intelligent food safety inspection. By using machine vision and motion control techniques, the system can conduct automated Raman spectral acquisition for a variety of sample types presented in customized well plates or in Petri dishes (from food materials to bacterial colonies grown on media). Interesting targets within the sample materials can be identified and labeled using real-time image and spectral processing and machine learning functions integrated into the in-house developed software. The system shows promise for use by food safety regulatory agencies as an initial screening tool for quick species identification of common foodborne bacteria. Compact and easily transportable, the prototype is suitable for field and on-site food safety inspection in potential regulatory and industrial applications.

3. Dual-modality IR and Raman in situ spectral measurements distinctly identify seven individual, biologically essential, lipids at room temperature. Commercially available fish oil products are mixtures of lipids, all of which contain the same redundant cis-polyunsaturated fatty acids structure. Spectral verification of specific lipid identities is possible but requires distinguishing spectral wavenumbers specific to individual lipid compounds—fingerprint-like markers—from redundant nonselective wavenumbers. ARS scientists in Beltsville, Maryland, determined that, surprisingly, infrared (IR) wavenumbers unique to seven lipids are different from the wavenumber markers in the Raman domain for those same lipids. Using dual-modality measurements–both IR and Raman markers together—can facilitate and enhance the identification of lipids for verification purposes, for example, by readily distinguishing between cis-polyunsaturated fatty acids with an even number of double bonds from those with an odd number of double bonds, such as the omega-3 fatty acid DHA (with six double bonds) and the omega-3 fatty acid EPA (with five double bonds) that each exhibit a unique spectral signature. This approach enables verifying the identity of lipids supplements, such as the omega-3 fatty acids that are often added to poultry feed to enhance egg quality.

4. A rapid spectroscopic technique for detecting veterinary drug residues on meat surfaces. Current methods to detect veterinary drug residues require a strict sample collection and processing protocol, the use of very expensive analytical instrumentation, and a high technical level of expertise to both operate the equipment and to interpret the results. ARS scientists at Beltsville, Maryland, have developed macro-scale Raman imaging and spectroscopy technologies and methodologies for research addressing food integrity concerns arising from adulteration or contamination. A 785-nm point-scan Raman system was used in the development of a detection method for drug residues including salbutamol, clenbuterol, and ractopamine. The method detects the spectral fingerprint of the residue compounds after they are first absorbed onto gold nanoparticles. These spectroscopic methods, once developed, are less complicated to run and more user-friendly than conventional methods, and can produce practical analytical results in real-time, which is useful to those raising healthy animals for safe human consumption.

5. Rapid assessment of fish freshness for multiple supply-chain nodes using multi-mode spectroscopy and fusion-based artificial intelligence. Fresh fish is a highly perishable product with more than 20% wasted at retail level every year. One reason for such waste is that early fish decay is not easily detectable by human senses. There is a need for sensing techniques that allow for onsite inspection of fish freshness in a rapid, cost-effective, and nondestructive manner. ARS scientists in Beltsville, Maryland, developed a multimode spectroscopy method for rapid assessment of fish freshness. Three types of spectral data (i.e., visible and near- infrared reflectance, short -wave infrared reflectance, and fluorescence) were collected from fish fillets of four species (i.e., farmed Atlantic salmon, wild coho salmon, Chinook salmon, and sablefish) over time as fillet conditions progressed from fresh to spoiled. A machine learning method using the fusion of the three spectral data types achieved 95% accuracy in classifying fresh and spoiled fish samples. The data analysis and classification methods developed in this research can be used to assist development of an easy-to-use handheld device to estimate remaining shelf life of the fish fillets, which could enable dynamic sales management and major reductions in waste for the seafood industry.

6. Citrus disease detection using convolutional neural network generated features and Softmax classifier on hyperspectral image data. Citrus diseases and peel blemishes can limit marketability of citrus crops and in some cases lead to shipping restrictions into certain regions. Proper and timely identification and control of citrus diseases can assure fruit quality and safety, improve production, and minimize economic losses. ARS scientists in Beltsville, Maryland, developed an AI-based hyperspectral imaging and classification method for identification of various diseased peel conditions on citrus fruit. Hyperspectral reflectance images were collected in the visible and near-infrared wavelength range from Ruby Red grapefruits with normal peels and with common peel diseases and defects, including canker, greasy spot, insect damage, melanose, scab, and wind scar. A classification accuracy of over 98% was achieved using a deep learning algorithm based on convolution neural network with the hyperspectral reflectance image data. Using this method could benefit the citrus industry and regulatory agencies (e.g., FDA and USDA APHIS) in helping to ensure and enforce quality and safety standards for citrus-related food and beverage products.

7. Nondestructive evaluation of packaged butter for adulteration based on spatially offset Raman spectroscopy coupled with FastICA. Butter is a dairy product that is prone to mixing with cheaper vegetable fat (e.g., margarine) in economically motivated adulteration. Traditional optical sensing techniques can be used for adulteration detection for unpackaged butter products. However, nondestructively authenticating packaged foods is challenging due to complicated interactions between light and packaging materials. ARS scientists in Beltsville, Maryland, employed a spatially offset Raman line-scan imaging technique using a point laser to nondestructively detect adulterated butter within its intact packaging. Animal butter was mixed with margarine in different ratios. Raman image and spectral data were acquired from the butter-margarine mixtures covered by original packaging sheets and plastic film. Analysis models were developed and successfully used to predict the adulteration content of the butter-margarine samples covered with different packaging materials. The detection method is useful for through-package safety and quality inspection of food materials. Use of the technique would benefit the food industry and regulatory agencies (e.g., FDA and USDA FSIS) in ensuring and enforcing safety and quality standards for packaged food products.

8. Evaluating performance of SORS-based subsurface signal separation methods using statistical replication Monte Carlo simulation. Nondestructive evaluation of safety and quality for packaged foods is a challenging task due to difficulties in acquiring optical signals from food samples through packaging materials. Spatially offset Raman spectroscopy (SORS) is a promising depth-profiling technique to tackle this problem. However, there is a lack of studies to evaluate the signal separation methods for the SORS technique. ARS scientists in Beltsville, Maryland, presented a method based on the line scan SORS and statistical replication Monte Carlo simulation to evaluate effectiveness of retrieving Raman signals from subsurface food samples. The simulation results were verified by three packaged foods (i.e., sugar in plastic jar, bagged rice, and boxed butter), in which fast independent component analysis method can effectively separate Raman signals from surface layer of the packaging materials and subsurface layer of the foods. The evaluation method can assist in developing and optimizing SORS-based methods for through-package safety and quality inspection of the foods and ingredients. The technique would benefit the regulatory agencies (e.g., FDA and USDA FSIS) and the food industry in enforcing standards of the safety and quality of the packaged food products.

Review Publications
Gorji, H., Van Kessel, J.S., Haley, B.J., Husarik, K., Sonnier, J.L., Shahabi, S., Zadeh, H., Chan, D.E., Qin, J., Baek, I., Kim, M.S., Akhbardeh, A., Sohrabi, M., Kerge, B., Mckinnon, N., Vasefi, F., Tavakolian, K. 2022. Deep learning and multiwavelength fluorescence imaging for cleanliness assessment and disinfection in food services. Frontiers in Remote Sensing.
Guo, O., Peng, Y., Chao, K. 2022. Raman enhancement effect of different silver nanoparticles on salbutamol. Heliyon. 8(6):e09576.
Guo, Q., Peng, Y., Chao, K., Zhuang, Q., Chen, Y. 2022. Raman enhancement effects of gold nanoparticles with different particle sizes on clenbuterol and ractopamine. Vibrational Spectroscopy. 123:103444.
Zadeh, H., Hardy, M., Sueker, M., Li, Y., Tzouchas, A., Mackinnon, N., Bearman, G., Haughey, S., Akhbardeh, A., Baek, I., Hwang, C., Qin, J., Tabb, A.M., Hellberg, R., Ismail, S., Reza, H., Vasefi, F., Kim, M.S., Tavakolian, K., Elliott, C.T. 2023. Rapid assessment of fish freshness for multiple supply-chain nodes using multi-mode spectroscopy and fusion-based artificial intelligence. Sensors. 23:5149.
Tunny, S., Kurniawan, H., Amanah, H., Baek, I., Kim, M.S., Chan, D.E., Farqeerzada, M., Wakholi, C., Cho, B. 2023. Hyperspectral imaging techniques for detection of foreign materials from fresh-Cut vegetables. Postharvest Biology and Technology. 201:112373.
Qin, J., Monje, O., Nugent, M.R., Finn, J.R., O'Rourke, A.E., Wilson, K.D., Fritsche, R.F., Baek, I., Chan, D.E., Kim, M.S. 2023. A hyperspectral plant health monitoring system for space crop production. Frontiers in Plant Science. 14:1133505.
Qin, J., Hong, J., Cho, H., Van Kessel, J.S., Baek, I., Chao, K., Kim, M.S. 2023. A multimodal optical sensing system for automated and intelligent food safety inspection. Journal of the ASABE. 66(4):839-849.
Faquurzada, M.A., Park, E., Kim, T., Kim, M.S., Baek, I., Joshi, R., Kim, J., Cho, B. 2023. Fluorescence hyperspectral imaging for early diagnosis of heat-stressed ginseng plants. Applied Sciences. 13:31.
Baek, I., Mo, C., Eggleton, C., Gadsden, S.A., Cho, B., Lee, H., Kim, M.S., Qin, J. 2022. Determination of spectral resolutions for multispectral detection of apple bruises using visible/near-infrared hyperspectral reflectance imaging. Frontiers in Plant Science. 13:963591.
Nabwire, S., Suh, H., Kim, M.S., Baek, I., Cho, B. 2021. Review: Application of artificial intelligence in phenomics. Sensors. 21, 4363.
Joshi, R., Joshi, R., Kim, G., Kim, M.S., Baek, I., Lee, H., Mo, C., Cho, B., Kim, G., Park, E. 2022. Non-destructive identification of fake eggs using fluorescence spectral analysis and hyperspectral imaging. Korean Journal of Agricultural Science. 49:495-510.
Omia, E., Bae, H., Park, E., Kim, M.S., Baek, I., Kabenge, I., Cho, B. 2023. Remote sensing in field crop monitoring: A comprehensive review of sensor systems, data analyses and recent advances. Remote Sensing. 15:354.
Liu, Z., Huang, M., Zhu, Q., Qin, J., Kim, M.S. 2023. Evaluating performance of SORS-based subsurface signal separation methods using statistical replication Monte Carlo simulation. Spectrochimica Acta Part A: Molecular and Biomolecular Spectroscopy. 293:122520.
Sun, D., Wang, X., Huang, M., Zhu, Q., Qin, J. 2023. Optical parameters inversion of tissue using spatially resolved diffuse reflection imaging combined with LSTM-attention network. Optics Express. 31(6):10260-10272.
Yadav, P., Burks, T.F., Frederick, Q., Qin, J., Kim, M.S., Ritenour, M.A. 2022. Citrus disease detection using convolution neural network generated features and softmax classifier on hyperspectral image data. Frontiers in Plant Science. 13:1043712.