Skip to main content
ARS Home » Northeast Area » Beltsville, Maryland (BARC) » Beltsville Agricultural Research Center » Environmental Microbial & Food Safety Laboratory » Research » Research Project #438569

Research Project: Automated Portable Sensing Technologies for Screening Food Contaminants

Location: Environmental Microbial & Food Safety Laboratory

Project Number: 8042-42000-021-004-S
Project Type: Non-Assistance Cooperative Agreement

Start Date: Sep 1, 2020
End Date: Aug 31, 2025

Objective:
The objective is to develop and validate user-friendly portable sensing methods and technologies for food safety and security applications. The aim of the research is to design and develop portable devices that can rapidly scan foods or food-contact surfaces for microbial, chemical, and biological contaminants with high throughput and automated detection capabilities suitable for use in commercial food processing facilities.

Approach:
In recent years, an uptick in recalls related to the presence of foreign material in foods, deliberate profit-driven chemical contamination of foods, and possibility of accidental contamination of foods, has demonstrated the need for rapid screening methods that can be used by food processors to detect low-level contamination and adulteration in large amounts of bulk food products. Several imaging technologies and processes will be evaluated to address the need for rapid and robust detection of contaminants and for the authentication of food ingredients. The core sensing technologies that include macro scale line-scan Raman chemical imaging, gradient-temperature dependent Raman spectroscopy, and hyperspectral imaging will be investigated. In addition, ARS fluorescence-based handheld imaging devices for contamination and sanitation inspection will be enhanced for applications in food processing environments. The project also involves developing, implementing, and testing the machine-learning pipelines for automated processing of the acquired spectral imaging data. Specifically, the research will address the use of convolutional neural networks in segmentation, shape, texture, and spectral descriptors on the segmented features and the use of manifold learning methods to visualize the spectral imaging data. The studies will focus, in particular, on the data visualization using t-SNE (t-Distributed Stochastic Neighbor Embedding), UMAP (Uniform Manifold Approximation and Projection), and deep-learning autoencoder approaches. The project will address the difficulties related to hardware/software integration and the possible use of detection technologies and data processing subsystems in the portable format. The targeted platforms for data processing and presentation will include desktop/laptop computers as well as mobile devices (Android and/or iOS).