Dongyi Wang

Study offers improvements to food quality computer predictions

By John Lovett
University of Arkansas System Division of Agriculture
Arkansas Agricultural Experiment Station

FAYETTEVILLE, Ark. — Have you ever stood in front of apples on display at the grocery store trying to pick out the best ones and wondered, “Is there an app for this?”

FOOD QUALITY PREDICTION — Dongyi Wang's study showed computer prediction of food quality improved when based on human perceptions under various lighting situations. (U of A System Division of Agriculture photo by Paden Johnson)

Current machine-learning based computer models used for predicting food quality are not as consistent as a human’s ability to adapt to environmental conditions. Still, information compiled in an Arkansas Agricultural Experiment Station study may be used someday to develop that app, as well as provide grocery stores with insights on presenting foods in a more appealing manner and optimize software designs for machine vision systems used in processing facilities.

The study led by Dongyi Wang, assistant professor of smart agriculture and food manufacturing in the biological and agricultural engineering department and the food science department, was recently published in the Journal of Food Engineering.

Even though human perception of food quality can be manipulated with illumination, the study showed that computers trained with data from human perceptions of food quality made more consistent food quality predictions under different lighting conditions.

“When studying the reliability of machine-learning models, the first thing you need to do is evaluate the human’s reliability,” Wang said. “But there are differences in human perception. What we are trying to do is train our machine-learning models to be more reliable and consistent.”

The study, supported by the National Science Foundation, showed that computer prediction errors can be decreased by about 20 percent using data from human perceptions of photos under different lighting conditions. It outperforms an established model that trains a computer using pictures without human perception variability taken into consideration.

Even though machine vision techniques have been widely studied and applied in the food engineering field, the study noted that most current algorithms are trained based on “human-labeled ground truths or simple color information.” No studies have considered the effects of illumination variations on human perception, and how the biases can affect the training of machine vision models for food quality evaluations, the authors stated.

The researchers used lettuce to evaluate human perceptions under different lighting conditions, which were in turn used to train the computer model. Sensory evaluations were done at the experiment station’s Sensory Science Center. Han-Seok Seo, professor in the food science department and director of the Sensory Science Center, was a co-author of the study.

Out of 109 participants in a broad age range, 89 completed all nine sensory sessions of the human perceptional reliability phase of the study. None of the participants were color blind or had vision problems. In five consecutive days, the panelists evaluated 75 images of Romaine lettuce each day. They graded freshness of the lettuce on a scale of zero to 100.

The images of lettuce the sensory panel graded were of samples photographed over the course of eight days to provide different levels of browning. They were taken under different lighting brightness and color temperatures, ranging from a blueish “cool” tone to an orangey “warm” tone, to obtain a dataset of 675 images.

Several well-established machine learning models were applied to evaluate the same images as the sensory panel, the study noted. Different neural network models used the sample images as inputs and were trained to predict the corresponding average human grading to better mimic human perception.

As seen in other experiments at the Sensory Science Center, human perception of food quality can be manipulated with illumination. For example, warmer environmental colors can disguise lettuce browning, Wang explained.

Wang said the method to train machine vision-based computers using human perceptions under different lighting conditions could be applied to many things, from foods to jewelry.

Other co-authors of the study from the University of Arkansas included Shengfan Zhang, associate professor of industrial engineering in the College of Engineering; Swarna Sethu, former post-doctoral researcher in biological and agricultural engineering department, and now assistant professor of Computer Information Sciences at Missouri Southern State University; and Victoria J. Hogan, program assistant in the food science department.

The study was supported by the National Science Foundation, grant numbers OIA-1946391 and No. 2300281. The authors also recognized graduate and senior undergraduate students Olivia Torres, Robert Blindauer and Yihong Feng for helping collect, analyze and grade samples.

To learn more about the Division of Agriculture research, visit the Arkansas Agricultural Experiment Station website. Follow us on X at @ArkAgResearch, subscribe to the Food, Farms and Forests podcast and sign up for our monthly newsletter, the Arkansas Agricultural Research Report. To learn more about the Division of Agriculture, visit uada.edu. Follow us on X at @AgInArk. To learn about extension programs in Arkansas, contact your local Cooperative Extension Service agent or visit uaex.uada.edu.

High-tech cameras focused on chicken breast defect detection

By John Lovett
University of Arkansas System Division of Agriculture
Arkansas Agricultural Experiment Station

FAYETTEVILLE, Ark. — Some research for poultry processing automation is more than meets the eye.

HIGH-TECH VIEW — Graduate assistant Chaitanya Kumar Reddy Pallerla investigates the use of hyperspectral imaging to detect a defect in chicken meat. (U of A System Division of Agriculture photo by Fred Miller)

A multidisciplinary team of scientists at the Arkansas Agricultural Experiment Station are testing to see if hyperspectral images can be used to detect a chicken breast defect known as “woody breast” that costs the poultry industry millions of dollars annually and decreases customer satisfaction.

Dongyi Wang, assistant professor of biological and agricultural engineering, explains that hyperspectral imaging is a non-invasive sensing technique that combines a near-infrared sensor with a high-definition color camera to capture physical and chemical information.

“The current evaluation procedure is time-consuming and needs a sample tested through cumbersome laboratory tests,” Wang said.

Woody breast detection with a hyperspectral camera system would take just a few seconds with a computer instead of grading by hand.

“Woody breast detection by hand can be labor intensive,” said Casey Owens, the Novus International Professor of Poultry Science at the experiment station. “If hyperspectral imaging can be used in a poultry processing plant, that labor force could be diverted to another area.”

POULTRY PROFESSOR — Casey Owens is the Novus International Professor of Poultry Science at the Arkansas Agricultural Experiment Station. (U of A System Division of Agriculture photo by Fred Miller)

Owens said woody breast affects up to about 20 percent of chicken breast meat. Although it can be diverted for further processing, the loss in premium as a whole-muscle product accounts for a yield loss worth about $200 million annually in the United States, Wang said.

“Woody breast is still a safe product. It just can have a crunchy texture in some cases that is not appealing to customers, but it can be diverted for further processing into products like chicken nuggets, sausage, or chicken patties where the defect is not as noticeable,” Owens said.

Woody breast meat is harder to the touch because it has less water-holding capacity and less protein content, so the meat doesn’t retain marination as well as meat without the defect.

The woodiness is more common in larger birds of 8-9 pounds versus a 6-7-pound bird. Owens said one theory is that the fast-growing birds may be producing muscle faster than the blood vessels can support them, leading to muscle fiber damage and therefore increased collagen deposits. 

Chaitanya Kumar Reddy Pallerla, a food science graduate student working on the project, said each image with a hyperspectral camera takes up about 1 gigabyte of data. The photo is processed by a computer and correlated with a texture map indicating hardness levels in the fillet created with Owens’ previous research. Once calibrated, the system would rely on the images alone to detect woody breast.

“What we’re trying to do is collect the spectral data, intensities that were reflected, and correlate them with texture properties,” Pallerla said. “These are rated with a texture analyzer initially, and if we find a correlation between this spectral information and the texture properties later, we do not need a texture analyzer. So, we can use this correlation and directly interpret the texture properties from the spectral properties.”

Although protein content, water holding capacity and texture properties are considered the best markers for woody breast detection, Pallerla said most researchers have not focused on those properties because of the level of irregularities in the sections of a chicken breast.

Wang said the hyperspectral camera, so far, has detected woody breast meat with about 84 percent accuracy. The goal is to accommodate high-speed sorting on a conveyor belt, or handheld portable devices, he added.

TECH TALK — Dongyi Wang, assistant professor of biological and agricultural engineering, researches the use of robotics and machine learning in agriculture. (U of A System Division of Agriculture photo by Fred Miller)

Pallerla said the research will help fine tune their current texture analysis map and decrease the variance in detection.

Wang and Owens conduct research for the Arkansas Agricultural Experiment Station, the research arm of the University of Arkansas System Division of Agriculture. Owens also teaches classes through the Dale Bumpers College of Agricultural, Food and Life Sciences at the University of Arkansas. Wang teaches classes through the University of Arkansas’ College of Engineering, and has a split research appointment between the biological and agricultural engineering department and the food science department. Pallerla holds a teaching assistant position in the biological and agricultural engineering department.

To learn more about Division of Agriculture research, visit the Arkansas Agricultural Experiment Station website: https://aaes.uada.edu. Follow on Twitter at @ArkAgResearch. To learn more about the Division of Agriculture, visit https://uada.edu/. Follow us on Twitter at @AgInArk. To learn about extension programs in Arkansas, contact your local Cooperative Extension Service agent or visit www.uaex.uada.edu.

Researchers receive $1 million grant to develop robotic system to assist poultry processing

By Brittaney Mann
U of A System Division of Agriculture

FAYETTEVILLE, Ark. — The COVID-19 pandemic strained many poultry processing plants as employees became ill. With the help of a $1 million grant, Arkansas Agricultural Experiment Station researchers will soon begin designing robotics to help alleviate that potential strain.

ROBOTICS — Dongyi Wang is the principal investigator in a robotics project for the poultry industry. The project is funded by a $1 million grant provided jointly by the National Science Foundation and the USDA's National Institute of Food and Agriculture. (U of A System Division of Agriculture Photo by Fred Miller)

The project will be funded through a joint proposal between the National Science Foundation’s National Robotics Initiative 3.0 and the United States Department of Agriculture’s National Institute of Food and Agriculture.

Dongyi Wang, assistant professor of biological and agricultural engineering, is the principal investigator on the project. Wang conducts research for the experiment station, the research arm of the University of Arkansas System Division of Agriculture. He also has a research appointment with the food science department and a teaching appointment with the University of Arkansas’ College of Engineering.

A major focus in Wang’s lab is to understand what jobs robotic and automated systems can accomplish.

“We are trying to explore the opportunities and to see how automation can help the agriculture industry and the food industry,” Wang said.

This four-year project will lead to the development of a robotic system that can hang raw chicken as human workers do to meet the long-term needs of the poultry industry.

Poultry processing plants

In 2021, the U.S. produced 59.2 billion pounds of broiler chickens, according to the USDA. Arkansas ranked No. 3 in the nation, producing 1 billion broilers — 7.46 billion pounds of meat worth $3.97 billion — in 2021, according to the 2022 Arkansas Agriculture Profile.

Many of the steps to process chicken are already automated in processing plants, Wang said. Slaughtering and evisceration do not really rely on people. Rehanging the raw chicken is one of the major steps that relies on human work. Workers on the processing line hang the birds on conveyor lines that continue to the deboning, wing-cutting and packing steps.

Lending a hand

Besides Wang, the team includes Co-PIs, Wan Shou, assistant professor in the mechanical engineering department at the University of Arkansas, and Yu She, assistant professor in the industrial engineering department at Purdue University. Casey Owens, Novus International professor of poultry science and Philip Crandall, professor of food science, both with the Arkansas Agricultural Experiment Station, will also be involved with the research.

To create the automation system, the researchers will customize tactile sensory grippers and develop a high-resolution and high-speed 3D imaging system, Wang said. The 3D imaging system will allow the robotic arms to differentiate between the topmost chicken and the rest of the pile and will indicate the predetermined key points for chicken grasping. A key challenge is developing a gripper that reliably grasps the chicken without damaging the meat quality.

Shou will design the tactile sensors and She will design the robotic hand. By integrating these developments, robots will be enabled to adjust their grip based on how slick the surface is to ensure the bird is secure.

“Rather than buying an expensive robotic hand, we are going to design and fabricate a robotic hand with lower cost with the assistance of 3D printing,” Shou said.

Wang’s focus for this project is programming the two robots to work as human hands and complete the task of hanging the chicken without issues like the arms hitting one another.

They will test the robotics in the experiment station’s pilot chicken processing plant, with Owens overseeing the quality of meat handled by the robotic arms. The team will also use this project for opportunities in education and, with the help of Crandall, extension activities that target poultry and broader food industries.

Shou and She are excited to work on this project because of the advances they aim to make in artificial intelligence and multimodal sensing capabilities for intelligent robotic systems.

“With the new robotic system, we will generate new knowledge on mechanics and control,” She said.

Shou expressed confidence in the team to accomplish these advances.

“We have a great team to tackle the proposed project,” Shou said, highlighting the multiple disciplines the research involves, including manufacturing, sensors, robotics, mechanics, and computer vision and machine learning. “It has very promising applications for society,” he said.

Wang visualizes this project benefitting the scientific areas of tactile sensing, 3D imaging, dual robotic control and algorithms. He also sees it benefitting the poultry industry itself.

“It is very, very exciting that this kind of technology, even maybe not right now, but potentially, can help the local economic development and the local industry,” Wang said.

To learn more about Division of Agriculture research, visit the Arkansas Agricultural Experiment Station website: https://aaes.uada.edu/. Follow us on Twitter at @ArkAgResearch. To learn more about the Division of Agriculture, visit https://uada.edu/. Follow us on Twitter at @AgInArk.