New software increases accuracy of crop health measurement technology

RALEIGH, NC — An interdisciplinary team of researchers has developed a new tool that improves the accuracy of electronic devices that measure plant leaf color to assess plant health. The new technology works by improving the sensor’s ability to account for variations in light that can affect how the sensor perceives color.

“There’s a tremendous amount of research going on to develop new plant varieties that are better able to withstand challenges like drought, high temperatures, and so on,” said Michael Kudenov, co-author of the new software paper and professor of electrical and computer engineering at North Carolina State University. . “Many of these researchers use sensors that record the color of a plant’s leaves to assess plant health, which is critical to their work. These sensors are also used by some growers and crop advisors to assess crop health. However, when researchers, growers or crop consultants are working with crops in fields, sunlight can affect the ability of these sensors to accurately record leaf color. Specifically, glare can knock off sensors.

“Our goal was to develop software that would allow users to more easily take into account the ways in which sunlight glare can change the ways in which sensors capture the color of a plant’s leaves,” says Kudenov. “Previous tools that took glare into account were extremely complex and required a lot of computing power. Our approach is significantly less complicated.”

The key idea to understand here is polarization. If we think of light as a wave, it is possible for its wavelengths to vibrate along many different planes. When light is polarized, it means that the light vibrates in the same plane. If you’ve tried looking into a body of water on a clear day, you’ve probably noticed that the sun’s glare can make it difficult to see below the water’s surface. If you put on polarized sunglasses, the glare effectively disappears, allowing you to see below the surface of the water.

“The software we’ve developed essentially acts like an incredibly dynamic pair of polarized sunglasses, able to take into account all the polarization challenges that are present to accurately capture the color of a leaf, regardless of glare,” says Daniel Krafft, first author of the paper and a Ph.D. . student at NC State.

Here’s how the new tool works. When the sensors take an image of a leaf, they not only capture the color, but they can also measure how polarized the light is. The new software estimates the true color of the leaf based on two variables: the color sensed by the sensor and how polarized the darkest wavelength of light in the image is.

To evaluate the new tool, the researchers conducted proof-of-concept testing comparing the sensor’s performance with and without the new software when measuring leaves for which they knew the correct color. They found the new software to be extremely good.

“The new software reduced the size of the errors tenfold when there was a lot of glare,” says Kudenov. “For example, if the color recorded by the sensor with the new software was off by 3%, the color recorded by the sensor without the software was off by 30%. And when there’s not a lot of glare, then you don’t need the software as much, so the difference between the two sensors was less pronounced.”

The researchers tested the new software using a full-size hyperspectral polarization camera. Next steps include incorporating the new software into more compact visual sensors and testing it on platforms such as unmanned aerial vehicles to see how it performs in real-world situations with different crops.

“Ultimately, we would like to provide researchers and growers with a tool that is small enough and cheap enough for practical use,” says Kudenov.

The paper, “Mitigating illumination, foliage, and viewing angle dependence in hyperspectral imaging using polarimetry,” is published in the open access journal Phenomics of plants. The paper is co-authored by Clifton Scarboro, former Ph.D. student at NC State; William Hsieh, former undergraduate student at NC State; Colleen Doherty, associate professor of molecular and structural biochemistry at NC State; and Peter Balint-Kurti, USDA-ARS Research Geneticist and Adjunct Professor of Plant Pathology at NC State.

The research was carried out with the support of the National Science Foundation, number 1809753; and the National Institute for Food and Agriculture, under grant number 2020-67021-31961.

Note to editors: A summary of the study follows.

“Mitigating illumination, foliage and viewing angle dependence in hyperspectral imaging using polarimetry”

Authors: Daniel Krafft, Clifton G. Scarboro, William Hsieh, Colleen Doherty, Peter Balint-Kurti, and Michael Kudenov, North Carolina State University

Published: March 22, Phenomics of plants

DOI: 10.34133/plantphenomics.0157

Abstract: Automation of plant phenotyping using data from high-dimensional imaging sensors is at the forefront of agricultural research because of its potential to improve seasonal yield by monitoring crop health and accelerating breeding programs. A common challenge when taking images in the field is related to the spectral reflection of sunlight (glare) from crop leaves, which, at certain solar incidences and sensor viewing angles, represents unwanted signals. The research presented here involves the convergence of two parallel projects to develop an algorithm that can use polarization data to separate light reflected from leaf surfaces and light scattered from leaf tissue. The first project is a mast-mounted hyperspectral imaging polarimeter (HIP) that can image a corn field through multiple diurnal cycles during the growing season. Another project is a multistatic fiber (MFB) Mueller matrix-based bidirectional reflectance distribution function (mmBRDF) instrument that measures the behavior of individual maize leaves when scattering polarized light. These data were fitted to the existing model using SCATMECH, which provided the parameters that were used to perform the Monte Carlo simulations. The simulated data were then used to train a shallow neural network that works by comparing the unpolarized two-band vegetation index (VI) with the linearly polarized data from the low-reflectance VI bands. Using GNDVI and Red Edge Reflectance Ratio (RERR) we saw an order of magnitude or more improvement in mean error (ϵ) and a reduction ranging from 1.5 to 2.7 in their standard deviation (ϵσ) after applying grid corrections to the HIP data sensor.

Michael Kudenov, Danny Kraft, Matt Shipman, NC State University

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *