Smart application facilitates decision-making in the management of foliar diseases through images

By Wilhan Valasco dos Santos, Caroline Ferreira Freire, Gabryely Eloísa Henares Ostrufka and João Alberto de Oliveira Jr, from Agrivalle

08.08.2024 | 15:25 (UTC -3)

Currently, technology and easy access to information have allowed daily decisions to be increasingly assertive and agile and for agriculture it could not be different. With this scenario in mind, a tool was developed that enables rapid and accurate detection of disease severity, guiding farmer decision-making and making actions related to disease management simpler and more precise, all through a single photo taken. via the producer's own cell phone. For an adequate contextualization of the software's operation, the next paragraphs will highlight the proposed technical approach.

Analyzing disease severity is crucial to understanding and effectively treating plant health conditions. A common approach is the use of diagrammatic scales, a long-used method. However, the incorporation of advanced technologies such as image analysis offers new perspectives. For example, the development of applications for detecting and quantifying plant health through RGB images represents a significant advance in this field. These technological solutions have the potential to improve the accuracy and efficiency of disease severity assessment, providing valuable insights for precise interventions. Due to these complexities, we are required to seek technological solutions that minimize the subjectivity of evaluators and reduce the possibility of errors in the analysis.

To carry out this assessment using scales, at least three evaluators and a reference scale are required (Figure 1). They employ this scale as a reference and perform a detailed visual inspection of the leaf to determine the extent of the area affected by the disease.

Figure 1: diagrammatic scale for quantifying phytosanitary complexes in soybean crops (Peixoto, MJ, 2023)
Figure 1: diagrammatic scale for quantifying phytosanitary complexes in soybean crops (Peixoto, MJ, 2023)

With the aim of increasing the precision of the process and making it automated, we developed a computational application to analyze plant health through RGB images (Figure 2) captured by cell phones.

Figure 2: images of leaves with the presence of soybean end-of-cycle diseases (CFD) captured by cell phones
Figure 2: images of leaves with the presence of soybean end-of-cycle diseases (CFD) captured by cell phones

In this scenario, the software uses an algorithm that cuts out the region to be analyzed in the image, aiming to optimize processing time. At the same time, the image is subdivided into parts to mitigate the adverse effects of glare and others, as shown in figure 3.

Figure 3: original image, image fragmented into parts, binary image of fragmented parts
Figure 3: original image, image fragmented into parts, binary image of fragmented parts

Another way to mitigate erroneous estimates and increase the software's accuracy is to follow some assumptions during image acquisition. These guidelines aim to improve the effectiveness of computer vision. The requirements are simple: a) the image must be of reasonable quality, with at least 300 dpi (optical resolution met by most digital cameras, including smartphones); b) the sheet must occupy most of the image; c) there must be minimal variation in shadow or light in the image. 

Subsequently, the detection of the area infected by the disease is carried out using a machine learning algorithm. This algorithm transforms the RGB color space to HSI and, based on brightness variations in the images, performs background segmentation (as illustrated in Figure 4). To optimize this segmentation, a segmentation mask is applied and subjected to an image erosion operator. This procedure aims to preserve the pixels located in the center of the area of ​​interest, prioritizing the effectiveness of the segmentation.

Figure 4: Cropped original RGB image; binary image after background segmentation; RGB image with background completely removed
Figure 4: Cropped original RGB image; binary image after background segmentation; RGB image with background completely removed

From the HSI color space, we also isolate the Hue component. This component represents the hue or pure color of the image, providing an indication of its position in the angular color spectrum. This representation can be seen in Figure 5.

Figure 5: color systems, RGB, HSI (Hue, Saturation and Intensity); source: A: https://pt.stackoverflow.com/questions/19363/por-que-verde-azul-amarelo-mas-no-rgb-amarelo-verde-e-vermelho, B: https://tableless. com.br/sobre-cor-e-webdesign., adapted by Santos W. V (2024)
Figure 5: color systems, RGB, HSI (Hue, Saturation and Intensity); source: A: https://pt.stackoverflow.com/questions/19363/por-que-verde-azul-amarelo-mas-no-rgb-amarelo-verde-e-vermelho, B: https://tableless. com.br/sobre-cor-e-webdesign., adapted by Santos W. V (2024)

In this way, thresholds are established based on the hue of the colors, which determine whether they represent the health of the plant or not. This allows us to distinguish between pixels corresponding to healthy plants and those corresponding to unhealthy plants, which we can see in figure 6.

Figure 6: A) images with segmented background; B) reading of infected area
Figure 6: A) images with segmented background; B) reading of infected area

The effectiveness of the software was tested and compared, evaluating the health of the leaves using the diagrammatic scale and the Leaf Doctor image analysis software. The health of plants in 480 images were evaluated under the same conditions. The algorithm presented a higher explanation coefficient (Figure 6) than the comparison method when correlated with visual severity, demonstrating its ability to explain both the severity predicted by the Leaf Doctor and that assessed by the four evaluators.

Figure 7: linear relationship between the severity of end-of-cycle diseases being visually evaluated using a diagrammatic scale and computational evaluation
Figure 7: linear relationship between the severity of end-of-cycle diseases being visually evaluated using a diagrammatic scale and computational evaluation

When comparing correlations, once again the algorithm showed effectiveness in analyzing plant health in comparison with other methods, as can be seen in Figure 8. Showing a high positive correlation with all, the methods have similar behaviors.

Figure 8: Pearson correlation for the three disease severity estimation methods
Figure 8: Pearson correlation for the three disease severity estimation methods

With the data shown in this article, we can say that the software is effective for reading plant health through images, offering practicality and efficiency. Its robustness makes it immune to subjectivity in analyses, ensuring more reliable and consistent results. Furthermore, it opens doors for other approaches to be explored using the same technical principle applied here.

Research plays a crucial role in our future, enabling the development of tools that not only facilitate farmers' daily lives, but also promote sustainability and efficiency in agriculture. In this scenario, this software can be downloaded both on the cell phone and via the web and because its main input is a photo, the absence of a network signal in the field will not prevent the application from functioning, making it simple and functional for the farmer's reality. 

App link: https://wilhan-valasco-santos.shinyapps.io/SeverCalc/

*Per Wilhan Valasco dos Santos, Research and Development Manager at Agrivalle; Caroline Ferreira Freire, Researcher at Agrivalle; Gabryely Eloísa Henares Ostrufka, Research and Development Manager at Agrivalle; e João Alberto de Oliveira Jr, Director of Research and Development at Agrivalle

Cultivar Newsletter

Receive the latest agriculture news by email

access whatsapp group
Agritechnica 2025