EOSAM 2023
Open Access
Issue
J. Eur. Opt. Society-Rapid Publ.
Volume 20, Number 1, 2024
EOSAM 2023
Article Number 25
Number of page(s) 9
DOI https://doi.org/10.1051/jeos/2024024
Published online 26 June 2024

© The Author(s), published by EDP Sciences, 2024

Licence Creative CommonsThis is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

1 Introduction

Ensuring an adequate food supply for the escalating human population is a great challenge. Crop losses and value degradation caused by plant diseases are major constraints to food security globally. Fungicides are important tools in management of plant diseases caused by fungal and oomycete pathogens. Currently, major crop losses are inevitable without the use of fungicides. However, reliance on fungicides is not sustainable since all fungal pathogens tend to develop fungicide resistance [1, 2]. European legislation also restricts pesticide use due to public health concerns and costs associated with pesticide removal from water [3, 4]. In sum, this makes integration and use of new, non-chemical strategies crucial for continued, sustainable food production. Successful management of fungal diseases critically depends on timely treatment. This requires the identification of pathogen type and inoculum concentration in the crop production system. Early detection of spores is a key to enabling rapid and adapted treatment to the plants. This implies that a detection system should be adapted to the detection of low concentrations of spores in air. Various spore-trapping devices are available, but most of them require time-consuming laboratory work for the identification of spores of pathogenic fungi. There have been some attempts at developing semi-automated spore detection systems [58], but most of these systems are still immature and not suitable for agricultural field applications. Most of the systems currently described in the literature are based on an off-the-shelf microscope for indoor use, without considerations for online use or field conditions [7]. Commercial systems for measuring pollen are available, see for example [9, 10] which makes use of digital holography and fluorescence spectroscopy on particles in flight. The image analysis includes a preprocessing step to remove particles that are not sufficiently round and compact, which essentially means that the particles must be sufficiently spread out to not touch each other. This will remove many fungal spores before the classification starts since spores often stick together in chains or loose clumps. Analyzing particles in flight also requires a long flying path when several detection modalities are to be applied successively, which may add complexity and lead to large instruments. The spore detection system described in [11] also requires the particles to be well separated and it does not include an automatic system for spore trapping. There is thus a need for compact, robust systems adapted for in-field, real-time, and automated detection and classification of spores.

Table 1

Classification results for the test set.

The two major challenges for automated spore detection are 1) developing the right combination of illumination, optics, and sensors to achieve sufficient image quality without losing the required area coverage, depth focus and acquisition speed and 2) training a classifier that can handle different types of spores together with pollen, dust and water droplets at different concentration levels. In a project funded by the Norwegian Research Council, END-IT, we have developed a real-time optical measurement system for detection of spores from grey mould (Botrytis cinerea) and powdery mildew (various species). The system is based on a modified microscope combined with an automatic system for spore trapping and air sampling. The system has been used in field trial and work is on-going on machine vision algorithms for detecting and classifying fungal spores in the presence of pollen, dust and other aerosols.

2 Optical imaging of spores

2.1 Microscope measurement systems

The measurement system is a customized microscope equipped with an automatic system for air sampling and trapping of fungal spores on a transparent tape, see Figures 1 and 2. It consists in a 10X Nikon Achromatic Finite Conjugate Objective, imaging 1 mm2 of the tape onto a color CMOS camera (BFS-U3-200S6C-C, Sony IMX183, 20 Megapixels, pixel size 2.4 μm). The microscope is mounted on a motorized linear translation stage for automatic focusing on the spores. A 3D-printed house holds a LED illumination system and a guiding support for the tape. The illumination system is designed to provide images of the spores in transmission (dark imaging) and in scattering (bright imaging), see Figure 2. These two imaging schemes are combined to provide better discrimination of the spores, see Section 3. The spore collection system consists of the tape, an automatic system for moving the tape and an air sampling system, see Figure 2. The tape is made from Kapton@ polyimide. It was chosen based on trial and error with different types of tapes and provided the best compromise in terms of transparency, color, and adhesive properties.

thumbnail Figure 1

Picture of the automatic sampling and detection system for spores. The total size of the present system is 30 cm by 30 cm by 50 cm. See Figure 2 for a description of the various components.

thumbnail Figure 2

The measurement and sampling system consists of a microscope with 10× magnification, an air sampling system, a tape for trapping the spore, and an illumination system with bright and dark configurations. The air intake at the top is also used to reflect and guide light from the bright field illumination towards the field of view of the microscope. When air is drawn in, small spores (<100 μm) will follow the air flow and impact the tape at a position corresponding to the FOV of the spectrometer. There the spores are trapped on the adhesive side.

Bright- and dark-field illumination are two illumination schemes that enhance different features in a biological sample [12]. Bright-field microscopy involves illuminating the sample in transmission using typically broad-spectrum light. The contrast in the image results from differential attenuation of light by parts of the sample with distinct density. This is the simplest form for microscopy with typically high signal-to-noise ratio (SNR). However, for biological samples, which present very small variations in density, the contrast can be poor. Dark-field microscopy, on the other hand, works by excluding the unscattered beam from the image. As a consequence, the field around the specimen is generally dark, while the specimens itself is bright. This allows for enhancing small contrast differences between, e.g., a specimen of interest and the background. However, since the principle is based on making use of scattered light, high SNR is achieved at the cost of long integration time in comparison to bright-light microscopy. In our measurement system, the bright illumination is provided by a white LED (Moonstone 3 W High Brightness 10,000 K LED Light Source), roughly collimated by a lens with a 5 cm focal length.

For the dark-field illumination, a “Dark Field ring” from advanced illumination [13] is utilized in which fifteen white LEDs are accurately positioned in a circular arrangement with a radius of 2.5 cm. The aimed LEDs illuminate the tape at an angle of approximately 45° ensuring circular and symmetrical illumination of the microscope’s field of view. The microscope objective focuses light that has been scattered by the samples at a 45-degree angle onto the camera chip.

A typical measuring sequence consists in:

  1. drawing a fresh part of tape,

  2. drawing in air,

  3. focusing the microscope on the tape (see Sect. 2.3),

  4. acquiring bright and dark images around focus.

The best focus image is then used to train the model or for prediction.

2.2 Optical resolution

To determine the optical resolution of the microscope through the tape, we used a reference object consisting of gold microstructure lanes deposited on silicon. The lane widths are 2.5, 5, 10, and 20 μm and all have a height of 250 nm. The reference object was imaged through the tape, see Figure 3 below for image of the 2.5 μm and 5 μm lane. The optical resolution was determined qualitatively from the images by assuming that when the two sides of the lane cannot be distinguished, the lane cannot be resolved.

thumbnail Figure 3

Reference object for measuring optical resolution of the optical system through the tape. See description in the text. The 2.5 μm (left) and 5 μm (right) vertical lanes can be distinguished on the picture. The squares at the bottom are gold pad with 200 μm × 200 μm dimensions.

In our optical system, this occurs for lanes with width of 2.5 μm, while the lane with width of 5 μm can be resolved, see Figure 4. The optical resolution through the tape falls within the range of 2.5–5 μm.

thumbnail Figure 4

Intensity across the 2.5 μm, 5 μm, and 10 μm lane. Roughly the side of the lane are resolved for the 5 μm but not for the 2.5 μm, giving us an optical resolution falling in between 2.5 μm and 5 μm.

2.3 Focusing precision

Each time a fresh tape sample is pulled, a slight change in the distance between the tape and the microscope lens occurs. The same happens after drawing in air to collect new spores, attributed to the air pressure on the tape. Consequently, it becomes necessary to refocus the microscope for each of these stages, as commented in Section 2.1.

This is achieved by following the sequence outlined below. Assuming the stepper is positioned at the last found focus point, it is then moved back, for example, 100 full steps from this position, representing approximately 250 μm. Subsequently, the stepper is moved in the opposite direction using a number N of sequences of 32 times 1/32nd-steps, and images are captured at each new position. Sub-stepping, i.e., using 1/32nd-steps, is utilized because it was proven experimentally to improve the focusing performance. A focus metric (Brenner algorithm [14]) is computed from each image, to quantify the degree of focusing. The algorithm reads as follows:

Let’s S denotes an image with N × N pixels, where each pixel of S is an integer and its position is indexed by i and j, i.e. Sij. The Brenner algorithm consists of calculating numbers d, dx and dy, defined in equations (1)(3): d = dx + dy $$ \mathrm{d}=\mathrm{dx}+\mathrm{dy} $$(1)

With dx = i = 0 i = N - 1 j = 0 j = N - 3 ( S ij + 2 - S ij ) 2 . $$ \mathrm{dx}=\sum_{\mathrm{i}=0}^{\mathrm{i}=\mathrm{N}-1}\sum_{\mathrm{j}=0}^{\mathrm{j}=\mathrm{N}-3}{\left({\mathrm{S}}_{\mathrm{ij}+2}-{\mathrm{S}}_{\mathrm{ij}}\right)}^2. $$(2) dy = i = 0 i = N - 3 j = 0 j = N - 1 ( S i + 2 j - S ij ) 2 . $$ \mathrm{dy}=\sum_{\mathrm{i}=0}^{\mathrm{i}=\mathrm{N}-3}\sum_{\mathrm{j}=0}^{\mathrm{j}=\mathrm{N}-1}{\left({\mathrm{S}}_{\mathrm{i}+2\mathrm{j}}-{\mathrm{S}}_{\mathrm{ij}}\right)}^2. $$(3)

Typical focusing curves are shown in Figure 5. The focusing curves present in most cases at least two peaks that we believe correspond to the last layer structures seen in the multilayer structure of the film. In particular, the last peak is the position of the surface of the tape exposed to air and fungus. When free of dust, fungus or any microscopic objects that can provide strong gradients, the focusing curves may present one to two additional peaks, corresponding to other layer structures, see for example the first two shoulders at ~185 μm and ~208 μm. For final focusing, we choose the last peak and estimate its position with additional precision, using a sub-pixel algorithm. Since we can move the motor with 1/32nd steps, the algorithm is applied with 1/32nd precision from the data acquired with 1 step precision. This is achieved through the following steps:

  1. Calculate the convolution of the focusing curves with a normal distribution with standard deviation σ. For σ, we choose values between 1.5 and 3.

  2. In the convolved function yi, find the maxima and choose the one corresponding to the surface of the tape, i.e., the last one. Denote its position by timax and its ordinate by yimax.

  3. Interpolate the position tsub using a polynomial of order 2 defined in equation (4):

thumbnail Figure 5

Typical focus curve obtained on a fresh tape. Fresh tape typically comprises four extrema corresponding to what we believe are different layer borders in the tape. The focusing sequence is repeated 100 times and the positions of the two last peaks are calculated with 1/32nd steps precision, see description in the text. The insert, left and right, show the layer thickness and the position of the adhesive side, as calculated for each run.

t sub = t i max + 0.5 y i max - 1 - y i max + 1 y i max + 1 - 2 y i max + y i max - 1 , $$ {t}_{\mathrm{sub}}={t}_{i\mathrm{max}}+0.5\frac{{y}_{{i}_{\mathrm{max}}-1}-{y}_{{i}_{\mathrm{max}}+1}}{{y}_{{i}_{\mathrm{max}}+1}-2{y}_{{i}_{\mathrm{max}}}+{y}_{{i}_{\mathrm{max}}-1}}, $$(4)

where yimax+1, yimax−1 are the ordinates of the two nearest point to (timax, yimax).

  1. In the focusing interval with 1/32nd steps precision, i.e., t = 0, 1/32, 2/32, ..., j/32, …, M − 1 with 0 ≤ j ≤ 32 (M − 1), choose the nearest position to tsub.

To test the efficiency of this algorithm, the focusing procedure was repeated 100 times on a fresh tape, devoid of dust and spores. The position of the surface of the tape was calculated using the sub-pixel algorithm above. The evolution of the sub-pixel position for each focusing sequence is shown in the top right insert. The last layer thickness is also evaluated using 1/32nd pixel precision as shown in the middle-left insert. The measured position of the surface presents a long-term variation of roughly 5 μm, over the full measuring sequence. The short-term variation, i.e. from focusing sequence to focusing sequence present smaller variations, typically between 0.1 μm and 1 μm. The Layer thickness is seen to be contained within a 20.5 ± 0.5 μm range.

After the position of the surface is evaluated, the motor is moved to that position. A new set of M, for example M = 5, images around the previously found optimal focus position is acquired, to cope with the variation of the focus positions between the center and the corner. The new set of images around the focus is then used for further image processing and as input to the neural network model.

3 Image processing

3.1 Preprocessing

After the best focused image has been chosen, a composite RGB images is created by replacing the green channel of the bright field image with the mean value of all the color channels of the dark-field (DF) image. This is calculated as:

  1. RGBcomposite [:,:,:] = RGBBF [:,:,:]

  2. RGB composite [ : , : , Blue ] = RGB DF [ : , : , Red ] + RGB DF [ : , : , Blue ] + RGB DF [ : , : , Green ] 3 $ {\mathrm{RGB}}_{\mathrm{composite}}[:,:,\mathrm{Blue}]=\frac{{\mathrm{RGB}}_{\mathrm{DF}}\left[:,:,\mathrm{Red}\right]+{\mathrm{RGB}}_{\mathrm{DF}}\left[:,:,\mathrm{Blue}\right]+{\mathrm{RGB}}_{\mathrm{DF}}[:,:,\mathrm{Green}]}{3}$

An example of the process is shown in Figure 4 below. Combining the bright and dark images provides better discrimination against the background. It also highlights the living spores since the spherical shape and high-water content of live spores make them function like tiny lenses. Dead spores tend to dry out and therefore do not light up in the same way as live spores.

Examples of images of relevant spores are shown in Figures 68. The resolution through the film is good enough to enable distinction between the two types of spores based on their shape and size. We observe that mildew spores sometimes exhibit a crown of small, bright spots in the bright-field illumination. This phenomenon is attributed to the lens-like properties of the oblong mildew structures, with their bodies acting as miniature lenses. The resulting effect of the “mildew lens” and the microscope lens is the formation of an image of the LED mounted on the dark Field ring. This effect is only present in fresh, water filled mildew, as the biological structure of dried-out mildew has collapsed, leading to the disappearance of its lens-like properties.

thumbnail Figure 6

Combination of bright and dark field image into a composite red-green image. We only show a small part of the image to highlight the size difference between spores of powdery mildew (blue box) and grey mould (orange box).

thumbnail Figure 7

Cucumber powdery mildew. Dark spores are dried out (dead).

thumbnail Figure 8

Example of training data that is input to the model development. The spore-like structures in the lower, right image are water droplets.

3.2 Model development

The model is based on the YOLOv5 neural network model [15], which is commonly used to detect objects in images for a large variety of applications [1620]. The choice of YOLOv5s as basis for the model development instead of other, larger models, is a compromise between size and performance of the model. We have a relatively limited amount of different features in the images and only a few object classes, so using a larger model would not necessarily give much better results.

In the training of the network, the first layers of the model are kept fixed and only the last layer of the network is retrained with images of powdery mildew and grey mould (acquired in field tests and in the lab). The images are split into three datasets; training, validation and test set. The training and validation sets are using during the training of the model. The test set is used only for reporting of the model results. Since we have a limited amount of spores in the images, the Images containing spores are augmented with flipped versions of the image (flip left-right and flip up-down). Negative images (without spores) are easily available, so they are included without augmentation.

4 Experimental results

Several field tests were conducted during spring and summer 2023 in various greenhouses in the Oslo region in Norway. The measuring system was positioned near the plants and left to collect data during several days (Figure 9). Both greenhouses with healthy plants and greenhouses with disease outbreaks have been investigated.

thumbnail Figure 9

Measuring system collecting data in a cucumber greenhouse with powdery mildew attack.

The model development is still ongoing, but preliminary results indicate that it is possible to detect the relevant spores and separate them from dust and other objects in the images. Current models give mean average precision (mAP50) of 0.92 for grey mould and 0.98 for cucumber powdery mildew, see Table 1.

Figure 10 shows examples of classification results for images with spores of grey mould and cucumber powdery mildew. We see that the model is able to correctly detect the powdery mildew even when the image is not perfectly focused. The model also manages to differentiate between the elliptic grey mould spores and circular water droplets.

thumbnail Figure 10

Classification examples for grey mould (top) and cucumber powdery mildew (bottom).

5 Discussion

The system performs well when it comes to classification accuracy of the spores that are trapped on the tape. However, more experiments are needed to determine which proportion of the spores in the air in a typical greenhouse will effectively make their way to our system. Our system has an air intake that draws air in and an air outtake that pushes air out. The spores are typically 10–60 μm long. Airborne particles with a diameter < 10 μm and a density similar to water are expected to behave passively in relation to air turbulence. In contrast, airborne particles with a diameter of 60 μm will be more influenced by their mass, exhibiting higher inertia against being carried by air flow [21]. Airborne particles with a diameter < 10 μm have a falling speed of approximately 0.25 cm/s, settling to the ground in about 11 min from a height of 2 m (average height of a cucumber leaf). Conversely, for a 60 μm particle, the falling speed is around 10.8 cm/s, requiring only 18 s to reach the ground1. In practical terms, grey mold spores may thus remain suspended in the air for a long time, potentially drifting over long distances in a greenhouse, depending on airflow and turbulence. Mildew spores, which spend a shorter time suspended in the air, will probably be less affected by airflow and drift over much shorter distances. In a large, commercial greenhouse, a large spore “emitted” at one side of the greenhouse might probably not then reach the measuring system before being dried out or before colliding with infrastructure or another plant. Statistical studies on the concentration of spores in the air are needed to evaluate the full potential of a detection system for, e.g., mildew spores based on air collection.

Another important assumption in our system is that a spore that is drawn in, is a spore trapped on the tape at some position in the FOV. This is a strong assumption which assumes that 1) the spores are not trapped along the side of the funnel on their way to the tape; 2) all the spores that make their way down to the tape will impact the tape and 3) all the spores that impact the tape will adhere to it. We have not made any studies of the number of spores being trapped on the tape versus the number of spores entering the funnel and we do not know the efficiency of our collection system. For example, a more efficient design of the geometry of the collection system might help in directing the spores to the FOV region and avoid collision of the spores with the walls. This should be optimized using e.g. a fluid mechanic simulator, considering the specificity of the spores of interest (size and mass). The geometry of the impact zone might also be further improved by optimizing e.g. the width of the tape, its orientation, and the geometry of the chamber around the tape for maximal impact. Finally, the tape chosen might also be optimized with respect to transparency, and “trapping” efficiency.

The imaging system and detection algorithm operate under the assumption that only a limited number of spore types need to be identified, and that the captured images are not contaminated by very large amounts of dust, pollen, or water droplets. In the greenhouse tests this has so far been a valid assumption. However, for coping with a more generic environment, additional detection modalities and extensive testing are probably needed. For example, fluorescence and multispectral detection could be added as a mean to detect a wider range of spores.

The system we have described currently makes use of a stepper motor to move the microscope. The stepper motor amounts to a large part of the weight and volume of the system and may be replaced by, e.g., a tunable lens [23] to reduce the size and improve the measurement speed.

The effect of the focusing precision and the optical resolution on the model performance has not been extensively evaluated. Testing indicates that the model is relatively robust for poorly focused images of mildew spores, but the results for the much smaller spores of grey mould deteriorate more quickly with inaccurate focus. This will also be the subject of further studies.

There is a very broad corpus of work showing that optical spectroscopy is an interesting tool for the diagnosis of plant diseases. Wavelength ranges such as visible, near-infrared and mid-infrared have been utilized for diagnosis and monitoring of plant diseases in a nondestructive way, see [24, 25] and references therein. Techniques such as Raman, Fourier Transform Infra-Red, and fluorescence spectroscopy, which can provide information about the molecular fingerprint of pathogens, and reflectance spectroscopy have proven effective for early detection and identification of plant pathogens, significantly enhancing disease management capabilities, see [2628] and references therein. For the case of mildew, Laser-Induced Fluorescence has been proven to be able to provide presymptomatic detection of powdery mildew on wheat leaves shortly after fungus inoculation [29]. Also, UV-fluorescence spectroscopy has been used for detection of powdery mildew in grapevine [30]. VIS-NIR reflectance spectroscopy was also assessed for the rapid detection of e.g. Botrytis cinerea and Powdery mildew on wine grape [31]. Finally, an interesting approach for indirect detection of strawberry powdery mildew has been achieved in [32] by remotely measuring CO2 concentration close to the crop, since in infected plants, the ability to absorb CO2 has been shown to be altered [33].

From this comprehensive but not exhaustive list of works, it is evident that spectral information can add new dimensions to the microscopy detection system discussed in this paper, particularly for detecting powdery mildew and botrytis. Many existing studies that utilize spectroscopy for local inspections close to the leaves rely on bulky and cumbersome equipment. For instance, standard commercial FTIR systems, which are typically expensive and heavy, are often customized for laboratory use and are not well-suited for field applications where large areas need to be monitored in real time. Remote spectral sensing presents a promising solution to overcome these challenges, as it allows for the monitoring of leaves and crops over large areas from a few isolated positions under ideal conditions. The system described in this article bridges the gap between costly, heavy equipment and remote sensing systems by offering a potential for compactness and affordability. It can be deployed to monitor large areas and does not depend on direct leaf inspection, making it relatively unaffected by the initial location or timing of disease development on the crops. As long as spores are present in the air, the suction system stands ready to capture and present them to the imaging system.

6 Conclusions

We have developed a real-time optical measurement system for non-contact measurement of fungal spores in protected crops such as strawberries, tomatoes, and cucumbers. The system combines a collection system of spores and a customized microscope combining bright and dark illumination to detect the spores. The collection of spores is achieved by sampling air and trapping spores on a tape. A YOLOv5 neural network is trained to identify spores of powdery mildew and grey mold. Current models give mean average precision (mAP50) of 0.92 for grey mould and 0.98 for cucumber powdery mildew. The measurement system has been tested in the field under real conditions, in several greenhouses in the Oslo region in Norway. Additional work is needed to estimate the collection rate of the current measurement systems as ground truth for concentration of spores in air is challenging to obtain in the greenhouse. Also, monitoring of additional spores could be achieved by our system by adding detection modalities, e.g. multi spectral imaging and/or fluorescence detection.

Funding

The work presented in this article was conducted as part of the “Environmentally friendly fungal disease management in protected crop production using plant genetic resources and sensor technology” (END-IT) project funded by a grant of the Research Council of Norway under the FFL-JA-Research funds for agriculture and food industry.

Conflicts of interest

The authors declare that they have no competing interests to report.

Data availability statement

Data associated with this article cannot be disclosed due to legal/ethical/other reason.

Author contribution statement

GB, KK and KHH contributed to the conceptualization of the idea. The development of hardware was performed by GB and KHH. The experiments and collection of data were performed by GB and KK. The development of image processing algorithms was performed by KK. GB and KK wrote the manuscript with feedback from KHH. All authors discussed the results and contributed to the final manuscript.


1

The gravitational settling velocity for a particle of diameter d and density ρ is Vs = ρgd2⁄18μ, with g, µ the gravitational acceleration and the air viscosity, see [22].

References

  1. Hahn M. (2014) The rising threat of fungicide resistance in plant pathogenic fungi: Botrytis as a case study, J. Chem. Biol. 7, 133–141. https://doi.org/10.1007/s12154-014-0113-1. [CrossRef] [Google Scholar]
  2. McGrath M.T. (2001) Fungicide resistance in cucurbit powdery mildew: Experiences and challenges. Plant Dis. 85, 236–245. https://doi.org/10.1094/PDIS.2001.85.3.236. [CrossRef] [Google Scholar]
  3. Coelho S. (2009) European pesticide rules promote resistance, researchers warn, Science 323, 450–450. https://doi.org/10.1126/science.323.5913.450. [CrossRef] [Google Scholar]
  4. Heimbach U., Kral G., Niemann P. (2002) EU regulatory aspects of resistance risk assessment. Pest Manag. Sci. 58, 9, 935–938. https://doi.org/10.1002/ps.538. [CrossRef] [Google Scholar]
  5. McLaughlin R.P., Mason G.S., Miller A.L., Stipe C.B., Kearns J.D., Prier M.W., Rarick J.D. (2016) Note: A portable laser induced breakdown spectroscopy instrument for rapid sampling and analysis of silicon-containing aerosols. Rev. Sci. Instrum. 87, 5. https://doi.org/10.1063/1.4949506. [CrossRef] [Google Scholar]
  6. Blank R., Vinayaka P.P., Tahir M.W., Yong J., Vellekoop M.J., Lang W. (2016) Comparison of several optical methods for an automated fungal spore sensor system concept. IEEE Sensors J. 16, 5596–5602. https://doi.org/10.1109/JSEN.2016.2567538. [NASA ADS] [CrossRef] [Google Scholar]
  7. Tahir M.W., Zaidi N.A., Blank R., Vinayaka P.P., Vellekoop M.J., Lang W. (2017) Fungus detection through optical sensor system using two different kinds of feature vectors for the classification. IEEE Sensors J. 17, 5341–5349. https://doi.org/10.1109/JSEN.2017.2723052. [NASA ADS] [CrossRef] [Google Scholar]
  8. Wang Y., Zhang X., Taha M.F., Chen T., Yang N., Zhang J., Mao H. (2023) Detection method of fungal spores based on fingerprint characteristics of diffraction-polarization images. J. Fungi 9, 1131. https://doi.org/10.3390/jof9121131. [CrossRef] [Google Scholar]
  9. Website of Swisens AS, accessed on 28 May 2024, https://www.swisens.ch/en/swisenspoleno-mars. [Google Scholar]
  10. Sauvageat E., Zeder Y., Auderset K., Calpini B., Clot B., Crouzy B., Konzelmann T., Lieberherr G., Tummon F., Vasilatou K. (2020) Real-time pollen monitoring using digital holography. Atmos. Meas. Tech. 13, 1539–1550. https://doi.org/10.5194/amt-13-1539-2020. [NASA ADS] [CrossRef] [Google Scholar]
  11. Wang Y., Mao H., Xu G., Zhang X., Zhang Y. (2022) A rapid detection method for fungal spores from greenhouse crops based on CMOS image sensors and diffraction fingerprint feature processing. J. Fungi 8, 4, 374. https://doi.org/10.3390/jof8040374. [CrossRef] [Google Scholar]
  12. Bradbury S. (1998) Introduction to Light Microscopy, 2nd ed., Bios Scientific Pub Ltd. [Google Scholar]
  13. Compact Aimed Dark Field RL2115. Website of Advanced Illumination, accessed on 28 May 2024, https://www.advancedillumination.com/. [Google Scholar]
  14. Brenner J.F., Dew B.S., Horton J.B., King J.B., Neirath P.W., Sellers W.D. (1971) An automated microscope for cytologic research. J. Histochem. Cytochem. 24, 100–111. [Google Scholar]
  15. Yolov5, Github repository, accessed on 28 May 2024, https://github.com/ultralytics/yolov5. [Google Scholar]
  16. Wang H., Zhang S., Zhao S., Wang Q., Li D., Zhao R. (2022) Real-time detection and tracking of fish abnormal behavior based on improved YOLOV5 and SiamRPN++. Comput. Electron. Agric. 192, 106512. https://doi.org/10.1016/j.compag.2021.106512. [NASA ADS] [CrossRef] [Google Scholar]
  17. Jing Y., Ren Y., Liu Y., Wang D., Yu L. (2022) Automatic extraction of damaged houses by earthquake based on improved YOLOv5: A case study in Yangbi. Remote Sens. 14, 2, 382. https://doi.org/10.3390/rs14020382. [NASA ADS] [CrossRef] [Google Scholar]
  18. Fang Y., Guo X., Chen K., Zhou Z., Ye Q. (2021) Accurate and automated detection of surface knots on sawn timbers using YOLO-V5 model. BioResources 16, 3, 5390–5406. https://doi.org/10.15376/biores.16.3.5390-5406. [CrossRef] [Google Scholar]
  19. Mathew M., Mahesh T.Y. (2022) Leaf-based disease detection in bell pepper plant using YOLO v5. SIViP 16, 841–847. https://doi.org/10.1007/s11760-021-02024-y. [CrossRef] [Google Scholar]
  20. Mushtaq F., Ramesh K., Deshmukh S., Ray T., Parimi C., Tandon P., Jha P.K. (2023) Nuts&bolts: YOLO-v5 and image processing based component identification system. Eng. Appl. Artif. Intell. 118, 105665. https://doi.org/10.1016/j.engappai.2022.105665. [CrossRef] [Google Scholar]
  21. Hinds W.C. (1999) Aerosol technology: Properties, behavior, and measurement of airborne particles, John Wiley & Sons. [Google Scholar]
  22. Colbeck I., Lazaridis M. (eds) (2014) Aerosol science: Technology and applications, 1st ed., John Wiley & Sons, New York, pp. 89–118. [Google Scholar]
  23. Chen L., Ghilardi M., Busfield J.J.C., Carpi F. (2021) Electrically tunable lenses: A review. Front. Robot. AI 8, 678046. https://doi.org/10.3389/frobt.2021.678046. [NASA ADS] [CrossRef] [Google Scholar]
  24. Zahir S.A.D.M., Omar A.F., Jamlos M.F., Azmi M.A.M., Muncan J. (2022) A review of visible and near-infrared (Vis-NIR) spectroscopy application in plant stress detection. Sens. Actuators A Phys. 338, 113468. [NASA ADS] [CrossRef] [Google Scholar]
  25. Nißler R., Müller A.T., Dohrman F., Kurth L., Li H., Cosio E.G., Flavel B.S., Giraldo J.P., Mithöfer A., Kruss S. (2022) Detection and imaging of the plant pathogen response by near-infrared fluorescent polyphenol sensors. Angew. Chem. Int. Ed. 61, e202108373. [CrossRef] [Google Scholar]
  26. Farber C., Mahnke M., Sanchez L., Kurouski D. (2019) Advanced spectroscopic techniques for plant disease diagnostics. A review. TrAC Trends Anal.l Chem. 118, 43–49. ISSN 0165-9936. [CrossRef] [Google Scholar]
  27. Kumar R., Pathak S., Prakash H., Priya U., Ghatak A. (2021) Application of spectroscopic techniques in early detection of fungal plant pathogens, in: Kurouski D. (ed), Diagnostics of Plant Diseases. IntechOpen, London, UK. [Google Scholar]
  28. Khaled A.Y., Abd Aziz S., Bejo S.K., Nawi N.M., Seman I.A., Onwude D.I. (2018) Early detection of diseases in plant tissue using spectroscopy – applications and limitations. Appl. Spectrosc. Rev. 53, 1, 36–64. [CrossRef] [Google Scholar]
  29. Bürling K., Hunsche M., Noga G. (2012) Presymptomatic detection of powdery mildew infection in winter wheat cultivars by laser-induced fluorescence. Appl. Spectrosc. 66, 12, 1411–1419. [CrossRef] [Google Scholar]
  30. Bélanger M.C., Roger J.M., Cartolaro P., Viau A.A., Bellon-Maurel V. (2008) Detection of powdery mildew in grapevine using remotely sensed UV-induced fluorescence. Int. J. Remote Sens. 29, 6, 1707–1724. [CrossRef] [Google Scholar]
  31. Beghi R., Giovenzana V., Brancadoro L., Guidetti R. (2017) Rapid evaluation of grape phytosanitary status directly at the check point station entering the winery by using visible/near infrared spectroscopy. J. Food Eng. 204, 46–54. [CrossRef] [Google Scholar]
  32. H. Pham, Y. Lim, A. Gardi, R.A. Sabatini, Novel Bistatic LIDAR system for early-detection of plant diseases from unmanned aircraft, in: Proceedings of the 31th Congress of the International Council of the Aeronautical Sciences (ICAS 2018), Belo Horizonte, Brazil, 2018. [Google Scholar]
  33. Gordon T.R., Duniway J.M. (1982) Effects of powdery mildew infection on the efficiency of CO2 fixation and light utilization by sugar beet leaves. Plant Physiol. 69, 1, 139–142. [CrossRef] [Google Scholar]

All Tables

Table 1

Classification results for the test set.

All Figures

thumbnail Figure 1

Picture of the automatic sampling and detection system for spores. The total size of the present system is 30 cm by 30 cm by 50 cm. See Figure 2 for a description of the various components.

In the text
thumbnail Figure 2

The measurement and sampling system consists of a microscope with 10× magnification, an air sampling system, a tape for trapping the spore, and an illumination system with bright and dark configurations. The air intake at the top is also used to reflect and guide light from the bright field illumination towards the field of view of the microscope. When air is drawn in, small spores (<100 μm) will follow the air flow and impact the tape at a position corresponding to the FOV of the spectrometer. There the spores are trapped on the adhesive side.

In the text
thumbnail Figure 3

Reference object for measuring optical resolution of the optical system through the tape. See description in the text. The 2.5 μm (left) and 5 μm (right) vertical lanes can be distinguished on the picture. The squares at the bottom are gold pad with 200 μm × 200 μm dimensions.

In the text
thumbnail Figure 4

Intensity across the 2.5 μm, 5 μm, and 10 μm lane. Roughly the side of the lane are resolved for the 5 μm but not for the 2.5 μm, giving us an optical resolution falling in between 2.5 μm and 5 μm.

In the text
thumbnail Figure 5

Typical focus curve obtained on a fresh tape. Fresh tape typically comprises four extrema corresponding to what we believe are different layer borders in the tape. The focusing sequence is repeated 100 times and the positions of the two last peaks are calculated with 1/32nd steps precision, see description in the text. The insert, left and right, show the layer thickness and the position of the adhesive side, as calculated for each run.

In the text
thumbnail Figure 6

Combination of bright and dark field image into a composite red-green image. We only show a small part of the image to highlight the size difference between spores of powdery mildew (blue box) and grey mould (orange box).

In the text
thumbnail Figure 7

Cucumber powdery mildew. Dark spores are dried out (dead).

In the text
thumbnail Figure 8

Example of training data that is input to the model development. The spore-like structures in the lower, right image are water droplets.

In the text
thumbnail Figure 9

Measuring system collecting data in a cucumber greenhouse with powdery mildew attack.

In the text
thumbnail Figure 10

Classification examples for grey mould (top) and cucumber powdery mildew (bottom).

In the text

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.