scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Detecting maize leaf water status by using digital RGB images

20 Feb 2014-International Journal of Agricultural and Biological Engineering (Chinese Society of Agricultural Engineering)-Vol. 7, Iss: 1, pp 45-53
TL;DR: In this article, a Canon digital camera was used to collect image information from detached leaves of heading-stage maize, and image processing technologies, including gray level co-occurrence matrices and grayscale histograms, were used to extract the maize leaf texture feature parameters and color feature parameters.
Abstract: To explore the correlation between crop leaf digital RGB (Red, Green and Blue) image features and the corresponding moisture content of the leaf, a Canon digital camera was used to collect image information from detached leaves of heading-stage maize. A drying method was adopted to measure the moisture content of the leaf samples, and image processing technologies, including gray level co-occurrence matrices and grayscale histograms, was used to extract the maize leaf texture feature parameters and color feature parameters. The correlations of these feature parameters with moisture content were analyzed. It is found that the texture parameters of maize leaf RGB images, including contrast, correlation, entropy and energy, were not significantly correlated with moisture content. Thus, it was difficult to use these features to predict moisture content. Of the six groups of eigenvalues for the leaf color feature parameters, including mean, variance, energy, entropy, kurtosis and skewness, mean and kurtosis were found to be correlated with moisture content. Thus, these features could be used to predict the leaf moisture content. The correlation coefficient (R2) of the mean-moisture content relationship model was 0.7017, and the error of the moisture content prediction was within
Citations
More filters
Book ChapterDOI
TL;DR: In this article, the use of structured light sensors in the characterization and phenotyping of crops in orchards and groves, weeds, and animals is discussed, with the aim of providing the farmer with information to take better decisions to enhance the production.
Abstract: The sustained growth of the world's population in the coming years will require an even greater role for agriculture to meet the food needs of humankind. To improve the productivity and competitiveness of the agricultural industry, it is necessary to develop new and affordable sensing technologies for agricultural operations. This kind of innovations should be implemented in a framework considering the farm, the crops, and their surroundings, with the aim of providing the farmer with information to take better decisions to enhance the production. This is the case of precision agriculture and precision livestock farming. This chapter reviews and discusses the use of structured light sensors in the characterization and phenotyping of crops in orchards and groves, weeds, and animals. As a result of a collaboration between researchers from Spain and Chile, opportunities for this type of sensors have been identified in these countries as examples of South American and European agriculture. In this context, several empirical case studies are presented regarding the use of structured light sensors for flower, fruit, branch, and trunk characterization considering depth and RGB (red-green-blue colors) information in avocados, lemons, apple, and pear orchards. Applications to weed detection and classification as well as to livestock phenotyping are also illustrated. Regarding the presented case studies, experimental and statistical results are provided showing the pros and cons of structured light sensors applied to agricultural environments. Additionally, several considerations are included for the use of this type of sensors to improve the agricultural process.

60 citations

Journal ArticleDOI
18 Feb 2019-Symmetry
TL;DR: The proposed deep learning-based approach for field maize drought identification and classification based on digital images achieves a better performance than the traditional machine learning method (Gradient Boosting Decision Tree GBDT).
Abstract: Drought stress seriously affects crop growth, development, and grain production. Existing machine learning methods have achieved great progress in drought stress detection and diagnosis. However, such methods are based on a hand-crafted feature extraction process, and the accuracy has much room to improve. In this paper, we propose the use of a deep convolutional neural network (DCNN) to identify and classify maize drought stress. Field drought stress experiments were conducted in 2014. The experiment was divided into three treatments: optimum moisture, light drought, and moderate drought stress. Maize images were obtained every two hours throughout the whole day by digital cameras. In order to compare the accuracy of DCNN, a comparative experiment was conducted using traditional machine learning on the same dataset. The experimental results demonstrated an impressive performance of the proposed method. For the total dataset, the accuracy of the identification and classification of drought stress was 98.14% and 95.95%, respectively. High accuracy was also achieved on the sub-datasets of the seedling and jointing stages. The identification and classification accuracy levels of the color images were higher than those of the gray images. Furthermore, the comparison experiments on the same dataset demonstrated that DCNN achieved a better performance than the traditional machine learning method (Gradient Boosting Decision Tree GBDT). Overall, our proposed deep learning-based approach is a very promising method for field maize drought identification and classification based on digital images.

58 citations

Journal ArticleDOI
TL;DR: A computer graphic-based 3D point cloud segmentation approach for accurately and efficiently detecting tree leaves and their morphological features (i.e., leaf area and leaf angle distributions (leaf azimuthal angle and leaf inclination angle) from single leaves is developed.
Abstract: Leaf attribute estimation is crucial for understanding photosynthesis, respiration, transpiration, and carbon and nutrient cycling in vegetation and evaluating the biological parameters of plants or forests. Terrestrial laser scanning (TLS) has the capability to provide detailed characterisations of individual trees at both the branch and leaf scales and to extract accurate structural parameters of stems and crowns. In this paper, we developed a computer graphic-based 3D point cloud segmentation approach for accurately and efficiently detecting tree leaves and their morphological features (i.e., leaf area and leaf angle distributions (leaf azimuthal angle and leaf inclination angle)) from single leaves. To this end, we adopted a sphere neighbourhood model with an adaptive radius to extract the central area points of individual leaves with different morphological structures and complex spatial distributions; meanwhile, four auxiliary criteria were defined to ensure the accuracy of the extracted central area points of individual leaf surfaces. Then, the density-based spatial clustering of applications with noise (DBSCAN) algorithm was used to cluster the central area points of leaves and to obtain the centre point corresponding to each leaf surface. We also achieved segmentation of individual leaf blades using an advanced 3D watershed algorithm based on the extracted centre point of each leaf surface and two morphology-related parameters. Finally, the leaf attributes (leaf area and leaf angle distributions) were calculated and assessed by analysing the segmented single-leaf point cloud. To validate the final results, the actual leaf area, leaf inclination and azimuthal angle data of designated leaves on the experimental trees were manually measured during field activities. In addition, a sensitivity analysis investigated the effect of the parameters in our segmentation algorithm. The results demonstrated that the segmentation accuracy of Ehretia macrophylla (94.0%) was higher than that of crape myrtle (90.6%) and Fatsia japonica (88.8%). The segmentation accuracy of Fatsia japonica was the lowest of the three experimental trees. In addition, the single-leaf area estimation accuracy for Ehretia macrophylla (95.39%) was still the highest among the three experimental trees, and the single-leaf area estimation accuracy for crape myrtle (91.92%) was lower than that for Ehretia macrophylla (95.39%) and Fatsia japonica (92.48%). Third, the method proposed in this paper provided accurate leaf inclination and azimuthal angles for the three experimental trees (Ehretia macrophylla: leaf inclination angle: R 2 = 0.908, RMSE = 6.806° and leaf azimuth angle: R 2 = 0.981, RMSE = 7.680°; crape myrtle: leaf inclination angle: R 2 = 0.901, RMSE = 8.365° and leaf azimuth angle: R 2 = 0.938, RMSE = 7.573°; Fatsia japonica: leaf inclination angle: R 2 = 0.849, RMSE = 6.158° and leaf azimuth angle: R 2 = 0.947, RMSE = 3.946°). The results indicate that the proposed method is effective and operational for providing accurate, detailed information on single leaves and vegetation structure from scanned data. This capability facilitates improvements in applications such as the estimation of leaf area, leaf angle distribution and biomass.

26 citations


Cites background from "Detecting maize leaf water status b..."

  • ...As the main organs of vegetation, leaves play a crucial role in vegetation growth [1]....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors explored the possibility of predicting soybean end-season traits through the color and texture features of early-season canopy images, and the best results were obtained using all 457 predictor variables.
Abstract: Global crop production is facing the challenge of a high projected demand, while the yields of major crops are not increasing at sufficient speeds. Crop breeding is an important way to boost crop productivity, however its improvement rate is partially hindered by the long crop generation cycles. If end-season crop traits such as yield can be predicted through early-season phenotypic measurements, crop selection can potentially be made before a full crop generation cycle finishes. This study explored the possibility of predicting soybean end-season traits through the color and texture features of early-season canopy images. Six thousand three hundred and eighty-three images were captured at V4/V5 growth stage over 6039 soybean plots growing at four locations. One hundred and forty color features and 315 gray-level co-occurrence matrix-based texture features were derived from each image. Another two variables were also introduced to account for location and timing differences between the images. Five regression and five classification techniques were explored. Best results were obtained using all 457 predictor variables, with Cubist as the regression technique and Random Forests as the classification technique. Yield (RMSE = 9.82, R2 = 0.68), Maturity (RMSE = 3.70, R2 = 0.76) and Seed Size (RMSE = 1.63, R2 = 0.53) were identified as potential soybean traits that might be early predictable.

21 citations

Journal ArticleDOI
TL;DR: In this paper, a back-propagation neural network (BPNN) was used to estimate the water content of a rice plant under different water stress levels to evaluate the canopy water content.
Abstract: A total of 120 rice plant samples were scanned by visible and thermal proximal sensing systems under different water stress levels to evaluate the canopy water content (CWC). The oven-drying method was employed for assessing the canopy’s water state. This CWC is of great importance for irrigation management decisions. The proposed framework is to integrate visible and thermal imaging data using an artificial neural network as a valuable promising implement for accurately estimating the water content of the plant. The RGB-based features included 20 color vegetation indices (VI) and 6 gray level co-occurrence matrix-based texture features (GLCMF). The thermal imaging features were two thermal indicators (T), namely normalized relative canopy temperature (NRCT) and the crop water stress index (CWSI), that were deliberated by plant temperatures. These features were applied with a back-propagation neural network (BPNN) for training the samples with minimal loss on a cross-validation set. Model behavior was affected by filtering high-level features and optimizing hyperparameters of the model. The results indicated that feature-based modeling from both visible and thermal images achieved better performance than features from the individual visible or thermal image. The supreme prediction variables were 21 features: 14VI, 5GLCMF, and 2T. The fusion of color–texture–thermal features greatly improved the precision of water content evaluation (99.40%). Its determination coefficient (R2 = 0.983) was the most satisfied with an RMSE of 0.599. Overall, the methodology of this work can support decision makers and water managers to take effective and timely actions and achieve agricultural water sustainability.

15 citations

References
More filters
Journal ArticleDOI
TL;DR: The results show that hand-operated thermographic cameras can be used to detect plant water stress in both fruit tree species and appears to be more precise in persimmon than in orange citrus.

99 citations


"Detecting maize leaf water status b..." refers background or methods in this paper

  • ...Ballester([6]) used a handheld infrared thermal imager to study leaf moisture content in citrus trees and persimmon trees....

    [...]

  • ...[6] Ballester C, Jiménez-Bello M A, Castel J R, Intrigliolo D S....

    [...]

Journal ArticleDOI
TL;DR: In this article, the spectral signature of plant leaves was analyzed by a hyperspectral camera to identify the onset and intensity of plant water stress, and various spectral indices were calculated and correlated to stress levels.

88 citations

Journal ArticleDOI
TL;DR: In this article, the authors proposed nature-inspired feature selection techniques to find the most significant set of Textural Features (TFs) suitable for predicting water content of cultured Sunagoke moss.

65 citations


"Detecting maize leaf water status b..." refers methods in this paper

  • ...Based on the meaning of each feature parameter and a comparison of the results, we selected four parameters, including contrast, correlation, entropy and energy, to study the texture features [15-17] ....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors used OBIA to distinguish vegetation zones, vegetation patches, and surface water channels in two intertidal salt marshes in southern San Francisco Bay.
Abstract: Mapping landscape features within wetlands using remote-sensing imagery is a persistent challenge due to the fine scale of wetland pattern variation and the low spectral contrast among plant species. Object-based image analysis OBIA is a promising approach for distinguishing wetland features, but systematic guidance for this use of OBIA is not presently available. A sensitivity analysis was tested using OBIA to distinguish vegetation zones, vegetation patches, and surface water channels in two intertidal salt marshes in southern San Francisco Bay. Optimal imagery sources and OBIA segmentation settings were determined from 348 sensitivity tests using the eCognition multiresolution segmentation algorithm. The optimal high-resolution ≤1 m imagery choices were colour infrared CIR imagery to distinguish vegetation zones, CIR or red, green, blue RGB imagery to distinguish vegetation patches depending on species and season, and RGB imagery to distinguish surface water channels. High-resolution 1 m lidar data did not help distinguish small surface water channels or other features. Optimal segmentation varied according to segmentation setting choices. Small vegetation patches and narrow channels were more recognizable using small scale parameter settings and coarse vegetation zones using larger scale parameter settings. The scale parameter served as a de facto lower bound to median segmented object size. Object smoothness/compactness weight settings had little effect. Wetland features were more recognizable using high colour/low shape weight settings. However, an experiment on a synthetic non-wetland image demonstrated that, colour information notwithstanding, segmentation results are still strongly affected by the selected image resolution, OBIA settings, and shape of the analysis region. Future wetland OBIA studies may benefit from strategically making imagery and segmentation setting choices based on these results; such systemization of future wetland OBIA approaches may also enhance study comparability.

57 citations


"Detecting maize leaf water status b..." refers methods in this paper

  • ...The texture description was achieved with statistical analysis [12-14] ....

    [...]

Journal Article
TL;DR: The MATLAB simulation results show that the gray level co-occurrence matrix generated by the four texture features can effectively describe the texture characteristics of wood with good differentiation.
Abstract: Widespread in nature Texture,texture is the most essential properties of the surface Texture analysis has been a hotspot in research As the primary task of texture analysis,texture feature extraction is the focus of study This paper puts forward a texture features extraction method for the five kinds of wood grain with the gray level co-occurrence matrix The MATLAB simulation results show that the gray level co-occurrence matrix generated by the four texture features can effectively describe the texture characteristics of wood with good differentiation

40 citations


"Detecting maize leaf water status b..." refers methods in this paper

  • ...The texture description was achieved with statistical analysis [12-14] ....

    [...]