CN115655157A - Fish-eye image-based leaf area index measuring and calculating method - Google Patents
Fish-eye image-based leaf area index measuring and calculating method Download PDFInfo
- Publication number
- CN115655157A CN115655157A CN202211290462.2A CN202211290462A CN115655157A CN 115655157 A CN115655157 A CN 115655157A CN 202211290462 A CN202211290462 A CN 202211290462A CN 115655157 A CN115655157 A CN 115655157A
- Authority
- CN
- China
- Prior art keywords
- image
- leaf area
- canopy
- fisheye
- degrees
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 51
- 238000005259 measurement Methods 0.000 claims abstract description 27
- 238000004364 calculation method Methods 0.000 claims abstract description 21
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 17
- 238000005516 engineering process Methods 0.000 claims abstract description 14
- 230000011218 segmentation Effects 0.000 claims description 19
- 238000012545 processing Methods 0.000 claims description 17
- 238000005070 sampling Methods 0.000 claims description 12
- 230000000007 visual effect Effects 0.000 claims description 11
- 238000003384 imaging method Methods 0.000 claims description 6
- 238000011161 development Methods 0.000 claims description 4
- 230000018109 developmental process Effects 0.000 claims description 4
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 claims description 3
- 238000010191 image analysis Methods 0.000 claims description 3
- 239000011159 matrix material Substances 0.000 claims description 3
- 238000010606 normalization Methods 0.000 claims description 3
- 238000012544 monitoring process Methods 0.000 abstract description 7
- 238000012795 verification Methods 0.000 abstract description 7
- 238000013461 design Methods 0.000 abstract description 3
- 238000012360 testing method Methods 0.000 description 16
- 241000196324 Embryophyta Species 0.000 description 13
- 241000209094 Oryza Species 0.000 description 10
- 235000007164 Oryza sativa Nutrition 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 235000009566 rice Nutrition 0.000 description 9
- 241000209140 Triticum Species 0.000 description 8
- 235000021307 Triticum Nutrition 0.000 description 8
- 238000002474 experimental method Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- 208000036855 Left sided atrial isomerism Diseases 0.000 description 2
- 240000008042 Zea mays Species 0.000 description 2
- 235000002017 Zea mays subsp mays Nutrition 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 150000001875 compounds Chemical class 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000003709 image segmentation Methods 0.000 description 2
- 241000723369 Cocculus trilobus Species 0.000 description 1
- 235000005824 Zea mays ssp. parviglumis Nutrition 0.000 description 1
- 235000016383 Zea mays subsp huehuetenangensis Nutrition 0.000 description 1
- 235000005822 corn Nutrition 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000012938 design process Methods 0.000 description 1
- 230000001066 destructive effect Effects 0.000 description 1
- 230000006353 environmental stress Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 235000009973 maize Nutrition 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 238000009659 non-destructive testing Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 239000002574 poison Substances 0.000 description 1
- 231100000614 poison Toxicity 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000007103 stamina Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000012418 validation experiment Methods 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Landscapes
- Image Processing (AREA)
Abstract
The invention discloses a fisheye image-based leaf area index measuring and calculating method, which is a farmland leaf area index monitoring method based on a machine vision technology in a mode of carrying an ultra-wide-angle lens to shoot a plant canopy fisheye image through a smart phone. The design key points of the invention comprise that the shooting mode of the mobile phone and the fisheye camera is applied to the crop canopy, the fisheye image is efficiently processed, and a simple and rapid LAI difference calculation method is designed by utilizing the definition of LAI and an integral formula. Meanwhile, the invention can be tested and applied in a plurality of fields and experimental scenes, and carries out precision verification with the measurement results of traditional instruments and professional software. The result shows that the equipment and the algorithm can effectively adapt to LAI measurement tasks in different scenes such as sunny days, rainy days and the like, have good accuracy and stability for paddy fields and dry lands, and can be well popularized to different agricultural monitoring tasks.
Description
Technical Field
The invention belongs to the technical field of crop phenotype monitoring, and particularly relates to a method for measuring and calculating crop leaf area indexes by carrying an ultra-wide-angle camera on a smart phone to shoot fisheye images of plant canopies and carrying out image processing based on a machine vision technology.
Background
The Leaf Area Index (LAI) of a vegetation canopy is an important reference Index for researching the growth state and physiological characteristics of plants, not only reflects the population morphological structure of the vegetation, but also reflects the response of the canopy to the change of the external environment. Therefore, the accurate measurement of the LAI has important significance for evaluating the growth vigor of crops, identifying the environmental stress of farmland and exploring the growth rule of plants.
LAI is numerically defined as the integral of the density of the leaf area over the depth of the canopy (equation 1), i.e. the distribution of the leaves over different height levels, conceptually corresponding to the sum of the top surface area (light receiving surface) of all the leaves per unit area of land.
Where H is the specified canopy height and l (H) is the leaf area density function at H height.
The traditional LAI measuring methods include destructive sampling method, litter method, oblique sampling method and the like, which need to manually collect crop leaves, have large workload and cause damage to plants, so that continuous observation data of the same vegetation coverage area cannot be obtained. In the aspect of indirect measurement technology, optical instruments are widely applied, common instruments comprise a Li-cor canopy analyzer, a TRAC instrument, an AccuPAR device and the like, and the instruments have the characteristics of nondestructive testing, convenience and high efficiency and are widely applied to the ground measurement practice of LAI. But the current mature measuring instruments still have some application problems: (1) The measurement of the instrument can be only carried out when the solar altitude is lower than 15-20 degrees under the influence of low resolution of the lens of the instrument and nonadjustable aperture, and the time for carrying out measurement operation every day is limited; (2) An instrument designed based on an optical principle needs to extend below a crop canopy and horizontally shoot and measure towards the sky, so that the instrument cannot detect short vegetation or crops in a seedling stage (the canopy is not deep enough for the instrument to enter); (3) The mainstream LAI measuring instrument is high in price, depends on import, needs to be operated and used by professionals, and is difficult to popularize and apply in fields.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a method for measuring and calculating the leaf area index based on a fisheye image, which is used for measuring and calculating the farmland leaf area index based on a machine vision technology in a mode of carrying an ultra-wide-angle lens on a smart phone to shoot a fisheye image of a plant canopy.
The design key points of the invention comprise that the shooting mode of the mobile phone and the fisheye camera is applied to the crop canopy, the fisheye image is subjected to high-efficiency image processing, and a simple and quick LAI difference calculation method is designed by utilizing the definition of LAI and an integral formula. Based on LAI definition and an optical measurement principle, the invention designs an extended shooting method of a fish-eye lens carried by a smart phone, a set of convenient fish-eye image acquisition device is constructed by utilizing a rear camera and a lightweight external lens of the smart phone, and the function of monitoring the leaf area index of farmland crops in real time by the smart phone is realized by optimizing the image through a machine vision algorithm. The method is suitable for various scenes, has high precision, strong stability and low cost, and has important significance for realizing large-scale and high-throughput crop phenotype monitoring and growth evaluation.
The technical scheme adopted by the invention is as follows: the method for measuring and calculating the leaf area index based on the fisheye image comprises the following steps:
and 3, cutting the processed fisheye image into infinite circular rings, wherein pixels on each circular ring are approximately positioned on a plane with the same height, calculating vegetation contact frequency in the circular rings and integrating to obtain the leaf area index value.
Further, in the step 1, the fisheye images of different types of canopies are collected through the smart phone and the fisheye lens, firstly, the smart phone is connected with the shooting rod in a Bluetooth mode, the fisheye lens is aligned to the main shooting lens behind the smart phone and is fixed, namely, the simple measuring rod is formed, after the lens is carried, the angle of the camera is determined according to the type and the development degree of the shot plant, the canopies with uneven growing edge are kept away during shooting, and the camera is kept horizontal.
Further, the imaging photosensitive plane of the fisheye lens is a circle inscribed in the target surface of the camera, a coordinate system is established by taking the circle center as the origin of coordinates, and the visual angle theta of each point in the image can be calculated by the following formula 2:
in the formula, x and y are pixel coordinates, and R is the radius of the image.
Further, the specific implementation manner of step 2 is as follows;
(21) Reducing the resolution of the image in an equal proportion by adopting a down-sampling method to accelerate the running speed, and cutting invalid areas around the image by virtue of an image cutting Cheng Ji packet;
(22) Based on HSV color space threshold value carry out vegetation pixel classification, extract vegetation proportion and clearance fraction in every ring, will adopt different segmentation methods according to shooting visual angle difference: converting the RGB image into an HSV image through an algorithm, setting upper and lower threshold limits, classifying pixels exceeding the limits into non-vegetation pixels and not counting in contact fraction calculation, and when the image is shot upwards in a canopy, taking a sky pixel as a segmentation main body, and removing the sky pixel to obtain the vegetation pixel; when the shooting angle is a downward shooting, the vegetation is taken as a segmentation main body, and the vegetation coverage is directly obtained.
Further, in the step 3, image ring region segmentation is carried out based on PIL Cheng Jibao, a central circle in the range of 0-15-degree observation zenith angle and concentric rings in the ranges of 15-30 degrees, 30-45 degrees, 45-60 degrees and 60-75 degrees are obtained, and considering that the region beyond 75 degrees is close to the sampling edge, the distortion is serious and the noise is more, so that only the part within 75 degrees is calculated.
Further, the specific implementation manner of step 3 is as follows;
according to the LAI concept and the definition of the formula (1), a Poisson calculation model based on contact frequency and gap fraction is provided, the contact frequency is provided by Warren Wilson and refers to the probability that sunlight is in contact with implanted elements in a canopy when the sunlight is incident on the canopy, the gap fraction refers to the probability that natural light beams are directly incident on a reference plane, and under the assumption that the leaves are opaque, the leaf coverage measured in image analysis is a unidirectional contact fraction;
wherein H is the canopy height, l (H) is the leaf area density function at H height;
under the height of the canopy of h, the average contact fraction is taken as the plant height integral value of the unidirectional contact fraction of each leaf layer, and the calculation formula is as follows:
wherein H is the height of the canopy, L (H) represents the leaf area density of each height H of the canopy of the plant corresponding to the corresponding layer, namely the leaf area of the canopy in unit volume,a direction vector, theta, referring to the observed position v The zenith angle for the direction of observation,for the azimuth of the observation direction, G is the projection function of the leaf area at the height h, bringing equation (1) into the available:
(4) Shows the correlation of LAI with contact frequencyIn whichThe compounds are obtained by the formulas (5) and (6):
introduction ofProbability density function of blade inclination angle distribution model, where l The zenith angle in the inclination direction of the blade,the azimuth angle of the blade inclination is adopted, and normalization condition constraint is carried out through equations (7) and (8):
the above formulas are combined to obtain the canopy gap fractionMean fraction of contactAnd LAI, optimized by Nilson as an exponential relationship in equation (9):
wherein,andsimilarly, based on the circular field of view of the fisheye image, regardless of the orientation of the incident ray, it is assumed that the gap fraction measurement depends only on the observation zenith angle θ v I.e. the angle between the incident direction of the light and the normal vector of the bottom photosensor of the canopy, the leaf area index LAI is calculated cal The calculation formula of (c) can be organized as:
welles proposes a discrete numerical analysis method aiming at integral expression (10) based on multi-view observation, which adopts a plurality of zenith observation angles to divide rings and divides average vegetation gap fraction in each ringThe integral formula (10) is subjected to difference processing:
in the formula S i (θ v ) Is cos i θ v -1 And W is i Is sin theta v d theta, i represents the angle of division, and the coefficients differ according to the angle of the taken ring.
Furthermore, 5 zenith observation angles are adopted to divide the circular ring into 0-15 degrees, 15-30 degrees, 30-45 degrees, 45-60 degrees, 60-75 degrees, which correspond to i =1 to i =5 respectively.
The invention is applied and developed on the basis of the years of research results of nondestructive monitoring of the leaf area index in the field of plant phenotype and optical remote sensing, and compared with other instruments and software of the same type, the invention achieves some new progresses and advantages:
(1) Adopt smart mobile phone to carry on the camera lens, the shooting mode of cooperation extension rod has simplified measurement operation, has optimized data storage and transmission process through smart mobile phone's operation to show the cost is reduced, be fit for promoting the field management aspect.
(2) The method has the advantages that image processing and integral calculation are carried out based on Python, rear-end codes are simple and easy to write, modification is convenient, the method is suitable for single-image debugging and batch processing of a large number of images, influences of weather and illumination on clearance fraction measurement are reduced through preprocessing, and a whole-course visualization condition is provided for operation debugging.
(3) Two visual angles of downward shooting and upward shooting are designed according to different crops and different field scenes, the defect that the LAI instrument cannot monitor the short canopy is overcome, and the limitation that the instrument cannot be used for measurement when the light is strong is eliminated.
(4) In the algorithm, the real-time solar zenith angle calculation function is added by reading exif information (position and time) of a mobile phone image, so that an LAI calculation result can be corrected.
(5) In the process of the invention, large-scale sampling experiments are carried out, and comparison experiments are carried out between the field of common crops such as rice, corn, wheat and the like and the traditional method and common instruments, so that the practicability and the accuracy of the invention are verified.
Drawings
FIG. 1: a fish-eye lens imaging principle schematic diagram;
FIG. 2 is a schematic diagram: a mobile phone + fisheye lens + extended shooting bar combination style (example);
FIG. 3: a canopy fisheye image acquisition method and process;
FIG. 4: the definition of the zenith angle and the calculation of the fish-eye image are calculated in a ring-dividing manner;
FIG. 5: processing and calculating fisheye images;
FIG. 6: verifying the precision of the rice upward shooting test result (the abscissa in the right graph is LAI calculated by a fisheye camera algorithm, and the ordinate is an LAI value measured by an instrument);
FIG. 7 is a schematic view of: verifying the rice inverted-shoot test results under different scenes (weather) (in the right picture, the abscissa is LAI calculated by a fisheye camera algorithm, and the ordinate is an LAI value measured by an instrument);
FIG. 8: the rice downshot test result is verified (in the right picture, the horizontal coordinate is LAI calculated by a fisheye camera algorithm, and the vertical coordinate is an LAI value measured by professional software);
FIG. 9: the test results of wheat downshots at different growth stages are verified (the abscissa in the right figure is LAI calculated by a fish-eye camera algorithm, and the ordinate is LAI value measured by an oblique photography method);
FIG. 10: and (4) verifying the wheat top-down shooting test result (the abscissa in the right graph is the LAI calculated by the fish-eye camera algorithm, and the ordinate is the LAI value measured by the oblique photography method).
Detailed Description
The technical solution of the present invention is further explained with reference to the drawings and the examples. The invention provides a fisheye image-based Leaf Area Index (LAI) measuring and calculating method, which comprises the following steps:
at present, the mainstream optical instruments collect and measure LAI by measuring the light transmission degree inside the canopy, so that the wider the view field angle, the larger the corresponding canopy depth range that can be obtained, the more accurate the measured LAI, the fish-eye lens (fish-eye lens) in the Photography can be applied to this scene, and a Hemispherical photogrammetry method (DHP) of the leaf area index is correspondingly developed. Different from a common camera lens, the fisheye lens has a smaller focal length and a wider view field, external light rays are greatly refracted at a convex lens, so that incident light rays with a small included angle with a plane where the camera is located can fall on an imaging target surface, the effect of acquiring a 180-degree hemispherical image is achieved, the imaging photosensitive plane is a circle internally tangent to the camera target surface, a coordinate system is established by taking the circle center as a coordinate origin, and the view angle theta (figure 1) at each point (pixel) in the image can be calculated by the following formula 2:
where x and y are pixel coordinates and R is the radius of the image (1/2 of the height of the camera target surface, e.g., 1080 for an image height of 1920 x 1080 resolution and 540 for radius).
According to the embodiment of the invention, the fisheye images of different types of canopies are collected through the smart phone and the fisheye lens. The shooting function and the resolution ratio of smart mobile phone can satisfy the collection requirement of canopy image, accomplish the equipment of carrying on and the extension rod of fisheye lens according to the mode of fig. 2, at first the cell-phone carries out the bluetooth with shooting the pole and is connected, take a photograph main lens and fix behind the cell-phone with fisheye lens alignment, constitute simple and easy measuring stick promptly, the mirror of going into of noise such as road surface, portrait under this kind of assembled mode can effectively avoiding wide angle visual field, reduce the influence of marginal effect to the computational result. After the lens is carried, the fisheye images of different types of canopies are collected according to the flow illustrated in fig. 3. First, the camera angle is determined according to the type and development degree of the plant to be shot, and in fig. 3, taking the maize in the stamina stage (upward shooting inside the canopy) and the rice in the seedling stage (downward shooting above the canopy) as an example, the camera is far away from the canopy with uneven growth at the edge and kept horizontal.
after the acquisition of the fisheye image is completed, the fisheye image needs to be processed into a computable image form, and a series of operations and calculations from the image to the LAI value are realized by an image processing technology based on machine vision. The image preprocessing technology is a process of utilizing a visual algorithm to adjust and balance attributes such as color space, brightness perception and topological structure of an original image, and aims to highlight a reference main body (vegetation) in the image, reduce noise, eliminate problems of inconsistent image brightness and the like caused by external light change and camera aperture adjustment, standardize the image color space and parameterize the brightness color gamut, and improve the precision and speed for extracting and calculating subsequent visual features. The main body segmentation technology is an important ring in image processing, the canopy fisheye image is divided into a foreground and a background, and the plant pixel is extracted from the background by the segmentation technology. The image segmentation method based on the threshold is a region parallel technology, a gray level histogram of an image is divided into several types by one or more set thresholds, and the gray level in the image is considered to belong to one type of object within the same interval range. The method has high operation efficiency and strong applicability by directly utilizing the gray characteristic of the image. The image post-processing technology is to perform pixel statistics and geometric transformation on the vegetation part with the background removed to extract the canopy gap fraction, and finally calculate to obtain the LAI.
(21) After image acquisition is finished, preprocessing of the image is carried out, the resolution ratio of the image is reduced in an equal proportion by adopting a down-sampling method to accelerate the operation speed, the invalid area around the image is cut by means of image cutting Cheng Ji, a fisheye image is a part internally tangent to a rectangular acquisition surface of a camera, and therefore the blocked invalid parts at four corners need to be removed.
(22) And carrying out vegetation pixel classification based on HSV color space threshold, extracting vegetation occupation ratio and gap rate in the image, and adopting different segmentation modes according to different shooting visual angles. The HSV color space (Hue-Hue, saturation-Value-shading) decomposes the luminance from the color in order to better perceive the color, luminance differences of different objects in the image. After the RGB image is converted into the HSV image through an algorithm, upper and lower limits of a threshold value are set, and pixels beyond the limits are classified into non-vegetation pixels and are not counted in contact fraction calculation. When the image is shot upwards (upward shot) in the canopy, the sky pixel is taken as a segmentation main body, and the vegetation pixel is obtained after the sky pixel is removed; when the shooting angle is a downward shooting, the vegetation is taken as a segmentation main body, and the vegetation coverage is directly obtained.
(23) Image circular ring region segmentation is carried out on the basis of PIL Cheng Jibao, a central circle in the range of 0-15-degree observation zenith angle and concentric circular rings in the ranges of 15-30 degrees, 30-45 degrees, 45-60 degrees and 60-75 degrees are obtained, and considering that the region beyond 75 degrees is close to a sampling edge, the distortion is serious and the noise is more, so that only the part within 75 degrees is calculated.
And (3) carrying out segmentation operation in the step (3) on each ring segmented in the step (23) to obtain segmentation results of vegetation and non-vegetation pixels on each ring, and finally, calculating to obtain the LAI value according to the segmentation calculation results of each ring, including vegetation pixel ratio, one-way gap fraction and the like, by using a joint-type method (11).
And 3, cutting the fisheye image subjected to HSV segmentation into infinite circular rings, enabling vegetation units corresponding to pixels on each circular ring to be approximately positioned on a plane with the same height, calculating vegetation contact frequency (coverage) in the circular rings according to segmentation results of each vegetation pixel and a non-vegetation pixel and integrating to obtain the LAI value.
LAI is a variable describing the area density condition of the canopy leaves at a certain height, according to the concept of LAI and the definition of formula (1), researchers propose a Poisson Model (Poison Model) based on contact frequency and gap fraction, wherein the contact frequency is proposed by Warren Wilson and is the probability that sunlight is incident on the canopy and is in contact with implanted elements in the canopy, the gap fraction is the probability that natural light beams are directly incident on a reference plane, and the leaf coverage (ratio) measured in image analysis is a unidirectional contact fraction under the assumption that the leaves are opaque. Under the height of the canopy h, the average contact fraction is used as the integral value of the plant height of the one-way contact fraction of each leaf layer, and the calculation formula is as follows:
wherein H is the height of the canopy, L (H) represents the leaf area density of the corresponding layer corresponding to each height H of the canopy, i.e. the leaf area in unit volume of the canopy,a direction vector (v-view), θ, referring to the observed position v In order to observe the zenith angle of the direction,for the azimuth of the observation direction, G is the projection function of the leaf area at the height h, bringing equation (1) into the available:
(4) Shows the dependence of the LAI on the frequency of the contact, whereThe compounds are obtained by the formulas (5) and (6):
introduction ofProbability density function (l-leaf) of distribution model of blade inclination angle (altitude, azimuth), where θ l The zenith angle in the inclination direction of the blade,the azimuth angle of the blade inclination is adopted, and normalization condition constraint is carried out through equations (7) and (8):
the above formulas are combined to obtain the canopy gap fractionMean fraction of contactAnd the relationship between LAIs, optimized by Nilson as an exponential relationship in equation (9):
wherein,andsimilarly, based on the circular field of view of the fisheye image, the orientation of the incident ray may be disregarded, assuming that the gap fraction measurement depends only on the observation zenith angle θ v I.e. the angle between the incident direction of the light and the normal vector of the bottom photosensor of the canopy, the leaf area index LAI is calculated in this example cal The calculation formula of (2) can be organized as:
P 0 (θ v ) Andsimilarly, welles proposes a discrete numerical analysis method for integral expression (10) based on multi-view observation, and divides rings by using 5 zenith observation angles and divides average vegetation gap fraction in each ringThe integral formula (10) is subjected to difference processing:
in the formula S i (θ v ) Is cos i θ v -1 And W is i Is sin theta v d theta, the coefficient is different according to the angle of the taken ring. The invention will calculate the LAI based on the derivation process and the differential expression (11) cal Figure 4 illustrates the definition of the observation zenith angle and the principle of the split-ring calculation. High in canopyTaking a point q at h as an example, the distance between the position of the point q in the fisheye image and the center of circle (zenith) is r q If the radius of the whole circle is R, the observed zenith angle sin theta of the point q v =r q R, so that W can be calculated at each point i And S i (θ v ) The value of (c). The zenith observation angles adopted by the invention for dividing the circular rings are respectively 0-15 degrees, 15-30 degrees, 30-45 degrees, 45-60 degrees and 60-75 degrees, which corresponds to the condition that i = 1-5.
In the design process of the invention, the collection of the fish-eye image of the canopy of the field crop and the measurement of an LAI instrument are continuously carried out so as to verify the result accuracy of the image processing method in a contrast manner and test the applicability of the method applied to different scenes. The test crops comprise rice and wheat, the test scene comprises a test field and an actual farmland, the application weather comprises sunrise, sunset, sunny days and cloudy days, and the image acquisition test is also carried out at different periods of crop growth.
The results of the method of the invention were tested by designing field experiments. Table 1 shows the detailed information of the verification experiment, the way of applying the present invention in different scenarios and the verification method. After point selection is completed in a test field or a field actual field of related crops, a fisheye camera and a LAI professional measuring instrument are carried to carry out measurement, under the condition of upward shooting, the angle of the camera is consistent with that of the instrument, the same position in the canopy is selected to carry out measurement, the instrument collects four points from left to right to take average LAI, and the camera shoots 6-10 fisheye images to carry out subsequent processing. In the case of overhead shooting, the view range of the camera is large, and therefore verification is performed with the aid of measurement results of other specialized software. In the validation experiment, the instrument was selected as LI-cor-LAI-2200 canopy analyzer, and the professional software was Can-eye developed by French Noncology based on MCR.
Table 1 application scenarios of the verification experiments of the invention
The invention can be applied to a plurality of fields and experimental scenes, and is used for testing the traditional instruments and professional softwareThe measurement results were verified for accuracy. FIG. 6 shows the results of a representation of the images taken by upward photography of the rice field during the stage of topping and booting and the LAI calculated by the algorithm of the present invention cal Value and Instrument measurement LAI mea The precision comparison of (1) is that the correlation coefficient R is 0.6332, the root mean square error RMSE is 0.1216, and the effect is good. In 135 sampling points for carrying out verification experiments, different time periods and different weather scenes in a day are selected for shooting and measuring (table 1), and fig. 7 shows examples of fisheye images in various scenes and verification accuracy of measurement results of an instrument, wherein a cloudy (overcast) scene is a time period commonly used by the instrument for measurement, the measurement result of the instrument in the time period is stable, the correlation coefficient R of the algorithm calculation result of the invention is 0.6283, the root mean square error RMSE is 0.1785, the error is small, the error of the algorithm is high in both sunny days and sunset days, the error reaches 0.3081 and 0.3419, and water drops have influence on image segmentation on a lens in rainy days, so the RMSE also has 0.2378. In the test of upward shooting in the rice field, the algorithm can adapt to different weather and environments, the error in cloudy days is stable, and the measured values in sunny days and rainy days have good correlation with actual values.
In the early growth stage (mainly seedling stage and tillering stage) of rice plant height and dwarfing, an instrument cannot enter a canopy for measurement, generally, an oblique photography or sampling scanning method is adopted, in the test process of the invention, a downward shooting visual angle is adopted when the instrument is applied to the seedling stage of rice, as shown in fig. 8, a calculated value and a measured value of other methods have high correlation of R =0.9049, and a root mean square error RMSE is only 0.0223, and the result shows that the instrument can not enter the field under the downward shooting condition, and has good precision and stability.
In the application of the dry crop wheat, the correlation and the precision of a top-down visual angle and instrument and other method measurement are mainly verified, and photographing and sampling are carried out in each growth period of the wheat. According to the graph 9, in the relatively short seedling stage and the tillering stage of the wheat, the error of the test result is 0.0333 and 0.0316, the precision is high, while in the stage of jointing and booting of the wheat, which gradually closes ridges, the correlation coefficient R reaches 0.9909, the RMSE is 0.1840, and the correlation with the measured value of the traditional method is good.
According to the test results of different scenes and different crops, the specific implementation process of the invention can draw the following conclusions:
(1) The equipment and the algorithm can effectively adapt to LAI measurement tasks in different scenes, and have good accuracy and stability for paddy fields and dry lands.
(2) The method effectively avoids the problem that the traditional method wastes time and labor, provides an accurate solution for LAI measurement in the early growth stage with short plant heights of various crops, and fills the blank of LAI monitoring in the early development stage of the crops.
Claims (7)
1. The method for measuring and calculating the leaf area index based on the fisheye image is characterized by comprising the following steps of:
step 1, collecting fish-eye images of different types of canopies;
step 2, processing the collected fisheye image based on an image processing technology, reducing background noise and obtaining a pixel matrix which is free of distortion and can be calculated;
and 3, cutting the processed fisheye image into an infinite number of rings, wherein pixels on each ring are approximately positioned on a plane with the same height, calculating vegetation contact frequency in the rings and integrating to obtain the leaf area index value.
2. The fisheye image-based leaf area index estimation method according to claim 1, characterized in that: in the step 1, fisheye images of different types of canopies are collected through a smart phone and a fisheye lens, the smart phone is connected with a shooting rod in a Bluetooth mode, the fisheye lens is aligned to a main shooting lens of the smart phone and fixed, a simple measuring rod is formed, after the lens is carried, the angle of a camera is determined according to the type and the development degree of a shot plant, the canopies with uneven growing edges are far away during shooting, and the camera is kept horizontal.
3. The fisheye image-based leaf area index estimation method according to claim 2, characterized in that: the imaging photosensitive plane of the fisheye lens is a circle inscribed in the target surface of the camera, a coordinate system is established by taking the circle center as the origin of coordinates, and the visual angle theta of each point in the image can be calculated by the following formula 2:
in the formula, x and y are pixel coordinates, and R is the radius of the image.
4. The fisheye image-based leaf area index estimation method according to claim 1, characterized in that: the specific implementation manner of the step 2 is as follows;
(21) Reducing the resolution of the image in an equal proportion by adopting a down-sampling method to accelerate the running speed, and cutting invalid areas around the image by virtue of an image cutting Cheng Ji packet;
(22) Based on HSV color space threshold value carry out vegetation pixel classification, extract vegetation proportion and clearance fraction in every ring, will adopt different segmentation methods according to shooting visual angle difference: converting the RGB image into an HSV image through an algorithm, setting upper and lower threshold limits, classifying pixels exceeding the limits into non-vegetation pixels and not counting in contact fraction calculation, and when the image is shot upwards in a canopy, taking a sky pixel as a segmentation main body, and removing the sky pixel to obtain the vegetation pixel; when the shooting angle is a downward shooting, the vegetation is taken as a segmentation main body, and the vegetation coverage is directly obtained.
5. The fisheye image-based leaf area index estimation method according to claim 1, characterized in that: in the step 3, image ring region segmentation is carried out based on PIL Cheng Jibao to obtain a central circle in the range of 0-15-degree observation zenith angle and concentric rings in the ranges of 15-30 degrees, 30-45 degrees, 45-60 degrees and 60-75 degrees, and the fact that the region beyond 75 degrees is close to the sampling edge, the distortion is serious and the noise is more is considered, so that only the part within 75 degrees is calculated.
6. The fisheye image based leaf area index estimation method of claim 1, wherein: the specific implementation manner of the step 3 is as follows;
according to the LAI concept and the definition of the formula (1), a Poisson calculation model based on contact frequency and gap fraction is provided, the contact frequency is provided by Warren Wilson and refers to the probability that sunlight is in contact with implanted elements in a canopy when the sunlight is incident on the canopy, the gap fraction refers to the probability that natural light beams are directly incident on a reference plane, and under the assumption that the leaves are opaque, the leaf coverage measured in image analysis is a unidirectional contact fraction;
wherein H is the canopy height, l (H) is the leaf area density function at H height;
under the height of the canopy h, the average contact fraction is used as the integral value of the plant height of the one-way contact fraction of each leaf layer, and the calculation formula is as follows:
wherein H is the height of the canopy, L (H) represents the leaf area density of each height H of the canopy of the plant corresponding to the corresponding layer, namely the leaf area of the canopy in unit volume,a direction vector, theta, referring to the observed position v In order to observe the zenith angle of the direction,for the azimuth of the observation direction, G is the projection function of the leaf area at the height h, bringing equation (1) into the available:
(4) Shows the dependence of the LAI on the frequency of the contact, whereThe following formulas (5) and (6) are combined to obtain:
introduction ofProbability density function of blade inclination angle distribution model, where l The zenith angle in the direction of inclination of the blade,the azimuth angle of the blade inclination is adopted, and normalization condition constraint is carried out through equations (7) and (8):
the above formulas are combined to obtain the canopy gap fractionAverage contact fractionAnd LAI, optimized by Nilson as an exponential relationship in equation (9):
wherein,andsimilarly, based on the circular field of view of the fisheye image, regardless of the orientation of the incident ray, it is assumed that the gap fraction measurement depends only on the observation zenith angle θ v I.e. the angle between the incident direction of the light and the normal vector of the bottom photosensor of the canopy, the leaf area index LAI is calculated cal The calculation formula of (2) can be organized as:
welles proposes a discrete numerical analysis method aiming at integral expression (10) based on multi-view observation, which adopts a plurality of zenith observation angles to divide rings and divides average vegetation gap fraction in each ringThe integral formula (10) is subjected to difference processing:
in the formula S i (θ v ) Is cos i θ v -1 And W is i Is sin theta v d θ, i represents the angle of division, and the coefficients differ according to the angle of the taken ring.
7. The fisheye image-based leaf area index estimation method of claim 6, wherein: the ring is divided by 5 zenith observation angles, which are respectively 0-15 degrees, 15-30 degrees, 30-45 degrees, 45-60 degrees, 60-75 degrees, and respectively correspond to i =1 to i =5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211290462.2A CN115655157A (en) | 2022-10-21 | 2022-10-21 | Fish-eye image-based leaf area index measuring and calculating method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211290462.2A CN115655157A (en) | 2022-10-21 | 2022-10-21 | Fish-eye image-based leaf area index measuring and calculating method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115655157A true CN115655157A (en) | 2023-01-31 |
Family
ID=84989744
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211290462.2A Pending CN115655157A (en) | 2022-10-21 | 2022-10-21 | Fish-eye image-based leaf area index measuring and calculating method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115655157A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118397075A (en) * | 2024-06-24 | 2024-07-26 | 合肥工业大学 | Calculation method of mountain forest effective leaf area index based on fisheye camera |
-
2022
- 2022-10-21 CN CN202211290462.2A patent/CN115655157A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118397075A (en) * | 2024-06-24 | 2024-07-26 | 合肥工业大学 | Calculation method of mountain forest effective leaf area index based on fisheye camera |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2020103026A4 (en) | A Single Tree Crown Segmentation Algorithm Based on Super-pixels and Topological Features in Aerial Images | |
JP5020444B2 (en) | Crop growth measuring device, crop growth measuring method, crop growth measuring program, and computer-readable recording medium recording the crop growth measuring program | |
CN108195736B (en) | Method for extracting vegetation canopy clearance rate through three-dimensional laser point cloud | |
CN112907520B (en) | Single tree crown detection method based on end-to-end deep learning method | |
CN112200854B (en) | Leaf vegetable three-dimensional phenotype measuring method based on video image | |
CN112560623B (en) | Unmanned aerial vehicle-based rapid mangrove plant species identification method | |
CN114998728B (en) | Method and system for predicting cotton leaf area index by unmanned aerial vehicle multi-source remote sensing | |
CN108776106A (en) | A kind of crop condition monitoring method and system based on unmanned plane low-altitude remote sensing | |
CN117409339A (en) | Unmanned aerial vehicle crop state visual identification method for air-ground coordination | |
CN115687850A (en) | Method and device for calculating irrigation water demand of farmland | |
CN116645321B (en) | Vegetation leaf inclination angle calculation statistical method and device, electronic equipment and storage medium | |
CN116883480A (en) | Corn plant height detection method based on binocular image and ground-based radar fusion point cloud | |
CN110610438B (en) | Crop canopy petiole included angle calculation method and system | |
CN115655157A (en) | Fish-eye image-based leaf area index measuring and calculating method | |
US20240290089A1 (en) | Method for extracting forest parameters of wetland with high canopy density based on consumer-grade uav image | |
CN117635898A (en) | Crop dynamic phenotype extraction-oriented close-range image stitching method | |
CN117392535A (en) | Fruit tree flower bud target detection and white point rate estimation method oriented to complex environment | |
CN116258844A (en) | Rapid and accurate identification method for phenotype character of cotton leaf | |
CN115541032A (en) | Tobacco plant temperature extraction method based on unmanned aerial vehicle thermal infrared remote sensing image | |
CN114092418A (en) | Shadow detection method for unmanned aerial vehicle image of farmland crop | |
CN117456364B (en) | Grassland biomass estimation method and system based on SfM and grassland height factors | |
CN117689481B (en) | Natural disaster insurance processing method and system based on unmanned aerial vehicle video data | |
CN117197062B (en) | Blade nitrogen content measurement method and system based on RGB image | |
CN117274844B (en) | Rapid extraction method for field peanut seedling emergence condition by using unmanned aerial vehicle remote sensing image | |
CN117765385B (en) | Method, device and equipment for constructing plant growth monitoring model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |