CN111223138B - Blade form calibration and extraction method - Google Patents

Blade form calibration and extraction method Download PDF

Info

Publication number
CN111223138B
CN111223138B CN201911247189.3A CN201911247189A CN111223138B CN 111223138 B CN111223138 B CN 111223138B CN 201911247189 A CN201911247189 A CN 201911247189A CN 111223138 B CN111223138 B CN 111223138B
Authority
CN
China
Prior art keywords
image
point
blade
pixel
circle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911247189.3A
Other languages
Chinese (zh)
Other versions
CN111223138A (en
Inventor
宫志宏
刘布春
黎贞发
李春
董朝阳
刘园
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Climate Center Tianjin Ecological Meteorological And Satellite Remote Sensing Center Tianjin Agricultural Meteorological Center
Institute of Environment and Sustainable Development in Agriculturem of CAAS
Original Assignee
Tianjin Climate Center Tianjin Ecological Meteorological And Satellite Remote Sensing Center Tianjin Agricultural Meteorological Center
Institute of Environment and Sustainable Development in Agriculturem of CAAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Climate Center Tianjin Ecological Meteorological And Satellite Remote Sensing Center Tianjin Agricultural Meteorological Center, Institute of Environment and Sustainable Development in Agriculturem of CAAS filed Critical Tianjin Climate Center Tianjin Ecological Meteorological And Satellite Remote Sensing Center Tianjin Agricultural Meteorological Center
Priority to CN201911247189.3A priority Critical patent/CN111223138B/en
Publication of CN111223138A publication Critical patent/CN111223138A/en
Application granted granted Critical
Publication of CN111223138B publication Critical patent/CN111223138B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/60Rotation of a whole image or part thereof
    • G06T3/604Rotation of a whole image or part thereof using a CORDIC [COordinate Rotation Digital Compute] device
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E10/00Energy generation through renewable energy sources
    • Y02E10/70Wind energy
    • Y02E10/72Wind turbines with rotation axis in wind direction

Abstract

The invention discloses a method for calibrating and extracting a blade form, which comprises the following steps of firstly, manufacturing a calibration plate; step two, an image acquisition device is used for acquiring images of the blades to obtain an original image O i (x, y); step three, processing the acquired image: the method comprises the steps of smoothly denoising an image, removing the influence of miscellaneous points, and obtaining the effective area of a calibration plate; and step four, extracting the leaf morphology of the processed image. The method and the device can solve the problem that the blade cavity cannot be processed when the blade shape is measured in the prior art, can calibrate the distortion of the blade image, and have more accurate measurement results.

Description

Blade form calibration and extraction method
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a blade form calibration and extraction method.
Background
The leaf blade is used as one of the main interfaces of plant ecological system in contact with atmosphere, and is not only the material basis for photosynthesis of plant, but also the green plant for plant to produce nutrient matter and the medium for plant transpiration. The area is not only an important index for evaluating the growth and development, yield formation and variety characteristics of plants, but also an important means for reasonably cultivating and managing crops and generating and developing processes of plant diseases and insect pests. Therefore, leaf area has an irreplaceable position in the aspects of researching physiological and biochemical properties of plants, genetic breeding, crop cultivation and the like. In crop cultivation, leaf area indexes are commonly used for measuring the growth condition of crop groups, and the leaf area indexes are used as reference indexes for determining cultivation measures; in addition, measuring pest leaf area is an important matter for studying pest damage loss. Therefore, the area of the plant leaves is accurately measured, and the method has important significance for making a reasonable cultivation management mode and trimming branch and leaf methods and determining the application intensity of chemical fertilizers and pesticides.
Currently, manual and instrumental measurements are mainly used. The manual measurement method mainly comprises the following steps: coordinate paper method, weighing method, regression analysis method, coefficient method, etc. The instrument measurement method mainly comprises an integrating instrument method and a leaf area instrument method. The leaf area meter method is convenient to measure and high in speed, but because the common handheld leaf area meter is mostly a semi-automatic instrument, the blade is required to vertically and uniformly pass through a scanning area during measurement, and the requirement is generally not completely met during measurement. And such blade volume meter measurements are not accurate enough.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a method for calibrating and extracting the shape of a blade.
The invention is realized by the following technical scheme: a method of blade morphology calibration and extraction comprising the steps of:
step one, manufacturing a calibration plate;
step two, using an image acquisition device to perform blade alignmentImage acquisition is carried out to obtain an original image O i (x,y);
Thirdly, processing the acquired image;
i, smoothly denoising an image, and removing the influence of miscellaneous points;
the value of a certain point in the blade image is replaced by the average value in an adjacent area of the point, and surrounding pixel values are close to the true value, so that isolated noise points are eliminated, and the formula is as follows:
P i (x,y)=mid{O i (x-k,y-l),(k,l)∈R}
wherein O is i (x,y),P i (x, y) is the original image and the processed image respectively, R is a two-dimensional matrix;
II. Obtaining the effective area of the calibration plate;
first to P i (x, y) performing image correction;
i. circle center coordinate lookup
x 1 =x-r cosθ
y 1 =y-r sinθ
Setting (x, y) as a center point, and generating a complete circle with radius r when theta is changed from 0 to 360; image P i Each point in (x, y) corresponds to a circle in (x, y, r) space, θ scans from 0 to 360 degrees for a particular (x, y), assuming a radius r of a circle, the (x) is obtained 1 ,y 1 ) The most occurring coordinate set in the values is the center of a circle;
distortion calibration
Due to the fact that the image O is being obtained i When (x, y), an inclination angle is formed between the image acquisition device and the reference plate, the image acquisition device cannot form vertical downward (orthographic projection), the projection geometry on the projection bearing surface is kept unchanged by rotating the projection bearing surface (perspective surface) by a certain angle according to the perspective rotation law by utilizing the condition that the perspective center, the image point and the target point are collinear, and the transformation formula is as follows:
Figure BDA0002308038240000021
(x, y) is P i (x, y) image pixel coordinates,
Figure BDA0002308038240000022
for the transformed image P1 i The (x, y) pixel coordinates, the perspective transformation matrix is illustrated as follows: />
Figure BDA0002308038240000023
Representing the linear change of the image +.>
Figure BDA0002308038240000024
For generating perspective transformations of images [ a ] 31 a 32 ]Representing an image translation;
step four, extracting leaf morphology from the processed image;
i, calculating the extraction of a blade part;
i. primarily removing the non-blade portion;
set image P1 i The (x, y) point pixel RGB three primary colors in (x, y) are (R, G, B), which accords with G (x,y) *2-(B (x,y) +R (x,y) ) If the pixel of the (x, y) point is more than 0, the pixel of the (x, y) point is a green leaf part, (R, G, B) keeps an initial value, otherwise, the pixel of the (x, y) point is a dead leaf part or a non-leaf part, (R, G, B) = (255 ) to obtain an image P2 i (x,y);
Binarization;
applying the maximum inter-class variance method OTSU algorithm to convert the image P2 according to the gray characteristic of the image i (x, y) is divided into background (non-leaf) and foreground (leaf) to obtain image P3 i (x,y);
II, obtaining and separating blades;
i. removing the hollow in the blade, and smoothing the blade;
first for image P3 i The boundary of (x, y) is contracted by a structural matrix b (selected as 3 x 3,5 x 5 matrix area) to eliminate small impurities and obtain an image P4 i (x, y), the specific operation is as follows:
Figure BDA0002308038240000031
the formulation shrinks P3 by the structure matrix b i (x, y), when the origin of b is shifted to the image P3 i When the pixel point (x, y) of (x, y) is at (x, y), if b is at (x, y) and is completely contained in the image P3 i (x, y) overlapping region, then the image P4 will be output i The RGB value of the pixel point (x, y) corresponding to (x, y) is assigned as (R, G, B) = (0, 0), otherwise, the RGB value is assigned as (R, G, B) = (255 ); then for image P4 i (x, y) expanding with a structural matrix b (selected as 3 x 3,5 x 5 matrix region) filling P4 i Some voids in (x, y) and eliminate impurities.
In the technical proposal, P4 is filled up i The specific operations of (x, y) some voids and eliminating impurities are as follows:
Figure BDA0002308038240000032
the formula represents translating the origin of the structural element b to the image P4 i (x, y) pixel point (x, y) locations. If b is in the image P4 i The (x, y) pixel point (x, y) is equal to P4 i If the intersection of (x, y) is not null, the image P4 is output i The RGB value of the pixel point (x, y) corresponding to (x, y) is assigned as (R, G, B) = (0, 0) otherwise, is assigned as (R, G, B) = (255 ). />
In the technical scheme, after filling the cavity and eliminating the impurities, the step I image smooth denoising in the step 3 is repeated finally, and the influence of the impurities is removed.
The invention has the advantages and beneficial effects that: the method and the device can solve the problem that the blade cavity cannot be processed when the blade shape is measured in the prior art, can calibrate the distortion of the blade image, and have more accurate measurement results.
Drawings
Fig. 1 is a schematic structural view of the present invention.
Other relevant drawings may be made by those of ordinary skill in the art from the above figures without undue burden.
Detailed Description
In order to make the person skilled in the art better understand the solution of the present invention, the following describes the solution of the present invention with reference to specific embodiments.
A method of blade morphology calibration and extraction comprising the steps of:
1) Manufacturing calibration plate
The calibration plate is a white background rectangle tileable baffle, the four corners comprise a black hollow circle, and the circle centers of the four circles are connected to form a rectangle with a known area (S bd ). The size of rectangle is according to measuring blade area size setting, guarantees that the blade is inside four circular connection rectangles.
2) Image acquisition of blades
And a calibration plate is arranged behind the blade, the blade is photographed by a mobile phone, a camera or other camera devices, and the included angle between a photographing lens and the calibration plate is less than or equal to 45 degrees.
3) Blade image distortion calibration, obtaining effective area of calibration plate
I, smoothly denoising an image, and removing the influence of miscellaneous points;
the value of a certain point in the blade image is replaced by the average value in a neighboring area of the point, and the surrounding pixel values are close to the true value, so that the isolated noise point is eliminated, and the formula is as follows
P i (x,y)=mid{O i (x-k,y-l),(k,l)∈R}
Where Oi (x, y), pi (x, y) are the original image and the processed image, respectively, R is a two-dimensional matrix, and is recommended to be selected as 3*3 or 5*5; (what is k, l) is the matrix 3*3 if R is the matrix k E [ -1,1], l E [ -1,1], if R is the matrix 5*5, k [ -2,2], l E [ -2,2], where k, l are integers.
II, obtaining effective area of calibration plate
Due to acquisition of the original image O i When (x, y), the lens and the calibration plate cannot be guaranteed to be completely horizontal, the image has certain deformation, and meanwhile, the area of the calibration plate needs to be obtained, so that the method needs to firstly obtain the image of P i (x, y) performing image correction:
i. circle center coordinate lookup
x 1 =x-r cosθ
y 1 =y-r sinθ
Let (x, y) be the center point, when θ changes from 0 to 360, a full circle of radius r is produced. Image P i Each point in (x, y) corresponds to a circle in (x, y, r) space, θ scans from 0 to 360 degrees for a particular (x, y), assuming a radius r of a circle, the (x) is obtained 1 ,y 1 ) The most frequently occurring set of coordinates in the values is the center of the circle.
Distortion calibration
Because an inclination angle is formed between the digital camera and the reference plate during image acquisition, vertical downward orthographic projection cannot be formed, a shadow bearing surface (perspective surface) rotates a certain angle around a trace line (perspective shaft) according to perspective rotation law by utilizing the condition that a perspective center, an image point and a target point are collinear, and a projection geometric figure on the shadow bearing surface is kept unchanged, and the transformation formula is as follows:
Figure BDA0002308038240000051
(x, y) is P i (y, y) image pixel coordinates,
Figure BDA0002308038240000052
for the transformed image P1 i (x, y) pixel coordinates. The perspective transformation matrix is illustrated as follows: />
Figure BDA0002308038240000053
Representing the linear change of the image +.>
Figure BDA0002308038240000054
For generating perspective transformations of images [ a ] 31 a 3i ]Representing image translation.
4) Leaf morphology extraction
I, leaf part extraction
i. Preliminary removal of non-blade portions
Set image P1 i The (x, y) point pixel RGB three primary colors in (x, y) are (R, G, B), which accords with G (x,y) *2-(B (x,y) +R (x,y) )>0, then (x, y) point pixel is green leaf part, (R, G, B) hold initial value, otherwise (x, y) point pixel is dead leaf part or non-leaf part, (R, G, B) = (255 ) obtain image P2 i (x,y)。
Binarization of
Applying the maximum inter-class variance method OTSU algorithm to convert the image P2 according to the gray characteristic of the image i (x, y) is divided into background (non-leaf) and foreground (leaf) to obtain image P3 i (x,y)。
II, blade harvesting and separation
i. Removing the hollow and smoothing the blade
First for image P3 i The boundary of (x, y) is contracted by a structural matrix b (selected as 3 x 3,5 x 5 matrix area) to eliminate small impurities and obtain an image P4 i (x, y), the specific operation is as follows:
Figure BDA0002308038240000061
the formulation shrinks P3 by the structure matrix b i (x, y), when the origin of b is shifted to the image P3 i When the pixel point (x, y) of (x, y) is at (x, y), if b is at (x, y) and is completely contained in the image P3 i (x, y) overlapping region, then the image P4 will be output i The RGB value of the pixel point (x, y) corresponding to (x, y) is assigned as (R, G, B) = (0, 0), otherwise, the RGB value is assigned as (R, G, B) = (255, 255, 255).
Then for image P4 i (x, y) expanding with a structural matrix b (selected as 3 x 3,5 x 5 matrix region) filling P4 i Some voids in (x, y) and eliminate impurities. The specific operation is as follows:
Figure BDA0002308038240000062
the formula represents translating the origin of the structural element b to the image P4 i (x, y) pixel point (x, y) locations. If b is in the image P4 i (x, y) pixel point (x, y) andP4 i if the intersection of (x, y) is not null, the image P5 is output i The RGB value of the pixel point (x, y) corresponding to (x, y) is assigned as (R, G, B) = (0, 0) otherwise is assigned as (R, G, B) = (255, 255, 255).
The foregoing has described exemplary embodiments of the invention, it being understood that any simple variations, modifications, or other equivalent arrangements which would not unduly obscure the invention may be made by those skilled in the art without departing from the spirit of the invention.

Claims (2)

1. A method for calibrating and extracting the shape of a blade, which is characterized by comprising the following steps:
step one, manufacturing a calibration plate;
step two, an image acquisition device is used for acquiring images of the blades to obtain an original image O i (x,y);
Thirdly, processing the acquired image;
i, smoothly denoising an image, and removing the influence of miscellaneous points;
the value of a certain point in the blade image is replaced by the average value in an adjacent area of the point, and surrounding pixel values are close to the true value, so that isolated noise points are eliminated, and the formula is as follows:
P i (x,y)=mid{O i (x-k,y-l),(k,l)∈R}
wherein O is i (x,y),P i (x, y) is the original image and the processed image respectively, R is a two-dimensional matrix;
II, obtaining the effective area of the calibration plate;
first to P i (x, y) performing image correction;
i. circle center coordinate lookup
x 1 =x-r cosθ
y 1 =y-r sinθ
Setting (x, y) as a center point, and generating a complete circle with radius r when theta is changed from 0 to 360; image P i Each point in (x, y) corresponds to a circle in (x, y, r) space, θ being from 0 to 36 for a particular (x, y)Scanning at 0 degrees, assuming a radius r of one circle, finally obtaining (x 1 ,y 1 ) The most occurring coordinate set in the values is the center of a circle;
distortion calibration
Due to the fact that the image O is being obtained i When (x, y), an inclination angle is formed between the image acquisition device and the reference plate, the image acquisition device cannot form a vertical downward direction, the shadow bearing surface rotates a certain angle around the trace line according to the perspective rotation law by utilizing the condition that the perspective center, the image point and the target point are collinear, and the projection geometric figure on the shadow bearing surface is kept unchanged, and the transformation formula is as follows:
Figure RE-FDA0002454220500000011
(x, y) is P i (x, y) image pixel coordinates,
Figure RE-FDA0002454220500000012
for the transformed image P1 i The (x, y) pixel coordinates, the perspective transformation matrix is explained as follows: />
Figure RE-FDA0002454220500000013
Representing the linear change of the image +.>
Figure RE-FDA0002454220500000014
For generating perspective transformations of images [ a ] 31 a 32 ]Representing an image translation;
step four, extracting leaf morphology from the processed image;
i, calculating the extraction of a blade part;
i. primarily removing the non-blade portion;
set image P1 i The (x, y) point pixel RGB three primary colors in (x, y) are (R, G, B), which accords with G (x,y) *2-(B (x,y) +R (x,y) )>0, the (x, y) point pixel is a green leaf part, (R, G, B) keeps the initial value, otherwise the (x, y) point pixel is a dead leaf part or a non-leaf part, (R, G, B) = (255, 255), an image P2 is obtained i (x,y);
Binarization;
applying the maximum inter-class variance method OTSU algorithm to convert the image P2 according to the gray characteristic of the image i (x, y) is divided into background (non-leaf) and foreground to obtain image P3 i (x,y);
II, obtaining and separating blades;
i. removing the hollow in the blade, and smoothing the blade;
first for image P3 i The boundary of (x, y) is contracted by the structural matrix b to eliminate small impurities, and an image P4 is obtained i (x, y), the specific operation is as follows:
Figure RE-FDA0002454220500000021
the formulation shrinks P3 by the structure matrix b i (x, y), when the origin of b is shifted to the image P3 i When the pixel point (x, y) of (x, y) is at (x, y), if b is at (x, y) and is completely contained in the image P3 i (x, y) overlapping region, then the image P4 will be output i The RGB value of the pixel point (x, y) corresponding to (x, y) is assigned as (R, G, B) = (0, 0), otherwise, the RGB value is assigned as (R, G, B) = (255 );
then for image P4 i (x, y) expanding with structural matrix b to fill P4 i Some voids in (x, y) and eliminate impurities.
2. The method of blade morphology calibration and extraction according to claim 1, wherein filling P4 i The specific operations of (x, y) some voids and eliminating impurities are as follows:
Figure RE-FDA0002454220500000022
the formula represents translating the origin of the structural element b to the image P4 i (x, y) pixel point (x, y) position, if b is in image P4 i The (x, y) pixel point (x, y) is equal to P4 i Intersection of (x, y)Is not empty, the image P4 is output i The RGB value of the pixel point (x, y) corresponding to (x, y) is assigned as (R, G, B) = (0, 0) otherwise, is assigned as (R, G, B) = (255 ). />
CN201911247189.3A 2019-12-09 2019-12-09 Blade form calibration and extraction method Active CN111223138B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911247189.3A CN111223138B (en) 2019-12-09 2019-12-09 Blade form calibration and extraction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911247189.3A CN111223138B (en) 2019-12-09 2019-12-09 Blade form calibration and extraction method

Publications (2)

Publication Number Publication Date
CN111223138A CN111223138A (en) 2020-06-02
CN111223138B true CN111223138B (en) 2023-05-09

Family

ID=70830729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911247189.3A Active CN111223138B (en) 2019-12-09 2019-12-09 Blade form calibration and extraction method

Country Status (1)

Country Link
CN (1) CN111223138B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103593840A (en) * 2013-09-30 2014-02-19 北京林业大学 Method for detecting phenotype of Arabidopsis
CN103591887A (en) * 2013-09-30 2014-02-19 北京林业大学 Method for detecting regional phenotype of Arabidopsis
CN106468536A (en) * 2016-11-22 2017-03-01 天津市气候中心 A kind of leaf area measurement method
CN108765433A (en) * 2018-05-31 2018-11-06 西京学院 One kind is for carrying high-precision leafy area measurement method
CN108844499A (en) * 2018-05-03 2018-11-20 岭南师范学院 A kind of Measurement Approach of Leaf Area
CN109191520A (en) * 2018-09-30 2019-01-11 湖北工程学院 A kind of Measurement Approach of Leaf Area and system based on color calibration
CN109241966A (en) * 2018-08-22 2019-01-18 东北农业大学 A kind of plant leaf blade nondestructive collection method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103593840A (en) * 2013-09-30 2014-02-19 北京林业大学 Method for detecting phenotype of Arabidopsis
CN103591887A (en) * 2013-09-30 2014-02-19 北京林业大学 Method for detecting regional phenotype of Arabidopsis
CN106468536A (en) * 2016-11-22 2017-03-01 天津市气候中心 A kind of leaf area measurement method
CN108844499A (en) * 2018-05-03 2018-11-20 岭南师范学院 A kind of Measurement Approach of Leaf Area
CN108765433A (en) * 2018-05-31 2018-11-06 西京学院 One kind is for carrying high-precision leafy area measurement method
CN109241966A (en) * 2018-08-22 2019-01-18 东北农业大学 A kind of plant leaf blade nondestructive collection method
CN109191520A (en) * 2018-09-30 2019-01-11 湖北工程学院 A kind of Measurement Approach of Leaf Area and system based on color calibration

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
宫志宏 ; 薛庆禹 ; 于红 ; 李春.基于数字照片解析的黄瓜叶片面积测定及效果分析.第32届中国气象学会年会.2015,全文. *
李彦锋.基于图像处理的植物叶片几何形态数据获取方法的研究.《CNKI》.2006,全文. *
胡迪.植物叶片测量方法的研究.《CNKI》.2011,全文. *

Also Published As

Publication number Publication date
CN111223138A (en) 2020-06-02

Similar Documents

Publication Publication Date Title
CN106468543B (en) Blade area measuring method based on image processing
CN109708578B (en) Plant phenotype parameter measuring device, method and system
CN102506772B (en) Method and device for quickly detecting area of leaf blade based on mobile phone
CN109816680B (en) High-throughput calculation method for crop plant height
CN108765433A (en) One kind is for carrying high-precision leafy area measurement method
CN112465832B (en) Single-side tree point cloud skeleton line extraction method and system based on binocular vision
CN109859101B (en) Crop canopy thermal infrared image identification method and system
CN106468536A (en) A kind of leaf area measurement method
CN110610438B (en) Crop canopy petiole included angle calculation method and system
CN112700488A (en) Living body long blade area analysis method, system and device based on image splicing
CN110969654A (en) Corn high-throughput phenotype measurement method and device based on harvester and harvester
CN112465893A (en) Vegetation leaf area measuring method and device, terminal and storage medium
CN115687850A (en) Method and device for calculating irrigation water demand of farmland
CN111223138B (en) Blade form calibration and extraction method
CN105761259A (en) Wheat leaf stoma density measurement method based on microscopic image
CN113920106A (en) Corn growth three-dimensional reconstruction and stem thickness measurement method based on RGB-D camera
CN112668406B (en) Effective screening method for soybean varieties
CN112330672B (en) Crop leaf area index inversion method based on PROSAIL model and canopy coverage optimization
CN103919556A (en) Cow body shape trait index data collecting method based on three-dimensional measuring
CN112686859A (en) Crop CWSI detection method based on thermal infrared and RGB-D camera
He et al. A calculation method of phenotypic traits of soybean pods based on image processing technology
CN112435290A (en) Leaf area image measuring method based on saturation segmentation
Feng et al. Calculating the leaf-area based on non-loss correction algorithm
AU2014267257B2 (en) Device and method for the parameterisation of a plant
CN114049390A (en) Wheat seedling planting density measuring device and method based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant