CN111223138A - Method for calibrating and extracting blade form - Google Patents

Method for calibrating and extracting blade form Download PDF

Info

Publication number
CN111223138A
CN111223138A CN201911247189.3A CN201911247189A CN111223138A CN 111223138 A CN111223138 A CN 111223138A CN 201911247189 A CN201911247189 A CN 201911247189A CN 111223138 A CN111223138 A CN 111223138A
Authority
CN
China
Prior art keywords
image
blade
point
pixel
follows
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911247189.3A
Other languages
Chinese (zh)
Other versions
CN111223138B (en
Inventor
宫志宏
刘布春
黎贞发
李春
董朝阳
刘园
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Climate Center Tianjin Ecological Meteorological And Satellite Remote Sensing Center Tianjin Agricultural Meteorological Center
Institute of Environment and Sustainable Development in Agriculturem of CAAS
Original Assignee
Tianjin Climate Center Tianjin Ecological Meteorological And Satellite Remote Sensing Center Tianjin Agricultural Meteorological Center
Institute of Environment and Sustainable Development in Agriculturem of CAAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Climate Center Tianjin Ecological Meteorological And Satellite Remote Sensing Center Tianjin Agricultural Meteorological Center, Institute of Environment and Sustainable Development in Agriculturem of CAAS filed Critical Tianjin Climate Center Tianjin Ecological Meteorological And Satellite Remote Sensing Center Tianjin Agricultural Meteorological Center
Priority to CN201911247189.3A priority Critical patent/CN111223138B/en
Publication of CN111223138A publication Critical patent/CN111223138A/en
Application granted granted Critical
Publication of CN111223138B publication Critical patent/CN111223138B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • G06T3/604Rotation of whole images or parts thereof using coordinate rotation digital computer [CORDIC] devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E10/00Energy generation through renewable energy sources
    • Y02E10/70Wind energy
    • Y02E10/72Wind turbines with rotation axis in wind direction

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for calibrating and extracting the shape of a blade, comprising the following steps of manufacturing a calibration plate; step two, image acquisition is carried out on the blade by an image acquisition device to obtain an original image Oi(x, y); step three, processing the acquired image: smoothly denoising the image, removing the influence of impurity removal points, and obtaining the effective area of the calibration plate; and step four, carrying out blade form extraction on the processed image. The method can solve the problem that the blade cavity cannot be processed when the blade form is measured in the prior art, can correct the blade image distortion, and has more accurate measurement result.

Description

Method for calibrating and extracting blade form
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a method for calibrating and extracting blade morphology.
Background
The leaf is one of the main interfaces of the plant ecosystem and the atmosphere, and not only is the material basis of the plant for photosynthesis, but also is a 'green factory' for producing nutrient substances by the plant and a medium for plant transpiration. The area size is not only an important index for evaluating the growth and development, yield formation and variety characteristics of plants, but also an important means for carrying out reasonable cultivation management and pest occurrence and development processes on crops. Therefore, the leaf area has an irreplaceable position in the aspects of researching the physiology and biochemistry, genetic breeding, crop cultivation and the like of plants. In crop cultivation, the growth condition of a crop group is measured by a leaf area index, and the growth condition is used as a reference index for determining cultivation measures; in addition, the determination of the leaf eating area of pests is an important content for researching the damage and loss of pests. Therefore, the area of the plant leaves is accurately measured, and the method has important significance for formulating a reasonable cultivation management mode and a method for trimming branches and leaves and determining the application strength of chemical fertilizers and pesticides.
Currently, manual measurement and instrumental measurement are mainly used. The artificial measurement method mainly comprises the following steps: coordinate paper method, weighing method, regression analysis method, coefficient method and the like. The instrument measurement method mainly comprises an integrator method and a leaf area meter method. The leaf area meter method is convenient to measure and fast, but because common hand-held leaf area meters are mostly semi-automatic instruments, the blade is generally required to vertically and uniformly pass through a scanning area during measurement, and the requirement can not be completely met during measurement. And such a vane integrator measurement is not accurate enough.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a method for calibrating and extracting the shape of a blade.
The invention is realized by the following technical scheme: a method for calibrating and extracting blade morphology comprises the following steps:
firstly, manufacturing a calibration plate;
step two, image acquisition is carried out on the blade by an image acquisition device to obtain an original image Oi(x,y);
Step three, processing the collected image;
i, smoothly denoising an image and removing the influence of miscellaneous points;
the value of a certain point in the blade image is replaced by the average value in a neighboring area of the point, and the surrounding pixel values are close to the real values, so that the isolated noise point is eliminated, and the formula is as follows:
Pi(x,y)=mid{Oi(x-k,y-l),(k,l)∈R}
wherein, Oi(x,y),Pi(x, y) are respectively an original image and a processed image, and R is a two-dimensional matrix;
II. Obtaining the effective area of a calibration plate;
first to Pi(x, y) performing image correction;
i. circle center coordinate lookup
x1=x-r cosθ
y1=y-r sinθ
Assuming (x, y) as the center point, when θ changes from 0 to 360, a complete circle of radius r is generated; image PiEach point in (x, y) corresponds to a circle in (x, y, r) space, and for a particular (x, y), theta is scanned from 0 to 360 degrees, assuming a radius r of a circle, and (x, y) is finally obtained1,y1) The group of coordinates with the most occurrence in the values is the circle center;
distortion calibration
Since the image O is being obtainedi(x, y) an angle of inclination is formed between the image acquisition device and the reference plate, which cannot form a vertical downward (orthographic projection), and perspective is usedUnder the condition that three points of the center, the image point and the target point are collinear, the bearing surface (the perspective surface) rotates a certain angle around a trace line (a perspective axis) according to a perspective rotation law, the projection geometric figure on the bearing surface is kept unchanged, and a transformation formula is as follows:
Figure BDA0002308038240000021
(x, y) is Pi(x, y) image pixel coordinates,
Figure RE-GDA0002454220510000022
for images P1 after transformationi(x, y) pixel coordinates, the perspective transformation matrix is interpreted as follows:
Figure RE-GDA0002454220510000023
which represents a linear change in the image,
Figure RE-GDA0002454220510000024
for producing a perspective transformation of the image, [ a ]31a32]Representing image translation;
step four, extracting the shape of the leaves of the processed image;
i, calculating the extraction of a leaf part;
i. preliminarily removing a non-blade part;
let image P1iThe (x, y) point pixel RGB three primary colors in (x, y) are (R, G, B), which is in line with G(x,y)*2-(B(x,y)+R(x,y)) If > 0, the (x, y) point pixel is a green leaf part, (R, G, B) is kept at the initial value, otherwise, the (x, y) point pixel is a dead leaf part or a non-leaf part, (R, G, B) ((255,255,255)), and an image P2 is obtainedi(x,y);
Binarization;
applying OTSU algorithm of maximum inter-class variance method to image P2 according to gray level characteristics of imagei(x, y) is divided into two parts of background (non-leaf) and foreground (leaf), and an image P3 is obtainedi(x,y);
II, obtaining and separating blades;
i. removing the cavity in the blade and smoothing the blade;
first, for image P3iThe boundaries of (x, y) are shrunk with a structural matrix b (selected as the 3 x 3, 5 x 5 matrix region) to eliminate small impurities and obtain an image P4i(x, y), the specific calculation is as follows:
Figure BDA0002308038240000031
the formula represents shrinking P3 by the structural matrix bi(x, y) when the origin of b is translated to image P3i(x, y) if b is at (x, y) and is completely contained in image P3i(x, y) overlap, image P4 will be outputiThe RGB value of the pixel point (x, y) corresponding to (x, y) is assigned to (R, G, B) ═ 0,0,0, otherwise, (R, G, B) ═ 255, 255); then to image P4i(x, y) is expanded with the structural matrix b (selected as the 3 x 3, 5 x 5 matrix area) to fill in the P4iSome voids in (x, y) and elimination of impurities.
In the technical scheme, the filling P4iThe specific operation of removing some voids and impurities in (x, y) is as follows:
Figure BDA0002308038240000032
this formula represents the translation of the origin of the structural element b to the image P4i(x, y) the location of the pixel (x, y). If b is in image P4i(x, y) pixel (x, y) is associated with P4iIf the intersection of (x, y) is not null, the output image P4iAnd (x, y) assigning the RGB value of the pixel point (x, y) corresponding to (x, y) to (R, G, B) ═ 0,0,0, otherwise, assigning (R, G, B) ═ 255, 255.
In the technical scheme, after the cavity is filled and the impurities are eliminated, the step I of smoothing and denoising the image in the step 3 is repeated finally, and the influence of the impurities is eliminated.
The invention has the advantages and beneficial effects that: the method can solve the problem that the blade cavity cannot be processed when the blade form is measured in the prior art, can correct the blade image distortion, and has more accurate measurement result.
Drawings
Fig. 1 is a schematic structural view of the present invention.
For a person skilled in the art, other relevant figures can be obtained from the above figures without inventive effort.
Detailed Description
In order to make the technical solution of the present invention better understood, the technical solution of the present invention is further described below with reference to specific examples.
A method for calibrating and extracting blade morphology comprises the following steps:
1) making calibration plates
The calibration plate is a white background rectangle flat baffle, four corners of the calibration plate respectively comprise a black hollow circle, and the centers of the four circles are connected to form a rectangle with a known area (S)bd). The size of the rectangle is set according to the size of the area of the blade to be measured, and the blade is ensured to be arranged inside the four circular connecting rectangles.
2) Image acquisition of blades
A calibration plate is placed behind the blades, the blades are photographed through a mobile phone, a camera or other photographing devices, and the included angle between a photographing lens and the calibration plate is not more than 45 degrees.
3) Calibrating the distortion of the blade image to obtain the effective area of the calibration plate
I, smoothly denoising an image and removing the influence of miscellaneous points;
the value of a certain point in the blade image is replaced by the average value in an adjacent area of the point, the surrounding pixel values are close to the real values, and therefore the isolated noise point is eliminated
Pi(x,y)=mid{Oi(x-k,y-l),(k,l)∈R}
Wherein Oi (x, y), Pi (x, y) are the original image and the processed image respectively, R is a two-dimensional matrix, and is preferably selected to be 3 × 3 or 5 × 5; (k, what is l) if R is a 3 x 3 matrix, then k e-1, l e-1, if R is a 5 x 5 matrix, then k-2, l e-2, where k, l are integers.
II, obtaining effective area of calibration plate
Due to the acquisition of the original image Oi(x, y) when the image is in a horizontal state, the lens and the calibration plate cannot be guaranteed to be completely horizontal, the image has certain deformation, and meanwhile, the area of the calibration plate needs to be obtained, so that P needs to be calibrated firstlyi(x, y) performing image correction:
i. circle center coordinate lookup
x1=x-r cosθ
y1=y-r sinθ
Let (x, y) be the center point, when θ varies from 0 to 360, a complete circle of radius r is produced. Image PiEach point in (x, y) corresponds to a circle in (x, y, r) space, and for a particular (x, y), theta is scanned from 0 to 360 degrees, assuming a radius r of a circle, and (x, y) is finally obtained1,y1) The set of coordinates in which the most appears in the values is the center of the circle.
Distortion calibration
Because an inclined angle is formed between the digital camera and the reference plate when an image is obtained, the vertical downward orthographic projection cannot be formed, the condition that three points including a perspective center, an image point and a target point are collinear is utilized, a bearing surface (a perspective surface) is rotated for a certain angle around a trace line (a perspective axis) according to a perspective rotation law, the projection geometric figure on the bearing surface is kept unchanged, and the transformation formula is as follows:
(x, y) is Pi(x, y) image pixel coordinates,
Figure RE-GDA0002454220510000052
for images P1 after transformationi(x, y) pixel coordinates. The perspective transformation matrix is explained as follows:
Figure RE-GDA0002454220510000053
which represents a linear change in the image,
Figure RE-GDA0002454220510000054
for producing a perspective transformation of the image, [ a ]31a32]Representing image translation.
4) Leaf morphology extraction
I, partial extraction of leaves
i. Preliminary removal of non-bladed portions
Let image P1iThe (x, y) point pixel RGB three primary colors in (x, y) are (R, G, B), which is in line with G(x,y)*2-(B(x,y)+R(x,y))>0, the (x, y) point pixel is a green leaf part, (R, G, B) is kept at the initial value, otherwise, the (x, y) point pixel is a dead leaf part or a non-leaf part, (R, G, B) ((255,255,255)), and an image P2 is obtainedi(x,y)。
Binarization
Applying OTSU algorithm of maximum inter-class variance method to image P2 according to gray level characteristics of imagei(x, y) is divided into two parts of background (non-leaf) and foreground (leaf), and an image P3 is obtainedi(x,y)。
II, blade acquisition and separation
i. Removing cavities and smoothing blades
First, for image P3iThe boundaries of (x, y) are shrunk with a structural matrix b (selected as the 3 x 3, 5 x 5 matrix region) to eliminate small impurities and obtain an image P4i(x, y), the specific calculation is as follows:
Figure BDA0002308038240000061
the formula represents shrinking P3 by the structural matrix bi(x, y) when the origin of b is translated to image P3i(x, y) if b is at (x, y) and is completely contained in image P3i(x, y) overlap, image P4 will be outputiThe RGB value of the pixel point (x, y) corresponding to (x, y) is assigned to (R, G, B) ═ 0,0,0, otherwise, (R, G, B) ═ 255,255, 255).
Then to image P4i(x, y) is expanded with the structural matrix b (selected as the 3 x 3, 5 x 5 matrix area) to fill in the P4iSome voids in (x, y) and elimination of impurities. Concrete transportationThe calculation is as follows:
Figure BDA0002308038240000062
this formula represents the translation of the origin of the structural element b to the image P4i(x, y) the location of the pixel (x, y). If b is in image P4i(x, y) pixel (x, y) is associated with P4iIf the intersection of (x, y) is not null, the output image P5iAnd (x, y) assigning the RGB value of the pixel point (x, y) corresponding to (x, y) to (R, G, B) ═ 0,0,0), otherwise, assigning (R, G, B) ═ 255,255, 255.
The invention has been described in an illustrative manner, and it is to be understood that any simple variations, modifications or other equivalent changes which can be made by one skilled in the art without departing from the spirit of the invention fall within the scope of the invention.

Claims (2)

1. A method for calibrating and extracting blade morphology is characterized by comprising the following steps:
firstly, manufacturing a calibration plate;
step two, image acquisition is carried out on the blade by an image acquisition device to obtain an original image Oi(x,y);
Step three, processing the collected image;
i, smoothly denoising an image and removing the influence of miscellaneous points;
the value of a certain point in the blade image is replaced by the average value in a neighboring area of the point, and the surrounding pixel values are close to the real values, so that the isolated noise point is eliminated, and the formula is as follows:
Pi(x,y)=mid{Oi(x-k,y-l),(k,l)∈R}
wherein, Oi(x,y),Pi(x, y) are respectively an original image and a processed image, and R is a two-dimensional matrix;
II, obtaining the effective area of the calibration plate;
first to Pi(x, y) performing image correction;
i. circle center coordinate lookup
x1=x-r cosθ
y1=y-r sinθ
Assuming (x, y) as the center point, when θ changes from 0 to 360, a complete circle of radius r is generated; image PiEach point in (x, y) corresponds to a circle in (x, y, r) space, and for a particular (x, y), theta is scanned from 0 to 360 degrees, assuming a radius r of a circle, and (x, y) is finally obtained1,y1) The group of coordinates with the most occurrence in the values is the circle center;
distortion calibration
Since the image O is being obtainedi(x, y) when the projection geometry on the bearing surface is kept unchanged, an inclination angle is formed between the image acquisition device and the reference plate, the vertical downward direction cannot be formed, the bearing surface is rotated by a certain angle around the trace line according to the perspective rotation law under the condition that three points including a perspective center, an image point and a target point are collinear, and the transformation formula is as follows:
Figure RE-FDA0002454220500000011
(x, y) is Pi(x, y) image pixel coordinates,
Figure RE-FDA0002454220500000012
for images P1 after transformationi(x, y) pixel coordinates, the perspective transformation matrix is interpreted as follows:
Figure RE-FDA0002454220500000013
which represents a linear change in the image,
Figure RE-FDA0002454220500000014
for generating image perspective transformation, [ a ]31a32]Representing image translation;
step four, extracting the shape of the leaves of the processed image;
i, calculating the extraction of a leaf part;
i. preliminarily removing a non-blade part;
let image P1i(x, y) ofx, y) dot pixel RGB three primary colors are (R, G, B), corresponding to G(x,y)*2-(B(x,y)+R(x,y))>0, the (x, y) point pixel is a green leaf part, (R, G, B) is kept at the initial value, otherwise, the (x, y) point pixel is a dead leaf part or a non-leaf part, (R, G, B) ((255,255,255)), and an image P2 is obtainedi(x,y);
Binarization;
applying OTSU algorithm of maximum inter-class variance method to image P2 according to gray level characteristics of imagei(x, y) into a background (non-leaf) and a foreground, an image P3 is obtainedi(x,y);
II, obtaining and separating blades;
i. removing the cavity in the blade and smoothing the blade;
first, for image P3iThe boundaries of (x, y) are shrunk with a structural matrix b, eliminating small impurities, and an image P4 is obtainedi(x, y), the specific calculation is as follows:
Figure RE-FDA0002454220500000021
the formula represents shrinking P3 by the structural matrix bi(x, y) when the origin of b is translated to image P3i(x, y) if b is at (x, y) and is completely contained in image P3i(x, y) overlap, image P4 will be outputiThe RGB value of the pixel point (x, y) corresponding to (x, y) is assigned to (R, G, B) ═ 0,0,0, otherwise, (R, G, B) ═ 255, 255);
then to image P4i(x, y) expansion with the construction matrix b, filling P4iSome voids in (x, y) and elimination of impurities.
2. The method for blade morphology calibration and extraction as claimed in claim 1, wherein the padding P4iThe specific operation of removing some voids and impurities in (x, y) is as follows:
Figure RE-FDA0002454220500000022
this formula represents the translation of the origin of the structural element b to the image P4i(x, y) the location of the pixel (x, y) if b is in image P4i(x, y) pixel (x, y) is associated with P4iIf the intersection of (x, y) is not null, the output image P4iAnd (x, y) assigning the RGB value of the pixel point (x, y) corresponding to (x, y) to (R, G, B) ═ 0,0,0, otherwise, assigning (R, G, B) ═ 255, 255.
CN201911247189.3A 2019-12-09 2019-12-09 Blade form calibration and extraction method Active CN111223138B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911247189.3A CN111223138B (en) 2019-12-09 2019-12-09 Blade form calibration and extraction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911247189.3A CN111223138B (en) 2019-12-09 2019-12-09 Blade form calibration and extraction method

Publications (2)

Publication Number Publication Date
CN111223138A true CN111223138A (en) 2020-06-02
CN111223138B CN111223138B (en) 2023-05-09

Family

ID=70830729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911247189.3A Active CN111223138B (en) 2019-12-09 2019-12-09 Blade form calibration and extraction method

Country Status (1)

Country Link
CN (1) CN111223138B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103593840A (en) * 2013-09-30 2014-02-19 北京林业大学 Method for detecting phenotype of Arabidopsis
CN103591887A (en) * 2013-09-30 2014-02-19 北京林业大学 Method for detecting regional phenotype of Arabidopsis
CN106468536A (en) * 2016-11-22 2017-03-01 天津市气候中心 A kind of leaf area measurement method
CN108765433A (en) * 2018-05-31 2018-11-06 西京学院 One kind is for carrying high-precision leafy area measurement method
CN108844499A (en) * 2018-05-03 2018-11-20 岭南师范学院 A kind of Measurement Approach of Leaf Area
CN109191520A (en) * 2018-09-30 2019-01-11 湖北工程学院 A kind of Measurement Approach of Leaf Area and system based on color calibration
CN109241966A (en) * 2018-08-22 2019-01-18 东北农业大学 A kind of plant leaf blade nondestructive collection method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103593840A (en) * 2013-09-30 2014-02-19 北京林业大学 Method for detecting phenotype of Arabidopsis
CN103591887A (en) * 2013-09-30 2014-02-19 北京林业大学 Method for detecting regional phenotype of Arabidopsis
CN106468536A (en) * 2016-11-22 2017-03-01 天津市气候中心 A kind of leaf area measurement method
CN108844499A (en) * 2018-05-03 2018-11-20 岭南师范学院 A kind of Measurement Approach of Leaf Area
CN108765433A (en) * 2018-05-31 2018-11-06 西京学院 One kind is for carrying high-precision leafy area measurement method
CN109241966A (en) * 2018-08-22 2019-01-18 东北农业大学 A kind of plant leaf blade nondestructive collection method
CN109191520A (en) * 2018-09-30 2019-01-11 湖北工程学院 A kind of Measurement Approach of Leaf Area and system based on color calibration

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
宫志宏;薛庆禹;于红;李春: "基于数字照片解析的黄瓜叶片面积测定及效果分析" *
李彦锋: "基于图像处理的植物叶片几何形态数据获取方法的研究" *
胡迪: "植物叶片测量方法的研究" *

Also Published As

Publication number Publication date
CN111223138B (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN106468543B (en) Blade area measuring method based on image processing
CN108765433A (en) One kind is for carrying high-precision leafy area measurement method
CN109708578B (en) Plant phenotype parameter measuring device, method and system
Lü et al. Leaf area measurement based on image processing
CN109816680B (en) High-throughput calculation method for crop plant height
CN109522929A (en) It is a kind of based on multi-source image fusion natural environment under wheat weeds recognition methods
CN104132897B (en) A kind of nitrogenous measuring method of plant leaf blade based on handheld device and device
CN112435290B (en) Leaf area image measurement method based on saturation segmentation
CN109859101B (en) Crop canopy thermal infrared image identification method and system
CN113920106B (en) Corn growth vigor three-dimensional reconstruction and stem thickness measurement method based on RGB-D camera
CN112465832B (en) Single-side tree point cloud skeleton line extraction method and system based on binocular vision
CN110969654A (en) Corn high-throughput phenotype measurement method and device based on harvester and harvester
CN109816779A (en) A method of artificial forest forest model, which is rebuild, using smart phone obtains single wooden parameter
CN112465893A (en) Vegetation leaf area measuring method and device, terminal and storage medium
CN110610438B (en) Crop canopy petiole included angle calculation method and system
CN115687850A (en) Method and device for calculating irrigation water demand of farmland
CN112700488A (en) Living body long blade area analysis method, system and device based on image splicing
CN111579511A (en) Seed quality detection method and device based on structure hyperspectrum
CN111008642A (en) High-resolution remote sensing image classification method and system based on convolutional neural network
CN116051537A (en) Crop plant height measurement method based on monocular depth estimation
CN116883480A (en) Corn plant height detection method based on binocular image and ground-based radar fusion point cloud
CN110349200A (en) A kind of high-precision wheat head length measurement method based on monocular camera
CN112668406B (en) Effective screening method for soybean varieties
CN111223138A (en) Method for calibrating and extracting blade form
CN112634375A (en) Plane calibration and three-dimensional reconstruction method in AI intelligent detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant