CN116468657A - Plant growth detection method, equipment, device and computer storage medium - Google Patents

Plant growth detection method, equipment, device and computer storage medium Download PDF

Info

Publication number
CN116468657A
CN116468657A CN202310162809.3A CN202310162809A CN116468657A CN 116468657 A CN116468657 A CN 116468657A CN 202310162809 A CN202310162809 A CN 202310162809A CN 116468657 A CN116468657 A CN 116468657A
Authority
CN
China
Prior art keywords
plant
bounding box
height
stalk
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310162809.3A
Other languages
Chinese (zh)
Inventor
苏海峰
宋佳音
蔡扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seven Seas Shenzhen Technology Co ltd
Original Assignee
Seven Seas Shenzhen Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seven Seas Shenzhen Technology Co ltd filed Critical Seven Seas Shenzhen Technology Co ltd
Priority to CN202310162809.3A priority Critical patent/CN116468657A/en
Publication of CN116468657A publication Critical patent/CN116468657A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Medical Informatics (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure provides methods, apparatus, devices, and computer storage media for detecting plant vigor. The method is used for improving the detection efficiency of plant growth vigor. Comprising the following steps: acquiring a top image of a plant and a side image of the plant at intervals of a designated time length; inputting the top image into a pre-trained plant detection model for plant detection to obtain the positions of bounding boxes of all plants in the top image; determining, for any one plant in the top image, a width of the plant based on a position of a bounding box of the plant; and determining a stalk height of the plant using the side image; the height of the plant is obtained according to the measured distance of the plant, wherein the measured distance is obtained by measuring through a depth sensor, and the measured distance is the distance between the top end of the plant and the depth sensor; and obtaining the growth vigor of the plant through the width of the plant, the height of the stalk and the height of the plant.

Description

Plant growth detection method, equipment, device and computer storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method, an apparatus, a device, and a computer storage medium for detecting plant growth vigor.
Background
In the agricultural production process, the plant growth vigor is an important index for detecting the plant growth state and is also an important basis for predicting the plant yield. Therefore, the detection of the growth vigor of plants in a scientific way is of great importance to farmers.
In the prior art, the plant growth detection method mainly evaluates the growth of each plant manually according to own experience, but the method needs to take a lot of time, so that the plant growth detection efficiency is lower.
Disclosure of Invention
Exemplary embodiments of the present disclosure provide a method, an apparatus, a device, and a computer storage medium for detecting plant growth vigor, which are used to improve the efficiency of detecting plant growth vigor.
A first aspect of the present disclosure provides a method for detecting plant vigor, the method comprising:
acquiring a top image of a plant and a side image of the plant at intervals of a designated time length;
inputting the top image into a pre-trained plant detection model for plant detection to obtain the positions of bounding boxes of all plants in the top image;
Determining, for any one plant in the top image, a width of the plant based on a position of a bounding box of the plant; the method comprises the steps of,
determining the stalk height of the plant by using the side image; the method comprises the steps of,
obtaining the height of the plant according to the measured distance of the plant, wherein the measured distance is measured by a depth sensor, and the measured distance is the distance between the top end of the plant and the depth sensor;
and obtaining the growth vigor of the plant through the width of the plant, the height of the stalk and the height of the plant.
In this embodiment, a top image of a plant and a side image of the plant are obtained at intervals of a specified time period, the top image is input into a pre-trained plant detection model to perform plant detection, the position of a bounding box of each plant in the top image is obtained, then the width of each plant is respectively determined according to the position of the bounding box of each plant, the height of the stem of the plant is determined by using the side image, and the height of the plant is obtained according to the measured distance of the plant, so that the growth condition of the plant is obtained. Therefore, in the embodiment, the growth condition of the plant can be automatically determined, and the estimation is not needed by manpower, so that the detection efficiency of the growth condition of the plant is improved, and compared with the manual estimation, the detection method of the growth condition of each plant in the embodiment has the same standard, and the condition that the standard is different in the manual detection is avoided, so that the detection accuracy of the growth condition of the plant is improved.
In one embodiment, the plant vigor further comprises plant density;
the method further comprises the steps of after inputting the top image into a pre-trained plant detection model to detect plants and obtaining the positions of bounding boxes of all plants in the top image:
obtaining an intermediate plant density corresponding to the top image according to the total number of bounding boxes in the top image and a preset visual field area;
and determining the average value of the intermediate plant densities of the top images corresponding to the designated time periods as the plant density.
In the embodiment, the plant density is determined by the average value of the intermediate plant densities of the top images corresponding to the designated time periods respectively, so that the accuracy of the plant density is improved.
In one embodiment, the position of the bounding box includes image position coordinates of four vertices in the bounding box;
the determining the width of the plant based on the position of the bounding box of the plant comprises:
converting the image position coordinates of any vertex in the bounding box by using a preset depth camera calibration matrix to obtain the actual position coordinates of the vertex in a world coordinate system;
Obtaining the length and the width of the bounding box based on the actual position coordinates of each vertex in the bounding box, wherein the value of the width is larger than that of the length;
the width of the plant is determined based on the width of the bounding box.
In this embodiment, the width of the bounding box is determined by converting the image position coordinates of each vertex in the bounding box into the actual position coordinates of each vertex in the world coordinate system and determining the width of the bounding box as the width of the plant. Therefore, the width of the plant is determined through the actual position coordinates of the vertex of the bounding box in the embodiment, so that the determined width of the plant is more accurate.
In one embodiment, the determining the stalk height of the plant using the side image comprises:
inputting the side image into a pre-trained stalk detection model for stalk detection, and obtaining the position of a bounding box of the stalk of each plant in the plant side image, wherein the position of the bounding box comprises the image position coordinates of each vertex in the bounding box;
determining a target stalk position number corresponding to the plant position number of the plant by utilizing the corresponding relation between the preset plant position number and the stalk position number;
Converting the image position coordinates of each vertex of a bounding box of the stalk corresponding to the stalk position serial number of the target through a preset vision camera calibration matrix to obtain the actual position coordinates of each vertex in a world coordinate system;
and obtaining the height of the stalk of the plant based on the actual position coordinates of each vertex.
In the embodiment, the stem detection is performed on the side image through the stem detection model to obtain a stem bounding box, and the stem height of the plant is determined through the actual position coordinates of each vertex in the stem bounding box in the world coordinate system, so that the accuracy of the determined stem height of the plant is improved.
In one embodiment, the plant vigor further comprises a height ratio parameter;
after the plant height is obtained according to the measured distance of the plant, the method further comprises the following steps:
and dividing the height of the stalk of the plant by the height of the plant to obtain the height proportion parameter of the plant.
In one embodiment, the plant detection model is trained by:
acquiring a training sample, wherein the training sample comprises top images of plants, and the top images comprise labeling positions of bounding boxes of the plants;
Inputting the training sample into the plant detection model to perform plant detection to obtain the predicted position of the bounding box of each plant;
obtaining an error value according to the labeling position of the bounding box of each plant and the prediction position of the bounding box of each plant;
and if the error value is greater than the specified threshold, after the specified parameters of the plant detection model are adjusted, returning to the step of inputting the training sample into the plant detection model for plant detection until the error value is not greater than the specified threshold, and ending training of the plant detection model.
In this embodiment, the plant detection model is trained through the training sample, and an error value of the model is obtained according to the predicted position and the labeling position of the bounding box of each plant in the obtained training sample, when the error value is greater than a specified threshold, parameters of the plant detection model are adjusted, and then the training sample is continuously used for training the plant detection model until the obtained error value is not greater than the specified threshold, and then training of the model is ended. Therefore, the accuracy of the plant growth detection model is guaranteed, and the accuracy of plant growth detection is further improved.
A second aspect of the present disclosure provides a detection apparatus for plant vigor, the apparatus comprising a depth sensor, a vision sensor and a processor, wherein:
the depth sensor is used for acquiring a top image of the plant and a measurement distance of the plant, wherein the measurement distance is the distance between the top of the plant and the depth sensor;
the visual sensor is used for acquiring a side image of the plant;
the processor is respectively connected with the depth sensor and the visual sensor, and is used for inputting the top image into a pre-trained plant detection model for plant detection every other appointed time length to obtain the position of a bounding box of each plant in the top image;
determining, for any one plant in the top image, a width of the plant based on a position of a bounding box of the plant; the method comprises the steps of,
determining the stalk height of the plant by using the side image; the method comprises the steps of,
obtaining the height of the plant according to the measured distance of the plant;
and obtaining the growth vigor of the plant through the width of the plant, the height of the stalk and the height of the plant.
In one embodiment, the plant vigor further comprises plant density;
The processor is further configured to:
inputting the top image into a pre-trained plant detection model for plant detection, and obtaining intermediate plant density corresponding to the top image according to the total number of bounding boxes in the top image and a preset field area after obtaining the position of the bounding box of each plant in the top image;
and determining the average value of the intermediate plant densities of the top images corresponding to the designated time periods as the plant density.
In one embodiment, the position of the bounding box includes image position coordinates of four vertices in the bounding box;
the processor executes the determining the width of the plant based on the position of the bounding box of the plant, and is specifically used for:
converting the image position coordinates of any vertex in the bounding box by using a preset depth camera calibration matrix to obtain the actual position coordinates of the vertex in a world coordinate system;
obtaining the length and the width of the bounding box based on the actual position coordinates of each vertex in the bounding box, wherein the value of the width is larger than that of the length;
The width of the plant is determined based on the width of the bounding box.
In one embodiment, the position of the bounding box includes image position coordinates of four vertices in the bounding box;
the processor executes the determining the width of the plant based on the position of the bounding box of the plant, and is specifically used for:
inputting the side image into a pre-trained stalk detection model for stalk detection, and obtaining the position of a bounding box of the stalk of each plant in the plant side image, wherein the position of the bounding box comprises the image position coordinates of each vertex in the bounding box;
determining a target stalk position number corresponding to the plant position number of the plant by utilizing the corresponding relation between the preset plant position number and the stalk position number;
converting the image position coordinates of each vertex of a bounding box of the stalk corresponding to the stalk position serial number of the target through a preset vision camera calibration matrix to obtain the actual position coordinates of each vertex in a world coordinate system;
and obtaining the height of the stalk of the plant based on the actual position coordinates of each vertex.
In one embodiment, the plant vigor further comprises a height ratio parameter;
After the plant height is obtained according to the measured distance of the plant, the method further comprises the following steps:
and dividing the height of the stalk of the plant by the height of the plant to obtain the height proportion parameter of the plant.
In one embodiment, the processor is further configured to:
the plant detection model is trained by:
acquiring a training sample, wherein the training sample comprises top images of plants, and the top images comprise labeling positions of bounding boxes of the plants;
inputting the training sample into the plant detection model to perform plant detection to obtain the predicted position of the bounding box of each plant;
obtaining an error value according to the labeling position of the bounding box of each plant and the prediction position of the bounding box of each plant;
and if the error value is greater than the specified threshold, after the specified parameters of the plant detection model are adjusted, returning to the step of inputting the training sample into the plant detection model for plant detection until the error value is not greater than the specified threshold, and ending training of the plant detection model.
According to a third aspect of embodiments of the present disclosure, there is provided a detection apparatus for plant growth vigor, the apparatus comprising:
The plant image acquisition module is used for acquiring top images of plants and side images of the plants at intervals of designated time;
the plant detection module is used for inputting the top image into a pre-trained plant detection model to perform plant detection, so as to obtain the position of a bounding box of each plant in the top image;
the plant width determining module is used for determining the width of any plant in the top end image based on the position of the bounding box of the plant; the method comprises the steps of,
a stalk height determining module for determining a stalk height of the plant using the side image; the method comprises the steps of,
the plant height determining module is used for obtaining the height of the plant according to the measured distance of the plant, wherein the measured distance is obtained by measuring through a depth sensor, and the measured distance is the distance between the top end of the plant and the depth sensor;
and the plant growth condition determining module is used for obtaining the growth condition of the plant through the width of the plant, the height of the stem and the height of the plant.
According to a fourth aspect provided by embodiments of the present disclosure, there is provided a computer storage medium storing a computer program for performing the method according to the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings that are needed in the description of the embodiments will be briefly described below, it will be apparent that the drawings in the following description are only some embodiments of the present disclosure, and that other drawings may be obtained according to these drawings without inventive effort to a person of ordinary skill in the art.
FIG. 1 is a schematic structural view of a plant growth detection apparatus according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a top image according to one embodiment of the present disclosure;
FIG. 3 is a schematic illustration of a side image according to one embodiment of the present disclosure;
FIG. 4 is one of the flow charts of the method for detecting plant vigor according to one embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a top image of a plant after detection according to one embodiment of the present disclosure;
FIG. 6 is a flow chart of a method for detecting plant vigor according to one embodiment of the present disclosure;
FIG. 7 is a flow chart of determining plant width according to one embodiment of the present disclosure;
FIG. 8 is a schematic flow chart of determining stalk height according to one embodiment of the present disclosure;
FIG. 9 is a schematic illustration of a binarized side image according to one embodiment of the present disclosure;
FIG. 10 is a second flow chart of a method for detecting plant vigor according to an embodiment of the disclosure;
FIG. 11 is a plant growth detection device according to one embodiment of the present disclosure;
fig. 12 is a schematic structural view of an electronic device according to an embodiment of the present disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are some embodiments of the present disclosure, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without inventive effort, based on the embodiments in this disclosure are intended to be within the scope of this disclosure.
The term "and/or" in the embodiments of the present disclosure describes an association relationship of association objects, which indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The application scenario described in the embodiments of the present disclosure is for more clearly describing the technical solution of the embodiments of the present disclosure, and does not constitute a limitation on the technical solution provided by the embodiments of the present disclosure, and as a person of ordinary skill in the art can know that, with the appearance of a new application scenario, the technical solution provided by the embodiments of the present disclosure is equally applicable to similar technical problems. In the description of the present disclosure, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the prior art, the plant growth detection method mainly evaluates the growth of each plant manually according to own experience, but the method needs to take a lot of time, so that the plant growth detection efficiency is lower.
Therefore, the present disclosure provides a method for detecting plant growth vigor, by acquiring a top image of a plant and a side image of the plant at intervals of a specified period of time, inputting the top image into a pre-trained plant detection model to perform plant detection, obtaining positions of bounding boxes of each plant in the top image, then determining widths of each plant respectively by the positions of the bounding boxes of each plant, determining a stem height of the plant by using the side image, and obtaining the height of the plant according to a measurement distance of the plant, thereby obtaining the growth vigor of the plant. Therefore, in the embodiment, the growth condition of the plant can be automatically determined, and the estimation is not needed by manpower, so that the detection efficiency of the growth condition of the plant is improved, and compared with the manual estimation, the detection method of the growth condition of each plant in the embodiment has the same standard, and the condition that the standard is different in the manual detection is avoided, so that the detection accuracy of the growth condition of the plant is improved.
Before describing the plant growth detection method in detail, first, the plant growth detection device in the present invention is described, as shown in fig. 1, and is a schematic structural diagram of the plant growth detection device, and as can be seen from fig. 1, the plant growth detection device includes a depth sensor 110, a visual sensor 120, a processor 130 and a tire 140. Wherein:
the depth sensor 110 is configured to obtain an image of a top end of a plant and a measured distance of the plant, where the measured distance is a distance between the top end of the plant and the depth sensor. H in FIG. 1 is the measured distance of the plants. FIG. 2 is a top image of a plant acquired by a depth sensor.
It should be noted that: the top image of the plants includes at least one plant, but the number of the plants is determined according to the field of view of the depth sensor and the gap between each plant, and fig. 2 is only used for illustration in this embodiment, and the number of the plants in the top image is not limited.
A vision sensor 120 for acquiring a side image of the plant. Fig. 3 is a schematic diagram of a measurement image in this embodiment.
The processor 130 is respectively connected with the depth sensor 110 and the visual sensor 120, and is configured to input the top image into a pre-trained plant detection model for plant detection every a specified period of time, so as to obtain the position of a bounding box of each plant in the top image; determining, for any one plant in the top image, a width of the plant based on a position of a bounding box of the plant; and determining a stalk height of the plant using the side image; obtaining the height of the plant according to the measured distance of the plant; and obtaining the growth vigor of the plant through the width of the plant, the height of the stalk and the height of the plant.
And the tyre 140 is used for rotating along the travelling direction after receiving the instruction sent by the processor.
The method for detecting plant growth in the present disclosure will be described in detail with reference to the accompanying drawings. As shown in fig. 4, a flow chart of a method for detecting plant growth conditions of the present disclosure may include the following steps:
step 401: acquiring a top image of a plant and a side image of the plant at intervals of a designated time length;
it should be noted that: the specified duration in this embodiment may be 1 second, 1 minute, 2 minutes, etc., and may be set according to practical situations, and the specific value of the specified duration is not limited in this embodiment.
Step 402: inputting the top image into a pre-trained plant detection model for plant detection to obtain the positions of bounding boxes of all plants in the top image;
as shown in fig. 5, for a schematic diagram of the top image after plant detection, it can be seen from the figure that each plant in the top image has a corresponding bounding box.
In order to ensure the accuracy of plant detection, a training method of the plant detection model in this embodiment is described, and in one embodiment, as shown in fig. 6, a flowchart of the training method of the plant detection model in this embodiment includes the following steps:
step 601: acquiring a training sample, wherein the training sample comprises top images of plants, and the top images comprise labeling positions of bounding boxes of the plants;
step 602: inputting the training sample into the plant detection model to perform plant detection to obtain the predicted position of the bounding box of each plant;
step 603: obtaining an error value according to the labeling position of the bounding box of each plant and the prediction position of the bounding box of each plant;
the labeling positions of the bounding boxes of the plants comprise labeling positions of all vertexes in the bounding boxes, and the predicting positions of the bounding boxes comprise predicting positions of all vertexes in the bounding boxes.
In one embodiment, step 603 may be implemented as: for any vertex in the bounding box of any plant, determining the absolute value of the difference between the predicted position of the vertex and the labeling position of the vertex as the error value of the vertex, determining the average value of the error values of the vertices as the error value of the plant, and determining the average value of the error values of the plants in a training sample as the error value.
Step 604: judging whether the error value is greater than the specified threshold, if so, executing step 605, and if not, executing step 606;
it should be noted that: the specified threshold value in the present embodiment may be set according to actual situations, and the specific value of the specified threshold value is not limited in this embodiment.
Step 605: after the specified parameters of the plant detection model are adjusted, returning to the execution step 602;
in this embodiment, the specific parameters of the plant detection model are adjusted by increasing or decreasing the specific parameters by specific values each time. The specific adjustment mode can be set according to the actual situation, and the embodiment is not limited to the adjustment mode. In addition, the specific parameters in the present embodiment may be set according to the actual situation, but the present embodiment does not limit the specific parameters, and if the number of the specific parameters is plural, the adjustment manner of each specific parameter may be the same or different, and specific needs may be set according to the actual situation.
Step 606: and finishing training the plant detection model to obtain the pre-trained plant detection model.
The plant detection modules in the embodiment can be R-CNN, fast R-CNN, SPP-Net, R-FCN, YOLO, SSD, retinaNet and the like. However, the specific plant detection model may be set according to practical situations, and the embodiment is not limited to the plant detection model.
Step 403: determining, for any one plant in the top image, a width of the plant based on a position of a bounding box of the plant;
the position of the bounding box comprises image position coordinates of four vertexes in the bounding box;
in one embodiment, step 403 may be implemented as: as shown in fig. 7, a schematic flow chart for determining plant width includes the following steps:
step 701: converting the image position coordinates of any vertex in the bounding box by using a preset depth camera calibration matrix to obtain the actual position coordinates of the vertex in a world coordinate system; wherein the actual position coordinates of the vertex in the world coordinate system can be obtained by the formula (1):
Wherein h is p For the distance measured by the plant, i.e. the distance between the depth sensor and the top of the plant, u 1 Is the abscissa of the image position coordinates of the vertex, v 1 Is the ordinate of the image position coordinates of the vertex, K 1 Calibrating camera internal parameters in a matrix for a preset depth camera,R 1 calibrating a rotation matrix in the matrix for the preset depth camera, T 1 Calibrating a translation matrix, x in the matrix for a preset depth camera 1 Is the abscissa in the actual coordinates of the vertex, y 1 Is the ordinate, z, of the actual coordinates of the vertex 1 Is the vertical coordinate of the actual coordinates of the vertex.
Step 702: obtaining the length and the width of the bounding box based on the actual position coordinates of each vertex in the bounding box, wherein the value of the width is larger than that of the length;
in one embodiment, step 702 may be implemented as: for any one edge in the bounding box, obtaining the length of the edge based on actual position coordinates of two vertexes positioned in the edge; and obtaining the length of the vertical side based on the actual position coordinates of two vertexes in the vertical side perpendicular to the side, and determining the length with the largest value in the obtained lengths as the width. Wherein the length of the edge can be obtained by the formula (2):
Wherein L is the length of the edge, x 1 Y is the abscissa of one vertex in the edge 1 X is the ordinate, x, of a vertex in the edge 2 Is the abscissa, y, of another vertex in the edge 2 Is the ordinate of the other vertex in the edge.
Step 703: the width of the plant is determined based on the width of the bounding box.
In one embodiment, step 703 may be implemented in two ways:
mode one: the width of the bounding box is determined as the width of the plant.
Mode two: and adding the width of the bounding box with a first specified error value to obtain the width of the plant.
Mode three: subtracting the width of the bounding box from the second specified error value to obtain the width of the plant.
It should be noted that: the specific manner of using any of the above modes may be set according to the actual situation, and the embodiment is not limited to the specific manner. In addition, the first specified error value and the second specified error value in the present embodiment may be set according to actual situations, and the specific values of the first specified error value and the second specified error value are not limited in this embodiment.
Step 404: determining the stalk height of the plant by using the side image;
The following describes the manner of determining the stalk height of a plant, as shown in fig. 8, which is a schematic flow chart of a method for determining the stalk height, comprising the steps of:
step 801: inputting the side image into a pre-trained stalk detection model for stalk detection, and obtaining the position of a bounding box of the stalk of each plant in the plant side image, wherein the position of the bounding box comprises the image position coordinates of each vertex in the bounding box;
it should be noted that, the training mode of the stalk detection model in this embodiment is the same as the training mode of the plant detection model described above, except that the training samples are different, and the training samples in this embodiment include stalk images and bounding boxes of the stalks in the images. However, the training process is the same as that of the plant detection model, so that the description of this embodiment is omitted here.
Step 802: determining a target stalk position number corresponding to the plant position number of the plant by utilizing the corresponding relation between the preset plant position number and the stalk position number;
since the installation angles of the depth sensor and the vision sensor are known, the correspondence between the plant in the top image and the stalk in the side image photographed at the same point in time can be determined based on the installation angle of the depth sensor and the installation angle of the vision sensor. The corresponding relation between the plant position serial number and the stalk position serial number is shown in table 1:
Plant position number Stalk position number
1 C
2 B
3 A
TABLE 1
The plant position numbers in this embodiment are that the top images sequentially number the plants from top to bottom, and the stalk position numbers are that the side images sequentially number the stalks from left to right.
For example, as shown in fig. 2 and 3, the plant position numbers of the plants from top to bottom in the top image of the plants are sequentially 1, 2 and 3, and the stem position numbers of the stems from left to right in the side image are sequentially A, B, C. Plants with plant numbers of 1, 2 and 3 in turn correspond to the stalks with stalk position numbers of C, B and a in the side images, respectively, based on the correspondence described above.
Step 803: converting the image position coordinates of each vertex of a bounding box of the stalk corresponding to the stalk position serial number of the target through a preset vision camera calibration matrix to obtain the actual position coordinates of each vertex in a world coordinate system; wherein, the actual position coordinates of each vertex in the world coordinate system can be obtained by the formula (3):
wherein z is c For the distance between the preset visual sensor and the side of the plant, u 2 Is the abscissa of the image position coordinates of the vertex, v 2 Is the ordinate of the image position coordinates of the vertex, K 2 Calibrating camera internal parameters in matrix for preset visual camera, R 2 Calibrating a rotation matrix in the matrix for the preset vision camera, T 2 Calibrating a translation matrix, x in the matrix for a preset vision camera 2 Is the abscissa, y, in the actual position coordinates of the vertex 2 Is the ordinate, z, of the actual position coordinates of the vertex 2 Is the vertical coordinate among the actual position coordinates of the vertex.
Step 804: and obtaining the height of the stalk of the plant based on the actual position coordinates of each vertex.
In one embodiment, step 804 may be implemented as: obtaining the length and the width of the bounding box of the stalk according to the position coordinates of each vertex of the bounding box of the stalk, wherein the length is larger than the width; and determining the length of the bounding box of the stalk as the height of the stalk.
Specifically, for any one edge in the bounding box of the stalk, obtaining the length of the edge based on actual position coordinates of two vertexes positioned in the edge; and obtaining the length of the vertical side based on the actual position coordinates of two vertexes in the vertical side which is perpendicular to the side, determining the length with the largest value in the obtained two lengths as the length of the stalk bounding box, and determining the length with the smallest value in the obtained two lengths as the width of the stalk bounding box.
In this embodiment, the stalk height may also be determined by means of image binarization: if the whole plant is green and the stem is red, binarizing the side image through the threshold value of the brightness of the red channel, namely, the pixel value in the side image is smaller thanThe pixel value of the pixel point of the threshold is set to 0, and the pixel value of the pixel point whose non-pixel value is not less than the threshold is set to 255. Only the red stalk part remains in the processed side image, and as shown in fig. 9, the height h of the black pixel communication area (i.e. the area with the pixel value of 255) is used for binarizing the side image 2 Determining the image height of the stalk, and then converting the image height of the stalk by using a preset visual camera calibration matrix to obtain the stalk height. The image height of the stalk is the height of the stalk in the side image, and the height of the stalk is the height of the stalk in a world coordinate system, namely the real height.
Step 405: obtaining the height of the plant according to the measured distance of the plant, wherein the measured distance is measured by a depth sensor, and the measured distance is the distance between the top end of the plant and the depth sensor;
In one embodiment, step 405 may be implemented as: subtracting the preset height of the depth sensor from the measured distance of the plant to obtain the height of the plant, wherein the height of the depth sensor is the distance between the depth sensor and the ground. Wherein the plant height can be obtained by formula (4):
K h =h g -h p ……(4);
wherein K is h H is the height of the plant g H is the height of the depth sensor p The distance was measured for the plants.
In one embodiment, after step 405 is performed, the height of the plant's stalk is divided by the height of the plant to obtain a height scaling parameter for the plant. Wherein the plant height ratio parameter can be obtained by the formula (5):
wherein K is p Is a height proportion parameter, h w For the height of the stem of the plant, K h Is the plant height.
Step 406: and obtaining the growth vigor of the plant through the width of the plant, the height of the stalk and the height of the plant.
The growth vigor of the plant in this embodiment includes good growth vigor or poor growth vigor.
In one embodiment, step 406 may be implemented as: comparing the data of the plant with preset standard data for any plant, if the data of the plant is smaller than the data of the standard data, determining that the growth of the plant is poor, and if the data of the plant is not smaller than the data of the standard data, determining that the growth of the plant is good. Wherein the data of the plant includes a width of the plant, a height of the stalk, a height ratio parameter of the plant, and a height of the plant, and the standard data includes a standard width of the plant, a standard height of the stalk, a standard height ratio parameter of the plant, and a standard height of the plant.
It should be noted that: in this embodiment, the width of the plant is compared with the standard width of the plant, the stalk height is compared with the standard stalk height, the height proportion parameter of the plant is compared with the standard proportion parameter, and the height of the plant is compared with the standard height of the plant to obtain the growth vigor of the plant.
The growth vigor of the obtained plants is evaluated for each plant, and the growth vigor of each plant in the whole area can be evaluated by the plant density in the embodiment, and the following detailed description is given to the determination mode of the plant density:
in one embodiment, plant density is determined by: obtaining an intermediate plant density corresponding to the top image according to the total number of bounding boxes in the top image and a preset visual field area; and determining the average value of the intermediate plant densities of the top images corresponding to the designated time periods as the plant density. And dividing the total number of the surrounding boxes by the field area to obtain the plant density. The intermediate plant density can be obtained by formula (6):
Where P is the intermediate plant density, m is the total number of bounding boxes, and A is the field area.
For further understanding of the technical solution of the present disclosure, the following detailed description with reference to fig. 10 may include the following steps:
step 1001: acquiring a top image of a plant and a side image of the plant at intervals of a designated time length;
step 1002: inputting the top image into a pre-trained plant detection model for plant detection to obtain the positions of bounding boxes of all plants in the top image, wherein the positions of the bounding boxes comprise image position coordinates of four vertexes in the bounding box;
step 1003: converting the image position coordinates of any vertex in the bounding box by using a preset depth camera calibration matrix to obtain the actual position coordinates of the vertex in a world coordinate system;
step 1004: obtaining the length and the width of the bounding box based on the actual position coordinates of each vertex in the bounding box, wherein the value of the width is larger than that of the length;
step 1005: determining a width of the plant based on the width of the bounding box;
step 1006: inputting the side image into a pre-trained stalk detection model for stalk detection, and obtaining the position of a bounding box of the stalk of each plant in the plant side image, wherein the position of the bounding box comprises the image position coordinates of each vertex in the bounding box;
Step 1007: determining a target stalk position number corresponding to the plant position number of the plant by utilizing the corresponding relation between the preset plant position number and the stalk position number;
step 1008: converting the image position coordinates of each vertex of a bounding box of the stalk corresponding to the stalk position serial number of the target through a preset vision camera calibration matrix to obtain the actual position coordinates of each vertex in a world coordinate system;
step 1009: obtaining the stalk height of the plant based on the actual position coordinates of the vertexes;
step 1010: obtaining the height of the plant according to the measured distance of the plant, wherein the measured distance is measured by a depth sensor, and the measured distance is the distance between the top end of the plant and the depth sensor;
step 1011: obtaining intermediate plant densities corresponding to the top images according to the total number of bounding boxes in the top images and a preset visual field area, and determining the average value of the intermediate plant densities of the top images corresponding to each appointed time length as the plant density;
step 1012: and dividing the height of the stalk of the plant by the height of the plant to obtain the height proportion parameter of the plant.
Based on the same public conception, the method for detecting the plant growth vigor can also be realized by a plant growth vigor detection device. The effect of the plant growth detection device is similar to that of the method, and the description is omitted here.
Fig. 11 is a schematic structural view of a plant growth detection device according to an embodiment of the present disclosure.
As shown in fig. 11, the plant growth detection apparatus 1100 of the present disclosure may include an acquisition module 1110, a plant detection module 1120, a plant width determination module 1130, a stalk height determination module 1140, a plant height determination module 1150, and a plant growth determination module 1160.
An acquisition module 1110, configured to acquire a top image of a plant and a side image of the plant every other specified duration;
the plant detection module 1120 is configured to input the top image into a pre-trained plant detection model to perform plant detection, so as to obtain positions of bounding boxes of each plant in the top image;
a plant width determining module 1130, configured to determine, for any one plant in the top image, a width of the plant based on a position of a bounding box of the plant; the method comprises the steps of,
a stalk height determination module 1140 for determining a stalk height of the plant using the side image; the method comprises the steps of,
The plant height determining module 1150 is configured to obtain a height of the plant according to a measured distance of the plant, where the measured distance is measured by a depth sensor, and the measured distance is a distance between the top end of the plant and the depth sensor;
and a plant growth status determining module 1160, configured to obtain a growth status of the plant through the width of the plant, the height of the stalk, and the height of the plant.
In one embodiment, the plant vigor further comprises plant density;
the plant density determining module 1170 is configured to input the top image into a pre-trained plant detection model to perform plant detection, obtain positions of bounding boxes of each plant in the top image, and obtain an intermediate plant density corresponding to the top image according to the total number of bounding boxes in the top image and a preset field area;
and determining the average value of the intermediate plant densities of the top images corresponding to the designated time periods as the plant density.
In one embodiment, the plant density determining module 1170 performs the step of obtaining the intermediate plant density corresponding to the top image according to the total number of bounding boxes in the top image and a preset field area, and is specifically configured to:
Dividing the total number of the surrounding boxes by the field area to obtain the plant density.
In one embodiment, the position of the bounding box includes image position coordinates of four vertices in the bounding box;
the plant width determination module 1130 is specifically configured to:
converting the image position coordinates of any vertex in the bounding box by using a preset depth camera calibration matrix to obtain the actual position coordinates of the vertex in a world coordinate system;
obtaining the length and the width of the bounding box based on the actual position coordinates of each vertex in the bounding box, wherein the value of the width is larger than that of the length;
the width of the plant is determined based on the width of the bounding box.
In one embodiment, the stalk height determination module 1140 is specifically configured to:
inputting the side image into a pre-trained stalk detection model for stalk detection, and obtaining the position of a bounding box of the stalk of each plant in the plant side image, wherein the position of the bounding box comprises the image position coordinates of each vertex in the bounding box;
determining a target stalk position number corresponding to the plant position number of the plant by utilizing the corresponding relation between the preset plant position number and the stalk position number;
Converting the image position coordinates of each vertex of a bounding box of the stalk corresponding to the stalk position serial number of the target through a preset vision camera calibration matrix to obtain the actual position coordinates of each vertex in a world coordinate system;
and obtaining the height of the stalk of the plant based on the actual position coordinates of each vertex.
In one embodiment, the stalk height determining module 1140 executes the actual position coordinates based on the vertices to obtain the stalk height of the plant, specifically for:
obtaining the length and the width of the bounding box of the stalk according to the position coordinates of each vertex of the bounding box of the stalk, wherein the length is larger than the width;
and determining the length of the bounding box of the stalk as the height of the stalk.
In one embodiment, the plant height determining module 1150 is specifically configured to:
subtracting the preset height of the depth sensor from the measured distance of the plant to obtain the height of the plant, wherein the height of the depth sensor is the distance between the depth sensor and the ground.
In one embodiment, the plant vigor further comprises a height ratio parameter; the apparatus further comprises:
And the height proportion parameter determining module 1180 is configured to divide the height of the plant stem by the height of the plant to obtain the height proportion parameter of the plant after the height of the plant is obtained according to the measured distance of the plant.
In one embodiment, the apparatus further comprises:
a plant detection model determination module 1190 for training the plant detection model by:
acquiring a training sample, wherein the training sample comprises top images of plants, and the top images comprise labeling positions of bounding boxes of the plants;
inputting the training sample into the plant detection model to perform plant detection to obtain the predicted position of the bounding box of each plant;
obtaining an error value according to the labeling position of the bounding box of each plant and the prediction position of the bounding box of each plant;
and if the error value is greater than the specified threshold, after the specified parameters of the plant detection model are adjusted, returning to the step of inputting the training sample into the plant detection model for plant detection until the error value is not greater than the specified threshold, and ending training of the plant detection model.
Having described a method and apparatus for detecting plant growth vigor of an exemplary embodiment of the present disclosure, next, an electronic device according to another exemplary embodiment of the present disclosure is described.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
In some possible implementations, an electronic device according to the present disclosure may include at least one processor, and at least one computer storage medium. Wherein the computer storage medium stores program code which, when executed by a processor, causes the processor to perform the steps in the method of detecting plant vigor according to various exemplary embodiments of the present disclosure described above in the present specification. For example, the processor may perform steps 401-406 as shown in FIG. 4.
An electronic device 1200 according to such an embodiment of the present disclosure is described below with reference to fig. 12. The electronic device 1200 shown in fig. 12 is merely an example, and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 12, the electronic device 1200 is embodied in the form of a general-purpose electronic device. Components of electronic device 1200 may include, but are not limited to: the at least one processor 1201, the at least one computer storage medium 1202, and a bus 1203 that connects the various system components, including the computer storage medium 1202 and the processor 1201.
Bus 1203 represents one or more of several types of bus structures, including a computer storage medium bus or computer storage medium controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
Computer storage media 1202 may include readable media in the form of volatile computer storage media, such as random access computer storage media (RAM) 1221 and/or cache storage media 1222, and may further include read only computer storage media (ROM) 1223.
Computer storage media 1202 may also include a program/utility 1225 having a set (at least one) of program modules 1224, such program modules 1224 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The electronic device 1200 may also communicate with one or more external devices 1204 (e.g., keyboard, pointing device, etc.), with one or more devices that enable a user to interact with the electronic device 1200, and/or with any device (e.g., router, modem, etc.) that enables the electronic device 1200 to communicate with one or more other electronic devices. Such communication may occur through an input/output (I/O) interface 1205. Also, electronic device 1200 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 1206. As shown, network adapter 1206 communicates with other modules for electronic device 1200 over bus 1203. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 1200, including, but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
In some possible embodiments, aspects of a method for detecting plant vigor provided by the present disclosure may also be implemented in the form of a program product comprising program code for causing a computer device to carry out the steps of the method for detecting plant vigor according to various exemplary embodiments of the present disclosure as described above when the program product is run on a computer device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, a random access computer storage medium (RAM), a read-only computer storage medium (ROM), an erasable programmable read-only computer storage medium (EPROM or flash memory), an optical fiber, a portable compact disc read-only computer storage medium (CD-ROM), an optical computer storage medium, a magnetic computer storage medium, or any suitable combination of the foregoing.
The program product of the detection of plant vigour of embodiments of the present disclosure may employ a portable compact disc read-only computer storage medium (CD-ROM) and include program code and may run on an electronic device. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the consumer electronic device, partly on the consumer electronic device, as a stand-alone software package, partly on the consumer electronic device, partly on the remote electronic device, or entirely on the remote electronic device or server. In the case of remote electronic devices, the remote electronic device may be connected to the consumer electronic device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external electronic device (e.g., connected through the internet using an internet service provider).
It should be noted that although several modules of the apparatus are mentioned in the detailed description above, this division is merely exemplary and not mandatory. Indeed, the features and functions of two or more modules described above may be embodied in one module in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module described above may be further divided into a plurality of modules to be embodied.
Furthermore, although the operations of the methods of the present disclosure are depicted in the drawings in a particular order, this is not required to or suggested that these operations must be performed in this particular order or that all of the illustrated operations must be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
It will be apparent to those skilled in the art that embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, magnetic disk computer storage media, CD-ROM, optical computer storage media, and the like) having computer-usable program code embodied therein.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the disclosure. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable computer storage medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable computer storage medium produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present disclosure without departing from the spirit or scope of the disclosure. Thus, the present disclosure is intended to include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (14)

1. A method for detecting plant growth vigor, the method comprising:
acquiring a top image of a plant and a side image of the plant at intervals of a designated time length;
inputting the top image into a pre-trained plant detection model for plant detection to obtain the positions of bounding boxes of all plants in the top image;
determining, for any one plant in the top image, a width of the plant based on a position of a bounding box of the plant; the method comprises the steps of,
Determining the stalk height of the plant by using the side image; the method comprises the steps of,
obtaining the height of the plant according to the measured distance of the plant, wherein the measured distance is measured by a depth sensor, and the measured distance is the distance between the top end of the plant and the depth sensor;
and obtaining the growth vigor of the plant through the width of the plant, the height of the stalk and the height of the plant.
2. The method of claim 1, wherein the plant vigor further comprises plant density;
the method further comprises the steps of after inputting the top image into a pre-trained plant detection model to detect plants and obtaining the positions of bounding boxes of all plants in the top image:
obtaining an intermediate plant density corresponding to the top image according to the total number of bounding boxes in the top image and a preset visual field area;
and determining the average value of the intermediate plant densities of the top images corresponding to the designated time periods as the plant density.
3. The method of claim 1, wherein the bounding box positions comprise image position coordinates of four vertices in the bounding box;
The determining the width of the plant based on the position of the bounding box of the plant comprises:
converting the image position coordinates of any vertex in the bounding box by using a preset depth camera calibration matrix to obtain the actual position coordinates of the vertex in a world coordinate system;
obtaining the length and the width of the bounding box based on the actual position coordinates of each vertex in the bounding box, wherein the value of the width is larger than that of the length;
the width of the plant is determined based on the width of the bounding box.
4. The method of claim 1, wherein said determining a stalk height of said plant using said side image comprises:
inputting the side image into a pre-trained stalk detection model for stalk detection, and obtaining the position of a bounding box of the stalk of each plant in the plant side image, wherein the position of the bounding box comprises the image position coordinates of each vertex in the bounding box;
determining a target stalk position number corresponding to the plant position number of the plant by utilizing the corresponding relation between the preset plant position number and the stalk position number;
Converting the image position coordinates of each vertex of a bounding box of the stalk corresponding to the stalk position serial number of the target through a preset vision camera calibration matrix to obtain the actual position coordinates of each vertex in a world coordinate system;
and obtaining the height of the stalk of the plant based on the actual position coordinates of each vertex.
5. The method of claim 1, wherein the plant vigor further comprises a height ratio parameter;
after the plant height is obtained according to the measured distance of the plant, the method further comprises the following steps:
and dividing the height of the stalk of the plant by the height of the plant to obtain the height proportion parameter of the plant.
6. The method of claim 1, wherein the plant detection model is trained by:
acquiring a training sample, wherein the training sample comprises top images of plants, and the top images comprise labeling positions of bounding boxes of the plants;
inputting the training sample into the plant detection model to perform plant detection to obtain the predicted position of the bounding box of each plant;
obtaining an error value according to the labeling position of the bounding box of each plant and the prediction position of the bounding box of each plant;
And if the error value is greater than the specified threshold, after the specified parameters of the plant detection model are adjusted, returning to the step of inputting the training sample into the plant detection model for plant detection until the error value is not greater than the specified threshold, and ending training of the plant detection model.
7. Detection apparatus for plant growth vigor, characterized in that the apparatus includes depth sensor, vision sensor and processor, wherein:
the depth sensor is used for acquiring a top image of the plant and a measurement distance of the plant, wherein the measurement distance is the distance between the top of the plant and the depth sensor;
the visual sensor is used for acquiring a side image of the plant;
the processor is respectively connected with the depth sensor and the visual sensor, and is used for inputting the top image into a pre-trained plant detection model for plant detection every other appointed time length to obtain the position of a bounding box of each plant in the top image;
determining, for any one plant in the top image, a width of the plant based on a position of a bounding box of the plant; the method comprises the steps of,
Determining the stalk height of the plant by using the side image; the method comprises the steps of,
obtaining the height of the plant according to the measured distance of the plant;
and obtaining the growth vigor of the plant through the width of the plant, the height of the stalk and the height of the plant.
8. The apparatus of claim 7, wherein the plant vigor further comprises plant density;
the processor is further configured to:
inputting the top image into a pre-trained plant detection model for plant detection, and obtaining intermediate plant density corresponding to the top image according to the total number of bounding boxes in the top image and a preset field area after obtaining the position of the bounding box of each plant in the top image;
and determining the average value of the intermediate plant densities of the top images corresponding to the designated time periods as the plant density.
9. The apparatus of claim 7, wherein the position of the bounding box comprises image position coordinates of four vertices in the bounding box;
the processor executes the determining the width of the plant based on the position of the bounding box of the plant, and is specifically used for:
Converting the image position coordinates of any vertex in the bounding box by using a preset depth camera calibration matrix to obtain the actual position coordinates of the vertex in a world coordinate system;
obtaining the length and the width of the bounding box based on the actual position coordinates of each vertex in the bounding box, wherein the value of the width is larger than that of the length;
the width of the plant is determined based on the width of the bounding box.
10. The apparatus of claim 7, wherein the processor performs the determining of the stalk height of the plant using the side image, in particular for:
inputting the side image into a pre-trained stalk detection model for stalk detection, and obtaining the position of a bounding box of the stalk of each plant in the plant side image, wherein the position of the bounding box comprises the image position coordinates of each vertex in the bounding box;
determining a target stalk position number corresponding to the plant position number of the plant by utilizing the corresponding relation between the preset plant position number and the stalk position number;
converting the image position coordinates of each vertex of a bounding box of the stalk corresponding to the stalk position serial number of the target through a preset vision camera calibration matrix to obtain the actual position coordinates of each vertex in a world coordinate system;
And obtaining the height of the stalk of the plant based on the actual position coordinates of each vertex.
11. The apparatus of claim 8, wherein the plant vigor further comprises a height ratio parameter;
after the plant height is obtained according to the measured distance of the plant, the method further comprises the following steps:
and dividing the height of the stalk of the plant by the height of the plant to obtain the height proportion parameter of the plant.
12. The apparatus of claim 7, wherein the processor is further configured to:
the plant detection model is trained by:
acquiring a training sample, wherein the training sample comprises top images of plants, and the top images comprise labeling positions of bounding boxes of the plants;
inputting the training sample into the plant detection model to perform plant detection to obtain the predicted position of the bounding box of each plant;
obtaining an error value according to the labeling position of the bounding box of each plant and the prediction position of the bounding box of each plant;
and if the error value is greater than the specified threshold, after the specified parameters of the plant detection model are adjusted, returning to the step of inputting the training sample into the plant detection model for plant detection until the error value is not greater than the specified threshold, and ending training of the plant detection model.
13. A plant growth detection device, the device comprising:
the plant image acquisition module is used for acquiring top images of plants and side images of the plants at intervals of designated time;
the plant detection module is used for inputting the top image into a pre-trained plant detection model to perform plant detection, so as to obtain the position of a bounding box of each plant in the top image;
the plant width determining module is used for determining the width of any plant in the top end image based on the position of the bounding box of the plant; the method comprises the steps of,
a stalk height determining module for determining a stalk height of the plant using the side image; the method comprises the steps of,
the plant height determining module is used for obtaining the height of the plant according to the measured distance of the plant, wherein the measured distance is obtained by measuring through a depth sensor, and the measured distance is the distance between the top end of the plant and the depth sensor;
and the plant growth condition determining module is used for obtaining the growth condition of the plant through the width of the plant, the height of the stem and the height of the plant.
14. A computer storage medium, characterized in that it stores a computer program for executing the method according to any one of claims 1-6.
CN202310162809.3A 2023-02-17 2023-02-17 Plant growth detection method, equipment, device and computer storage medium Pending CN116468657A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310162809.3A CN116468657A (en) 2023-02-17 2023-02-17 Plant growth detection method, equipment, device and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310162809.3A CN116468657A (en) 2023-02-17 2023-02-17 Plant growth detection method, equipment, device and computer storage medium

Publications (1)

Publication Number Publication Date
CN116468657A true CN116468657A (en) 2023-07-21

Family

ID=87174145

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310162809.3A Pending CN116468657A (en) 2023-02-17 2023-02-17 Plant growth detection method, equipment, device and computer storage medium

Country Status (1)

Country Link
CN (1) CN116468657A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109115776A (en) * 2018-08-27 2019-01-01 江苏大学 A kind of Plug seedling growing way non-destructive monitoring method and device based on color and depth information
CN109886094A (en) * 2019-01-08 2019-06-14 中国农业大学 A kind of crop growth of cereal crop seedlings seedling gesture capturing analysis method and device
CN111046730A (en) * 2019-11-08 2020-04-21 北京海益同展信息科技有限公司 Plant data processing method and device, computer equipment and storage medium
CN114066842A (en) * 2021-11-12 2022-02-18 浙江托普云农科技股份有限公司 Method, system and device for counting number of ears and storage medium
CN114170148A (en) * 2021-11-12 2022-03-11 浙江托普云农科技股份有限公司 Corn plant type parameter measuring method, system, device and storage medium
CN114812418A (en) * 2022-04-25 2022-07-29 安徽农业大学 Portable plant density and plant spacing measurement system
CN114820758A (en) * 2021-01-12 2022-07-29 富泰华工业(深圳)有限公司 Plant growth height measuring method, device, electronic device and medium
KR102469815B1 (en) * 2022-02-14 2022-11-23 주식회사 리트빅 3D AVM system by use of Deep Learning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109115776A (en) * 2018-08-27 2019-01-01 江苏大学 A kind of Plug seedling growing way non-destructive monitoring method and device based on color and depth information
CN109886094A (en) * 2019-01-08 2019-06-14 中国农业大学 A kind of crop growth of cereal crop seedlings seedling gesture capturing analysis method and device
CN111046730A (en) * 2019-11-08 2020-04-21 北京海益同展信息科技有限公司 Plant data processing method and device, computer equipment and storage medium
CN114820758A (en) * 2021-01-12 2022-07-29 富泰华工业(深圳)有限公司 Plant growth height measuring method, device, electronic device and medium
CN114066842A (en) * 2021-11-12 2022-02-18 浙江托普云农科技股份有限公司 Method, system and device for counting number of ears and storage medium
CN114170148A (en) * 2021-11-12 2022-03-11 浙江托普云农科技股份有限公司 Corn plant type parameter measuring method, system, device and storage medium
KR102469815B1 (en) * 2022-02-14 2022-11-23 주식회사 리트빅 3D AVM system by use of Deep Learning
CN114812418A (en) * 2022-04-25 2022-07-29 安徽农业大学 Portable plant density and plant spacing measurement system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A.D. NAKARMI ET AL.: "Automatic inter-plant spacing sensing at early growth stages using a 3D vision sensor", 《COMPUTERS AND ELECTRONICS IN AGRICULTURE》, 31 March 2012 (2012-03-31) *
丁启朔;李海康;孙克润;何瑞银;汪小旵;刘富玺;厉翔;: "基于机器视觉的稻茬麦单茎穗高通量表型分析", 中国农业科学, no. 01, 1 January 2020 (2020-01-01) *
宋磊 等: "基于ResNeXt 单目深度估计的幼苗植株高度测量方法", 《农业工程学报》, 28 February 2022 (2022-02-28) *

Similar Documents

Publication Publication Date Title
Liu et al. A vision-based robust grape berry counting algorithm for fast calibration-free bunch weight estimation in the field
US20220012907A1 (en) Volume measurement method and system, apparatus and computer-readable storage medium
CN115187527B (en) Separation and identification method for multi-source mixed ultrahigh frequency partial discharge spectrum
CN110991452B (en) Parking space frame detection method, device, equipment and readable storage medium
CN111126662A (en) Irrigation decision making method, device, server and medium based on big data
CN114926511A (en) High-resolution remote sensing image change detection method based on self-supervision learning
CN111598942A (en) Method and system for automatically positioning electric power facility instrument
CN113255590A (en) Defect detection model training method, defect detection method, device and system
CN113238578A (en) Routing planning method and system for power tower unmanned aerial vehicle inspection route
WO2020093631A1 (en) Antenna downtilt angle measurement method based on depth instance segmentation network
CN114387253A (en) Infrared image processing method and device for defects of external thermal insulation layer of external wall and storage medium
CN109934151B (en) Face detection method based on movidius computing chip and Yolo face
CN117333776A (en) VOCs gas leakage detection method, device and storage medium
CN115757369A (en) Automatic inspection method and system for laser point cloud data
CN110738272A (en) method for labeling visualized mechanical continuous alarm samples of power transmission line channel
CN111060922B (en) Tree point cloud extraction method based on airborne laser radar point cloud spatial distribution characteristics
CN116468657A (en) Plant growth detection method, equipment, device and computer storage medium
CN112132135B (en) Power grid transmission line detection method based on image processing and storage medium
CN115272815A (en) Cable tunnel environment abnormity identification method based on image
CN114169404A (en) Method for intelligently acquiring quantitative information of slope diseases based on images
CN112950504A (en) Power transmission line inspection haze weather monocular hidden danger object distance measurement method and system
CN113096027B (en) Point cloud-based farmland soil layer horizontal correction and removal method
CN113158743B (en) Small target real-time detection and positioning method, system and equipment based on priori knowledge
CN117496368A (en) Drought color-changing tree trunk metering detection method based on visible light visual remote sensing
CN117522950B (en) Geometric parameter measurement method for plant stem growth based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination