CN116246225A - Crop breeding monitoring method and system based on image processing - Google Patents

Crop breeding monitoring method and system based on image processing Download PDF

Info

Publication number
CN116246225A
CN116246225A CN202310531713.XA CN202310531713A CN116246225A CN 116246225 A CN116246225 A CN 116246225A CN 202310531713 A CN202310531713 A CN 202310531713A CN 116246225 A CN116246225 A CN 116246225A
Authority
CN
China
Prior art keywords
locus
seedling
point
breeding
breeding area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310531713.XA
Other languages
Chinese (zh)
Inventor
赵磊
王克响
宋文宇
王兰丰
龚美娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Agricultural University
Original Assignee
Qingdao Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Agricultural University filed Critical Qingdao Agricultural University
Priority to CN202310531713.XA priority Critical patent/CN116246225A/en
Publication of CN116246225A publication Critical patent/CN116246225A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
    • Y02A40/25Greenhouse technology, e.g. cooling systems therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a crop breeding monitoring method and system based on image processing, wherein the method comprises the following steps: data acquisition, edge detection processing, calculating the number of lines of seedlings in a breeding area, calculating the number of columns of seedlings in the breeding area, calculating the density of seedlings in the breeding area and calculating the height of seedlings in the breeding area. The invention belongs to the technical field of crop breeding, and particularly relates to a crop breeding monitoring method and system based on image processing.

Description

Crop breeding monitoring method and system based on image processing
Technical Field
The invention belongs to the technical field of crop breeding, and particularly relates to a crop breeding monitoring method and system based on image processing.
Background
The selection of high quality varieties through crop breeding is a main method for improving crop yield and quality, and crop breeding plays an important role in agricultural production. In the crop breeding process, because the crop growth period is long, the analysis of the crop breeding condition by a breeding expert consumes great manpower, the data collection difficulty is high, the accuracy is not enough, and the current reliable phenotype analysis becomes the core of plant breeding, so that a method and a system capable of effectively monitoring the crop growth process and the crop characteristics to help the breeding expert to analyze are lacking currently.
In the process of collecting breeding images, a large amount of noise is easily generated due to the interference of various random signals, the quality of the images is greatly reduced, the subsequent processing operation on the images is influenced, meanwhile, the accuracy of edge positioning also greatly influences the subsequent operation, the existing edge detection algorithm of the breeding images has the technical problems of inaccurate positioning and low efficiency, which are easily caused by noise interference, and the current method for acquiring effective data information from the breeding images to help breeding experts to monitor crop breeding is few, and how to analyze useful data information from the collected breeding images to monitor crop breeding is the current technical problem.
Disclosure of Invention
Aiming at the problems of long growth period, great difficulty in collecting breeding data, and insufficient accuracy in the process of crop breeding and the lack of a method and a system capable of effectively recording the growth process and the crop characteristics of crops to help breeding experts to analyze, the invention provides a crop breeding monitoring system based on an image processing technology to monitor the crop breeding process, collect crop breeding images in real time and effectively record the growth process and the crop characteristics; aiming at the technical problems of inaccurate positioning and low efficiency caused by noise interference in the conventional edge detection algorithm of the breeding image, the invention adopts an edge detection algorithm, simplifies a plurality of processes of image processing into one algorithm, realizes better noise reduction effect and more accurate edge positioning effect, and greatly improves the efficiency of image processing; aiming at the technical problem that the current method for acquiring effective data information from a breeding image to help a breeding expert to monitor crop breeding is few, the invention acquires the number of lines and the number of columns of the seedlings in a breeding area respectively by acquiring the number of lines algorithm of the seedlings in the breeding area and acquiring the number of columns algorithm of the seedlings in the breeding area, further acquires the density of the seedlings to judge the emergence rate, and calls a function of OpenCV (open library) to acquire the height of the seedlings in the breeding area so as to monitor the characteristics of the crops in the breeding process and the growing process in real time.
The technical scheme adopted by the invention is as follows: the invention provides a crop breeding monitoring method based on image processing, which comprises the following steps:
step S1: collecting data;
step S2: edge detection processing;
step S3: calculating the number of lines of the seedling plants in the breeding area;
step S4: calculating the number of seedling lines in a breeding area;
step S5: calculating the seedling density in the breeding area;
step S6: and calculating the height of the seedling in the breeding area.
Further, in step S1, the data acquisition is acquisition of crop breeding images.
Further, in step S2, the edge detection process is provided with a convolution kernel in advance
Figure SMS_1
The method specifically comprises the following steps:
step S21: traversing the convolution kernel on an original image to obtain a gray matrix after traversing, wherein the original image is a gray matrix of a crop breeding image, and the step S21 comprises a step S211, a step S212 and a step S213;
in step S211, the horizontal direction of the original image is derived, and the calculation formula is as follows:
Figure SMS_2
in the method, in the process of the invention,
Figure SMS_3
the result of deriving the horizontal direction of the original image is that I is the gray matrix of the original image;
in step S212, the vertical direction of the original image is derived, and the calculation formula is as follows:
Figure SMS_4
in the method, in the process of the invention,
Figure SMS_5
is the result of deriving the vertical direction of the original image;
in step S213, a gray matrix obtained after the traversal is obtained by performing calculation based on the result of deriving in the horizontal direction and the result of deriving in the vertical direction of the original image, and the calculation formula of the gray matrix obtained after the traversal is as follows:
Figure SMS_6
wherein F is a gray matrix obtained after traversal;
step S22: expanding the gray matrix after traversing for one circle, filling the gray matrix with a value of 0, and traversing the convolution kernel on the gray matrix after filling again to obtain a breeding area image.
Further, in step S3, the calculating the number of lines of seedlings in the breeding area specifically includes the following steps:
step S31: acquiring coordinates of a left locus and a right locus, scanning the middle-most row and the left row of seedling plants of the middle-most row of a breeding area to obtain center coordinates of two rows of seedling plants, taking the center coordinates of the two rows of seedling plants as the coordinates of the left locus, scanning the middle-most row and the right row of seedling plants of the middle-most row of the breeding area to obtain center coordinates of the two rows of seedling plants, and taking the center coordinates of the two rows of seedling plants as the coordinates of the right locus;
step S32: the step of obtaining coordinates of the left intercept point and the right intercept point includes:
step S321: acquiring a rotation image of a breeding area image;
step S322: taking the center point of the seedling at the center position of the breeding area as a center seedling point, and taking the center seedling point as a starting point, scanning the rotating image leftwards until a black pixel point is scanned;
step S323: taking the previous point of the scanned black pixel point as the coordinate of the left intercept point;
step S324: taking the central seedling point as a starting point, scanning the rotating image rightwards until a black pixel point is scanned;
step S325: taking the previous point of the scanned black pixel point as the coordinate of the right intercept point;
step S33: calculating the number of lines of seedling plants in a breeding area, setting three lines of seedling plants between a left locus and a right locus as main three lines of lines in advance, wherein the step of calculating the number of lines of seedling plants in the breeding area comprises the following steps:
step S331: carrying out inverse rotation transformation on the breeding area image, projecting the breeding area image subjected to the inverse rotation transformation, and obtaining coordinates of projection points of a left locus and coordinates of projection points of a right locus;
step S332: calculating the projection height of the left locus and the projection height of the right locus, wherein the calculation formula of the projection height of the left locus is as follows:
Figure SMS_7
the calculation formula of the projection height of the right locus is as follows:
Figure SMS_8
wherein a1 is the left intercept point and the projection point of the left locusThe distance between the two points, b1 is the distance between the projection point of the left locus and the projection point of the right locus, c1 is the distance between the projection point of the right locus and the right intercept point,
Figure SMS_9
for the projection height of the left locus, +.>
Figure SMS_10
The projected height of the right locus;
step S333: calculating the distance between the left locus and the left intercept point, the distance between the left locus and the right locus and the distance between the right locus and the right intercept point, wherein the calculation formula of the distance between the left locus and the left intercept point is as follows:
Figure SMS_11
the calculation formula of the distance between the left locus and the right locus is as follows:
Figure SMS_12
the calculation formula of the distance between the right locus and the right intercept point is as follows:
Figure SMS_13
in the method, in the process of the invention,
Figure SMS_14
is the distance between the left locus and the left intercept, < >>
Figure SMS_15
Is the distance between the left site and the right site, < >>
Figure SMS_16
Is the distance between the right locus and the right intercept;
step S334: calculation of
Figure SMS_17
、/>
Figure SMS_18
And->
Figure SMS_19
Corresponding central angle, said ++>
Figure SMS_20
The corresponding calculation formula of the central angle is as follows:
Figure SMS_21
the said
Figure SMS_22
The corresponding calculation formula of the central angle is as follows:
Figure SMS_23
the said
Figure SMS_24
The corresponding calculation formula of the central angle is as follows:
Figure SMS_25
,/>
in the method, in the process of the invention,
Figure SMS_26
is->
Figure SMS_27
Corresponding central angle>
Figure SMS_28
Is->
Figure SMS_29
Corresponding central angle>
Figure SMS_30
Is->
Figure SMS_31
The corresponding central angle, r1, is the distance between the left intercept point and the center seedling point;
step S335: calculating the proportional relation of the arc between the left locus and the left intercept point, the arc between the left locus and the right locus and the arc between the right locus and the right intercept point:
Figure SMS_32
in the method, in the process of the invention,
Figure SMS_33
is the arc between the left locus and the left intercept, < >>
Figure SMS_34
Is the arc between the left locus and the right locus, < >>
Figure SMS_35
Is the arc between the right locus and the right intercept;
step S336: calculation of
Figure SMS_36
The number of seedling lines contained in the middle is calculated as follows:
Figure SMS_37
wherein A1 is
Figure SMS_38
The number of seedling lines contained in the middle;
step S337: calculation of
Figure SMS_39
The number of seedling lines contained in the middle is calculated as follows:
Figure SMS_40
wherein C1 is
Figure SMS_41
The number of seedling lines contained in the middle;
step S338: calculating the number of lines of the seedling plants in the breeding area, wherein the calculation formula is as follows:
Figure SMS_42
wherein P is the number of lines of the seedling plants in the breeding area.
Further, in step S4, the method specifically includes the following steps:
step S41: acquiring coordinates of an upper locus and a lower locus, scanning the middle-most row and the upper row of seedling plants in the middle-most row of a breeding area to obtain center coordinates of two rows of seedling plants, taking the center coordinates of the two rows of seedling plants as the coordinates of the upper locus, scanning the middle-most row and the lower row of seedling plants in the middle-most row of the breeding area to obtain center coordinates of the two rows of seedling plants, and taking the center coordinates of the two rows of seedling plants as the coordinates of the lower locus;
step S42: the step of acquiring coordinates of the upper intercept point and the lower intercept point includes:
step S421: acquiring a rotation image of a breeding area image;
step S422: taking the central point of the seedling at the most central position of the breeding area as a central seedling point, and taking the central seedling point as a starting point, scanning the rotating image upwards until a black pixel point is scanned;
step S423: taking the previous point of the scanned black pixel point as the coordinate of the upper intercept point;
step S424: taking the central seedling as a starting point, scanning the rotating image downwards until a black pixel point is scanned;
step S425: taking the previous point of the scanned black pixel point as the coordinate of the lower intercept point;
step S43: calculating the number of seedling lines in a breeding area, setting three rows of seedling lines between an upper locus and a lower locus as main three rows of columns in advance, wherein the step of calculating the number of seedling lines in the breeding area comprises the following steps:
step S431: carrying out inverse rotation transformation on the breeding area image, and projecting the breeding area image subjected to inverse rotation transformation to obtain coordinates of projection points of an upper locus and coordinates of projection points of a lower locus;
step S432: calculating the projection height of the upper locus and the projection height of the lower locus, wherein the calculation formula of the projection height of the upper locus is as follows:
Figure SMS_43
the calculation formula of the projection height of the lower locus is as follows:
Figure SMS_44
wherein a2 is the distance between the upper intercept point and the projection point of the upper locus, b2 is the distance between the projection point of the upper locus and the projection point of the lower locus, c2 is the distance between the projection point of the lower locus and the lower intercept point,
Figure SMS_45
for the projection height of the upper locus, +.>
Figure SMS_46
Is the projection height of the lower locus;
step S433: calculating the distance between the upper locus and the upper intercept point, the distance between the upper locus and the lower locus and the distance between the lower locus and the lower intercept point, wherein the calculation formula of the distance between the upper locus and the upper intercept point is as follows:
Figure SMS_47
the calculation formula of the distance between the upper locus and the lower locus is as follows:
Figure SMS_48
the calculation formula of the distance between the lower locus and the lower intercept point is as follows:
Figure SMS_49
in the method, in the process of the invention,
Figure SMS_50
is the distance between the upper locus and the upper intercept, < >>
Figure SMS_51
Is the distance between the upper and lower sites, < >>
Figure SMS_52
Is the distance between the lower locus and the lower intercept point;
step S434: calculation of
Figure SMS_53
、/>
Figure SMS_54
And->
Figure SMS_55
Corresponding central angle, said ++>
Figure SMS_56
The corresponding calculation formula of the central angle is as follows:
Figure SMS_57
the said
Figure SMS_58
The corresponding calculation formula of the central angle is as follows:
Figure SMS_59
the said
Figure SMS_60
The corresponding calculation formula of the central angle is as follows:
Figure SMS_61
in the method, in the process of the invention,
Figure SMS_62
is->
Figure SMS_63
Corresponding central angle>
Figure SMS_64
Is->
Figure SMS_65
Corresponding central angle>
Figure SMS_66
Is->
Figure SMS_67
The corresponding central angle, r2, is the distance between the upper intercept point and the center seedling point; />
Step S435: calculating the proportional relation of the arc between the upper locus and the upper intercept point, the arc between the upper locus and the lower locus and the arc between the lower locus and the lower intercept point:
Figure SMS_68
in the method, in the process of the invention,
Figure SMS_69
is the arc between the upper locus and the upper intercept point, < >>
Figure SMS_70
Is the arc between the upper and lower locus, < ->
Figure SMS_71
Is the arc between the lower locus and the lower intercept point;
step S436: calculation of
Figure SMS_72
The number of seedling columns contained in the middle is calculated as follows:
Figure SMS_73
wherein A2 is
Figure SMS_74
The number of seedling rows contained in the middle;
step S437: calculation of
Figure SMS_75
The number of seedling columns contained in the middle is calculated as follows:
Figure SMS_76
wherein C2 is
Figure SMS_77
The number of seedling rows contained in the middle;
step S438: the number of seedling lines in the breeding area is calculated, and the calculation formula is as follows:
Figure SMS_78
wherein Q is the number of seedling lines in the breeding area.
Further, in step S5, a breeding area is preset, and a calculation formula of the seedling density in the breeding area is:
Figure SMS_79
wherein Q is the number of seedling plant columns in the breeding area, ρ is the seedling plant density in the breeding area, and s is the breeding area.
Further, in step S6, the seedling height in the breeding area is calculated by calling the OpenCV function of the public library to find out the minimum circumscribed rectangle, and the circumscribed length of the minimum circumscribed rectangle is the pixel value of the image, i.e. the seedling height.
The invention provides a crop breeding monitoring system based on image processing, which comprises a data acquisition module, an edge detection processing module, a module for acquiring the number of lines of seedlings in a breeding area, a module for acquiring the number of columns of seedlings in the breeding area, a module for acquiring the density of seedlings in the breeding area and a module for acquiring the height of seedlings in the breeding area, wherein the data acquisition module acquires crop breeding images in the breeding process and sends the crop breeding images to the edge detection processing module, the edge detection processing module receives the crop breeding images sent by the data acquisition module, performs edge detection processing on the crop breeding images by utilizing an edge detection algorithm, sends the images of the breeding area obtained after the edge detection processing to the module for acquiring the number of lines of seedlings in the breeding area, the module for acquiring the number of columns of seedlings in the breeding area and the module for acquiring the height of seedlings in the breeding area, the method comprises the steps of acquiring a seed plant line number module in a breeding area and a seed plant column number module in the breeding area, receiving a breeding area image sent by an edge detection processing module, acquiring the seed plant line number in the breeding area and the seed plant column number in the breeding area by utilizing an algorithm for acquiring the seed plant line number in the breeding area and an algorithm for acquiring the seed plant column number in the breeding area, sending the seed plant line number in the breeding area and the seed plant column number in the breeding area to a seed plant density module in the breeding area, receiving the seed plant line number in the breeding area and the seed plant column number in the breeding area, calculating the seed plant density in the breeding area, outputting the seed plant density in the breeding area, and calling a function of an openCV (open library) by using the seed plant height module in the breeding area, wherein the seed plant height module in the breeding area receives the seed plant image sent by the edge detection processing module.
By adopting the scheme, the beneficial effects obtained by the invention are as follows:
(1) Aiming at the technical problems that the growth period is long in the crop breeding process, the breeding data collection difficulty is high, the accuracy is not enough, and a method and a system for effectively recording the crop growth process and the crop characteristics to help a breeding expert to analyze are lacked.
(2) Aiming at the technical problems of inaccurate positioning and low efficiency caused by noise interference in the conventional edge detection algorithm of the breeding image, the invention adopts the edge detection algorithm, simplifies a plurality of processes of image processing into one algorithm, realizes better noise reduction effect and more accurate edge positioning effect, and greatly improves the efficiency of image processing.
(3) Aiming at the technical problem that the current method for acquiring effective data information from a breeding image to help a breeding expert to monitor crop breeding is few, the invention acquires the number of lines and the number of columns of the seedlings in a breeding area respectively by acquiring the number of lines algorithm of the seedlings in the breeding area and acquiring the number of columns algorithm of the seedlings in the breeding area, further acquires the density of the seedlings to judge the emergence rate, and calls a function of OpenCV (open library) to acquire the height of the seedlings in the breeding area so as to monitor the characteristics of the crops in the breeding process and the growing process in real time.
Drawings
FIG. 1 is a schematic flow chart of a crop breeding monitoring method based on image processing;
FIG. 2 is a schematic flow chart of a crop breeding monitoring system based on image processing;
FIG. 3 is a flow chart of step S3;
fig. 4 is a flow chart of step S4;
FIG. 5 is a crop breeding image acquired in step S1;
FIG. 6 is a breeding area image processed by the edge detection algorithm adopted in the scheme in the step S2;
fig. 7 is an image of a breeding area obtained by processing by another edge detection algorithm in step S2.
The accompanying drawings are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate the invention and together with the embodiments of the invention, serve to explain the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention; all other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In the description of the present invention, it should be understood that the terms "upper," "lower," "front," "rear," "left," "right," "top," "bottom," "inner," "outer," and the like indicate orientation or positional relationships based on those shown in the drawings, merely to facilitate description of the invention and simplify the description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus should not be construed as limiting the invention.
In an embodiment, referring to fig. 1, the present invention provides a crop breeding monitoring method based on image processing, which includes the following steps:
step S1: collecting data;
step S2: edge detection processing;
step S3: calculating the number of lines of the seedling plants in the breeding area;
step S4: calculating the number of seedling lines in a breeding area;
step S5: calculating the seedling density in the breeding area;
step S6: and calculating the height of the seedling in the breeding area.
In a second embodiment, referring to fig. 1, the edge detection process is provided with a convolution kernel in advance in step S2, which is based on the above embodiment
Figure SMS_80
The method specifically comprises the following steps:
step S21: traversing the convolution kernel on an original image to obtain a gray matrix after traversing, wherein the original image is a gray matrix of a crop breeding image, and the step S21 comprises a step S211, a step S212 and a step S213;
in step S211, the horizontal direction of the original image is derived, and the calculation formula is as follows:
Figure SMS_81
in the method, in the process of the invention,
Figure SMS_82
is the result of deriving the horizontal direction of the original image, IIs the gray matrix of the original image;
in step S212, the vertical direction of the original image is derived, and the calculation formula is as follows:
Figure SMS_83
in the method, in the process of the invention,
Figure SMS_84
is the result of deriving the vertical direction of the original image;
in step S213, a gray matrix obtained after the traversal is obtained by performing calculation based on the result of deriving in the horizontal direction and the result of deriving in the vertical direction of the original image, and the calculation formula of the gray matrix obtained after the traversal is as follows:
Figure SMS_85
wherein F is a gray matrix obtained after traversal;
step S22: expanding the gray matrix after traversing for one circle, filling the gray matrix with a value of 0, and traversing the convolution kernel on the gray matrix after filling again to obtain a breeding area image.
By executing the operation, aiming at the technical problems of inaccurate positioning and low efficiency caused by noise interference in the conventional edge detection algorithm of the breeding image, the invention adopts the edge detection algorithm, simplifies a plurality of processes of image processing into one algorithm, realizes better noise reduction effect and more accurate edge positioning effect, and greatly improves the efficiency of image processing.
In a third embodiment, referring to fig. 1 and 3, the method further includes the step of calculating the number of lines of seedlings in the breeding area in step S3, where the step includes:
step S31: acquiring coordinates of a left locus and a right locus, scanning the middle-most row and the left row of seedling plants of the middle-most row of a breeding area to obtain center coordinates of two rows of seedling plants, taking the center coordinates of the two rows of seedling plants as the coordinates of the left locus, scanning the middle-most row and the right row of seedling plants of the middle-most row of the breeding area to obtain center coordinates of the two rows of seedling plants, and taking the center coordinates of the two rows of seedling plants as the coordinates of the right locus;
step S32: the step of obtaining coordinates of the left intercept point and the right intercept point includes:
step S321: acquiring a rotation image of a breeding area image;
step S322: taking the center point of the seedling at the center position of the breeding area as a center seedling point, and taking the center seedling point as a starting point, scanning the rotating image leftwards until a black pixel point is scanned;
step S323: taking the previous point of the scanned black pixel point as the coordinate of the left intercept point;
step S324: taking the central seedling point as a starting point, scanning the rotating image rightwards until a black pixel point is scanned;
step S325: taking the previous point of the scanned black pixel point as the coordinate of the right intercept point;
step S33: calculating the number of lines of seedling plants in a breeding area, setting three lines of seedling plants between a left locus and a right locus as main three lines of lines in advance, wherein the step of calculating the number of lines of seedling plants in the breeding area comprises the following steps:
step S331: carrying out inverse rotation transformation on the breeding area image, projecting the breeding area image subjected to the inverse rotation transformation, and obtaining coordinates of projection points of a left locus and coordinates of projection points of a right locus;
step S332: calculating the projection height of the left locus and the projection height of the right locus, wherein the calculation formula of the projection height of the left locus is as follows:
Figure SMS_86
the calculation formula of the projection height of the right locus is as follows:
Figure SMS_87
wherein a1 is the distance between the left intercept point and the projection point of the left locus, and b1 is leftThe distance between the projection point of the locus and the projection point of the right locus, c1 is the distance between the projection point of the right locus and the right intercept point,
Figure SMS_88
for the projection height of the left locus, +.>
Figure SMS_89
The projected height of the right locus;
step S333: calculating the distance between the left locus and the left intercept point, the distance between the left locus and the right locus and the distance between the right locus and the right intercept point, wherein the calculation formula of the distance between the left locus and the left intercept point is as follows:
Figure SMS_90
the calculation formula of the distance between the left locus and the right locus is as follows:
Figure SMS_91
the calculation formula of the distance between the right locus and the right intercept point is as follows:
Figure SMS_92
in the method, in the process of the invention,
Figure SMS_93
is the distance between the left locus and the left intercept, < >>
Figure SMS_94
Is the distance between the left site and the right site, < >>
Figure SMS_95
Is the distance between the right locus and the right intercept;
step S334: calculation of
Figure SMS_96
、/>
Figure SMS_97
And->
Figure SMS_98
Corresponding central angle, said ++>
Figure SMS_99
The corresponding calculation formula of the central angle is as follows:
Figure SMS_100
,/>
the said
Figure SMS_101
The corresponding calculation formula of the central angle is as follows:
Figure SMS_102
the said
Figure SMS_103
The corresponding calculation formula of the central angle is as follows:
Figure SMS_104
in the method, in the process of the invention,
Figure SMS_105
is->
Figure SMS_106
Corresponding central angle>
Figure SMS_107
Is->
Figure SMS_108
Corresponding central angle>
Figure SMS_109
Is->
Figure SMS_110
The corresponding central angle, r1, is the distance between the left intercept point and the center seedling point;
step S335: calculating the proportional relation of the arc between the left locus and the left intercept point, the arc between the left locus and the right locus and the arc between the right locus and the right intercept point:
Figure SMS_111
in the method, in the process of the invention,
Figure SMS_112
is the arc between the left locus and the left intercept, < >>
Figure SMS_113
Is the arc between the left locus and the right locus, < >>
Figure SMS_114
Is the arc between the right locus and the right intercept;
step S336: calculation of
Figure SMS_115
The number of seedling lines contained in the middle is calculated as follows:
Figure SMS_116
wherein A1 is
Figure SMS_117
The number of seedling lines contained in the middle;
step S337: calculation of
Figure SMS_118
The number of seedling lines contained in the middle is calculated as follows:
Figure SMS_119
wherein C1 is
Figure SMS_120
The number of seedling lines contained in the middle;
step S338: calculating the number of lines of the seedling plants in the breeding area, wherein the calculation formula is as follows:
Figure SMS_121
wherein P is the number of lines of the seedling plants in the breeding area.
Fourth embodiment, referring to fig. 1 and 4, based on the above embodiment, in step S4, specifically includes the following steps:
step S41: acquiring coordinates of an upper locus and a lower locus, scanning the middle-most row and the upper row of seedling plants in the middle-most row of a breeding area to obtain center coordinates of two rows of seedling plants, taking the center coordinates of the two rows of seedling plants as the coordinates of the upper locus, scanning the middle-most row and the lower row of seedling plants in the middle-most row of the breeding area to obtain center coordinates of the two rows of seedling plants, and taking the center coordinates of the two rows of seedling plants as the coordinates of the lower locus;
step S42: the step of acquiring coordinates of the upper intercept point and the lower intercept point includes:
step S421: acquiring a rotation image of a breeding area image;
step S422: taking the central point of the seedling at the most central position of the breeding area as a central seedling point, and taking the central seedling point as a starting point, scanning the rotating image upwards until a black pixel point is scanned;
step S423: taking the previous point of the scanned black pixel point as the coordinate of the upper intercept point;
step S424: taking the central seedling as a starting point, scanning the rotating image downwards until a black pixel point is scanned;
step S425: taking the previous point of the scanned black pixel point as the coordinate of the lower intercept point;
step S43: calculating the number of seedling lines in a breeding area, setting three rows of seedling lines between an upper locus and a lower locus as main three rows of columns in advance, wherein the step of calculating the number of seedling lines in the breeding area comprises the following steps:
step S431: carrying out inverse rotation transformation on the breeding area image, and projecting the breeding area image subjected to inverse rotation transformation to obtain coordinates of projection points of an upper locus and coordinates of projection points of a lower locus;
step S432: calculating the projection height of the upper locus and the projection height of the lower locus, wherein the calculation formula of the projection height of the upper locus is as follows:
Figure SMS_122
the calculation formula of the projection height of the lower locus is as follows:
Figure SMS_123
wherein a2 is the distance between the upper intercept point and the projection point of the upper locus, b2 is the distance between the projection point of the upper locus and the projection point of the lower locus, c2 is the distance between the projection point of the lower locus and the lower intercept point,
Figure SMS_124
for the projection height of the upper locus, +.>
Figure SMS_125
Is the projection height of the lower locus;
step S433: calculating the distance between the upper locus and the upper intercept point, the distance between the upper locus and the lower locus and the distance between the lower locus and the lower intercept point, wherein the calculation formula of the distance between the upper locus and the upper intercept point is as follows:
Figure SMS_126
the calculation formula of the distance between the upper locus and the lower locus is as follows:
Figure SMS_127
the calculation formula of the distance between the lower locus and the lower intercept point is as follows:
Figure SMS_128
in the method, in the process of the invention,
Figure SMS_129
is the distance between the upper locus and the upper intercept, < >>
Figure SMS_130
Is the distance between the upper and lower sites, < >>
Figure SMS_131
Is the distance between the lower locus and the lower intercept point;
step S434: calculation of
Figure SMS_132
、/>
Figure SMS_133
And->
Figure SMS_134
Corresponding central angle, said ++>
Figure SMS_135
The corresponding calculation formula of the central angle is as follows:
Figure SMS_136
the said
Figure SMS_137
The corresponding calculation formula of the central angle is as follows: />
Figure SMS_138
The said
Figure SMS_139
The corresponding calculation formula of the central angle is as follows:
Figure SMS_140
in the method, in the process of the invention,
Figure SMS_141
is->
Figure SMS_142
Corresponding central angle>
Figure SMS_143
Is->
Figure SMS_144
Corresponding central angle>
Figure SMS_145
Is->
Figure SMS_146
The corresponding central angle, r2, is the distance between the upper intercept point and the center seedling point;
step S435: calculating the proportional relation of the arc between the upper locus and the upper intercept point, the arc between the upper locus and the lower locus and the arc between the lower locus and the lower intercept point:
Figure SMS_147
in the method, in the process of the invention,
Figure SMS_148
is the arc between the upper locus and the upper intercept point, < >>
Figure SMS_149
Is the arc between the upper and lower locus, < ->
Figure SMS_150
Is the arc between the lower locus and the lower intercept point;
step S436: calculation of
Figure SMS_151
The number of seedling columns contained in the middle is calculated as follows:
Figure SMS_152
wherein A2 is
Figure SMS_153
The number of seedling rows contained in the middle;
step S437: calculation of
Figure SMS_154
The number of seedling columns contained in the middle is calculated as follows:
Figure SMS_155
wherein C2 is
Figure SMS_156
The number of seedling rows contained in the middle;
step S438: the number of seedling lines in the breeding area is calculated, and the calculation formula is as follows:
Figure SMS_157
wherein Q is the number of seedling lines in the breeding area.
In step S5, referring to fig. 1, the area of the breeding area is preset, and the calculation formula of the seedling density in the breeding area is as follows:
Figure SMS_158
wherein Q is the number of seedling plant columns in the breeding area, ρ is the seedling plant density in the breeding area, and s is the breeding area.
In a sixth embodiment, referring to fig. 1, fig. 5, fig. 6, and fig. 7, the embodiment is based on the above embodiment, and the edge detection algorithm adopted in the present scheme and other edge detection algorithms are used to perform edge detection processing on the crop breeding image acquired in the step S1, where fig. 5 is a gray scale image of the breeding image, fig. 6 is a breeding area image obtained by processing the edge detection algorithm adopted in the present scheme, and fig. 7 is a breeding area image obtained by processing other edge detection algorithms, and the noise immunity of the edge detection algorithm adopted in the present scheme is better by comparing fig. 6 and fig. 7.
In step S6, the calculating the height of the seedling in the breeding area is to call the OpenCV function of the public library to find the minimum circumscribed rectangle, and the circumscribed length of the minimum circumscribed rectangle is the pixel value of the image, that is, the height of the seedling, referring to fig. 1.
By executing the operation, aiming at the technical problem that the current method for acquiring effective data information from a breeding image to help a breeding expert to monitor crop breeding is few, the invention acquires the number of rows and the number of columns of the seedlings in a breeding area respectively through acquiring a algorithm for the number of rows of the seedlings in the breeding area and acquiring a algorithm for the number of columns of the seedlings in the breeding area, further acquires the density of the seedlings to judge the emergence rate, and invokes a function of OpenCV in a public library to acquire the height of the seedlings in the breeding area so as to monitor the characteristics of crops in the breeding process and the growing process in real time.
An eighth embodiment, referring to fig. 2, is based on the above embodiment, and the crop breeding monitoring system based on image processing provided by the invention includes a data acquisition module, an edge detection processing module, a module for acquiring the number of lines of seedlings in a breeding area, a module for acquiring the number of columns of seedlings in a breeding area, a module for acquiring the density of seedlings in a breeding area and a module for acquiring the height of seedlings in a breeding area, where the data acquisition module acquires a crop breeding image in a breeding process and sends the crop breeding image to the edge detection processing module, the edge detection processing module receives the crop breeding image sent by the data acquisition module, performs edge detection processing on the crop breeding image by using an edge detection algorithm, sends the image of the breeding area obtained after the edge detection processing to the module for acquiring the number of lines of seedlings in the breeding area, the module for acquiring the number of columns of seedlings in the breeding area and the module for acquiring the height of seedlings in the breeding area, the method comprises the steps of acquiring a seed plant line number module in a breeding area and a seed plant column number module in the breeding area, receiving a breeding area image sent by an edge detection processing module, acquiring the seed plant line number in the breeding area and the seed plant column number in the breeding area by utilizing an algorithm for acquiring the seed plant line number in the breeding area and an algorithm for acquiring the seed plant column number in the breeding area, sending the seed plant line number in the breeding area and the seed plant column number in the breeding area to a seed plant density module in the breeding area, receiving the seed plant line number in the breeding area and the seed plant column number in the breeding area, calculating the seed plant density in the breeding area, outputting the seed plant density in the breeding area, and calling a function of an openCV (open library) by using the seed plant height module in the breeding area, wherein the seed plant height module in the breeding area receives the seed plant image sent by the edge detection processing module.
By executing the operations, the invention provides a crop breeding monitoring system based on an image processing technology for monitoring a crop breeding process, acquiring crop breeding images in real time and effectively recording the crop growth process and the crop characteristics, aiming at the technical problems of long growth period, great breeding data collection difficulty and insufficient accuracy in the crop breeding process and lacking a method and a system for effectively recording the crop growth process and the crop characteristics to help breeding experts to analyze.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
The invention and its embodiments have been described above with no limitation, and the actual construction is not limited to the embodiments of the invention as shown in the drawings. In summary, if one of ordinary skill in the art is informed by this disclosure, a structural manner and an embodiment similar to the technical solution should not be creatively devised without departing from the gist of the present invention.

Claims (7)

1. The crop breeding monitoring method based on image processing is characterized by comprising the following steps of: the method comprises the following steps:
step S1: collecting data;
step S2: edge detection processing;
step S3: calculating the number of lines of the seedling plants in the breeding area;
step S4: calculating the number of seedling lines in a breeding area;
step S5: calculating the seedling density in the breeding area;
step S6: and calculating the height of the seedling in the breeding area.
2. The image processing-based crop breeding monitoring method according to claim 1, wherein: in step S3, the calculating the number of lines of the seedling in the breeding area specifically includes the following steps:
step S31: acquiring coordinates of a left locus and a right locus, scanning the middle-most row and the left row of seedling plants of the middle-most row of a breeding area to obtain center coordinates of two rows of seedling plants, taking the center coordinates of the two rows of seedling plants as the coordinates of the left locus, scanning the middle-most row and the right row of seedling plants of the middle-most row of the breeding area to obtain center coordinates of the two rows of seedling plants, and taking the center coordinates of the two rows of seedling plants as the coordinates of the right locus;
step S32: the step of obtaining coordinates of the left intercept point and the right intercept point includes:
step S321: acquiring a rotation image of a breeding area image;
step S322: taking the center point of the seedling at the center position of the breeding area as a center seedling point, and taking the center seedling point as a starting point, scanning the rotating image leftwards until a black pixel point is scanned;
step S323: taking the previous point of the scanned black pixel point as the coordinate of the left intercept point;
step S324: taking the central seedling point as a starting point, scanning the rotating image rightwards until a black pixel point is scanned;
step S325: taking the previous point of the scanned black pixel point as the coordinate of the right intercept point;
step S33: calculating the number of lines of seedling plants in a breeding area, setting three lines of seedling plants between a left locus and a right locus as main three lines of lines in advance, wherein the step of calculating the number of lines of seedling plants in the breeding area comprises the following steps:
step S331: carrying out inverse rotation transformation on the breeding area image, projecting the breeding area image subjected to the inverse rotation transformation, and obtaining coordinates of projection points of a left locus and coordinates of projection points of a right locus;
step S332: calculating the projection height of the left locus and the projection height of the right locus, wherein the calculation formula of the projection height of the left locus is as follows:
Figure QLYQS_1
the calculation formula of the projection height of the right locus is as follows:
Figure QLYQS_2
wherein a1 is the distance between the left intercept point and the projection point of the left locus, b1 is the distance between the projection point of the left locus and the projection point of the right locus, c1 is the distance between the projection point of the right locus and the right intercept point,
Figure QLYQS_3
for the projection height of the left locus, +.>
Figure QLYQS_4
The projected height of the right locus;
step S333: calculating the distance between the left locus and the left intercept point, the distance between the left locus and the right locus and the distance between the right locus and the right intercept point, wherein the calculation formula of the distance between the left locus and the left intercept point is as follows:
Figure QLYQS_5
,/>
the calculation formula of the distance between the left locus and the right locus is as follows:
Figure QLYQS_6
the calculation formula of the distance between the right locus and the right intercept point is as follows:
Figure QLYQS_7
in the method, in the process of the invention,
Figure QLYQS_8
is the distance between the left locus and the left intercept, < >>
Figure QLYQS_9
Is the distance between the left site and the right site, < >>
Figure QLYQS_10
Is the distance between the right locus and the right intercept;
step S334: calculation of
Figure QLYQS_11
、/>
Figure QLYQS_12
And->
Figure QLYQS_13
Corresponding central angle, said ++>
Figure QLYQS_14
The corresponding calculation formula of the central angle is:
Figure QLYQS_15
The said
Figure QLYQS_16
The corresponding calculation formula of the central angle is as follows:
Figure QLYQS_17
the said
Figure QLYQS_18
The corresponding calculation formula of the central angle is as follows:
Figure QLYQS_19
in the method, in the process of the invention,
Figure QLYQS_20
is->
Figure QLYQS_21
Corresponding central angle>
Figure QLYQS_22
Is->
Figure QLYQS_23
Corresponding central angle>
Figure QLYQS_24
Is->
Figure QLYQS_25
The corresponding central angle, r1, is the distance between the left intercept point and the center seedling point;
step S335: calculating the proportional relation of the arc between the left locus and the left intercept point, the arc between the left locus and the right locus and the arc between the right locus and the right intercept point:
Figure QLYQS_26
in the method, in the process of the invention,
Figure QLYQS_27
is the arc between the left locus and the left intercept, < >>
Figure QLYQS_28
Is the arc between the left locus and the right locus, < >>
Figure QLYQS_29
Is the arc between the right locus and the right intercept;
step S336: calculation of
Figure QLYQS_30
The number of seedling lines contained in the middle is calculated as follows:
Figure QLYQS_31
wherein A1 is
Figure QLYQS_32
The number of seedling lines contained in the middle;
step S337: calculation of
Figure QLYQS_33
The number of seedling lines contained in the middle is calculated as follows:
Figure QLYQS_34
wherein C1 is
Figure QLYQS_35
The number of seedling lines contained in the middle;
step S338: calculating the number of lines of the seedling plants in the breeding area, wherein the calculation formula is as follows:
Figure QLYQS_36
wherein P is the number of lines of the seedling plants in the breeding area.
3. The image processing-based crop breeding monitoring method according to claim 1, wherein:
in step S1, the data acquisition is acquisition of crop breeding images;
in step S2, the edge detection process is provided with a convolution kernel in advance
Figure QLYQS_37
The method specifically comprises the following steps:
step S21: traversing the convolution kernel on an original image to obtain a gray matrix after traversing, wherein the original image is a gray matrix of a crop breeding image, and the step S21 comprises a step S211, a step S212 and a step S213;
in step S211, the horizontal direction of the original image is derived, and the calculation formula is as follows:
Figure QLYQS_38
in the method, in the process of the invention,
Figure QLYQS_39
the result of deriving the horizontal direction of the original image is that I is the gray matrix of the original image;
in step S212, the vertical direction of the original image is derived, and the calculation formula is as follows:
Figure QLYQS_40
in the method, in the process of the invention,
Figure QLYQS_41
is the result of deriving the vertical direction of the original image;
in step S213, a gray matrix obtained after the traversal is obtained by performing calculation based on the result of deriving in the horizontal direction and the result of deriving in the vertical direction of the original image, and the calculation formula of the gray matrix obtained after the traversal is as follows:
Figure QLYQS_42
wherein F is a gray matrix obtained after traversal;
step S22: expanding the gray matrix after traversing for one circle, filling the gray matrix with a value of 0, and traversing the convolution kernel on the gray matrix after filling again to obtain a breeding area image.
4. The image processing-based crop breeding monitoring method according to claim 1, wherein: in step S4, the method specifically includes the following steps:
step S41: acquiring coordinates of an upper locus and a lower locus, scanning the middle-most row and the upper row of seedling plants in the middle-most row of a breeding area to obtain center coordinates of two rows of seedling plants, taking the center coordinates of the two rows of seedling plants as the coordinates of the upper locus, scanning the middle-most row and the lower row of seedling plants in the middle-most row of the breeding area to obtain center coordinates of the two rows of seedling plants, and taking the center coordinates of the two rows of seedling plants as the coordinates of the lower locus;
step S42: the step of acquiring coordinates of the upper intercept point and the lower intercept point includes:
step S421: acquiring a rotation image of a breeding area image;
step S422: taking the central point of the seedling at the most central position of the breeding area as a central seedling point, and taking the central seedling point as a starting point, scanning the rotating image upwards until a black pixel point is scanned;
step S423: taking the previous point of the scanned black pixel point as the coordinate of the upper intercept point;
step S424: taking the central seedling as a starting point, scanning the rotating image downwards until a black pixel point is scanned;
step S425: taking the previous point of the scanned black pixel point as the coordinate of the lower intercept point;
step S43: calculating the number of seedling lines in a breeding area, setting three rows of seedling lines between an upper locus and a lower locus as main three rows of columns in advance, wherein the step of calculating the number of seedling lines in the breeding area comprises the following steps:
step S431: carrying out inverse rotation transformation on the breeding area image, and projecting the breeding area image subjected to inverse rotation transformation to obtain coordinates of projection points of an upper locus and coordinates of projection points of a lower locus;
step S432: calculating the projection height of the upper locus and the projection height of the lower locus, wherein the calculation formula of the projection height of the upper locus is as follows:
Figure QLYQS_43
the calculation formula of the projection height of the lower locus is as follows:
Figure QLYQS_44
wherein a2 is the distance between the upper intercept point and the projection point of the upper locus, b2 is the distance between the projection point of the upper locus and the projection point of the lower locus, c2 is the distance between the projection point of the lower locus and the lower intercept point,
Figure QLYQS_45
for the projection height of the upper locus, +.>
Figure QLYQS_46
Is the projection height of the lower locus;
step S433: calculating the distance between the upper locus and the upper intercept point, the distance between the upper locus and the lower locus and the distance between the lower locus and the lower intercept point, wherein the calculation formula of the distance between the upper locus and the upper intercept point is as follows:
Figure QLYQS_47
the calculation formula of the distance between the upper locus and the lower locus is as follows:
Figure QLYQS_48
the calculation formula of the distance between the lower locus and the lower intercept point is as follows:
Figure QLYQS_49
in the method, in the process of the invention,
Figure QLYQS_50
is the distance between the upper locus and the upper intercept, < >>
Figure QLYQS_51
Is the distance between the upper and lower sites, < >>
Figure QLYQS_52
Is the distance between the lower locus and the lower intercept point;
step S434: calculation of
Figure QLYQS_53
、/>
Figure QLYQS_54
And->
Figure QLYQS_55
Corresponding central angle, said ++>
Figure QLYQS_56
The corresponding calculation formula of the central angle is as follows:
Figure QLYQS_57
the said
Figure QLYQS_58
The corresponding calculation formula of the central angle is as follows:
Figure QLYQS_59
the said
Figure QLYQS_60
The corresponding calculation formula of the central angle is as follows:
Figure QLYQS_61
in the method, in the process of the invention,
Figure QLYQS_62
is->
Figure QLYQS_63
Corresponding central angle>
Figure QLYQS_64
Is->
Figure QLYQS_65
Corresponding central angle>
Figure QLYQS_66
Is->
Figure QLYQS_67
The corresponding central angle, r2, is the distance between the upper intercept point and the center seedling point;
step S435: calculating the proportional relation of the arc between the upper locus and the upper intercept point, the arc between the upper locus and the lower locus and the arc between the lower locus and the lower intercept point:
Figure QLYQS_68
in the method, in the process of the invention,
Figure QLYQS_69
is the arc between the upper locus and the upper intercept point, < >>
Figure QLYQS_70
Is the arc between the upper and lower locus, < ->
Figure QLYQS_71
Is the arc between the lower locus and the lower intercept point;
step S436: calculation of
Figure QLYQS_72
The number of seedling columns contained in the middle is calculated as follows:
Figure QLYQS_73
wherein A2 is
Figure QLYQS_74
The number of seedling rows contained in the middle;
step S437: calculation of
Figure QLYQS_75
The number of seedling columns contained in the middle is calculated as follows:
Figure QLYQS_76
wherein C2 is
Figure QLYQS_77
The number of seedling rows contained in the middle;
step S438: the number of seedling lines in the breeding area is calculated, and the calculation formula is as follows:
Figure QLYQS_78
wherein Q is the number of seedling lines in the breeding area.
5. The image processing-based crop breeding monitoring method according to claim 1, wherein: in step S5, a breeding area is preset, and a calculation formula of the seedling density in the breeding area is:
Figure QLYQS_79
wherein Q is the number of seedling plant columns in the breeding area, ρ is the seedling plant density in the breeding area, and s is the breeding area.
6. An image processing-based crop breeding monitoring system for implementing the image processing-based crop breeding monitoring method according to any one of claims 1 to 5, characterized in that: the device comprises a data acquisition module, an edge detection processing module, a seedling line number acquisition module in a breeding area, a seedling line column number acquisition module in the breeding area, a seedling density acquisition module in the breeding area and a seedling height acquisition module in the breeding area.
7. The image processing-based crop breeding monitoring system of claim 6, wherein: the system comprises a data acquisition module, a breeding area seed density call and open area seed height library receiving module, a breeding area seed library receiving module and a breeding area seed library receiving module, wherein the data acquisition module acquires crop breeding images in a breeding process and sends the crop breeding images to the edge detection processing module, the edge detection processing module receives the crop breeding images sent by the data acquisition module, performs edge detection processing on the crop breeding images by utilizing an edge detection algorithm, sends the obtained crop breeding area seed density module to the breeding area seed density module, and outputs the obtained seed density in the breeding area by utilizing the Openface seed density module.
CN202310531713.XA 2023-05-12 2023-05-12 Crop breeding monitoring method and system based on image processing Pending CN116246225A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310531713.XA CN116246225A (en) 2023-05-12 2023-05-12 Crop breeding monitoring method and system based on image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310531713.XA CN116246225A (en) 2023-05-12 2023-05-12 Crop breeding monitoring method and system based on image processing

Publications (1)

Publication Number Publication Date
CN116246225A true CN116246225A (en) 2023-06-09

Family

ID=86633547

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310531713.XA Pending CN116246225A (en) 2023-05-12 2023-05-12 Crop breeding monitoring method and system based on image processing

Country Status (1)

Country Link
CN (1) CN116246225A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103190224A (en) * 2013-03-26 2013-07-10 中国农业大学 Computer vision technique-based corn ear species test method, system and device
CN103413172A (en) * 2013-08-22 2013-11-27 北京农业信息技术研究中心 Method and device for measuring number of seedlings in shortage in corn seedling stage
CN106909881A (en) * 2017-01-16 2017-06-30 中国农业大学 The method and system of corn breeding base ridge number are extracted based on unmanned aerial vehicle remote sensing images
CN109886094A (en) * 2019-01-08 2019-06-14 中国农业大学 A kind of crop growth of cereal crop seedlings seedling gesture capturing analysis method and device
CN109978904A (en) * 2019-03-19 2019-07-05 南开大学 Emergent aquactic plant growth information extracting method based on image technique
CN111833384A (en) * 2020-05-29 2020-10-27 武汉卓目科技有限公司 Method and device for quickly registering visible light and infrared images
US20210056685A1 (en) * 2017-12-11 2021-02-25 Jiangsu University Method and device for monitoring comprehensive growth of potted lettuce
CN112614147A (en) * 2020-12-24 2021-04-06 中国农业科学院作物科学研究所 Method and system for estimating plant density of crop at seedling stage based on RGB image
WO2022016563A1 (en) * 2020-07-23 2022-01-27 南京科沃信息技术有限公司 Ground monitoring system for plant-protection unmanned aerial vehicle, and monitoring method for same
CN114022771A (en) * 2021-11-15 2022-02-08 安徽农业大学 Corn seedling stage field distribution information statistical method based on deep learning
CN115619286A (en) * 2022-11-11 2023-01-17 中国农业科学院农业资源与农业区划研究所 Method and system for evaluating sample plot quality of breeding field plot

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103190224A (en) * 2013-03-26 2013-07-10 中国农业大学 Computer vision technique-based corn ear species test method, system and device
CN103413172A (en) * 2013-08-22 2013-11-27 北京农业信息技术研究中心 Method and device for measuring number of seedlings in shortage in corn seedling stage
CN106909881A (en) * 2017-01-16 2017-06-30 中国农业大学 The method and system of corn breeding base ridge number are extracted based on unmanned aerial vehicle remote sensing images
US20210056685A1 (en) * 2017-12-11 2021-02-25 Jiangsu University Method and device for monitoring comprehensive growth of potted lettuce
CN109886094A (en) * 2019-01-08 2019-06-14 中国农业大学 A kind of crop growth of cereal crop seedlings seedling gesture capturing analysis method and device
CN109978904A (en) * 2019-03-19 2019-07-05 南开大学 Emergent aquactic plant growth information extracting method based on image technique
CN111833384A (en) * 2020-05-29 2020-10-27 武汉卓目科技有限公司 Method and device for quickly registering visible light and infrared images
WO2022016563A1 (en) * 2020-07-23 2022-01-27 南京科沃信息技术有限公司 Ground monitoring system for plant-protection unmanned aerial vehicle, and monitoring method for same
CN112614147A (en) * 2020-12-24 2021-04-06 中国农业科学院作物科学研究所 Method and system for estimating plant density of crop at seedling stage based on RGB image
CN114022771A (en) * 2021-11-15 2022-02-08 安徽农业大学 Corn seedling stage field distribution information statistical method based on deep learning
CN115619286A (en) * 2022-11-11 2023-01-17 中国农业科学院农业资源与农业区划研究所 Method and system for evaluating sample plot quality of breeding field plot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
翟志强;朱忠祥;李臻;杜岳峰;毛恩荣;: "作物行识别算法的虚拟试验方法", 农业机械学报, no. 1 *

Similar Documents

Publication Publication Date Title
CN109115776B (en) Color and depth information-based plug seedling growth nondestructive monitoring method and device
US20170223947A1 (en) Apparatus and methods for in-field data collection and sampling
CN115131346B (en) Fermentation tank processing procedure detection method and system based on artificial intelligence
CN115272187A (en) Vehicle-mounted dynamic field frame-to-frame relevance based field rice and wheat lodging global evaluation method
CN112304902A (en) Real-time monitoring method and device for crop phenology
CN116246225A (en) Crop breeding monitoring method and system based on image processing
CN114119437B (en) GMS-based image stitching method for improving distortion of moving object
CN117496359B (en) Plant planting layout monitoring method and system based on three-dimensional point cloud
CN106530342B (en) Full-view image generation method is measured using what laser point cloud was aided in
CN103413172A (en) Method and device for measuring number of seedlings in shortage in corn seedling stage
CN113807128B (en) Seedling shortage marking method and device, computer equipment and storage medium
CN111896045B (en) Greenhouse crop three-dimensional directional sensing and fine-grained automatic acquisition device and method
CN113470007A (en) Data analysis method, system and storage medium based on plant growth state
CN117115769A (en) Plant detection and positioning method based on semantic segmentation network
AU2014267257B2 (en) Device and method for the parameterisation of a plant
CN111932551B (en) Missing transplanting rate detection method of rice transplanter
CN114581450A (en) Point cloud image conversion-based corn plant height and stem thickness measuring method and device
CN111886982A (en) Real-time detection system and detection method for dry land planting operation quality
Bates et al. Automating measurements of canopy and fruit to map crop load in commercial vineyards
CN110866972A (en) In-situ observation device for sugarcane root system configuration and analysis method thereof
CN113344968A (en) Orchard fruit identification and yield statistical system and method
CN117333400B (en) Root box cultivated crop root system image broken root restoration and phenotype extraction method
CN113807129A (en) Crop area identification method and device, computer equipment and storage medium
CN214206629U (en) Automatic screening and optimizing device for optical imaging assisted edible mushroom breeding
CN105574853A (en) Method and system for calculating number of wheat grains based on image identification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20230609