CN107220647B - Crop center point positioning method and system under blade crossing condition - Google Patents

Crop center point positioning method and system under blade crossing condition Download PDF

Info

Publication number
CN107220647B
CN107220647B CN201710415434.1A CN201710415434A CN107220647B CN 107220647 B CN107220647 B CN 107220647B CN 201710415434 A CN201710415434 A CN 201710415434A CN 107220647 B CN107220647 B CN 107220647B
Authority
CN
China
Prior art keywords
image
target crop
gray level
crop
level image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710415434.1A
Other languages
Chinese (zh)
Other versions
CN107220647A (en
Inventor
张漫
仇瑞承
李世超
李民赞
刘刚
孙红
李寒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Agricultural University
Original Assignee
China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Agricultural University filed Critical China Agricultural University
Priority to CN201710415434.1A priority Critical patent/CN107220647B/en
Publication of CN107220647A publication Critical patent/CN107220647A/en
Application granted granted Critical
Publication of CN107220647B publication Critical patent/CN107220647B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a crop center point positioning method and a system under the condition of blade intersection, wherein the method comprises the following steps: acquiring a gray level image and a binary image of a target crop based on an original gray level image of the target crop; and obtaining the coordinates of the central point of the target crop based on the gray level image and the binary image of the target crop. The scheme of the invention has the following beneficial effects: the interference of crossed leaves of crops can be effectively eliminated, and the precision and the speed of positioning the crop center point are improved.

Description

Crop center point positioning method and system under blade crossing condition
Technical Field
The invention relates to the technical field of automatic identification and positioning of crops, in particular to a method and a system for positioning a crop center point under a blade crossing condition.
Background
The crop automatic identification and positioning technology based on machine vision can quickly acquire the distribution condition of crops, provides reference for field mechanical management of the crops, is beneficial to reducing the working intensity of workers and improves the working efficiency and precision. Automatic identification and localization of crops is typically achieved by acquiring two-dimensional color images or three-dimensional positional information of the crop. The two-dimensional color image is obtained by using a color camera, contains color information of crops, and has flexible application and lower use cost; the three-dimensional position data needs to be obtained through a depth camera, a stereo camera or a laser sensor, point cloud data of crops can be generated, the cost is high, and the calculation data volume is large. Crop identification and localization based on color images has thus been a focus of research.
The crop identification and positioning method based on the color image is to utilize a color camera to acquire the image information of crops, analyze and process the image, extract the crops and position the central points of the crops. Generally, green information of crops is applied to partition the crops from a complex background, the outlines of the crops and a communication area are extracted, and then the center of mass of the communication area is calculated to obtain the center points of the crops. However, when the method is used for detecting crops with irregular shapes, the positioning of the center point of the crop is easy to deviate.
In the prior art, for an extracted target crop, a crop row to be detected is determined through histogram statistics of pixel rows, then the histogram statistics of pixel columns is carried out on the crop row to be detected, and a central point of the crop is determined. When the leaves are crossed, the crop image is usually skeletonized. In the skeleton of the image, the tail end point appears at the tail end position of the blade, the cross point exists at the intersection of the blade and the position of the crop center point, the tail end point and the cross point in the skeleton are searched, and the cross point is classified and logically judged, so that the positioning of the crop center point is realized.
Disclosure of Invention
The present invention overcomes or at least partially solves the above problems by providing a method and system for locating a crop center point under a blade crossing condition.
According to one aspect of the invention, a crop center point positioning method under a blade crossing condition is provided, which comprises the following steps:
step 1, acquiring a gray level image and a binary image of a target crop based on an original gray level image of the target crop;
and 2, obtaining the coordinates of the central point of the target crop based on the gray level image and the binary image of the target crop.
Further, the step 1 further comprises:
s11, obtaining a binary image of the original gray level image of the target crop based on the original gray level image of the target crop;
s12, performing morphological denoising processing on the binary image of the original gray level image of the target crop to determine an image interest area of the target crop;
s13, acquiring a binary image of the target crop image interest area based on the target crop image interest area.
Further, the step 2 further comprises:
s21, fusing the original gray level image of the target crop with the binary image of the image interest area of the target crop to obtain a gray level image of the image interest area of the target crop;
s22, acquiring a minimum value point of the gray level image of the image interest area of the target crop; obtaining minimum value points of which the difference between the gray level image of the image interest area of the target crop and surrounding pixel points is larger than a threshold value;
and S23, obtaining the coordinates of the center point of the target crop by utilizing a watershed algorithm based on the gray level image of the image interest area of the target crop and the minimum value point obtained in the S22.
Further, before the step 1, the method further comprises:
graying the initial image of the target crop, wherein the graying process comprises the following steps,
Igray(i,j)=G(i,j)*1.262-R(i,j)*0.884-B(i,j)*0.311,
where i and j are the row and column coordinates of the pixel, G (i, j), R (i, j), and B (i, j) are the gray scale values of the color component of the pixel G, R, B at the image (i, j), respectively, and Igray (i, j) is the gray scale value of the pixel at the converted image (i, j).
Further, the S12 further includes:
and obtaining a conversion threshold value of a binary image of the original gray level image of the target crop based on the original gray level image of the target crop by using a maximum inter-class variance method.
Further, the S12 further includes:
removing the interference of weed noise in the binary image of the original gray level image of the target crop by using morphological opening operation;
horizontally projecting the pixel values of the binary image of the denoised original gray level image of the target crop to obtain a projection curve taking a pixel row coordinate as an abscissa; dividing the projection curve into two parts by taking a middle pixel behavior as a boundary, searching a pixel row coordinate corresponding to the minimum value position of each curve, and taking an area between two pixel rows as an image interest area of the target crop.
Further, the S22 further includes:
and calculating minimum value points and maximum value points in eight adjacent regions in the gray level image of the image interest region of the target crop, respectively calculating the average values of the minimum value points and the maximum value points, taking the difference value of the minimum value points and the maximum value points as a threshold, reserving the minimum value points with the difference larger than the threshold with the surrounding pixel points, and obtaining the final local minimum value.
Further, the S23 further includes:
carrying out foreground marking on the gray level image of the image interest area of the target crop by using the minimum value point obtained in the step S22; using a watershed algorithm to take the marked gray level image as an input image to obtain a central area of the target crop;
based on the central area of the target crop, adding and summing the x coordinate and the y coordinate of the pixel points in the central area respectively, and counting the number of the pixel points in the central area, wherein the ratio of the x coordinate sum, the y coordinate sum and the number of the pixel points is the final crop central point coordinate.
According to another aspect of the present invention, there is provided a crop center point locating system under a blade crossing condition, comprising:
the acquisition module is used for acquiring a gray level image and a binary image of the target crop based on the original gray level image of the target crop;
and the positioning module is used for obtaining the center point coordinates of the target crops based on the gray level images and the binary images of the target crops.
According to yet another aspect of the invention, there is provided a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any of the above.
The invention provides a crop center point positioning method and system under the condition of blade intersection, and the scheme provided by the invention has the following beneficial effects: the interference of crossed leaves of crops can be effectively eliminated, and the precision and the speed of positioning the crop center point are improved.
Drawings
FIG. 1 is a schematic overall flow chart of a crop center point positioning method under a blade crossing condition according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart illustrating a method for locating a center point of a crop under a blade crossing condition according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an original gray-scale image of a target crop according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an original binary image of a target crop according to an embodiment of the present invention, under a leaf crossing condition, in a crop center point positioning method;
fig. 5 is a schematic diagram of a horizontal projection of a binary image of an original gray-scale image of the target crop in a crop center point positioning method under a blade crossing condition according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a region of interest of the target maize image in a maize center point positioning method under a leaf crossing condition according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of an original grayscale image of the target corn in a method for locating a center point of corn under a leaf crossing condition according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of a binary image of the target maize under a leaf crossing condition in the maize center point positioning method according to an embodiment of the present invention.
FIG. 9 is a schematic diagram of a minimum region of an original grayscale image of the target corn in a method for locating a center point of corn under a leaf crossing condition according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a central region of a crop obtained by a watershed algorithm segmentation process in a method for locating a center point of a corn under a leaf crossing condition according to an embodiment of the present invention;
FIG. 11 is a schematic diagram illustrating coordinates of a center point of a corn obtained by segmentation processing using a watershed algorithm in a method for locating a center point of a corn under a leaf crossing condition according to an embodiment of the present invention;
FIG. 12 is a schematic diagram of an overall frame of a corn center positioning system with blade crossing according to an embodiment of the present invention;
FIG. 13 is a schematic structural framework diagram of an apparatus for a method of corn center point location under leaf crossing conditions in accordance with an embodiment of the present invention.
Detailed Description
The following detailed description of embodiments of the present invention is provided in connection with the accompanying drawings and examples. The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
Referring to fig. 1, an overall flow chart of a crop center point positioning method under a blade crossing condition is shown in an embodiment of the present invention. In general, the method comprises the following steps: step 1, acquiring a gray level image and a binary image of a target crop based on an original gray level image of the target crop; and 2, obtaining the coordinates of the central point of the target crop based on the gray level image and the binary image of the target crop.
In another embodiment of the present invention, a method for locating a center point of a crop under a blade crossing condition, said step 1 further comprises: s11, obtaining a binary image of the original gray level image of the target crop based on the original gray level image of the target crop; s12, performing morphological denoising processing on the binary image of the original gray level image of the target crop to determine an image interest area of the target crop; s13, acquiring a binary image of the target crop image interest area based on the target crop image interest area.
In another embodiment of the present invention, a method for locating a center point of a crop under a blade crossing condition, said step 2 further comprises:
s21, fusing the original gray level image of the target crop with the binary image of the image interest area of the target crop to obtain a gray level image of the image interest area of the target crop;
s22, acquiring a minimum value point of the gray level image of the image interest area of the target crop; obtaining minimum value points of which the difference between the gray level image of the image interest area of the target crop and surrounding pixel points is larger than a threshold value;
and S23, obtaining the coordinates of the center point of the target crop by utilizing a watershed algorithm based on the gray level image of the image interest area of the target crop and the minimum value point obtained in the S22.
In another embodiment of the present invention, a method for locating a center point of a crop under a blade crossing condition further includes, before step 1:
graying the initial image of the target crop, wherein the graying process comprises the following steps,
Igray(i,j)=G(i,j)*1.262-R(i,j)*0.884-B(i,j)*0.311,
where i and j are the row and column coordinates of the pixel, G (i, j), R (i, j), and B (i, j) are the gray scale values of the color component of the pixel G, R, B at the image (i, j), respectively, and Igray (i, j) is the gray scale value of the pixel at the converted image (i, j).
In another embodiment of the present invention, a method for locating a center point of a crop under a blade crossing condition, the S12 further includes:
removing the interference of weed noise in the binary image of the original gray level image of the target crop by using morphological opening operation;
horizontally projecting the pixel values of the binary image of the denoised original gray level image of the target crop to obtain a projection curve taking a pixel row coordinate as an abscissa; dividing the projection curve into two parts by taking a middle pixel behavior as a boundary, searching a pixel row coordinate corresponding to the minimum value position of each curve, and taking an area between two pixel rows as an image interest area of the target crop.
In another embodiment of the present invention, a method for locating a center point of a crop under a blade crossing condition, the S22 further includes:
and calculating minimum value points and maximum value points in eight adjacent regions in the gray level image of the image interest region of the target crop, respectively calculating the average values of the minimum value points and the maximum value points, taking the difference value of the minimum value points and the maximum value points as a threshold, reserving the minimum value points with the difference larger than the threshold with the surrounding pixel points, and obtaining the final local minimum value.
In another embodiment of the present invention, a method for locating a center point of a crop under a blade crossing condition, the S23 further includes:
carrying out foreground marking on the gray level image of the image interest area of the target crop by using the minimum value point obtained in the step S22; using a watershed algorithm to take the marked gray level image as an input image to obtain a central area of the target crop;
based on the central area of the target crop, adding and summing the x coordinate and the y coordinate of the pixel points in the central area respectively, and counting the number of the pixel points in the central area, wherein the ratio of the x coordinate sum, the y coordinate sum and the number of the pixel points is the final crop central point coordinate.
Referring to fig. 2, a schematic flow chart of a crop center point positioning method under a blade crossing condition according to an embodiment of the present invention is shown. The specific embodiment provides a crop center point positioning method under the condition of blade intersection. The method fully considers the influence of factors such as weeds, blade intersection and the like on the positioning of the crop center point, and improves the speed and the accuracy of the positioning of the crop center point. The method specifically comprises the following steps.
The height and the angle of the camera are adjusted to enable the camera to vertically shoot the crop row, the crop row in the collected image is enabled to be approximately parallel to the horizontal direction, the crop to be detected is located in the middle of the image, and the crop row where the crop row is located is called as a specific crop row.
The image is converted according to the following method:
Igray(i,j)=G(i,j)*1.262-R(i,j)*0.884-B(i,j)*0.311,
where i and j are the row and column coordinates of the pixel, G (i, j), R (i, j), and B (i, j) are the gray scale values of the color component of the pixel G, R, B at the image (i, j), respectively, and Igray (i, j) is the gray scale value of the pixel at the converted image (i, j).
In the binary image, white pixel points (gray value is 1) are crops, and black pixel points (gray value is 0) are background. And performing morphological open operation on the binary image by using square structural elements to remove the white region noise with tiny area in the binary image.
Determining the interest region according to the binary image obtained by denoising processing according to the following method: carrying out horizontal projection on pixel gray values of the binary image to obtain a projection curve taking pixel row coordinates as abscissa, dividing the projection curve into two parts by taking a middle pixel row Midrow as a boundary, wherein each curve has a minimum value min1 and min2, each minimum value possibly corresponds to one or more pixel row coordinates, recording the pixel row coordinates closest to the middle pixel row Midrow as Rowmin1 and Rowmin2, and determining an area between Rowmin1 and Rowmin2 as an interest area.
And calculating the area of each communicated region in the interest region of the binary image, and reserving the region with the largest area to obtain the binary image containing the crop to be detected.
Fusing the binary image containing the crop to be detected and the gray image according to the following method, and extracting a new gray image Inew: and detecting the coordinates (i, j) of white pixel points of the binary image containing the crop to be detected, wherein the gray value at the white pixel point is the corresponding gray value Igray (i, j) in the gray image, and the gray value at the black pixel point is set to be 0.
The central area of the crop is a new leaf, in the gray level image Inew, the gray level value of the area is higher than that of the surrounding mature leaves, the difference of the pixel gray level values is large, and the central area of the crop can be determined by detecting the minimum value of the gray level in the eight neighborhoods of the pixels in the gray level image. However, a large number of local minimum points are usually present in the grayscale image, which is likely to cause false detection. Carrying out extended minimum operation on the image by applying the following formula, and carrying out blanking processing on a minimum value point of which the gray value difference with the adjacent pixel in the image is less than a threshold h to obtain a local minimum value:
BW1=EM(Inew,h),
BW1 denotes a grayscale image obtained by an extended minimum operation, which marks the minimum value of the image; EM represents an extended minimum operation; h represents a fall threshold. The fall threshold is obtained by calculating the difference between the average of the maximum points and the average of the minimum points of the image.
Performing morphology forced minimum operation on the image by applying the following formula to mark image minimum values and eliminate all other minimum values outside the designated area:
BW2=Imposemin(Inew,BW1),
where BW2 denotes the labeled grayscale image and improsemin denotes the morphology-enforced minimum operation.
The marked image BW2 is used as an input image, and the image is segmented by a watershed algorithm to obtain the central area of the crop.
And calculating the center-of-mass coordinates of the crop center area according to the following method, and positioning the crop center point. Firstly, extracting an obtained crop central area to obtain a binary image of the central area, wherein white pixel points (the gray value is 1) are used as the central area, and black pixel points (the gray value is 0) are used as a background. Scanning and counting white pixel points of the binary image of the central area to obtain the number N of the pixel points of the central area, adding and summing pixel row coordinates X of all the white pixel points to obtain X, adding and summing pixel column coordinates Y to obtain Y, calculating to obtain the centroid coordinates (Xcenter, Ycenter) of the crop central area according to the following formula,
Figure BDA0001313546060000101
Figure BDA0001313546060000102
where ceil denotes the rounding operation, the coordinates of the center point of the crop in the original image are (Xcenter + Rowmin1, Ycenter).
In another embodiment of the present invention, a method for positioning a center point of a crop under a blade crossing condition is provided. Taking a corn image as an example, under a natural condition, the quick positioning of the center point of the corn is realized.
In the embodiment, a middle row of crops in the originally acquired image is approximately in the horizontal direction, the row of crops is a specific crop row, and the corn to be detected is located in the middle of the specific crop row.
FIG. 3 is a schematic diagram of an original gray scale image of a target corn according to an embodiment of the present invention. The collected image is subjected to gray level conversion, a maximum inter-class variance method is applied to calculate a gray level threshold value to be 138, and the gray level image is converted into a binary image, as shown in fig. 4, white pixel points (gray level is 1) are plants (corn or weeds), and black pixel points (gray level is 0) are a background.
Fig. 5 is a schematic diagram of a horizontal projection of a binary image of the original gray-scale image of the target corn in the method for positioning the center point of the corn under the blade crossing condition according to the embodiment of the present invention. Fig. 6 is a schematic diagram of a region of interest of the target corn image in a corn center point positioning method under a leaf crossing condition according to an embodiment of the present invention. Firstly, selecting a square structural element with a pixel size of 4 pixels to perform on operation on the binary image, and eliminating a white area with a tiny binary image area; then, carrying out horizontal projection on the denoised binary image to obtain a projection curve; dividing the projection curve into two parts by taking the middle pixel row Midrow as a boundary, wherein Midrow is 360 in the embodiment; the minimum values min1 and min2 of each curve and the corresponding pixel row coordinates are determined, the pixel row coordinates closest to the middle pixel row Midrow are selected to be Rowmin1 and Rowmin2, in the embodiment, min1 is 0, min2 is 0, the corresponding pixel row coordinates are Rowmin1 is 299, and Rowmin2 is 631, and then the area between Rowmin1 and Rowmin2 is determined as the interest area of the image.
Fig. 7 is a schematic diagram of an original grayscale image of the target corn in a method for locating a center point of the corn under a leaf crossing condition according to an embodiment of the present invention. FIG. 8 is a schematic diagram of a binary image of the target maize under a leaf crossing condition in the maize center point positioning method according to an embodiment of the present invention. And (4) counting the area of each region in the interest region, wherein the corn to be detected is positioned in the interest region, and the area of the region is the largest due to the crossing with other leaves. And reserving the area with the largest area, and removing the interference of scattered blades. The area of the region where the corn to be detected is 23048 in this embodiment. And (3) detecting a white pixel point containing the binary image of the corn to be detected, wherein the coordinate of the white pixel point is (i, j), and the gray value of the white pixel point in the new gray image Inew is the value of the gray image (i, j) in the graph 3. The gray value of the black pixel point of the corn area to be detected in the new gray image Inew is 0.
FIG. 9 is a schematic diagram of a minimum region of an original grayscale image of the target corn in a method for locating a center point of corn under a leaf crossing condition according to an embodiment of the present invention; in order to obtain BW1 by performing the minimum expansion operation on the gray image Inew of the embodiment, the gray value near the center point of the corn of BW1 is higher than the peripheral pixel points. In this embodiment, the average value of the minimum value points is 147.09, the average value of the maximum value points is 150.47, the drop threshold h is set to be 3.38, and the minimum value of the micro area is eliminated.
Fig. 10 is a schematic diagram of a crop center region obtained by using watershed algorithm segmentation processing in a method for locating a corn center point under a leaf crossing condition according to an embodiment of the present invention. The gray level image Inew of the embodiment is marked by applying morphological forced minimum operation, the marked image is taken as an input image, and the image is segmented by adopting a watershed algorithm to obtain a central region of the embodiment, as shown in fig. 10.
Fig. 11 is a schematic diagram of coordinates of a center point of corn obtained by segmentation processing using a watershed algorithm in a method for locating a center point of corn under a leaf crossing condition according to an embodiment of the present invention.
The shape of the central region of corn obtained in fig. 10 is not regular, and the centroid of the region is calculated as the center point of corn. And (4) carrying out scanning statistics on the pixel points in the central area, and calculating the centroid coordinate according to the formula (4). In this embodiment, the pixel point N of the center region is 557, the sum X of the X coordinates of the pixels is 89017, the sum Y of the Y coordinates is 152204, and the centroid coordinate is obtained (160,270), and the coordinate of the center point of the corn is (459,270), with the result shown in fig. 11 as a cross.
Referring to FIG. 12, in another embodiment of the present invention, a general outline of a crop center positioning system in a blade-crossing condition is shown. In its entirety, comprising: the acquisition module A1 is used for acquiring a gray level image and a binary image of the target crop based on the original gray level image of the target crop; and the positioning module A2 is used for obtaining the coordinates of the central point of the target crop based on the gray level image and the binary image of the target crop.
In another embodiment of the present invention, the system for locating the center point of a crop under a blade-crossing condition, the acquiring module is further configured to: obtaining a binary image of the original gray level image of the target crop based on the original gray level image of the target crop; performing morphological denoising processing on the binary image of the original gray level image of the target crop to determine an image interest area of the target crop; and acquiring a binary image of the image interest area of the target crop based on the image interest area of the target crop.
In another embodiment of the present invention, a system for locating a center point of a crop under a blade-crossing condition, the locating module further comprises:
s21, fusing the original gray level image of the target crop with the binary image of the image interest area of the target crop to obtain a gray level image of the image interest area of the target crop;
s22, acquiring a minimum value point of the gray level image of the image interest area of the target crop; obtaining minimum value points of which the difference between the gray level image of the image interest area of the target crop and surrounding pixel points is larger than a threshold value;
and S23, obtaining the coordinates of the center point of the target crop by utilizing a watershed algorithm based on the gray level image of the image interest area of the target crop and the minimum value point obtained in the S22.
In another embodiment of the present invention, the system for locating the center point of a crop under a blade crossing condition further comprises an acquiring module for graying an initial image of a target crop, wherein the graying comprises the following steps,
Igray(i,j)=G(i,j)*1.262-R(i,j)*0.884-B(i,j)*0.311,
where i and j are the row and column coordinates of the pixel, G (i, j), R (i, j), and B (i, j) are the gray scale values of the color component of the pixel G, R, B at the image (i, j), respectively, and Igray (i, j) is the gray scale value of the pixel at the converted image (i, j).
In another embodiment of the present invention, the obtaining module is further configured to obtain a conversion threshold of a binary image of an original gray-scale image of a target crop by using a maximum inter-class variance method based on the original gray-scale image of the target crop.
In another embodiment of the present invention, the system for locating a center point of a crop under a blade-crossing condition, the acquiring module is further configured to:
removing the interference of weed noise in the binary image of the original gray level image of the target crop by using morphological opening operation;
horizontally projecting the pixel values of the binary image of the denoised original gray level image of the target crop to obtain a projection curve taking a pixel row coordinate as an abscissa; dividing the projection curve into two parts by taking a middle pixel behavior as a boundary, searching a pixel row coordinate corresponding to the minimum value position of each curve, and taking an area between two pixel rows as an image interest area of the target crop.
In another embodiment of the present invention, a system for locating a center point of a crop under a blade-crossing condition, the locating module is further configured to:
and calculating minimum value points and maximum value points in eight adjacent regions in the gray level image of the image interest region of the target crop, respectively calculating the average values of the minimum value points and the maximum value points, taking the difference value of the minimum value points and the maximum value points as a threshold, reserving the minimum value points with the difference larger than the threshold with the surrounding pixel points, and obtaining the final local minimum value.
In another embodiment of the present invention, a system for locating a center point of a crop under a blade-crossing condition, the locating module is further configured to:
carrying out foreground marking on the gray level image of the image interest area of the target crop by using the minimum value point obtained in the step S22; using a watershed algorithm to take the marked gray level image as an input image to obtain a central area of the target crop;
based on the central area of the target crop, adding and summing the x coordinate and the y coordinate of the pixel points in the central area respectively, and counting the number of the pixel points in the central area, wherein the ratio of the x coordinate sum, the y coordinate sum and the number of the pixel points is the final crop central point coordinate.
Fig. 13 is a block diagram illustrating an apparatus for crop center point location under blade crossing conditions according to an embodiment of the present invention.
Referring to fig. 13, the apparatus for crop center point location under blade crossing condition comprises: a processor (processor)1301, a memory (memory)1302, and a bus 1303;
wherein the content of the first and second substances,
the processor 1301 and the memory 1302 complete communication with each other through the bus 1303;
the processor 1301 is configured to call the program instructions in the memory 1302 to perform the methods provided by the above-mentioned method embodiments, for example, including: step 1, acquiring a gray level image and a binary image of a target crop based on an original gray level image of the target crop; and 2, obtaining the coordinates of the central point of the target crop based on the gray level image and the binary image of the target crop.
The present embodiment discloses a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the computer to perform the method provided by the above-mentioned method embodiments, for example, comprising: step 1, acquiring a gray level image and a binary image of a target crop based on an original gray level image of the target crop; and 2, obtaining the coordinates of the central point of the target crop based on the gray level image and the binary image of the target crop.
The present embodiments provide a non-transitory computer-readable storage medium storing computer instructions that cause the computer to perform the methods provided by the above method embodiments, for example, including: step 1, acquiring a gray level image and a binary image of a target crop based on an original gray level image of the target crop; and 2, obtaining the coordinates of the central point of the target crop based on the gray level image and the binary image of the target crop.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The embodiments of the apparatus and the like for the crop center point positioning method under the blade crossing condition described above are merely illustrative, wherein the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, the method of the present application is only a preferred embodiment and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (6)

1. A crop center point positioning method under a blade crossing condition is characterized by comprising the following steps:
step 1, acquiring a gray level image and a binary image of a target crop based on an original gray level image of the target crop;
step 2, obtaining the coordinates of the central point of the target crop based on the gray level image and the binary image of the target crop;
the step 1 further comprises:
s11, obtaining a binary image of the original gray level image of the target crop based on the original gray level image of the target crop;
s12, performing morphological denoising processing on the binary image of the original gray level image of the target crop to determine an image interest area of the target crop;
s13, acquiring a binary image of the target crop image interest area based on the target crop image interest area;
the step 2 further comprises:
s21, fusing the original gray level image of the target crop with the binary image of the image interest area of the target crop to obtain a gray level image of the image interest area of the target crop;
s22, acquiring a minimum value point of the gray level image of the image interest area of the target crop; obtaining minimum value points of which the difference between the gray level image of the image interest area of the target crop and surrounding pixel points is larger than a threshold value;
s23, obtaining coordinates of a central point of the target crop by utilizing a watershed algorithm based on the gray level image of the image interest area of the target crop and the minimum value point obtained in the S22;
the S12 further includes:
removing the interference of weed noise in the binary image of the original gray level image of the target crop by using morphological opening operation;
horizontally projecting the pixel values of the binary image of the denoised original gray level image of the target crop to obtain a projection curve taking a pixel row coordinate as an abscissa; dividing the projection curve into two parts by taking a middle pixel behavior as a boundary, searching a pixel row coordinate corresponding to the minimum value position of each curve, and taking an area between two pixel rows as an image interest area of the target crop;
the S22 further includes:
and calculating minimum value points and maximum value points in eight adjacent regions in the gray level image of the image interest region of the target crop, respectively calculating the average values of the minimum value points and the maximum value points, taking the difference value of the minimum value points and the maximum value points as a threshold, reserving the minimum value points with the difference larger than the threshold with the surrounding pixel points, and obtaining the final local minimum value.
2. The method of claim 1, wherein step 1 is preceded by:
graying the initial image of the target crop, wherein the graying process comprises the following steps,
Igray(i,j)=G(i,j)*1.262-R(i,j)*0.884-B(i,j)*0.311,
where i and j are the row and column coordinates of the pixel, G (i, j), R (i, j), and B (i, j) are the gray scale values of the color component of the pixel G, R, B at the image (i, j), respectively, and Igray (i, j) is the gray scale value of the pixel at the converted image (i, j).
3. The method of claim 1, wherein the S12 further comprises:
and obtaining a conversion threshold value of a binary image of the original gray level image of the target crop based on the original gray level image of the target crop by using a maximum inter-class variance method.
4. The method of claim 2, wherein the S23 further comprises:
carrying out foreground marking on the gray level image of the image interest area of the target crop by using the minimum value point obtained in the step S22; using a watershed algorithm to take the marked gray level image as an input image to obtain a central area of the target crop;
based on the central area of the target crop, adding and summing the x coordinate and the y coordinate of the pixel points in the central area respectively, and counting the number of the pixel points in the central area, wherein the ratio of the x coordinate sum, the y coordinate sum and the number of the pixel points is the final crop central point coordinate.
5. A system for locating a center point of a crop under a blade crossing condition, comprising:
the acquisition module is used for acquiring a gray level image and a binary image of the target crop based on the original gray level image of the target crop;
the positioning module is used for obtaining the coordinates of the central point of the target crop based on the gray level image and the binary image of the target crop;
the acquisition module is further configured to: obtaining a binary image of the original gray level image of the target crop based on the original gray level image of the target crop; performing morphological denoising processing on the binary image of the original gray level image of the target crop to determine an image interest area of the target crop; acquiring a binary image of the image interest area of the target crop based on the image interest area of the target crop;
the positioning module is further configured to:
fusing the original gray level image of the target crop with the binary image of the image interest area of the target crop to obtain a gray level image of the image interest area of the target crop;
obtaining a minimum value point of a gray level image of the image interest area of the target crop; obtaining minimum value points of which the difference between the gray level image of the image interest area of the target crop and surrounding pixel points is larger than a threshold value;
obtaining the coordinates of the central point of the target crop by utilizing a watershed algorithm based on the gray level image of the image interest area of the target crop and the minimum value point of which the fall is greater than the threshold;
the morphological denoising processing is performed on the binary image of the original gray level image of the target crop to determine the image interest area of the target crop, and the method further comprises the following steps:
removing the interference of weed noise in the binary image of the original gray level image of the target crop by using morphological opening operation;
horizontally projecting the pixel values of the binary image of the denoised original gray level image of the target crop to obtain a projection curve taking a pixel row coordinate as an abscissa; dividing the projection curve into two parts by taking a middle pixel behavior as a boundary, searching a pixel row coordinate corresponding to the minimum value position of each curve, and taking an area between two pixel rows as an image interest area of the target crop;
obtaining a minimum value point of a gray level image of the image interest area of the target crop; obtaining minimum value points of which the difference between the gray level image of the image interest area of the target crop and surrounding pixel points is larger than a threshold value, and further comprising the following steps of:
and calculating minimum value points and maximum value points in eight adjacent regions in the gray level image of the image interest region of the target crop, respectively calculating the average values of the minimum value points and the maximum value points, taking the difference value of the minimum value points and the maximum value points as a threshold, reserving the minimum value points with the difference larger than the threshold with the surrounding pixel points, and obtaining the final local minimum value.
6. A non-transitory computer-readable storage medium storing computer instructions that cause a computer to perform the method of any one of claims 1 to 4.
CN201710415434.1A 2017-06-05 2017-06-05 Crop center point positioning method and system under blade crossing condition Active CN107220647B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710415434.1A CN107220647B (en) 2017-06-05 2017-06-05 Crop center point positioning method and system under blade crossing condition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710415434.1A CN107220647B (en) 2017-06-05 2017-06-05 Crop center point positioning method and system under blade crossing condition

Publications (2)

Publication Number Publication Date
CN107220647A CN107220647A (en) 2017-09-29
CN107220647B true CN107220647B (en) 2020-03-31

Family

ID=59947183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710415434.1A Active CN107220647B (en) 2017-06-05 2017-06-05 Crop center point positioning method and system under blade crossing condition

Country Status (1)

Country Link
CN (1) CN107220647B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111627059B (en) * 2020-05-28 2023-05-30 桂林市思奇通信设备有限公司 Cotton leaf center point positioning method
CN112541383B (en) * 2020-06-12 2021-12-28 广州极飞科技股份有限公司 Method and device for identifying weed area
CN111753688B (en) * 2020-06-12 2022-03-18 广州极飞科技股份有限公司 Planting line center line determining method and device, electronic equipment and storage medium
CN111738159A (en) * 2020-06-23 2020-10-02 桂林市思奇通信设备有限公司 Cotton terminal bud positioning method based on vector calibration
CN113298768B (en) * 2021-05-20 2022-11-08 山东大学 Cotton detection, segmentation and counting method and system
CN113421301B (en) * 2021-07-08 2022-08-05 浙江大学 Method and system for positioning central area of field crop
CN113469112B (en) * 2021-07-19 2022-06-21 三门峡市乡村振兴局 Crop growth condition image identification method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103488991A (en) * 2013-09-30 2014-01-01 中国农业大学 Method for extracting leading line of farmland weeding machine
CN104392460A (en) * 2014-12-12 2015-03-04 山东大学 Adherent white blood cell segmentation method based on nucleus-marked watershed transformation
CN104484877A (en) * 2014-12-12 2015-04-01 山东大学 AML cell segmentation method based on Meanshift cluster and morphological operations

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103488991A (en) * 2013-09-30 2014-01-01 中国农业大学 Method for extracting leading line of farmland weeding machine
CN104392460A (en) * 2014-12-12 2015-03-04 山东大学 Adherent white blood cell segmentation method based on nucleus-marked watershed transformation
CN104484877A (en) * 2014-12-12 2015-04-01 山东大学 AML cell segmentation method based on Meanshift cluster and morphological operations

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
自然环境下农业机器人作业目标信息获取与视觉伺服策略研究;张春龙;《中国博士学位论文全文数据库-信息科技辑》;20140815(第8期);第19,26,30-34,61页 *

Also Published As

Publication number Publication date
CN107220647A (en) 2017-09-29

Similar Documents

Publication Publication Date Title
CN107220647B (en) Crop center point positioning method and system under blade crossing condition
CN109272509B (en) Target detection method, device and equipment for continuous images and storage medium
CN109165538B (en) Bar code detection method and device based on deep neural network
CN113109368B (en) Glass crack detection method, device, equipment and medium
CN108960011B (en) Partially-shielded citrus fruit image identification method
CN109871829B (en) Detection model training method and device based on deep learning
CN103093198B (en) A kind of crowd density monitoring method and device
CN108229232B (en) Method and device for scanning two-dimensional codes in batch
CN111931643A (en) Target detection method and device, electronic equipment and storage medium
CN111768450A (en) Automatic detection method and device for line deviation of structured light camera based on speckle pattern
CN110276759B (en) Mobile phone screen bad line defect diagnosis method based on machine vision
CN108961316B (en) Image processing method and device and server
CN111489337A (en) Method and system for removing false defects through automatic optical detection
CN113780110A (en) Method and device for detecting weak and small targets in image sequence in real time
CN108596032B (en) Detection method, device, equipment and medium for fighting behavior in video
CN116342525A (en) SOP chip pin defect detection method and system based on Lenet-5 model
CN113313692B (en) Automatic banana young plant identification and counting method based on aerial visible light image
CN113793385A (en) Method and device for positioning fish head and fish tail
CN111738310B (en) Material classification method, device, electronic equipment and storage medium
Peng et al. Weed recognition using image blur information
WO2024016632A1 (en) Bright spot location method, bright spot location apparatus, electronic device and storage medium
CN115223031B (en) Monocular frame ranging method and device, medium and curtain wall robot
CN111898408A (en) Rapid face recognition method and device
CN114724119B (en) Lane line extraction method, lane line detection device, and storage medium
CN113487538B (en) Multi-target segmentation defect detection method and device and computer storage medium thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant