CN112270708B - Vegetable and fruit plant lateral branch point identification method based on intersection points of different edge types - Google Patents

Vegetable and fruit plant lateral branch point identification method based on intersection points of different edge types Download PDF

Info

Publication number
CN112270708B
CN112270708B CN202011154805.3A CN202011154805A CN112270708B CN 112270708 B CN112270708 B CN 112270708B CN 202011154805 A CN202011154805 A CN 202011154805A CN 112270708 B CN112270708 B CN 112270708B
Authority
CN
China
Prior art keywords
edge
points
image
point
intersection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011154805.3A
Other languages
Chinese (zh)
Other versions
CN112270708A (en
Inventor
梁喜凤
花瑞
余文胜
赵力勤
谢文兵
王永维
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Jiliang University
Original Assignee
China Jiliang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Jiliang University filed Critical China Jiliang University
Priority to CN202011154805.3A priority Critical patent/CN112270708B/en
Publication of CN112270708A publication Critical patent/CN112270708A/en
Application granted granted Critical
Publication of CN112270708B publication Critical patent/CN112270708B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Abstract

The invention discloses a vegetable and fruit plant lateral branch point identification method based on different edge types. Firstly, carrying out threshold segmentation on an acquired color image of a vegetable and fruit plant, respectively carrying out extraction of a transverse edge and extraction of a longitudinal edge in a binarized image of the color image, carrying out denoising treatment on the extracted edges, and then carrying out threshold segmentation on the extracted transverse and longitudinal edges to remove short edges of leaves of the vegetable and fruit plant. The crossing point is searched for on the transverse and longitudinal edges after the treatment, so that the branching point of the vegetable and fruit plants is obtained. And finally, screening out side branch points corresponding to the vegetable and fruit plants through the distance limitation between the two points and the edge type. The invention can realize the identification of the lateral branch points of the vegetable and fruit plants with similar colors of branches and leaves, and provides the information of the lateral branch positions for the operation of the branch and leaf pruning robot.

Description

Vegetable and fruit plant lateral branch point identification method based on intersection points of different edge types
Technical Field
The invention relates to a method for identifying side branch points of vegetable and fruit plants, in particular to a method for identifying side branch points of vegetable and fruit plants based on intersection points of different edge types.
Background
In order to ensure the normal growth of vegetable and fruit plants, the vegetable and fruit plants are required to be subjected to branch and leaf pruning at regular intervals, and proper branch and leaf pruning can reduce plant insect pests and promote the maturation of fruits. However, since the research of domestic pruning machines is slow, the agricultural pruning work is mainly manual pruning until now, and the automation and the intelligent degree of the pruning machines are not high, so that an intelligent branch and leaf pruning robot is very needed. In addition, the labor force is deficient in agricultural activities, the labor cost is high, but the problem of labor force can be solved by using the vegetable, fruit, branch and leaf pruning robot, and the sustainability and predictability of daily work can be increased.
The identification of the lateral branch points of the vegetable and fruit plants is an important part of the vision system of the vegetable and fruit branch and leaf pruning robot. The current identification methods of the lateral branches of the agricultural fruit and vegetable plants can be divided into identification methods based on color images, multispectral images and stereoscopic vision. The multispectral image identification method can identify the rootstocks of vegetable and fruit plants with similar branch and leaf colors, but has high cost and poor effect; the stereoscopic vision identification method cannot identify vegetable and fruit plants from the background with similar colors; the color image recognition method cannot separate the branches and leaves of vegetable and fruit plants (such as tomato plants) with similar branches and leaves, and the positions of the rhizomes of the plants cannot be found.
In summary, a method for identifying lateral branch points of vegetable and fruit plants is highly needed by a vegetable and fruit branch and leaf pruning robot.
Disclosure of Invention
The invention aims to overcome the defect that lateral branches of vegetable and fruit plants with similar branch and leaf colors cannot be accurately identified, and provides a method for identifying lateral branch points of vegetable and fruit plants based on intersection points of different edge types.
The technical scheme adopted by the invention is as follows:
a vegetable and fruit plant lateral branch point identification method based on intersection points of different edge types comprises the following steps:
s1: acquiring a color image I of the rootstock part of a vegetable and fruit plant to be identified;
s2: background segmentation is carried out on the color image I, branches and leaves of the whole plant of vegetables and fruits are extracted, and binarization processing is carried out on the extracted image to obtain a binarized image C;
s3: the binarized image C is subjected to the extraction of the transverse and longitudinal edges to form longitudinal edge images C with the same size as the binarized image C 2 Horizontal edge image C 3 And upper and lower edge mark images C u
The longitudinal edge image C 2 The pixel value of (a) is as follows:
if C (x, y) =c (x+1, y), then C 2 (x, y) =0, meaning that (x, y) is not a longitudinal edge point
If C (x, y) -C (x+1, y) noteq0, then C 2 (x, y) =1, meaning that (x, y) is a longitudinal edge point
The transverse edge image C 3 And upper and lower edge mark images C u The pixel value of (a) is as follows:
if C (x, y) =c (x, y+1), then C 3 (x,y)=0,C u (x, y) =0, meaning that (x, y) is not a lateral edge point
If C (x, y) -C (x, y+1)>0, then C 3 (x,y)=1,C u (x, y) =1, meaning that (x, y) is the lateral lower edge point
If C (x, y) -C (x, y+1) < 0, C 3 (x,y+1)=1,C u (x, y+1) =2, meaning that (x, y+1) is a lateral edge point
Wherein: c (x, y), C 2 (x,y)、C 3 (x,y)、C u (x, y) represent images C, C, respectively 2 、C 3 、C u Pixel values for the mid-coordinate (x, y) position;
s4: for the obtained longitudinal edge image C 2 And a transverse edge image C 3 Respectively performing open operation to remove discrete points and edges not belonging to branches in the image to obtain a longitudinal branch edge image C only retaining branch edges 4 And lateral shoot edge image C 5
S5: marking image C based on upper and lower edges u Judging the type of the lateral branch edges of the plants by utilizing a vegetable and fruit plant edge sequencing algorithm, and carrying out an image C 5 The edges of each transverse branch are marked as an upper edge or a lower edge according to the judging result;
s6: based on longitudinal shoot edge image C 4 And lateral shoot edge image C 5 Integrating the transverse and longitudinal edges and judging the intersection point to obtain an image C recording the intersection point of the transverse and longitudinal edges 6
S7: image C 6 Performing pairwise distance measurement on all intersection points in the two branches, and removing non-lateral branch points through setting a maximum distance threshold value to obtain a group of branch points corresponding to each transverse branch; and judging whether the intersection point is an upper edge point or a lower edge point according to the type of the edge of the transverse branch, which is close to or attached to the two intersection points in the group of branch points, and finally identifying the upper edge point and the lower edge point of each transverse branch side branch point.
Preferably, in the step S1, the color image I of the rootstock part of the vegetable and fruit plant is obtained by photographing the rootstock of the plant obliquely upward under the vegetable and fruit plant obliquely downward by using a binocular camera under the illumination condition.
Preferably, in S2, the specific method for obtaining the binarized image C is as follows:
based on blue components in the color image I, background segmentation is carried out on the color image I, the branch and leaf images of the whole plant of vegetables and fruits are extracted, and the extracted images are subjected to graying and then binarization processing by a threshold method.
Preferably, a combination ofIn the step S5, the specific method for judging the category of the lateral branch edges of the plant by utilizing the vegetable and fruit plant edge sequencing algorithm is as follows: traversing image C in sequence 5 For each lateral branch edge, based on image C u Counting the number of all the edge points included in the method, wherein the number of the edge points belongs to the transverse edge points and the transverse lower edge points, if the number of the transverse edge points is more than that of the transverse lower edge points, marking the transverse branch edge as an upper edge, otherwise marking the transverse branch edge as a lower edge.
Preferably, in the step S6, the horizontal and vertical edges are integrated and the intersection point is determined to obtain an image C recording the intersection point of the horizontal and vertical edges 6 The specific method of (2) is as follows: for the longitudinal branch edge image C, a circle with a radius of a plurality of pixels is adopted 4 And lateral shoot edge image C 5 When the circle moves to the coordinate (x, y) position of the two images and has both the transverse edge point and the longitudinal edge point in the circle range, the coordinate (x, y) position is recorded in the intersection point map E as the intersection point of the transverse edge and the longitudinal edge 1 In (a) and (b); for the point of intersection E 1 In the region with a plurality of aggregated intersection points, only one intersection point is reserved in a clustering mode, and finally an intersection point diagram E without aggregated intersection points is obtained 2 As image C of recorded crossing points of transverse and longitudinal edges 6
Further, when only one intersection point is reserved in the clustering mode, any one of the intersection points in the circle is reserved randomly.
Further, the radius of the circle is 3 pixels.
Further, for the point of intersection E 1 The method for judging the aggregation comprises the following steps:
using a circle-to-intersection graph E with three pixels radius 1 Performing traversing scanning, if a plurality of intersection points appear in the circle, regarding the intersection points as aggregation, reserving only one intersection point in a clustering mode, and obtaining an intersection point diagram E after traversing 2
Preferably, in the step S7, a maximum distance threshold is set to remove non-lateral branch points to obtain a group of branches corresponding to each lateral branchThe specific method of the dot is as follows: from the obtained intersection point diagram E 2 Performing distance calculation on all intersection point coordinates, taking two intersection points with the distance smaller than a maximum distance threshold value as a group of bifurcation points corresponding to lateral branch points of a transverse branch, and eliminating independent intersection points which cannot form a group of bifurcation points with all other intersection points in the intersection point map; preserving the abscissa of all component bifurcation points for image C-based u And judging the upper edge point and the lower edge point thereof.
Further, the maximum distance threshold is 30-40 pixels.
The method comprises the steps of firstly carrying out threshold segmentation on an acquired color image of a vegetable and fruit plant, respectively carrying out extraction of a transverse edge and extraction of a longitudinal edge in a binarized image of the color image, carrying out denoising treatment on the extracted edges, and then carrying out threshold segmentation on the extracted transverse and longitudinal edges to remove short edges of leaves of the vegetable and fruit plant. The crossing point is searched for on the transverse and longitudinal edges after the treatment, so that the branching point of the vegetable and fruit plants is obtained. And finally, screening out side branch points corresponding to the vegetable and fruit plants through the distance limitation between the two points and the edge type. The invention can realize the identification of the lateral branch points of the vegetable and fruit plants with similar colors of branches and leaves, and provides the information of the lateral branch positions for the operation of the branch and leaf pruning robot.
Drawings
FIG. 1 is a flow chart of a method for identifying lateral branch points of vegetable and fruit (tomato is taken as an example) plants based on intersection points of different edge types
FIG. 2 positional relationship between camera and plant of vegetable and fruit (tomato for example)
FIG. 3 edge extraction map of whole plant of vegetables and fruits (tomato for example)
FIG. 4 longitudinal edge extraction map of a thresholded plant of a fruit (e.g., tomato)
FIG. 5 lateral edge extraction map of thresholded vegetable and fruit (tomato for example) plants
FIG. 6 shows a map of side branches of a plant of a vegetable or fruit (e.g., tomato) identified by a logo on an original image
Detailed Description
The patent is further described with reference to the drawings and examples
As shown in fig. 1, the basic operation flow chart of the vegetable and fruit plant lateral branch point identification method of different edge type intersection points comprises the following specific operation steps:
and (3) image acquisition: as shown in fig. 2, the implementation method and the shooting angle of image acquisition are illustrated, under the condition of good illumination (or sunlight), the rootstock of the plant is obliquely shot upwards by using a binocular camera under the obliquely lower part of the plant, so that the inconvenience of ground color on image segmentation is avoided, and a color image I of the rootstock part of the plant to be identified is output.
Image segmentation: and (3) extracting each component of RGB (red, green and blue) of the vegetable and fruit plant color image I and carrying out histogram statistics, and comparing to obtain a blue component, so that the edge of the whole plant is extracted from the background more easily. Therefore, the background segmentation (the segmentation threshold is set to 0.18) is performed on the color image I based on the blue component in the color image I, thereby extracting the whole plant of vegetable and fruit branch and leaf images. Then converting the extracted image into a gray image by gray processing, and then performing binarization processing by a threshold method, wherein the binarization threshold is set to be 0.18. Thus, even in the leaf background, the edge of the branch can be clearly obtained, and as shown in fig. 3, the binarized image C of the whole plant branch and the leaf edge is obtained.
And (3) extracting transverse and longitudinal edges: the binarized image C is subjected to the extraction of the transverse and longitudinal edges to form longitudinal edge images C with the same size as the binarized image C 2 Horizontal edge image C 3 And upper and lower edge mark images C u . Wherein, image C 2 、C 3 、C u The pixel values in (a) are used for recording edge information in the image C so as to facilitate the subsequent judgment of whether the image C is an edge point or not and the type of the edge point.
Wherein the longitudinal edge image C 2 The pixel value taking rule of (a) is as follows:
if C (x, y) =c (x+1, y), then C 2 (x, y) =0, meaning that (x, y) is not a longitudinal edge point
If C (x, y) -C (x+1, y) noteq0, then C 2 (x, y) =1, meaning that (x, y) is a longitudinal edge point
Wherein the transverse edge image C 3 And upper and lower edge mark images C u The pixel value taking rule of (a) is as follows:
if C (x, y) =c (x, y+1), then C 3 (x,y)=0,C u (x, y) =0, meaning that (x, y) is not a lateral edge point
If C (x, y) -C (x, y+1)>0, then C 3 (x,y)=1,C u (x, y) =1, meaning that (x, y) is the lateral lower edge point
If C (x, y) -C (x, y+1) < 0, C 3 (x,y+1)=1,C u (x, y+1) =2, meaning that (x, y+1) is a lateral edge point
Wherein: c (x, y), C 2 (x,y)、C 3 (x,y)、C u (x, y) represent images C, C, respectively 2 、C 3 、C u Pixel values for the mid-coordinate (x, y) position. C (C) 2 Representing the longitudinal edge image, C 3 Representing the transverse edge image, the 1-value positions in the two are marked as corresponding edge points. But C is 3 And also need to be connected with C u Making joint judgment, at C 3 Corresponding C in (C) u (x, y) =1 represents the lower edge point, corresponding to C u (x, y+1) =2 represents the upper edge point.
Longitudinal edge image C 2 And a transverse edge image C 3 There are more noise points and short sides caused by the blades, so C is needed 2 C 3 Short edge elimination is carried out, and some short edges in the edges are eliminated, so that the edges of the blades in the graph can be eliminated. The discrete points in the graph and the edges which are shorter and not belonging to branches can be removed by carrying out open operation on the obtained transverse and longitudinal edges (the threshold value is set at about 60), the edges of the branches which are relatively complete are left, and finally, a longitudinal branch edge image C is obtained 4 And lateral shoot edge image C 5 As shown in fig. 4 and 5.
Classification of upper and lower edges: because the branches have a certain width, the transverse edges are divided into an upper edge and a lower edge, and classification is required. In this embodiment, a tomato plant edge ranking algorithm is used to rank the lateral edges C of the vegetable and fruit plants 5 Sorting and classifying, namely marking the edge of each transverse branch in the image as an upper edge or a lower edge according to a judging result. The tomato plant edge ordering algorithm comprises the following basic steps:
(1) Storing the variables of the ordered edges and the definitions and initializations of the groups; defining an ordered edge number variable CN, and initializing to 0; defining a one-dimensional array CPN for storing ordered edge points, and initializing all elements of the CPN to 0; defining two-dimensional arrays CPY and CPX for storing the longitudinal and transverse coordinates y and x of the images of the ordered edge points, wherein the first dimension represents the edge sequence number of the edge point, and the second dimension represents the sequence numbers of the edge point in all edge points of the ordered edge; defining a two-dimensional array CdgPoF for identifying whether the edge points are ordered edge points, wherein the first dimension represents the ordinate of the edge points in the image coordinate system, the second dimension represents the abscissa of the edge points in the image coordinate system, and all elements of the edge points are initialized to 0. Defining an edge number variable CdgeNo in the same starting point edge cluster, and initializing to 0; defining a one-dimensional array CdgPoNo for storing the edge points, and initializing all elements of the one-dimensional array CdgPoNo to 0; a two-dimensional array CdgPoY, cdgPoX storing the longitudinal and transverse coordinates y, x of the edge point image is defined, the first dimension representing the sequence number of the edge where the edge point is located, and the second dimension representing the sequence number of the edge point among all edge points located.
(2) Point-by-point scanning of edge image C from top to bottom and from left to right 5 Determining whether the current pixel (i, j) is an unordered edge point, i.e., determining C 5 Values of (i, j) and cdgpob (i, j); if yes, i.e. C 5 If the value of (i, j) is 1 and the value of CdgPoF (i, j) is 0, creating an edge taking the edge point (i, j) as a starting point, namely setting the edge number CdgeNo as 1, setting the edge point CdgPoNo (CdgeNo) of the edge of the CdgeNo number as 1, and storing the image longitudinal and transverse coordinates of the 1 st edge point in the edge of the CdgeNo number, namely CdgPoY (CdgeNo, 1) =i and CdgPoX (CdgeNo, 1) =j; identifying the edge point (i, j) as a sorted edge point, i.e., setting cdgporf (i, j) =1; the image aspect coordinates of the start point are respectively saved using variables StartY, startX, that is, starty=i, startx=j is set; taking the edge as a current edge CdgeNo; the edge point (i, j) is taken as the current edge point (r, c), i.e. r=i, c=j.
(3) Counting unordered edge points UnFPoNo in the 8 neighborhood of the current edge point (r, C), namely traversing pixels (m, n) in the 8 neighborhood of the current edge point (r, C), and counting C 5 (m, n) is1 and CdgPoF (m, n) is 0, storing UnFPoNo; judging whether the edge number CdgeNo of the same starting point (StartY, startX) is larger than 0; if so, determining the edge with the most edge points in all edges of the same starting point (StartY, startX) as the longest edge MLE, temporarily storing the edge point image longitudinal and transverse coordinates and edge points of the MLE, namely temporarily storing the ordered edge number variable EN to be increased by 1, storing the longitudinal and transverse coordinates CdgPoY (MLE, t) and CdgPoX (MLE, t) of all edge point images of the MLE into CPY (EN, t) and CPX (EN, t) arrays according to the sequence from 1 to CdgPoNo (MLE), and storing the longest edge point CdgPoNo (MLE) into CPN (EN).
(4) Traversing image C in sequence 5 For each lateral branch edge, based on image C u Counting the number of all the edge points included in the method, which belong to the transverse edge points and the transverse lower edge points, and modifying the current edge point type into the type with the most edge points in 2 types: if the number of the transverse edge points is greater than that of the transverse lower edge points, marking the transverse branch edge as an upper edge, otherwise marking the transverse branch edge as a lower edge.
Edge intersection points are calculated: based on longitudinal shoot edge image C 4 And lateral shoot edge image C 5 Integrating the transverse and longitudinal edges and judging the intersection point to obtain an image C recording the intersection point of the transverse and longitudinal edges 6 . The specific method of integration and intersection judgment can be as follows:
a circle with three pixels radius is arranged for the longitudinal branch edge image C 4 And lateral shoot edge image C 5 Performing traversal and synchronous traversal scanning, and if the circle has both transverse edge point and longitudinal edge point in the circle range when the circle moves to the coordinate (x, y) position of the two images, recording the coordinate (x, y) position as the intersection point of the transverse edge and the longitudinal edge in the intersection point map E 1 Is a kind of medium. In the process, as long as there are both the transverse edge and the longitudinal edge in the circle of the three pixels, the intersection point of the two edges can be considered and recorded in the intersection point diagram E 1 Is a kind of medium. In this process, since the range of intersection points is expanded to 3 pixels, relatively dense intersection points are obtained in a circle of three pixelsSome points may be duplicated, requiring unnecessary points to be deleted by: for the point of intersection E 1 In the region with a plurality of aggregated intersection points, only one intersection point is reserved in a clustering mode, and finally an intersection point diagram E without aggregated intersection points is obtained 2 As image C of recorded crossing points of transverse and longitudinal edges 6 . At E 2 And acquiring the coordinates of the intersection points in the graph, and storing the coordinates into a two-dimensional array EPN.
The clustering mode can be realized by the following modes: using a circle-to-intersection graph E with three pixels radius 1 Performing traversing scanning, if a plurality of intersection points appear in the circle, regarding the intersection points as aggregation, reserving only one intersection point in a clustering mode, and obtaining an intersection point diagram E after traversing 2
Removing non-collateral points: image C 6 And (3) storing all the coordinates of the intersection points into a two-dimensional array EPN, and then carrying out pairwise ranging on the coordinates EPN (i, j) of all the intersection points. Because the two lateral branch points of the same branch cannot be far away, non-lateral branch points can be removed by setting a maximum distance threshold value, and a group of branch points corresponding to each transverse branch can be obtained. And storing the corresponding bifurcation points on the same branch into a four-dimensional array ECN. And judging whether the intersection point is an upper edge point or a lower edge point according to the type of the edge of the transverse branch, which is close to or attached to the two intersection points in the group of branch points, and finally identifying the upper edge point and the lower edge point of each transverse branch side branch point. The final obtained side branch points are a pair of upper edge points and lower edge points which correspond to each other, two edge intersection points of branch and trunk bifurcation positions are shown, and two star-shaped positions shown in fig. 6 are a pair of plant side branch points which correspond to each other and are found on an original image.
The above embodiment is only a preferred embodiment of the present invention, but it is not intended to limit the present invention. Various changes and modifications may be made by one of ordinary skill in the pertinent art without departing from the spirit and scope of the present invention. Therefore, all the technical schemes obtained by adopting the equivalent substitution or equivalent transformation are within the protection scope of the invention.

Claims (9)

1. A vegetable and fruit plant lateral branch point identification method based on intersection points of different edge types is characterized by comprising the following steps:
s1: acquiring a color image I of the rootstock part of a vegetable and fruit plant to be identified;
s2: background segmentation is carried out on the color image I, branches and leaves of the whole plant of vegetables and fruits are extracted, and binarization processing is carried out on the extracted image to obtain a binarized image C;
s3: the binarized image C is subjected to the extraction of the transverse and longitudinal edges to form longitudinal edge images C with the same size as the binarized image C 2 Horizontal edge image C 3 And upper and lower edge mark images C u
The longitudinal edge image C 2 The pixel value of (a) is as follows:
if C (x, y) =c (x+1, y), then C 2 (x, y) =0, meaning that (x, y) is not a longitudinal edge point
If C (x, y) -C (x+1, y) noteq0, then C 2 (x, y) =1, meaning that (x, y) is a longitudinal edge point
The transverse edge image C 3 And upper and lower edge mark images C u The pixel value of (a) is as follows:
if C (x, y) =c (x, y+1), then C 3 (x,y)=0,C u (x, y) =0, meaning that (x, y) is not the lateral edge point if C (x, y) -C (x, y+1) > 0, C 3 (x,y)=1,C u (x, y) =1, meaning that (x, y) is the lateral lower edge point if C (x, y) -C (x, y+1) < 0, C 3 (x,y+1)=1,C u (x, y+1) =2, meaning that (x, y+1) is a lateral edge point where: c (x, y), C 2 (x,y)、C 3 (x,y)、C u (x, y) represent images C, C, respectively 2 、C 3 、C u Pixel values for the mid-coordinate (x, y) position;
s4: for the obtained longitudinal edge image C 2 And a transverse edge image C 3 Respectively performing open operation to remove discrete points and edges not belonging to branches in the image to obtain a longitudinal branch edge image C only retaining branch edges 4 And lateral shoot edge image C 5
S5: marking image C based on upper and lower edges u By using vegetablesThe fruit plant edge sequencing algorithm judges the types of the lateral branch edges of the plant, and images C 5 The edges of each transverse branch are marked as an upper edge or a lower edge according to the judging result;
s6: based on longitudinal shoot edge image C 4 And lateral shoot edge image C 5 Integrating the transverse and longitudinal edges and judging the intersection point to obtain an image C recording the intersection point of the transverse and longitudinal edges 6
S7: image C 6 Performing pairwise distance measurement on all intersection points in the two branches, and removing non-lateral branch points through setting a maximum distance threshold value to obtain a group of branch points corresponding to each transverse branch; judging whether the intersection point is an upper edge point or a lower edge point according to the type of the edge of the transverse branch, which is close to or attached to the two intersection points in the group of branch points, and finally identifying the upper edge point and the lower edge point of each transverse branch side branch point;
in the step S5, the specific method for judging the category of the lateral branch edges of the plant by utilizing the vegetable and fruit plant edge sequencing algorithm is as follows: traversing image C in sequence 5 For each lateral branch edge, based on image C u Counting the number of all the edge points included in the method, wherein the number of the edge points belongs to the transverse edge points and the transverse lower edge points, if the number of the transverse edge points is more than that of the transverse lower edge points, marking the transverse branch edge as an upper edge, otherwise marking the transverse branch edge as a lower edge.
2. The method for identifying lateral branch points of vegetable and fruit plants based on intersection points of different edge types as claimed in claim 1, wherein in the step S1, the color image I of the rhizome parts of the vegetable and fruit plants is obtained by photographing the rhizomes of the plants obliquely upwards under the vegetable and fruit plants by using a binocular camera under the illumination condition.
3. The method for identifying vegetable and fruit plant lateral branch points based on intersection points of different edge types as claimed in claim 1, wherein the specific method for obtaining the binary image C in S2 is as follows:
based on blue components in the color image I, background segmentation is carried out on the color image I, the branch and leaf images of the whole plant of vegetables and fruits are extracted, and the extracted images are subjected to graying and then binarization processing by a threshold method.
4. The method for identifying lateral branch points of vegetable and fruit plants based on intersection points of different edge types as set forth in claim 1, wherein in S6, the horizontal and vertical edges are integrated and intersection points are judged to obtain an image C recording the intersection points of the horizontal and vertical edges 6 The specific method of (2) is as follows: for the longitudinal branch edge image C, a circle with a radius of a plurality of pixels is adopted 4 And lateral shoot edge image C 5 When the circle moves to the coordinate (x, y) position of the two images and has both the transverse edge point and the longitudinal edge point in the circle range, the coordinate (x, y) position is recorded in the intersection point map E as the intersection point of the transverse edge and the longitudinal edge 1 In (a) and (b); for the point of intersection E 1 In the region with a plurality of aggregated intersection points, only one intersection point is reserved in a clustering mode, and finally an intersection point diagram E without aggregated intersection points is obtained 2 As image C of recorded crossing points of transverse and longitudinal edges 6
5. The method for identifying side branch points of vegetable and fruit plants based on intersection points of different edge types according to claim 4, wherein when only one intersection point is reserved in a clustering mode, any one of the intersection points in a circle is reserved randomly.
6. The method for identifying vegetable and fruit plant lateral branch points based on intersection points of different edge types as claimed in claim 4, wherein the radius of the circle is 3 pixels.
7. The method for identifying vegetable and fruit plant lateral branch points based on intersection points of different edge types as claimed in claim 4, wherein the method aims at an intersection point map E 1 The method for judging the aggregation comprises the following steps:
using a circle-to-intersection graph E with three pixels radius 1 Performing traversing scan if circleIf a plurality of intersection points appear in the tree, the tree is regarded as intersection point aggregation, only one intersection point is reserved in a clustering mode, and an intersection point diagram E is obtained after traversing is finished 2
8. The method for identifying lateral branch points of vegetable and fruit plants based on intersection points of different edge types according to claim 1, wherein in the step S7, the specific method for obtaining a set of branch points corresponding to each lateral branch by eliminating the lateral branch points by setting a maximum distance threshold is as follows: from the obtained intersection point diagram E 2 Performing distance calculation on all intersection point coordinates, taking two intersection points with the distance smaller than a maximum distance threshold value as a group of bifurcation points corresponding to lateral branch points of a transverse branch, and eliminating independent intersection points which cannot form a group of bifurcation points with all other intersection points in the intersection point map; preserving the abscissa of all component bifurcation points for image C-based u And judging the upper edge point and the lower edge point thereof.
9. The method for identifying vegetable and fruit plant lateral branch points based on intersection points of different edge types according to claim 8, wherein the maximum distance threshold is 30-40 pixels.
CN202011154805.3A 2020-10-26 2020-10-26 Vegetable and fruit plant lateral branch point identification method based on intersection points of different edge types Active CN112270708B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011154805.3A CN112270708B (en) 2020-10-26 2020-10-26 Vegetable and fruit plant lateral branch point identification method based on intersection points of different edge types

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011154805.3A CN112270708B (en) 2020-10-26 2020-10-26 Vegetable and fruit plant lateral branch point identification method based on intersection points of different edge types

Publications (2)

Publication Number Publication Date
CN112270708A CN112270708A (en) 2021-01-26
CN112270708B true CN112270708B (en) 2024-02-02

Family

ID=74342090

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011154805.3A Active CN112270708B (en) 2020-10-26 2020-10-26 Vegetable and fruit plant lateral branch point identification method based on intersection points of different edge types

Country Status (1)

Country Link
CN (1) CN112270708B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5253302A (en) * 1989-02-28 1993-10-12 Robert Massen Method and arrangement for automatic optical classification of plants
JPH07184437A (en) * 1993-12-28 1995-07-25 Kubota Corp Planting state detecting device for crop planting machine
JPH0916782A (en) * 1995-06-28 1997-01-17 Toyota Motor Corp Object recognizing device
JP2005332016A (en) * 2004-05-18 2005-12-02 National Institute Of Advanced Industrial & Technology Image processing method and system and program
CN107038446A (en) * 2017-03-23 2017-08-11 中国计量大学 A kind of night double fruits overlapping tomato recognition methods detected under active illumination based on overlapping edge
CN109035209A (en) * 2018-07-03 2018-12-18 广西壮族自治区气象减灾研究所 Sugarcane tillering stage automatic observation process
CN109255795A (en) * 2018-09-11 2019-01-22 中国计量大学 A kind of tomato plant edge sort algorithm
CN109522901A (en) * 2018-11-27 2019-03-26 中国计量大学 A kind of tomato plant stalk method for identification of edge based on edge duality relation
CN109684938A (en) * 2018-12-06 2019-04-26 广西大学 It is a kind of to be taken photo by plane the sugarcane strain number automatic identifying method of top view based on crop canopies
CN110603976A (en) * 2019-10-24 2019-12-24 中国计量大学 Tomato branch and leaf trimming device and trimming method thereof

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5253302A (en) * 1989-02-28 1993-10-12 Robert Massen Method and arrangement for automatic optical classification of plants
JPH07184437A (en) * 1993-12-28 1995-07-25 Kubota Corp Planting state detecting device for crop planting machine
JPH0916782A (en) * 1995-06-28 1997-01-17 Toyota Motor Corp Object recognizing device
JP2005332016A (en) * 2004-05-18 2005-12-02 National Institute Of Advanced Industrial & Technology Image processing method and system and program
CN107038446A (en) * 2017-03-23 2017-08-11 中国计量大学 A kind of night double fruits overlapping tomato recognition methods detected under active illumination based on overlapping edge
CN109035209A (en) * 2018-07-03 2018-12-18 广西壮族自治区气象减灾研究所 Sugarcane tillering stage automatic observation process
CN109255795A (en) * 2018-09-11 2019-01-22 中国计量大学 A kind of tomato plant edge sort algorithm
CN109522901A (en) * 2018-11-27 2019-03-26 中国计量大学 A kind of tomato plant stalk method for identification of edge based on edge duality relation
CN109684938A (en) * 2018-12-06 2019-04-26 广西大学 It is a kind of to be taken photo by plane the sugarcane strain number automatic identifying method of top view based on crop canopies
CN110603976A (en) * 2019-10-24 2019-12-24 中国计量大学 Tomato branch and leaf trimming device and trimming method thereof

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
串番茄采摘点的识别方法;梁喜凤;章艳;;中国农机化学报(第11期);全文 *
基于HSV空间再生稻植株与土壤背景图像分割;郭翰林;林建;张翔;;农机化研究(第07期);全文 *
基于K-means和近邻回归算法的Kinect植株深度图像修复;沈跃;徐慧;刘慧;李宁;;农业工程学报(第19期);全文 *
基于机器视觉的蔬菜生长状况分析;豆东东;陈广锋;;中国农机化学报(第10期);全文 *
番茄果实串采摘点位置信息获取与试验;梁喜凤;金超杞;倪梅娣;王永维;;农业工程学报(第16期);全文 *

Also Published As

Publication number Publication date
CN112270708A (en) 2021-01-26

Similar Documents

Publication Publication Date Title
CN111274865B (en) Remote sensing image cloud detection method and device based on full convolution neural network
Samajpati et al. Hybrid approach for apple fruit diseases detection and classification using random forest classifier
CN109636784B (en) Image saliency target detection method based on maximum neighborhood and super-pixel segmentation
CN110120042B (en) Crop image pest and disease damage area extraction method based on SLIC super-pixel and automatic threshold segmentation
CN109255757B (en) Method for segmenting fruit stem region of grape bunch naturally placed by machine vision
Ursani et al. Fusion of textural and spectral information for tree crop and other agricultural cover mapping with very-high resolution satellite images
Arora et al. A Plant Identification System using Shape and Morphological Features on Segmented Leaflets: Team IITK, CLEF 2012.
CN111259925B (en) K-means clustering and width mutation algorithm-based field wheat spike counting method
Khotimah et al. Tuna fish classification using decision tree algorithm and image processing method
CN113222959A (en) Fresh jujube wormhole detection method based on hyperspectral image convolutional neural network
CN112507911B (en) Real-time recognition method of pecan fruits in image based on machine vision
CN112380926A (en) Weeding path planning system of field weeding robot
CN114862858B (en) Cigar harvesting maturity identification method and system based on ensemble learning
Pushpa et al. Diseased leaf segmentation from complex background using indices based histogram
CN108171098A (en) A kind of code detection method and equipment
CN116229265A (en) Method for automatically and nondestructively extracting phenotype of soybean plants
Septiarini et al. Image processing techniques for tomato segmentation applying k-means clustering and edge detection approach
CN115995078A (en) Image preprocessing method and system for plankton in-situ observation
CN112270708B (en) Vegetable and fruit plant lateral branch point identification method based on intersection points of different edge types
CN112037241B (en) Crop mixed-seed area real-time identification method and system based on multispectral data
CN111738310B (en) Material classification method, device, electronic equipment and storage medium
CN112541383A (en) Method and device for identifying weed area
Jasani et al. Review of shape and texture feature extraction techniques for fruits
CN109255795B (en) Tomato plant edge sorting method
CN116863452A (en) Plant disease identification method, storage medium, and identification system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant