CN109859212B - Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle - Google Patents

Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle Download PDF

Info

Publication number
CN109859212B
CN109859212B CN201910039172.2A CN201910039172A CN109859212B CN 109859212 B CN109859212 B CN 109859212B CN 201910039172 A CN201910039172 A CN 201910039172A CN 109859212 B CN109859212 B CN 109859212B
Authority
CN
China
Prior art keywords
vertex
column
row
image
current
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201910039172.2A
Other languages
Chinese (zh)
Other versions
CN109859212A (en
Inventor
项荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Jiliang University
Original Assignee
China Jiliang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Jiliang University filed Critical China Jiliang University
Priority to CN201910039172.2A priority Critical patent/CN109859212B/en
Publication of CN109859212A publication Critical patent/CN109859212A/en
Application granted granted Critical
Publication of CN109859212B publication Critical patent/CN109859212B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a soybean crop row segmentation method for an unmanned aerial vehicle aerial image. Firstly, carrying out aerial photography soybean crop image segmentation, further determining upper and lower crop row boundaries based on row scanning, determining left and right crop row boundaries based on column scanning, simultaneously determining left and right reference vertex pairs of crop row external rectangles, extracting crop row external rectangles based on left and right reference vertex pairs matching, extracting external rectangles of isolated left and right reference vertex pairs based on crop row external rectangles in the same row and in the same column, carrying out row and column under-segmentation element segmentation and column alignment, row under-segmentation element segmentation and row alignment, and 'zero element' segmentation operation on an obtained external rectangle matrix, and finally extracting the crop row external rectangles of the whole image. The method can realize the division of the crop rows in the soybean crop image aerial photographed by the unmanned aerial vehicle, and lays a technical foundation for realizing soybean breeding and seed selection based on the color, the area, the height and other indexes of the crop rows by applying the soybean crop image aerial photographed by the unmanned aerial vehicle.

Description

Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle
Technical Field
The invention relates to an image processing method, in particular to a soybean crop row segmentation method for an unmanned aerial vehicle aerial image.
Background
Soybean is an important commercial crop. In order to improve the yield of the soybean, the breeding and seed selection of the soybean are key, so the method has very important application value.
At present, the breeding and seed selection of soybeans mainly depend on the personal experience and manual measurement of experts, the breeding and seed selection of soybeans are usually carried out in tens of thousands of varieties, the breeding process is very complicated, and the breeding result is different along with the difference of subjective experiences of the experts. Therefore, methods that facilitate automation of the soybean breeding and selection process are highly desirable. The method for realizing soybean seed selection by shooting field soybean crop images by using an unmanned aerial vehicle and processing based on aerial images is a good solution for realizing automation of soybean breeding and seed selection processes, and is receiving more and more attention of practitioners. However, the realization of soybean breeding and seed selection based on aerial image processing firstly needs to realize image segmentation of different varieties of soybean crop lines in aerial images. At present, the work is mainly realized by manual operation, time and labor are wasted, and an automatic image segmentation method for the crop rows of the soybean aerial images is urgently needed.
The method can realize automatic division of the crop rows in the soybean crop image aerial photographed by the unmanned aerial vehicle, and can lay a technical foundation for soybean breeding and seed selection based on the color, the area, the height and other indexes of the crop rows by applying the soybean crop image aerial photographed by the unmanned aerial vehicle.
Disclosure of Invention
The invention aims to provide a soybean crop row segmentation method for an unmanned aerial vehicle aerial image, which aims to realize automatic image segmentation of soybean crop rows in the unmanned aerial vehicle aerial image and extract an external rectangle of the soybean crop rows.
The technical scheme adopted by the invention is as follows:
the invention comprises the following steps:
1.1) image segmentation: carrying out image segmentation on the soybean crop image S of m lines multiplied by n columns aerial photographed by the unmanned aerial vehicle by adopting an automatic threshold image segmentation algorithm based on the normalized green-red difference OTSU to obtain a binary image I;
1.2) extracting upper and lower boundaries of crop rows based on row scanning: transversely dividing an image I into H image blocks; performing line scanning on each image block; determining the y coordinates of the upper and lower boundaries of the crop row in the image I according to the change of the foreground pixel number on the adjacent scanning lines from zero to nonzero and from nonzero to zero;
1.3) extracting the left and right boundaries of crop rows and the external rectangular matrix based on column scanning: longitudinally dividing an image I into V image blocks; performing column scanning on each image block; extracting the left boundary of the crop row according to the change of the foreground pixel number on the adjacent scanning columns from zero to non-zero; extracting a left upper and a left lower reference vertex pair of a crop row external rectangle; extracting the right boundary of the crop row according to the change of the foreground pixel number on the adjacent scanning columns from nonzero to zero; extracting upper right and lower right reference vertex pairs of a crop row external rectangle; simultaneously extracting 4 vertexes of a corresponding crop row external rectangle through matching between the right and left reference vertex pairs; after the current column scanning is finished, obtaining crop row external rectangle columns on the current column, wherein each external rectangle is an element, and each element stores image coordinates of upper left vertex and lower right vertex of the external rectangle; extracting unmatched isolated left and right reference vertex pairs corresponding to crop row external rectangles in the current column according to the left upper vertex image coordinates and the right lower vertex image coordinates of the external rectangles on the current column, and inserting the isolated left and right reference vertex pairs into the current external rectangle column in sequence; after the scanning of the current image block columns is finished, obtaining a crop row external rectangular matrix of the current image block and storing the crop row external rectangular matrix in the crop row external rectangular matrix of the current image; adjusting the initial row of the next image block according to the processed row number of the current image block, and performing column scanning on the next image block until the column scanning of the whole image is completed;
1.4) minimum value and mean value calculation of element height and column width: in the circumscribed rectangle matrix, the minimum value of the height (the distance between the upper side and the lower side of the circumscribed rectangle) and the width of the column (the distance between the left side and the right side of the circumscribed rectangle) of the circumscribed rectangle is solved, and the average values RowHAve and ColWAve of the height and the width of the column which are larger than the minimum value are solved;
1.5) column under-segmentation element segmentation: scanning each element in the circumscribed rectangular matrix if the element width is N of the column width meancMultiple and greater than threshold TcThen the element is equally divided into N in the transverse directioncAn element;
1.6) column alignment: scanning each column in the circumscribed rectangle matrix, and enabling the difference value between the x coordinate of the upper left vertex in each column and the minimum value of the x coordinates of the upper left vertices of all elements in the column to be larger than a threshold value TwThe elements and the right elements in the same row are sequentially shifted to the right to realize the column alignment of the circumscribed rectangular matrix;
1.7) line-under-segmentation element segmentation: scanning each element in the circumscribed rectangular matrix, if the height of the element row is N of high average valuerMultiple and greater than threshold TrDividing the element longitudinally equally into NrAn element;
1.8) line alignment: scanning each row in the circumscribed rectangle matrix, and enabling the difference value between the y coordinate of the top left vertex in each row and the minimum value of the y coordinates of the top left vertices of all elements in the row to be larger than a threshold value ThEach element and the lower element in the same column move downwards in sequence to realize the row alignment of the circumscribed rectangular matrix;
1.9) "zero element" segmentation: the 'zero element' refers to a vacant element without a circumscribed rectangle in the circumscribed rectangle matrix, namely the circumscribed rectangle of the crop row corresponding to the element is not extracted, and the image coordinates of the upper left vertex and the lower right vertex of the 'zero element' are both 0; determining the left upper vertex image coordinates and the right lower vertex image coordinates of the zero elements according to the left upper vertex image coordinates and the right lower vertex image coordinates of other non-zero elements in the rows and the columns where the zero elements are located and the average column width; scanning each column of the circumscribed rectangle matrix, and taking the x coordinates of the upper left vertex and the lower right vertex of the first non-zero element R1(n, j) on the current column as the x coordinates of the upper left vertex and the lower right vertex of all the zero elements R0(z, j) on the current column; scanning a z-th row where a current column of 'zero element' R0(z, j) is located, and if a non 'zero element' R1(z, c) exists on the z-th row, taking the y coordinates of the upper left vertex and the lower right vertex of the non 'zero element' R1(z, c) as the y coordinates of the upper left vertex and the lower right vertex of the 'zero element' R0(z, j) respectively; if there is no non-zero element on the z-th row and z is 1, taking 1 and row height average value RowHAve +1 as the y coordinates of the upper left vertex and the lower right vertex of the zero element; if there is no non-zero element on the z-th row and z >1, the lower right vertex y coordinate +1 and the lower right vertex y coordinate +1+ RowHAve of the upper adjacent element R1(z-1, j) in the same column as the zero element are set as the upper left and lower right vertex y coordinates of the zero element.
The crop row upper and lower boundary extraction based on row scanning in the step 1.2) is realized by the following method:
2.1) setting the line Width, transversely cutting the binary image I into image blocks of H ═ n ÷ Width, wherein the size of the H-th image block is m lines × nfColumn, nfThe value is Width + (n mod Width), mod represents remainder, and the size of each image block is m rows × Width columns;
2.2) scanning the ith row of each image block, wherein i is 1 to m, counting the number RCount of foreground pixels of the ith row:
if i is 1, storing RCount in FRCount as the foreground pixel number of the previous line during the next line scanning; if i>1, setting a flag up as 1 on an upper boundary of a crop row if FRCount is equal to 0 and RCount is greater than 0, and storing an i-1 on an upper boundary ordinate of the crop row into UpY; if i>1, FRCount is greater than 0, RCount is equal to 0, UpFlag is equal to 1, then the distance D of the upper and lower boundaries of the crop row is calculatedudi-UpY, if DudGreater than a threshold value TudIf yes, marking all the pixel points in the UpY th row as upper boundary points of the crop row, marking all the pixel points in the ith row as lower boundary points of the crop row, resetting the UpFlag, and storing the RCount in the FRCount; scanning the next line;
2.3) repeating the step 2.2) until the line scanning of all the lines of the current image block is finished;
2.4) performing line scanning of the next image block until the line scanning of all the image blocks is completed.
The extraction of the left and right boundaries of the crop rows and the external rectangular matrix based on column scanning in the step 1.3) comprises the following steps:
3.1) column scan: setting the line scanning Height, longitudinally cutting the binary image I into V ═ m ÷ Height image blocks, wherein the size of the V-th image block is mfRows by n columns, mfHeight + (m mod Height), and the size of each image block is Height rows × n columns; carrying out column scanning on the jth column of each image block, wherein j is 1 to n, and counting the number CCount of foreground pixels on the jth column; if j is 1, storing CCount into FCCount as the previous column foreground pixel number when the next column is scanned, and re-executing step 3.1); if j is less than or equal to n, executing the step 3.2); if j>n, namely the current image block finishes column scanning, and the step 3.9 is skipped;
3.2) extracting the left boundary of the crop row: if the foreground pixel number of the current row is greater than 0 and the foreground pixel number of the previous row of the current row is 0, that is, Count >0 and FCount ═ 0, the current row is the left boundary of the crop row, and step 3.3) is executed; otherwise, jumping to step 3.4);
3.3) identifying the upper left reference vertex and the lower left reference vertex of the crop row: scanning a current column from top to bottom, if a current pixel of the current column is an upper boundary pixel, identifying the pixel as an upper left reference vertex, recording image coordinates of the pixel, and setting an identifier leftUFlag which exists in the upper left reference vertex as 1; if the current pixel of the current column is a lower boundary pixel, the pixel is identified as a lower left reference vertex, the image coordinate of the pixel is recorded, and if leftUFlag is 1, the left upper and lower left reference vertices are set to identify leftUDFlag as 1, and leftUFlag is cleared; judging whether upper left and lower left reference vertexes appear continuously, namely judging whether leftUDFlag is 1, if so, the pair of upper left and lower left reference vertexes is an effective crop row external rectangular left reference vertex pair, storing the image coordinates of the pair of upper left and lower left reference vertexes, automatically increasing the number of effective left reference vertexes by 1, resetting the leftUDFlag, and setting an adjacent effective left boundary marker leftSFlag to be 1; after the current column scanning is finished, skipping to the step 3.1);
3.4) extracting the right boundary of the effective pairing crop row: if the foreground pixel number of the current row is equal to 0 and the foreground pixel number of the previous row of the current row is greater than 0, i.e. CCount is 0 and FCCount>0, the current column is the right boundary of the crop row, and whether an adjacent effective left boundary exists before the current right boundary, namely LeftSFlag is 1, and the distance D between the current right boundary and the previously existing adjacent effective left boundary is further judgedlrWhether or not it is greater than threshold value Tlr: if so, the current right boundary is the valid right boundary for pairing, and the crop is out-of-rowIncreasing the number of the rectangular columns by 1 and jumping to the step 3.5); otherwise, jumping to step 3.1);
3.5) identifying the reference top points of the upper right and the lower right of the crop row: scanning a current column from top to bottom, if a current pixel of the current column is an upper boundary pixel, identifying the pixel as an upper right reference vertex, recording the image coordinate of the pixel, and setting the identifier RightUFlag stored in the upper right reference vertex as 1; if the current pixel of the current column is a lower boundary pixel, the pixel is identified as a lower right reference vertex, the image coordinate of the pixel is recorded, if the rightUFlag is 1, the rightUDFlag stored in the upper right reference vertex and the lower right reference vertex is set to be 1, and the rightUFlag is cleared; judging whether upper right and lower right reference vertexes continuously appear, namely judging whether the rightUDFlag is 1 or not, and judging whether the leftSFlag is 1 or not: if so, the pair of upper right and lower right reference vertex pairs is an effective crop row external rectangle right reference vertex pair, the image coordinates of the pair of upper right and lower right reference vertex pairs are saved, the effective right reference vertex pair number is automatically increased by 1, the rightUDFlag is cleared, and the step 3.6 is executed); otherwise, re-executing the step 3.5) until the current column scanning is completed, clearing the leftSFlag, and skipping to the step 3.7);
3.6) extracting the matched crop row external rectangles based on the left and right reference vertex pairs: for the current effective right reference vertex pair, scanning the effective left reference vertex pair on the adjacent left boundary point by point in sequence, and if the effective upper right reference vertex pair and the upper left reference vertex pair have the absolute value D of the difference value of the y coordinateslruLess than threshold TlruThe corresponding effective right reference vertex pair and effective left reference vertex pair are 4 reference vertices of the effective crop row circumscribed rectangle, the effective upper left and lower right reference vertex x coordinates are respectively used as the upper left and lower right vertex x coordinates of the circumscribed rectangle, the larger value of the effective upper left and upper right reference vertex y coordinates and the smaller value of the effective lower left and lower right reference vertex y coordinates are used as the upper left and lower right vertex y coordinates of the circumscribed rectangle, the effective upper left and lower right reference vertex x coordinates are stored into LeftX and RightX at the same time and are used as subsequent isolated reference vertices to extract the circumscribed rectangle, the circumscribed rectangle number is increased by 1, the matched left and right reference vertex pairs are respectively set to be the matched identifier LeftMFlag 1, the RightMFlag is set to be 1, and the matched right reference vertex pair existing in the current right boundary is set to be the matched identifier LeftMFlag 1The vertex pair identifier MatchFlag is 1; jump to step 3.5);
3.7) extracting the circumscribed rectangle by the isolated left reference vertex: judging whether the current right boundary MatchFlag is 1 or not; if yes, sequentially scanning whether a left reference vertex pair leftMFlag matched with the current right boundary is 0 or not; if so, the left reference vertex pair is an isolated left reference vertex pair, the x and y coordinates of the upper left reference vertex, the saved RightX coordinate and the y coordinate of the lower left reference vertex are respectively used as the image coordinates of the upper left vertex and the lower right vertex of the circumscribed rectangle, and the circumscribed rectangle is inserted into the current circumscribed rectangle row according to the size of the y coordinate of the upper left vertex; step 3.8) is executed;
3.8) extracting the circumscribed rectangle by the isolated right reference vertex: sequentially scanning whether the rightMFlag of a right reference vertex pair on the current right boundary is 0, if so, taking the right reference vertex pair as an isolated right reference vertex pair, respectively taking saved leftX and upper right reference vertex y coordinates, and lower right reference vertex x and y coordinates as upper left vertex image coordinates and lower right vertex image coordinates of the circumscribed rectangle, and inserting the circumscribed rectangle into the circumscribed rectangle column of the current column according to the size of the upper left vertex y coordinate; storing the y coordinate of the last lower right vertex of the circumscribed rectangle in the ColMaxY, solving the minimum value ImBloEndY of all the ColMaxY columns, storing the foreground pixel number CCount of the current scanning column in FCCount, and jumping to the step 3.1);
3.9) storing a circumscribed rectangular matrix: if the current image block is the first block of the current image, the current image block is circumscribed with a rectangular matrix MpStore the whole image and connect the rectangular matrix Mt(ii) a Otherwise, scan MpFor each column, calculate MpThe first non-zero element of the current column, the upper left vertex, xpAnd MtThe first non-zero element of each column, the upper left vertex, xtAbsolute value of distance Dpt: if xp<xtAnd D ispt>TptThen at MtInserting a new column formed by 'zero element' before the current column, and adding MpThe current column is preceded by the last element of the new column; if D ispt≤TptThen directly combine MpCurrent is listed at MtAfter the last element of the current column; if go through MtAll ofAfter that, if none of the above conditions is satisfied, then M is satisfiedtInserting a new column formed by 'zero elements' behind the last column, and adding MpThe current column is preceded by the last "zero element" of the new column; calculating MtThe maximum element number of each column is MaxConNo, columns with the number of all elements less than the MaxConNo are complemented with 'zero elements' behind the last element until the number of the elements of the columns is MaxConNo, and step 3.10 is executed);
3.10) adjusting the initial line of the next image block: taking ImBloEndY as a termination line of the current processed image block, and removing a circumscribed rectangle of which the y coordinate of the top left vertex is larger than ImBloEndY; and the starting behavior ImBloEndY +1 and the ending behavior ImBloEndY +1+ Height of the next image block jump to the step 3.1) to scan the next image block until the column scanning of the whole image is completed.
The method for realizing the segmentation of the list under-segmentation elements in the step 1.5) comprises the following steps: scanning each column of the circumscribed rectangular matrix, and defining the column width ColWid as xr(i,j)-xl(i, j) is greater than TcAll elements from an element R (i, j) (i and j are respectively the row number and the column number of a circumscribed rectangular matrix where the element is located) of the multiple ColWAve to an element from the right adjacent element to the left adjacent element of the first 'zero element' are sequentially shifted to the right by one element, and the 'zero element' is covered; if the current row has no zero element, all elements starting from the adjacent element on the right side of the element R (i, j) are shifted to the right by one bit; insert a new element with the top left vertex image coordinate [ x ] at (i, j +1)l(i,j)+round(ColWid÷Nc)+1,yu(i,j)]And the coordinates of the lower right vertex image are [ x ]r(i,j),yd(i,j)]Wherein [ x ]l(i,j),yu(i,j)]、[xr(i,j),yd(i,j)]Upper left and lower right vertex image coordinates, N, of element R (i, j), respectivelyc═ ColWid ÷ ColWAve); let element R (i, j) bottom right vertex x coordinate xr(i, j) is modified to xl(i,j)+round(ColWid÷Nc)。
The column alignment in step 1.6) is realized by the following method: scanning each row of the circumscribed rectangle matrix, and solving the minimum MinXCol of the x coordinate of the top left vertex of the circumscribed rectangle in each row; the x coordinate of the top left vertex of the current column and the minimum valueDistance D of MinXColwGreater than a threshold value TwThe element R (i, j) of (a) is shifted right by one bit in the order of the elements starting from the element R (i, j) to the left adjacent element of the first "zero element" on its right side, covering this "zero element"; if no 'zero element' exists, sequentially shifting R (i, j) and all elements of the ith row to the right by one bit; r (i, j) is changed to "zero element".
In the step 1.7), the method for implementing under-segmentation element segmentation comprises the following steps: scanning each row of the circumscribed rectangular matrix, and scanning the row height RowHei-yd(i,j)-yu(i, j) is greater than TrAll elements starting from the element R (i, j) below the multiple rowwave to the first "zero element" above the adjacent element are sequentially shifted down by one element and the "zero element" is covered; if the current column has no zero element, all elements starting from the adjacent element below the element R (i, j) are shifted down by one bit; insert a new element with the top left vertex image coordinate [ x ] at (i +1, j)l(i,j),yu(i,j)+round(RowHei÷Nr)+1]And the coordinates of the lower right vertex image are [ x ]r(i,j),yd(i,j)]Wherein N isr(RowHei ÷ rowwave); the y coordinate y of the lower right vertex of the element R (i, j)d(i, j) is modified to yu(i,j)+round(RowHei÷Nr)。
The line alignment in the step 1.8) is realized by the following method: scanning each row of the circumscribed rectangle matrix, and solving the minimum value MinYRow of the y coordinate of the top left vertex of each row of circumscribed moment; the distance D between the y coordinate of the top left vertex of the current line and the minimum value MinYRowhGreater than a threshold value ThThe element R (i, j) of (a) is shifted down by one bit in the order of the elements starting from the element R (i, j) to the next element above the first "zero element" below it, covering that "zero element"; if no 'zero element' exists, sequentially shifting R (i, j) and all elements below the R (i, j) which belong to the jth column by one bit; r (i, j) is changed to "zero element".
The invention has the beneficial effects that: according to the soybean crop row segmentation method for the unmanned aerial vehicle aerial image, the automatic segmentation of the soybean crop row in the unmanned aerial vehicle aerial image is realized by designing the soybean crop row segmentation method for the unmanned aerial vehicle aerial image, and a technical foundation is laid for soybean breeding and seed selection based on indexes such as crop row color, area and height by applying the unmanned aerial vehicle aerial image.
Drawings
FIG. 1 is a schematic diagram of a soybean crop row segmentation system for an unmanned aerial vehicle aerial image.
FIG. 2 is a flow chart of a soybean crop row segmentation method of an unmanned aerial vehicle aerial image.
FIG. 3 is a flow chart of a crop row left and right boundary and circumscribed rectangle extraction method based on column scanning.
FIG. 4 is a schematic diagram of the extraction and processing principle of the circumscribed rectangle matrix.
FIG. 5 is an example of line segmentation of a soybean crop from an aerial image taken by an unmanned aerial vehicle.
In fig. 1: 1. unmanned aerial vehicle, 2, color camera, 3, soybean field, 4, computer, 5, unmanned aerial vehicle aerial image soybean crop row segmentation software.
Detailed Description
The invention is further illustrated by the following figures and examples.
Fig. 1 illustrates a specific embodiment of a soybean crop row segmentation system based on aerial images taken by an unmanned aerial vehicle. The drone 1 employs MATRICE 600PRO in large jiang. The color camera 2 employs SONY α 6300. The computer 4 is an S230U Twist notebook computer with an internal memory of 8Gb, and the CPU is an Intel Core i7-3537U @2.00GHz, WIN 10 operating system. The memory card in the color camera 2 is accessed to the computer 4 through a memory card interface.
The soybean crop row segmentation of the aerial image of the unmanned aerial vehicle is realized as follows:
an unmanned aerial vehicle 1 with a color camera flies above a soybean field 3; the color camera 2 receives the optical image of the soybean crop and converts the optical image into an electronic image; the electronic image of the soybean crop in the color camera 2 is input into the computer 4; and the soybean crop row image segmentation is realized by the soybean crop row segmentation software 5 of the unmanned aerial vehicle aerial image in the computer 4.
As shown in fig. 2, the method for segmenting the soybean crop row of the unmanned aerial vehicle aerial image in the unmanned aerial vehicle aerial image soybean crop row segmentation software 5 is specifically implemented as follows:
1.1) image segmentation: carrying out image segmentation on a soybean crop image S which is aerial photographed by an unmanned aerial vehicle and has m rows multiplied by n columns (m is 1920, n is 2560) by adopting an automatic threshold image segmentation algorithm based on a normalized green-red difference OTSU to obtain a binary image I;
1.2) extracting upper and lower boundaries of crop rows based on row scanning: transversely dividing an image I into H image blocks; performing line scanning on each image block; determining the y coordinates of the upper and lower boundaries of the crop row in the image I according to the change of the foreground pixel number on the adjacent scanning lines from zero to nonzero and from nonzero to zero;
1.3) extracting the left and right boundaries of crop rows and the external rectangular matrix based on column scanning: longitudinally dividing an image I into V image blocks; performing column scanning on each image block; extracting the left boundary of the crop row according to the change of the foreground pixel number on the adjacent scanning columns from zero to non-zero; extracting a left upper and a left lower reference vertex pair of a crop row external rectangle; extracting the right boundary of the crop row according to the change of the foreground pixel number on the adjacent scanning columns from nonzero to zero; extracting upper right and lower right reference vertex pairs of a crop row external rectangle; simultaneously extracting 4 vertexes of a corresponding crop row external rectangle through matching between the right and left reference vertex pairs; after the current column scanning is finished, obtaining crop row external rectangle columns on the current column, wherein each external rectangle is an element, and each element stores image coordinates of top left vertex and bottom right vertex of the external rectangle; extracting unmatched isolated left and right reference vertex pairs corresponding to crop row external rectangles in the current column according to the left upper vertex image coordinates and the right lower vertex image coordinates of the external rectangles on the current column, and inserting the isolated left and right reference vertex pairs into the current external rectangle column in sequence; after the scanning of the current image block columns is finished, obtaining a crop row external rectangular matrix of the current image block and storing the crop row external rectangular matrix in the crop row external rectangular matrix of the current image; adjusting the initial row of the next image block according to the processed row number of the current image block, and performing column scanning on the next image block until the column scanning of the whole image is completed;
1.4) minimum value and mean value calculation of element height and column width: in the circumscribed rectangle matrix, the minimum value of the height (the distance between the upper side and the lower side of the circumscribed rectangle) and the width of the column (the distance between the left side and the right side of the circumscribed rectangle) of the circumscribed rectangle is solved, and the average values RowHAve and ColWAve of the height and the width of the column which are larger than the minimum value are solved;
1.5) column under-segmentation element segmentation: scanning each element in the circumscribed rectangular matrixIf the element width is N of the column-wide meancMultiple and greater than threshold Tc(set to 1.2), the element is equally divided into N in the lateral directioncAn element;
1.6) column alignment: scanning each column in the circumscribed rectangle matrix, and enabling the difference value between the x coordinate of the upper left vertex in each column and the minimum value of the x coordinates of the upper left vertices of all elements in the column to be larger than a threshold value TwEach element (set as 200) and the right element in the same row are sequentially shifted to the right to realize the column alignment of the circumscribed rectangular matrix;
1.7) line-under-segmentation element segmentation: scanning each element in the circumscribed rectangular matrix, if the height of the element row is N of high average valuerMultiple and greater than threshold Tr(assume 1.5), the element is equally divided into N in the longitudinal directionrAn element;
1.8) line alignment: scanning each row in the circumscribed rectangle matrix, and enabling the difference value between the y coordinate of the top left vertex in each row and the minimum value of the y coordinates of the top left vertices of all elements in the row to be larger than a threshold value ThEach element (set as 40) and the lower element in the same column move downwards in sequence to realize the row alignment of the circumscribed rectangular matrix;
1.9) "zero element" segmentation: the 'zero element' refers to a vacant element without a circumscribed rectangle in the circumscribed rectangle matrix, namely the circumscribed rectangle of the crop row corresponding to the element is not extracted, and the image coordinates of the upper left vertex and the lower right vertex of the 'zero element' are both 0; determining the left upper vertex image coordinates and the right lower vertex image coordinates of the zero elements according to the left upper vertex image coordinates and the right lower vertex image coordinates of other non-zero elements in the rows and the columns where the zero elements are located and the average column width; scanning each column of the circumscribed rectangle matrix, and taking the x coordinates of the upper left vertex and the lower right vertex of the first non-zero element R1(n, j) on the current column as the x coordinates of the upper left vertex and the lower right vertex of all the zero elements R0(z, j) on the current column; scanning a z-th row where a current column of 'zero element' R0(z, j) is located, and if a non 'zero element' R1(z, c) exists on the z-th row, taking the y coordinates of the upper left vertex and the lower right vertex of the non 'zero element' R1(z, c) as the y coordinates of the upper left vertex and the lower right vertex of the 'zero element' R0(z, j) respectively; if there is no non-zero element on the z-th row and z is 1, taking 1 and row height average value RowHAve +1 as the y coordinates of the upper left vertex and the lower right vertex of the zero element; if there is no non-zero element on the z-th row and z >1, the lower right vertex y coordinate +1 and the lower right vertex y coordinate +1+ RowHAve of the upper adjacent element R1(z-1, j) in the same column as the zero element are set as the upper left and lower right vertex y coordinates of the zero element. As shown in fig. 4, R6 and R13 are "zero elements".
The crop row upper and lower boundary extraction based on row scanning in the step 1.2) is realized by the following method:
2.1) set the line Width (set to 400), the binary image I is transversely cut into H ═ n ÷ Width image blocks, where the H-th image block size is m lines × nfColumn, nfThe value is Width + (n mod Width), mod represents remainder, and the size of each image block is m rows × Width columns;
2.2) scanning the ith row of each image block, wherein i is 1 to m, counting the number RCount of foreground pixels on the ith row:
if i is 1, storing RCount in FRCount as the foreground pixel number of the previous line when scanning the next line, and scanning the next line;
if i is greater than 1, FRCount is equal to 0, and RCount is greater than 0, setting a landmark flag up on a crop row to be 1, storing a vertical coordinate i-1 of an upper boundary of the crop row into UpY, and scanning the next row;
if i>1, FRCount is greater than 0, RCount is equal to 0, UpFlag is equal to 1, then the distance D of the upper and lower boundaries of the crop row is calculatedudi-UpY; if D isudGreater than a threshold value Tud(20), identifying all pixel points on the UpY th line of the current image block as upper boundary points of a crop line, identifying all pixel points on the ith line as lower boundary points of the crop line, resetting UpFlag, storing RCount in FRCount, and scanning the next line; as shown by horizontal lines 3-7, 15 and 16 in fig. 4, the lower boundary of the crop row can be correctly extracted, and the problem that the upper and lower boundaries of the crop row cannot be extracted by using the horizontal line 1 is avoided; as shown by horizontal line 11-12 in FIG. 4, the false crop row upper and lower boundaries created for the row-to-row weed Y1 due to its separation DudLess than threshold TudTherefore, the upper and lower boundaries of the crop row cannot be judged by mistake;
2.3) repeating the step 2.2) until the line scanning of all the lines of the current image block is finished;
2.4) performing line scanning of the next image block until the line scanning of all the image blocks is completed.
As shown in fig. 3, the extraction of the crop row left and right boundaries and the circumscribed rectangle matrix based on column scanning in step 1.3) includes the following steps:
3.1) column scan: setting the line scan Height (set to 960), vertically cutting the binary image I into V ═ m ÷ Height image blocks, wherein the V-th image block size is mfRows by n columns, mfHeight + (m mod Height), and the size of each rest image block is Height row × n column; carrying out column scanning on the jth column of each image block, wherein j is 1 to n, and counting the number CCount of foreground pixels on the jth column; if j is 1, storing CCount into FCCount as the previous column foreground pixel number when the next column is scanned, and re-executing step 3.1); if j is less than or equal to n, executing the step 3.2); if j>n, namely the current image block finishes column scanning, and the step 3.9 is skipped;
3.2) extracting the left boundary of the crop row: if the foreground pixel number of the current row is greater than 0 and the foreground pixel number of the previous row of the current row is 0, that is, Count >0 and FCount ═ 0, the current row is the left boundary of the crop row, and step 3.3) is executed; otherwise, jumping to step 3.4);
3.3) identifying the upper left reference vertex and the lower left reference vertex of the crop row: scanning a current column from top to bottom, if a current pixel of the current column is an upper boundary pixel, identifying the pixel as an upper left reference vertex, recording image coordinates of the pixel, and setting an identifier leftUFlag which exists in the upper left reference vertex as 1; if the current pixel of the current column is a lower boundary pixel, the pixel is identified as a lower left reference vertex, the image coordinate of the pixel is recorded, and if leftUFlag is 1, the left upper and lower left reference vertices are set to identify leftUDFlag as 1, and leftUFlag is cleared; judging whether upper left and lower left reference vertexes appear continuously, namely judging whether leftUDFlag is 1, if so, the pair of upper left and lower left reference vertexes is an effective crop row external rectangular left reference vertex pair, storing the image coordinates of the pair of upper left and lower left reference vertexes, automatically increasing the number of effective left reference vertexes by 1, resetting the leftUDFlag, and setting an adjacent effective left boundary marker leftSFlag to be 1; after the current column scanning is finished, skipping to the step 3.1);
3.4) extracting the right boundary of the effective pairing crop row: if the foreground pixel number of the current row is equal to 0 and the foreground pixel number of the previous row of the current row is greater than 0, i.e. CCount is 0 and FCCount>0, the current column is the right boundary of the crop row, and whether an adjacent effective left boundary exists before the current right boundary, namely LeftSFlag is 1, and the distance D between the current right boundary and the previously existing adjacent effective left boundary is further judgedlrWhether or not it is greater than threshold value Tlr(set to 100): if yes, the current right boundary is an effective right boundary, the number of the crop row external rectangle columns is increased by 1, and the step 3.5 is skipped; otherwise, jumping to step 3.1); as shown by vertical lines 8-10 and 17 in fig. 4, the left and right boundaries of the crop row can be correctly extracted, so that the problem that the extraction of the left and right boundaries of the crop row cannot be realized by using the vertical line 2 is avoided; the false crop row left and right boundaries created by the row-to-row weed Y2, as shown by vertical lines 13-14 in FIG. 4, due to the separation DlrLess than threshold TlrTherefore, the left and right boundaries of the crop row can not be judged by mistake;
3.5) identifying the reference top points of the upper right and the lower right of the crop row: scanning a current column from top to bottom, if a current pixel of the current column is an upper boundary pixel, identifying the pixel as an upper right reference vertex, recording the image coordinate of the pixel, and setting the identifier RightUFlag stored in the upper right reference vertex as 1; if the current pixel of the current column is a lower boundary pixel, the pixel is identified as a lower right reference vertex, the image coordinate of the pixel is recorded, if the rightUFlag is 1, the rightUDFlag stored in the upper right reference vertex and the lower right reference vertex is set to be 1, and the rightUFlag is cleared; judging whether upper right and lower right reference vertexes continuously appear, namely judging whether the rightUDFlag is 1 or not, and judging whether the leftSFlag is 1 or not: if so, the pair of upper right and lower right reference vertex pairs is an effective crop row external rectangle right reference vertex pair, the image coordinates of the pair of upper right and lower right reference vertex pairs are saved, the effective right reference vertex pair number is automatically increased by 1, the rightUDFlag is cleared, and the step 3.6 is executed); otherwise, re-executing the step 3.5) until the current column scanning is completed, clearing the leftSFlag, and skipping to the step 3.7);
3.6) extracting the matched crop row external rectangles based on the left and right reference vertex pairs: for the current effective right reference vertex pair, scanning the effective left reference vertex pair on the adjacent left boundary point by point in sequence, and if the effective upper right reference vertex and the effective upper left reference vertex have the y coordinate difference absolute value DlruLess than threshold Tlru(20), the corresponding effective right reference vertex pair and left reference vertex pair are 4 reference vertices of an effective crop row circumscribed rectangle, the effective upper left and lower right reference vertex x coordinates are respectively used as the upper left and lower right vertex x coordinates of the circumscribed rectangle, the larger value of the effective upper left and upper right reference vertex y coordinates and the smaller value of the effective lower left and lower right reference vertex y coordinates are used as the upper left and lower right vertex y coordinates of the circumscribed rectangle, the effective upper left and lower right reference vertex x coordinates are stored into LeftX and RightX and are used as subsequent isolated vertices to extract the circumscribed rectangle, the number of the circumscribed rectangles is increased by 1, the matched left and right reference vertex pairs matched identifier LeftMFlag is set to be 1, the RightMFlag is set to be 1, and the matched right vertex pair identifier MatchFlag existing in the current right boundary is set to be 1; jump to step 3.5); as shown by rectangle R1 in fig. 4, effective upper left reference vertex a results from the intersection of upper boundary 15 and left boundary 17, effective lower left reference vertex B results from the intersection of lower boundary 5 and left boundary 17, effective upper right reference vertex C results from the intersection of upper boundary 16 and right boundary 10, and effective lower right reference vertex D results from the intersection of lower boundary 6 and right boundary 10; absolute value D of difference between y coordinates of upper right reference vertex C and upper left reference vertex AlrLess than threshold TlrThus, upper left reference vertex pair A, B is an upper left reference vertex pair matching upper right reference vertex pair C, D, reference vertex A, D has x-coordinates of the upper left and lower right vertex x-coordinates of R1, reference vertex C has y-coordinates, and reference vertex B has y-coordinates of the upper left and lower right vertex y-coordinates of R1;
3.7) extracting the circumscribed rectangle by the isolated left reference vertex: judging whether the current right boundary MatchFlag is 1: if yes, sequentially scanning whether the leftMFlag of a left reference vertex pair on a left boundary matched with the current right boundary is 0, if yes, the left vertex pair is an isolated left reference vertex pair, respectively taking the x and y coordinates of an upper left reference vertex, the saved rightX coordinate and the y coordinate of a lower left reference vertex as the image coordinates of the upper left vertex and the lower right vertex of the circumscribed rectangle, and inserting the circumscribed rectangle into the current circumscribed rectangle column according to the size of the y coordinate of the upper left vertex; step 3.8) is executed; as shown in fig. 4, the reference vertex E, F is an isolated left reference vertex pair, and R7R13 (representing a large rectangle corresponding to R7 and R13 before the under-divided column division) is a circumscribed rectangle of the isolated left reference vertex pair.
3.8) extracting the circumscribed rectangle by the isolated right reference vertex: sequentially scanning whether the rightMFlag of a right reference vertex pair on the current right boundary is 0, if so, taking the right reference vertex pair as an isolated right reference vertex pair, respectively taking saved leftX and upper right reference vertex y coordinates, and lower right reference vertex x and y coordinates as upper left vertex image coordinates and lower right vertex image coordinates of the circumscribed rectangle, and inserting the circumscribed rectangle into the circumscribed rectangle column of the current column according to the size of the upper left vertex y coordinate; storing the y coordinate of the last lower right vertex of the circumscribed rectangle in the ColMaxY, solving the minimum value ImBloEndY of all the ColMaxY columns, storing the foreground pixel number CCount of the current scanning column in FCCount, and jumping to the step 3.1); as shown in fig. 4, the reference vertex G, H is an isolated right reference vertex pair, and R6R12 (representing a large rectangle corresponding to R6 and R12 before the under-divided column division) is a circumscribed rectangle of the isolated right reference vertex pair.
3.9) storing a circumscribed rectangular matrix: if the current image block is the first block of the current image, the current image block is circumscribed with a rectangular matrix MpStore the whole image and connect the rectangular matrix Mt(ii) a Otherwise, scan MpFor each column, calculate MpThe first non-zero element of the current column, the upper left vertex, xpAnd MtThe first non-zero element of each column, the upper left vertex, xtAbsolute value of distance Dpt: if xp<xtAnd D ispt>Tpt(set to 200), then at MtInserting a new column formed by 'zero element' before the current column, and adding MpThe current column is preceded by the last element of the new column; if D ispt≤TptThen directly combine MpCurrent is listed at MtAfter the last element of the current column; if go through MtAfter all columns, if the above conditions are not satisfied, then at MtFinally, theA new column formed by inserting 'zero elements' behind one column, and MpThe current column is preceded by the last "zero element" of the new column; calculating MtThe maximum element number of each column is MaxConNo, columns with the number of all elements less than the MaxConNo are complemented with 'zero elements' behind the last element until the number of the elements of the columns is MaxConNo, and step 3.10 is executed);
3.10) adjusting the initial line of the next image block: taking ImBloEndY as a termination line of the current processed image block, and removing a circumscribed rectangle of which the y coordinate of the top left vertex is larger than ImBloEndY; and the starting behavior ImBloEndY +1 and the ending behavior ImBloEndY +1+ Height of the next image block jump to the step 3.1) to scan the next image block until the column scanning of the whole image is completed.
The method for realizing the segmentation of the list under-segmentation elements in the step 1.5) comprises the following steps: scanning each column of the circumscribed rectangular matrix, and defining the column width ColWid as xr(i,j)-xl(i, j) is greater than TcAll elements from an element R (i, j) (i and j are respectively the row number and the column number of a circumscribed rectangular matrix where the element is located) of the multiple ColWAve to an element from the right adjacent element to the left adjacent element of the first 'zero element' are sequentially shifted to the right by one element, and the 'zero element' is covered; if the current row has no zero element, all elements starting from the adjacent element on the right side of the element R (i, j) are shifted to the right by one bit; insert a new element with the top left vertex image coordinate [ x ] at (i, j +1)l(i,j)+round(ColWid÷Nc)+1,yu(i,j)]And the coordinates of the lower right vertex image are [ x ]r(i,j),yd(i,j)]Wherein [ x ]l(i,j),yu(i,j)]、[xr(i,j),yd(i,j)]Upper left and lower right vertex image coordinates, N, of element R (i, j), respectivelyc═ ColWid ÷ ColWAve); let element R (i, j) bottom right vertex x coordinate xr(i, j) is modified to xl(i,j)+round(ColWid÷Nc). As shown in fig. 4, due to the presence of the inter-row weed Y3, the right boundary of R2-R7 and the left boundary of R8-R13 were missing, while R2 and R8, R3R4 and R9R10 (representing R3 and R4, respectively, and the corresponding large rectangles of R9 and R10 before the under-split row split), R5 and R11, R6 and R12, and R7 and R13 are shown as the results after the column under-split element split.
The column alignment in step 1.6) is realized by the following method: scanning each row of the circumscribed rectangle matrix, and solving the minimum MinXCol of the x coordinate of the top left vertex of the circumscribed rectangle in each row; the distance D between the x coordinate of the upper left vertex of the current column and the minimum value MinXColwGreater than a threshold value TwThe element R (i, j) of (a) is shifted right by one bit in the order of the elements starting from the element R (i, j) to the left adjacent element of the first "zero element" on its right side, covering this "zero element"; if no 'zero element' exists, sequentially shifting R (i, j) and all elements of the ith row to the right by one bit; r (i, j) is changed to "zero element". As shown in FIG. 4, R14-R19 occupy the positions of the columns R8-R13 before column alignment, and correct right shift to the right adjacent column can be achieved after column alignment.
In the step 1.7), the method for implementing under-segmentation element segmentation comprises the following steps: scanning each row of the circumscribed rectangular matrix, and scanning the row height RowHei-yd(i,j)-yu(i, j) is greater than TrAll elements starting from the element R (i, j) below the multiple rowwave to the first "zero element" above the adjacent element are sequentially shifted down by one element and the "zero element" is covered; if the current column has no zero element, all elements starting from the adjacent element below the element R (i, j) are shifted down by one bit; insert a new element with the top left vertex image coordinate [ x ] at (i +1, j)l(i,j),yu(i,j)+round(RowHei÷Nr)+1]And the coordinates of the lower right vertex image are [ x ]r(i,j),yd(i,j)]Wherein N isr(RowHei ÷ rowwave); the y coordinate y of the lower right vertex of the element R (i, j)d(i, j) is modified to yu(i,j)+round(RowHei÷Nr). The results after line-under-segmentation element segmentation are shown as R3 and R4, R9 and R10 in fig. 4;
the line alignment in the step 1.8) is realized by the following method: scanning each row of the circumscribed rectangle matrix, and solving the minimum value MinYRow of the y coordinate of the top left vertex of each row of circumscribed moment; the distance D between the y coordinate of the top left vertex of the current line and the minimum value MinYRowhGreater than a threshold value ThThe element R (i, j) of (a) is shifted down by one bit in the order of the elements starting from the element R (i, j) to the next element above the first "zero element" below it, covering that "zero element"; if not present"zero element", then shift R (i, j) and all the elements below it belonging to the jth column down by one bit sequentially; r (i, j) is changed to "zero element". The rows are aligned to achieve a correct shift down by one row and R7 achieves a correct shift down by two rows, as shown by R4-R5, R10-R12 in FIG. 4.
In the figure 5, the left image is the soybean crop unmanned aerial vehicle aerial image, and the right image is the soybean crop row segmentation result obtained by applying the method, so that the soybean crop row automatic segmentation of the unmanned aerial vehicle aerial image can be realized by applying the method.

Claims (7)

1. An unmanned aerial vehicle aerial image soybean crop row segmentation method is characterized by comprising the following steps:
1.1) image segmentation: carrying out image segmentation on the soybean crop image S of m lines multiplied by n columns aerial photographed by the unmanned aerial vehicle by adopting an automatic threshold image segmentation algorithm based on the normalized green-red difference OTSU to obtain a binary image I;
1.2) extracting upper and lower boundaries of crop rows based on row scanning: transversely dividing an image I into H image blocks; performing line scanning on each image block; determining the y coordinates of the upper and lower boundaries of the crop row in the image I according to the change of the foreground pixel number on the adjacent scanning lines from zero to nonzero and from nonzero to zero;
1.3) extracting the left and right boundaries of crop rows and the external rectangular matrix based on column scanning: longitudinally dividing an image I into V image blocks; performing column scanning on each image block; extracting the left boundary of the crop row according to the change of the foreground pixel number on the adjacent scanning columns from zero to non-zero; extracting a left upper and a left lower reference vertex pair of a crop row external rectangle; extracting the right boundary of the crop row according to the change of the foreground pixel number on the adjacent scanning columns from nonzero to zero; extracting upper right and lower right reference vertex pairs of a crop row external rectangle; simultaneously extracting 4 vertexes of a corresponding crop row external rectangle through matching between the right and left reference vertex pairs; after the current column scanning is finished, obtaining crop row external rectangle columns on the current column, wherein each external rectangle is an element, and each element stores image coordinates of upper left vertex and lower right vertex of the external rectangle; extracting unmatched isolated left and right reference vertex pairs corresponding to crop row external rectangles in the current column according to the left upper vertex image coordinates and the right lower vertex image coordinates of the external rectangles on the current column, and inserting the isolated left and right reference vertex pairs into the current external rectangle column in sequence; after the scanning of the current image block columns is finished, obtaining a crop row external rectangular matrix of the current image block and storing the crop row external rectangular matrix in the crop row external rectangular matrix of the current image; adjusting the initial row of the next image block according to the processed row number of the current image block, and performing column scanning on the next image block until the column scanning of the whole image is completed;
1.4) minimum value and mean value calculation of element height and column width: in the circumscribed rectangle matrix, the minimum value of the height (the distance between the upper side and the lower side of the circumscribed rectangle) and the width of the column (the distance between the left side and the right side of the circumscribed rectangle) of the circumscribed rectangle is solved, and the average values RowHAve and ColWAve of the height and the width of the column which are larger than the minimum value are solved;
1.5) column under-segmentation element segmentation: scanning each element in the circumscribed rectangular matrix if the element width is N of the column width meancMultiple and greater than threshold TcThen the element is equally divided into N in the transverse directioncAn element;
1.6) column alignment: scanning each column in the circumscribed rectangle matrix, and enabling the difference value between the x coordinate of the upper left vertex in each column and the minimum value of the x coordinates of the upper left vertices of all elements in the column to be larger than a threshold value TwThe elements and the right elements in the same row are sequentially shifted to the right to realize the column alignment of the circumscribed rectangular matrix;
1.7) line-under-segmentation element segmentation: scanning each element in the circumscribed rectangular matrix, if the height of the element row is N of high average valuerMultiple and greater than threshold TrDividing the element longitudinally equally into NrAn element;
1.8) line alignment: scanning each row in the circumscribed rectangle matrix, and enabling the difference value between the y coordinate of the top left vertex in each row and the minimum value of the y coordinates of the top left vertices of all elements in the row to be larger than a threshold value ThEach element and the lower element in the same column move downwards in sequence to realize the row alignment of the circumscribed rectangular matrix;
1.9) "zero element" segmentation: the 'zero element' refers to a vacant element without a circumscribed rectangle in the circumscribed rectangle matrix, namely the circumscribed rectangle of the crop row corresponding to the element is not extracted, and the image coordinates of the upper left vertex and the lower right vertex of the 'zero element' are both 0; determining the left upper vertex image coordinates and the right lower vertex image coordinates of the zero elements according to the left upper vertex image coordinates and the right lower vertex image coordinates of other non-zero elements in the rows and the columns where the zero elements are located and the average column width; scanning each column of the circumscribed rectangle matrix, and taking the x coordinates of the upper left vertex and the lower right vertex of the first non-zero element R1(n, j) on the current column as the x coordinates of the upper left vertex and the lower right vertex of all the zero elements R0(z, j) on the current column; scanning a z-th row where a current column of 'zero element' R0(z, j) is located, and if a non 'zero element' R1(z, c) exists on the z-th row, taking the y coordinates of the upper left vertex and the lower right vertex of the non 'zero element' R1(z, c) as the y coordinates of the upper left vertex and the lower right vertex of the 'zero element' R0(z, j) respectively; if there is no non-zero element on the z-th row and z is 1, taking 1 and row height average value RowHAve +1 as the y coordinates of the upper left vertex and the lower right vertex of the zero element; if there is no non-zero element on the z-th row and z >1, the lower right vertex y coordinate +1 and the lower right vertex y coordinate +1+ RowHAve of the upper adjacent element R1(z-1, j) in the same column as the zero element are set as the upper left and lower right vertex y coordinates of the zero element.
2. The method for segmenting the soybean crop lines based on the aerial image of the unmanned aerial vehicle as claimed in claim 1, wherein the line scanning-based crop line upper and lower boundary extraction is realized by the following method:
2.1) setting the line Width, transversely cutting the binary image I into image blocks of H ═ n ÷ Width, wherein the size of the H-th image block is m lines × nfColumn, nfThe value is Width + (n mod Width), mod represents remainder, and the size of each image block is m rows × Width columns;
2.2) scanning the ith row of each image block, wherein i is 1 to m, counting the number RCount of foreground pixels of the ith row:
if i is 1, storing RCount in FRCount as the foreground pixel number of the previous line during the next line scanning; if i>1, setting a flag up as 1 on an upper boundary of a crop row if FRCount is equal to 0 and RCount is greater than 0, and storing an i-1 on an upper boundary ordinate of the crop row into UpY; if i>1, FRCount is greater than 0, RCount is equal to 0, UpFlag is equal to 1, then the distance D of the upper and lower boundaries of the crop row is calculatedudi-UpY, if DudGreater than a threshold value TudThen UpY th row of all pixelsPoint identification is a crop row upper boundary point, all pixel points in the ith row are identified as crop row lower boundary points, UpFlag is reset, and RCount is stored in FRCount; scanning the next line;
2.3) repeating the step 2.2) until the line scanning of all the lines of the current image block is finished;
2.4) performing line scanning of the next image block until the line scanning of all the image blocks is completed.
3. The method for segmenting soybean crop rows based on aerial images of unmanned aerial vehicles according to claim 1, wherein the extraction of the crop row left and right boundaries and the circumscribed rectangular matrix based on column scanning comprises the following steps:
3.1) column scan: setting the line scanning Height, longitudinally cutting the binary image I into V ═ m ÷ Height image blocks, wherein the size of the V-th image block is mfRows by n columns, mfHeight + (m mod Height), and the size of each rest image block is Height row × n column; carrying out column scanning on the jth column of each image block, wherein j is 1 to n, and counting the number CCount of foreground pixels on the jth column; if j is 1, storing CCount into FCCount as the previous column foreground pixel number when the next column is scanned, and re-executing step 3.1); if j is less than or equal to n, executing the step 3.2); if j>n, namely the current image block finishes column scanning, and the step 3.9 is skipped;
3.2) extracting the left boundary of the crop row: if the foreground pixel number of the current row is greater than 0 and the foreground pixel number of the previous row of the current row is 0, that is, Count >0 and FCount ═ 0, the current row is the left boundary of the crop row, and step 3.3) is executed; otherwise, jumping to step 3.4);
3.3) identifying the upper left reference vertex and the lower left reference vertex of the crop row: scanning a current column from top to bottom, if a current pixel of the current column is an upper boundary pixel, identifying the pixel as an upper left reference vertex, recording image coordinates of the pixel, and setting an identifier leftUFlag which exists in the upper left reference vertex as 1; if the current pixel of the current column is a lower boundary pixel, the pixel is identified as a lower left reference vertex, the image coordinate of the pixel is recorded, and if leftUFlag is 1, the left upper and lower left reference vertices are set to identify leftUDFlag as 1, and leftUFlag is cleared; judging whether upper left and lower left reference vertexes appear continuously, namely judging whether leftUDFlag is 1, if so, the pair of upper left and lower left reference vertexes is an effective crop row external rectangular left reference vertex pair, storing the image coordinates of the pair of upper left and lower left reference vertexes, automatically increasing the number of effective left reference vertexes by 1, resetting the leftUDFlag, and setting an adjacent effective left boundary marker leftSFlag to be 1; after the current column scanning is finished, skipping to the step 3.1);
3.4) extracting the right boundary of the effective pairing crop row: if the foreground pixel number of the current row is equal to 0 and the foreground pixel number of the previous row of the current row is greater than 0, i.e. CCount is 0 and FCCount>0, the current column is the right boundary of the crop row, and whether an adjacent effective left boundary exists before the current right boundary, namely LeftSFlag is 1, and the distance D between the current right boundary and the previously existing adjacent effective left boundary is further judgedlrWhether or not it is greater than threshold value Tlr: if yes, the current right boundary is an effective matching right boundary, the number of the crop row external rectangle columns is increased by 1, and the step 3.5 is skipped; otherwise, jumping to step 3.1);
3.5) identifying the reference top points of the upper right and the lower right of the crop row: scanning a current column from top to bottom, if a current pixel of the current column is an upper boundary pixel, identifying the pixel as an upper right reference vertex, recording the image coordinate of the pixel, and setting the identifier RightUFlag stored in the upper right reference vertex as 1; if the current pixel of the current column is a lower boundary pixel, the pixel is identified as a lower right reference vertex, the image coordinate of the pixel is recorded, if the rightUFlag is 1, the rightUDFlag stored in the upper right reference vertex and the lower right reference vertex is set to be 1, and the rightUFlag is cleared; judging whether upper right and lower right reference vertexes continuously appear, namely judging whether the rightUDFlag is 1 or not, and judging whether the leftSFlag is 1 or not: if so, the pair of upper right and lower right reference vertex pairs is an effective crop row external rectangle right reference vertex pair, the image coordinates of the pair of upper right and lower right reference vertex pairs are saved, the effective right reference vertex pair number is automatically increased by 1, the rightUDFlag is cleared, and the step 3.6 is executed); otherwise, re-executing the step
3.5) clearing the leftSFlag to zero until the current column scanning is finished, and skipping to the step 3.7);
3.6) extracting the matched crop row external rectangles based on the left and right reference vertex pairs: for the current effective right reference vertex pair, scanning the effective left reference vertex pair on the adjacent left boundary point by point in sequence, and if the effective upper right reference vertex pair and the upper left reference vertex pair have the absolute value D of the difference value of the y coordinateslruLess than threshold TlruIf the corresponding effective right reference vertex pair and effective left reference vertex pair are 4 reference vertices of an effective crop row circumscribed rectangle, the effective upper left and lower right reference vertex x coordinates are respectively used as the upper left and lower right vertex x coordinates of the circumscribed rectangle, the larger value of the effective upper left and upper right reference vertex y coordinates and the smaller value of the effective lower left and lower right reference vertex y coordinates are used as the upper left and lower right vertex y coordinates of the circumscribed rectangle, the effective upper left and lower right reference vertex x coordinates are stored into LeftX and RightX at the same time and are used as subsequent isolated reference vertices to extract the circumscribed rectangle, the circumscribed rectangle number is increased by 1, the matched left and right reference vertex pairs are respectively set to be matched with an identifier LeftMFlag 1, RightMFlag is set to be 1, and the matched right reference vertex pair identifier MatchFlag existing in the current right boundary is set to be 1; jump to step 3.5);
3.7) extracting the circumscribed rectangle by the isolated left reference vertex: judging whether the current right boundary MatchFlag is 1 or not; if yes, sequentially scanning whether a left reference vertex pair leftMFlag matched with the current right boundary is 0 or not; if so, the left reference vertex pair is an isolated left reference vertex pair, the x and y coordinates of the upper left reference vertex, the saved RightX coordinate and the y coordinate of the lower left reference vertex are respectively used as the image coordinates of the upper left vertex and the lower right vertex of the circumscribed rectangle, and the circumscribed rectangle is inserted into the current circumscribed rectangle row according to the size of the y coordinate of the upper left vertex; step 3.8) is executed;
3.8) extracting the circumscribed rectangle by the isolated right reference vertex: sequentially scanning whether the rightMFlag of a right reference vertex pair on the current right boundary is 0, if so, taking the right reference vertex pair as an isolated right reference vertex pair, respectively taking saved leftX and upper right reference vertex y coordinates, and lower right reference vertex x and y coordinates as upper left vertex image coordinates and lower right vertex image coordinates of the circumscribed rectangle, and inserting the circumscribed rectangle into the circumscribed rectangle column of the current column according to the size of the upper left vertex y coordinate; storing the y coordinate of the last lower right vertex of the circumscribed rectangle in the ColMaxY, solving the minimum value ImBloEndY of all the ColMaxY columns, storing the foreground pixel number CCount of the current scanning column in FCCount, and jumping to the step 3.1);
3.9) storing a circumscribed rectangular matrix: if the current image block is the first block of the current image, the current image block is circumscribed with a rectangular matrix MpStore the whole image and connect the rectangular matrix Mt(ii) a Otherwise, scan MpFor each column, calculate MpThe first non-zero element of the current column, the upper left vertex, xpAnd MtThe first non-zero element of each column, the upper left vertex, xtAbsolute value of distance Dpt: if xp<xtAnd D ispt>TptThen at MtInserting a new column formed by 'zero element' before the current column, and adding MpThe current column is preceded by the last element of the new column; if D ispt≤TptThen directly combine MpCurrent is listed at MtAfter the last element of the current column; if go through MtAfter all columns, if the above conditions are not satisfied, then at MtInserting a new column formed by 'zero elements' behind the last column, and adding MpThe current column is preceded by the last "zero element" of the new column; calculating MtThe maximum element number of each column is MaxConNo, columns with the number of all elements less than the MaxConNo are complemented with 'zero elements' behind the last element until the number of the elements of the columns is MaxConNo, and step 3.10 is executed);
3.10) adjusting the initial line of the next image block: taking ImBloEndY as a termination line of the current processed image block, and removing a circumscribed rectangle of which the y coordinate of the top left vertex is larger than ImBloEndY; and the starting behavior ImBloEndY +1 and the ending behavior ImBloEndY +1+ Height of the next image block jump to the step 3.1) to scan the next image block until the column scanning of the whole image is completed.
4. The method for segmenting soybean crop rows through unmanned aerial vehicle aerial images as claimed in claim 1, wherein the column under-segmentation element segmentation is realized by the following method: scanning each column of the circumscribed rectangular matrix, and defining the column width ColWid as xr(i,j)-xl(i, j) is greater than TcAll elements from an element R (i, j) (i and j are respectively the row number and the column number of a circumscribed rectangular matrix where the element is located) of the multiple ColWAve to an element from the right adjacent element to the left adjacent element of the first 'zero element' are sequentially shifted to the right by one element, and the 'zero element' is covered; if the current row has no zero element, all elements starting from the adjacent element on the right side of the element R (i, j) are shifted to the right by one bit; insert a new element with the top left vertex image coordinate [ x ] at (i, j +1)l(i,j)+round(ColWid÷Nc)+1,yu(i,j)]And the coordinates of the lower right vertex image are [ x ]r(i,j),yd(i,j)]Wherein [ x ]l(i,j),yu(i,j)]、[xr(i,j),yd(i,j)]Upper left and lower right vertex image coordinates, N, of element R (i, j), respectivelyc═ ColWid ÷ ColWAve); let element R (i, j) bottom right vertex x coordinate xr(i, j) is modified to xl(i,j)+round(ColWid÷Nc)。
5. The method for segmenting soybean crop rows by using aerial images of unmanned aerial vehicles according to claim 1, wherein the columns are aligned and the method is realized as follows: scanning each row of the circumscribed rectangle matrix, and solving the minimum MinXCol of the x coordinate of the top left vertex of the circumscribed rectangle in each row; the distance D between the x coordinate of the upper left vertex of the current column and the minimum value MinXColwGreater than a threshold value TwThe element R (i, j) of (a) is shifted right by one bit in the order of the elements starting from the element R (i, j) to the left adjacent element of the first "zero element" on its right side, covering this "zero element"; if no 'zero element' exists, sequentially shifting R (i, j) and all elements of the ith row to the right by one bit; r (i, j) is changed to "zero element".
6. The method for line segmentation of soybean crops in aerial images by unmanned aerial vehicles according to claim 1, wherein the line under segmentation element segmentation is realized by the following steps: scanning each row of the circumscribed rectangular matrix, and scanning the row height RowHei-yd(i,j)-yu(i, j) is greater than TrAll elements from the element below the element R (i, j) of multiple RowHAve to the first "zero element" above the adjacent elementElement order moves down one element and covers the 'zero element'; if the current column has no zero element, all elements starting from the adjacent element below the element R (i, j) are shifted down by one bit; insert a new element with the top left vertex image coordinate [ x ] at (i +1, j)l(i,j),yu(i,j)+round(RowHei÷Nr)+1]And the coordinates of the lower right vertex image are [ x ]r(i,j),yd(i,j)]Wherein N isr(RowHei ÷ rowwave); the y coordinate y of the lower right vertex of the element R (i, j)d(i, j) is modified to yu(i,j)+round(RowHei÷Nr)。
7. The method for line segmentation of soybean crops in aerial images by unmanned aerial vehicles according to claim 1, wherein the line alignment is realized by the following method: scanning each row of the circumscribed rectangle matrix, and solving the minimum value MinYRow of the y coordinate of the top left vertex of each row of circumscribed moment; the distance D between the y coordinate of the top left vertex of the current line and the minimum value MinYRowhGreater than a threshold value ThThe element R (i, j) of (a) is shifted down by one bit in the order of the elements starting from the element R (i, j) to the next element above the first "zero element" below it, covering that "zero element"; if no 'zero element' exists, sequentially shifting R (i, j) and all elements below the R (i, j) which belong to the jth column by one bit; r (i, j) is changed to "zero element".
CN201910039172.2A 2019-01-16 2019-01-16 Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle Expired - Fee Related CN109859212B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910039172.2A CN109859212B (en) 2019-01-16 2019-01-16 Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910039172.2A CN109859212B (en) 2019-01-16 2019-01-16 Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN109859212A CN109859212A (en) 2019-06-07
CN109859212B true CN109859212B (en) 2020-12-04

Family

ID=66894850

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910039172.2A Expired - Fee Related CN109859212B (en) 2019-01-16 2019-01-16 Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN109859212B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110660075B (en) * 2019-09-20 2023-03-24 中国计量大学 Method for row segmentation of soybean crops adhered to aerial images of unmanned aerial vehicle
CN113807129A (en) * 2020-06-12 2021-12-17 广州极飞科技股份有限公司 Crop area identification method and device, computer equipment and storage medium
CN112183448B (en) * 2020-10-15 2023-05-12 中国农业大学 Method for dividing pod-removed soybean image based on three-level classification and multi-scale FCN
CN113505766B (en) * 2021-09-09 2022-01-04 北京智源人工智能研究院 Image target detection method and device, electronic equipment and storage medium
CN114022534A (en) * 2021-10-22 2022-02-08 上海伯耶信息科技有限公司 Tobacco leaf texture included angle extraction method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107516068A (en) * 2017-07-26 2017-12-26 福州大学 A kind of method that single ebon hat is extracted in the high resolution image from unmanned plane
CN109087241A (en) * 2018-08-22 2018-12-25 东北农业大学 A kind of agricultural crops image data nondestructive collection method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102521596A (en) * 2011-12-09 2012-06-27 中国科学院长春光学精密机械与物理研究所 Method for identifying and dividing crop objects under limited cognitive condition
CN103514459A (en) * 2013-10-11 2014-01-15 中国科学院合肥物质科学研究院 Method and system for identifying crop diseases and pests based on Android mobile phone platform
US9305214B1 (en) * 2013-10-29 2016-04-05 The United States Of America, As Represented By The Secretary Of The Navy Systems and methods for real-time horizon detection in images
CN104952070B (en) * 2015-06-05 2018-04-13 中北大学 A kind of corn field remote sensing image segmentation method of class rectangle guiding
US10325351B2 (en) * 2016-03-11 2019-06-18 Qualcomm Technologies, Inc. Systems and methods for normalizing an image
CN107274418A (en) * 2017-07-07 2017-10-20 江苏省无线电科学研究所有限公司 A kind of crop image partition method based on AP clustering algorithms
CN108416353B (en) * 2018-02-03 2022-12-02 华中农业大学 Method for quickly segmenting rice ears in field based on deep full convolution neural network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107516068A (en) * 2017-07-26 2017-12-26 福州大学 A kind of method that single ebon hat is extracted in the high resolution image from unmanned plane
CN109087241A (en) * 2018-08-22 2018-12-25 东北农业大学 A kind of agricultural crops image data nondestructive collection method

Also Published As

Publication number Publication date
CN109859212A (en) 2019-06-07

Similar Documents

Publication Publication Date Title
CN109859212B (en) Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle
US11151723B2 (en) Image segmentation method, apparatus, and fully convolutional network system
CN108470021B (en) Method and device for positioning table in PDF document
CN107590447B (en) Method and device for recognizing word title
CN107506701B (en) Automatic go chess manual recording method based on video recognition technology
CN107609546B (en) Method and device for recognizing word title
US9454836B2 (en) Object display device and object display method
EP2534638B1 (en) Watermark detection using a propagation map
CN108352070B (en) Moving object tracking method, moving object tracking device, and program
CN109934886B (en) Graph filling method and device and interactive intelligent equipment
CN112183301B (en) Intelligent building floor identification method and device
CN109145906B (en) Target object image determination method, device, equipment and storage medium
CN108830278A (en) A kind of character string picture recognition methods
CN110909694A (en) Method, device, terminal and storage medium for acquiring port information of optical splitter
CN111179291A (en) Edge pixel point extraction method and device based on neighborhood relationship
CN108121989A (en) Information processing unit, storage medium and information processing method
CN116229265A (en) Method for automatically and nondestructively extracting phenotype of soybean plants
CN114998445A (en) Image sparse point stereo matching method
CN106356020B (en) LED display display control method and image data dividing method
CN109447970A (en) The image reorientation method based on energy transfer and uniformly scaled
CN111738310B (en) Material classification method, device, electronic equipment and storage medium
CN109255795B (en) Tomato plant edge sorting method
CN109919164B (en) User interface object identification method and device
CN111985508B (en) Target connected domain shape analysis method suitable for linear array CCD
CN110660075B (en) Method for row segmentation of soybean crops adhered to aerial images of unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201204

Termination date: 20220116

CF01 Termination of patent right due to non-payment of annual fee