CN109859212A - A kind of unmanned plane image soybean crops row dividing method - Google Patents
A kind of unmanned plane image soybean crops row dividing method Download PDFInfo
- Publication number
- CN109859212A CN109859212A CN201910039172.2A CN201910039172A CN109859212A CN 109859212 A CN109859212 A CN 109859212A CN 201910039172 A CN201910039172 A CN 201910039172A CN 109859212 A CN109859212 A CN 109859212A
- Authority
- CN
- China
- Prior art keywords
- row
- vertex
- column
- coordinate
- boundary rectangle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses a kind of unmanned plane image soybean crops row dividing methods.Soybean crops image segmentation of taking photo by plane is carried out first, it is based further on row and scans determining crop row up-and-down boundary, crop row right boundary is determined based on column scan, a left side for crop row boundary rectangle is determined simultaneously, right datum vertex pair, crop row boundary rectangle, which extracts, to be realized to matching based on left and right datum vertex, crop row boundary rectangle based on colleague and same column realizes that the boundary rectangle of isolated left and right datum vertex pair extracts, column less divided element segments and column alignment are carried out to the boundary rectangle matrix of acquisition, row less divided element segments and row alignment, and " neutral element " cutting operation, finally extract the crop row boundary rectangle of whole sub-picture.Using the segmentation of crop row in the achievable unmanned plane soybean crops image of the present invention, realize that technical foundation has been established in the soybean breeder seed selection work based on indexs such as crop row color, area, height for application unmanned plane image.
Description
Technical field
The present invention relates to a kind of image processing methods, especially a kind of unmanned plane image soybean crops row segmentation side
Method.
Background technique
Soybean is a kind of important industrial crops.For the yield for improving soybean, the breeding and seed selection of soybean are keys, therefore
With very important application value.
The breeding and seed selection of soybean at present relies primarily on personal experience and the manual measurement of expert, and the breeding of soybean and choosing
Kind often carried out in a kinds up to ten thousand, Breeding Process is very cumbersome and breeding result with the difference of expert's individual subjective experience and
It has differences.Hence it is highly desirable to some methods for helping to realize soybean breeder and process automation of choosing seeds.Using unmanned plane
Field soybean crop image is shot, the method for realizing soybean seed selection based on Aerial Images processing is selected as a kind of realization soybean breeder
The good solution of kind process automation, the just concern by more and more practitioners.However, being handled based on Aerial Images
Realize that soybean breeder seed selection needs to realize the image segmentation of different cultivars soybean crops row in Aerial Images first.Currently, the work
Manual operation realization is relied primarily on, time-consuming and laborious, there is an urgent need to a kind of soybean Aerial Images crop row Automatic image segmentation methods.
The automatic segmentation of crop row in unmanned plane soybean crops image can be achieved in the present invention, can be to navigate using unmanned plane
It claps image and realizes that technical foundation has been established in the soybean breeder seed selection work based on indexs such as crop row color, area, height.
Summary of the invention
The purpose of the present invention is to provide a kind of unmanned plane image soybean crops row dividing methods, to realize unmanned plane
The Automatic image segmentation of soybean crops row in Aerial Images, extracts the boundary rectangle of soybean crops row.
The technical solution adopted by the present invention is that:
The present invention includes the following steps:
1.1) it image segmentation: navigates using based on the green red difference OTSU automatic threshold image segmentation algorithm of normalization to unmanned plane
It claps m row × n column soybean crops image S and carries out image segmentation, obtain bianry image I;
1.2) the crop row up-and-down boundary based on row scanning extracts: image I is horizontally divided into H image block;To each
Image block is into line scans;It determines and makees from zero variation to non-zero, from non-zero to zero according to adjacent scan lines scene prime number of going forward
Y-coordinate of the upper and lower boundary of object row in image I;
1.3) the crop row right boundary based on column scan and boundary rectangle matrix extract: image I is longitudinally split at V
Image block;Column scan is carried out to each image block;According to adjacent scan columns go forward scene prime number from zero to non-zero change detection make
Object row left margin;Extract upper left, the lower-left datum vertex pair of crop row boundary rectangle;It is gone forward scene prime number according to adjacent scan columns
From non-zero to zero change detection crop row right margin;Extract upper right, the bottom right datum vertex pair of crop row boundary rectangle;Lead to simultaneously
Cross 4 vertex that corresponding crop row boundary rectangle is extracted in matching between the right side, left datum vertex pair;After completing current column scan, obtain
When crop row boundary rectangle arranges on forefront, wherein each boundary rectangle is an element, each element has a boundary rectangle left side
Upper, bottom right vertex image coordinate;Work as forefront according to when the upper left of boundary rectangle existing on forefront, the extraction of bottom right vertex image coordinate
In it is not matched isolate left and right datum vertex to corresponding crop row boundary rectangle, be sequentially inserted into current boundary rectangle column;Completion is worked as
After preceding image block column scan, obtains current image block crop row boundary rectangle matrix and be stored in present image crop row boundary rectangle square
In battle array;Next image block initial row is adjusted according to the processed line number of current image block, the column scan of next image block is carried out, until complete
At the column scan of whole sub-picture;
1.4) element row height and col width minimum value, mean value computation: asking in boundary rectangle matrix, and boundary rectangle row is high (external
Rectangle back gauge up and down) and col width (boundary rectangle left and right margins from) minimum value, and ask high greater than the row of minimum value, col width
Mean value RowHAve, ColWAve;
1.5) column less divided element segments: each element in scanning boundary rectangle matrix, if element width is col width mean value
NcTimes, and it is greater than threshold value Tc, then the element is laterally divided into NcA element;
1.6) it column alignment: is respectively arranged in scanning boundary rectangle matrix, by owning left upper apex x coordinate in each column with the column
The difference of the left upper apex x coordinate minimum value of element is greater than threshold value TwEach element and the right side order of elements gone together with it move to right
Realize the alignment of boundary rectangle matrix column;
1.7) row less divided element segments: each element in scanning boundary rectangle matrix, if element row height is capable high mean value
NrTimes, and it is greater than threshold value Tr, which is longitudinally divided into NrA element;
1.8) row alignment: each row in scanning boundary rectangle matrix, by the way that left upper apex y-coordinate in every row is owned with the row
The difference of the left upper apex y-coordinate minimum value of element is greater than threshold value ThEach element and moved down with the lower section order of elements of its same column
Realize the row alignment of boundary rectangle matrix;
1.9) " neutral element " is divided: " neutral element " refers in boundary rectangle matrix, the vacancy element of no boundary rectangle, i.e. this yuan
The corresponding crop row boundary rectangle of element is not extracted by, and the upper left of " neutral element ", bottom right vertex image coordinate are 0;According to " null element
The upper left of other non-" neutral elements ", bottom right vertex image coordinate and average col width determine the left side of " neutral element " in row, column where element "
Upper, bottom right vertex image coordinate.
Crop row up-and-down boundary based on row scanning in the step 1.2) extracts, and implementation method is as follows:
2.1) row sweep length Width is set, bianry image I is cut transversely into H=n ÷ Width image block,
In, the H tile size is m row × nfColumn, nf=Width+ (n modWidth), mod expression rem, remaining each figure
As block size is m row × Width column;
2.2) to each the i-th traveling of image block line scans, i is 1 to m, counts the i-th row foreground pixel number RCount:
If i=1, RCount is stored in FRCount, previous row foreground pixel number when as next line scanning;If i
> 1, and FRCount is equal to 0, RCount and is greater than 0, then crop row coboundary flag F lagUp=1 is arranged, by crop row coboundary
Ordinate i-1 is stored in UpY;If i > 1, and FRCount is greater than 0, RCount and is equal to 0, UpFlag=1, then calculates crop row or more
Frontier distance Dud=i-UpY, if DudGreater than threshold value Tud, then it is crop row coboundary point by UpY row all pixels point identification,
I-th row all pixels point identification is crop row lower boundary point, and UpFlag is reset, and RCount is stored in FRCount;It carries out down
A line scanning;
2.3) step 2.2) is repeated, until completing the row scanning of all rows of current image block;
2.4) the row scanning of next image block is carried out, until completing the row scanning of all image blocks.
Crop row right boundary and boundary rectangle matrix in the step 1.3) based on column scan extract, including walk as follows
It is rapid:
3.1) column scan: setting row scanning height Height, bianry image I is longitudinally cutting at V=m ÷ Height figure
As block, wherein V tile size is mfRow × n column, mf=Height+ (m mod Height), remaining each image block
Size is Height row × n column;Column scan is carried out to the jth column of each image block, j is 1 to n, and statistics jth arranges upper foreground pixel
Number CCount;If j=1, CCount is stored in FCCount, previous column foreground pixel number when as next column scan, weight
It is new to execute step 3.1);If j≤n, step 3.2) is executed;If j > n, i.e., current image block completes column scan, gos to step
3.9);
3.2) crop row left margin extracts: if when foreground pixel number in forefront is greater than 0 and works as the previous column foreground pixel in forefront
Number is 0, i.e. Count > 0 and FCount=0, then is currently classified as the left margin of crop row, executes step 3.3);Otherwise, it jumps to
Step 3.4);
3.3) crop row upper left, the identification of lower-left datum vertex: forefront is worked as in scanning from top to bottom, if working as the current pixel in forefront
For coboundary pixel, then the pixel is identified as upper left datum vertex, records the pixel image coordinate, and there are upper left ginsengs for setting
Examine vertex mark LeftUFlag=1;If when the current pixel in forefront is lower boundary pixel, which is identified as lower-left ginseng
Examine vertex, record the pixel image coordinate, if while LeftUFlag be 1, setting there are upper lefts and lower-left datum vertex to mark
Know LeftUDFlag=1, and LeftUFlag is reset;Judge whether upper left and lower-left datum vertex continuously occur, that is, judges
Whether LeftUDFlag is 1, if so, this is to upper left, lower-left datum vertex to for the effective left reference of crop row boundary rectangle
Vertex pair saves this to upper left and lower-left datum vertex image coordinate, and effective left datum vertex logarithm is increased 1 certainly,
LeftUDFlag is reset, and having adjacent effective left margin label L eftSFlag before setting is 1;After completing current column scan, jump
Go to step 3.1);
3.4) effectively pairing crop row right margin extracts: if when foreground pixel number in forefront is equal to 0 and works as the previous column in forefront
Foreground pixel number is greater than 0, i.e. CCount=0 and FCCount > 0 are then currently classified as the right margin of crop row, and further judgement is worked as
Whether adjacent effective left margin, i.e. LeftSFlag=1, and current right margin and the preceding existing phase are had before preceding right margin
Adjacent effective left margin distance DlrWhether threshold value T is greater thanlr: if so, current right margin is effectively to match right margin, crop row
Boundary rectangle columns jumps to step 3.5) from increasing 1;Otherwise, it gos to step 3.1);
3.5) crop row upper right, the identification of bottom right datum vertex: forefront is worked as in scanning from top to bottom, if working as the current pixel in forefront
For coboundary pixel, then the pixel is identified as upper right datum vertex, records the pixel image coordinate, and there are upper right ginsengs for setting
Examine vertex mark RightUFlag=1;If when the current pixel in forefront is lower boundary pixel, which is identified as bottom right ginseng
Vertex is examined, the pixel image coordinate is recorded, if RightUFlag is 1, there are upper rights and bottom right datum vertex to mark for setting
RightUDFlag=1, and RightUFlag is reset;Judge whether upper right and bottom right datum vertex continuously occur, that is, judges
Whether RightUDFlag is 1, and judges whether LeftSFlag is 1: if so, this is to upper right, bottom right datum vertex to have
The right datum vertex pair of crop row boundary rectangle of effect saves this to upper right, bottom right datum vertex image coordinate, and will effective right ginseng
It examines vertex logarithm and is reset from 1, RightUDFlag is increased, execute step 3.6);Otherwise, it re-execute the steps 3.5), until completing to work as
After preceding column scan, LeftSFlag is reset, is gone to step 3.7);
3.6) matched crop row boundary rectangle is extracted based on left and right datum vertex: to the currently active right datum vertex
It is right, sequentially effective left datum vertex pair on the adjacent left margin of point by point scanning, if effectively upper right datum vertex and upper left datum vertex y
Coordinate absolute difference DlruLess than threshold value Tlru, then corresponding effectively right datum vertex to effective left datum vertex to being effective
Crop row boundary rectangle 4 datum vertexs, respectively using effective upper left, bottom right datum vertex x coordinate as boundary rectangle upper left,
Bottom right vertex x coordinate, with effective upper left and upper right datum vertex y-coordinate the larger value, effective lower-left and bottom right datum vertex y-coordinate
Smaller value as boundary rectangle upper left, bottom right vertex y-coordinate, while by effective upper left, bottom right datum vertex x coordinate be stored in
LeftX and RightX is used as subsequent isolated datum vertex to external rectangular extraction, and matching is respectively set from increasing 1 in boundary rectangle number
Left and right datum vertex to matching identification LeftMFlag=1, RightMFlag=1, the existing matching of current right margin is set
Right datum vertex is to mark MatchFlag=1;Jump to step 3.5);
3.7) left datum vertex is isolated to external rectangular extraction: judging whether current right margin MatchFlag is 1;If so,
Successively whether scanning and left datum vertex on the matched left margin of current right margin are 0 to LeftMFlag;If so, the left ginseng
Vertex is examined to isolate left datum vertex pair, by upper left datum vertex x, y-coordinate has saved RightX and lower-left datum vertex y
Coordinate presses its left upper apex y-coordinate big-small plug respectively as boundary rectangle upper left, bottom right vertex image coordinate, by the boundary rectangle
Enter in current boundary rectangle column;Execute step 3.8);
3.8) right datum vertex is isolated to external rectangular extraction: successively scanning the upper right datum vertex pair of current right margin
Whether RightMFlag is 0, if so, the right datum vertex is to isolate right datum vertex pair, by the LeftX saved and the right side
Upper datum vertex y-coordinate, bottom right datum vertex x, y-coordinate, will respectively as the boundary rectangle upper left, bottom right vertex image coordinate
The boundary rectangle is inserted into current column boundary rectangle column by its left upper apex y-coordinate size;By the column, last boundary rectangle is right
Lower vertex y-coordinate is stored in ColMaxY, all column ColMaxY minimum value ImBloEndY is sought, by current scan list foreground pixel number
CCount is stored in FCCount, gos to step 3.1);
3.9) boundary rectangle matrix stores: if current image block is first piece of present image, by the external square of current image block
Shape matrix MpIt is stored in entire image boundary rectangle matrix Mt;Otherwise, M is scannedpEach column calculate MpWhen the non-" null element in first, forefront
Element " left upper apex abscissa xpWith MtEach column first non-" neutral element " left upper apex abscissa xtThe absolute value D of distanceptIf: xp
<xt, and Dpt>Tpt, then in MtThe new column that one " neutral element " is constituted are inserted into before forefront, by MpWhen forefront connect the new column most
Behind the latter element;If Dpt≤Tpt, then directly by MpWhen forefront connects in MtBehind the last one element in forefront;If traversal
MtAfter all column, above-mentioned condition is not satisfied, then in MtLast column new column that insertion " neutral element " is constituted below, by MpCurrently
Column connect behind the last one " neutral element " of the new column;Seek MtThe greatest member number MaxColNo respectively arranged, by all elements
Number mends " neutral element " behind element of getting the last place less than MaxColNo, until the column element number is MaxColNo
Until, execute step 3.10);
3.10) next image block initial row adjusts: taking ImBloEndY as the termination row of current processed image block, removal
Left upper apex y-coordinate is greater than the boundary rectangle of ImBloEndY;The starting behavior ImBloEndY+1 of next image block, termination behavior
3.1) ImBloEndY+1+Height gos to step, carries out the scanning of next image block, until the column for completing entire image are swept
It retouches.
Column less divided element segments, implementation method are as follows in the step 1.5): scanning boundary rectangle matrix respectively arranges, by it
Middle col width ColWid=xr(i,j)-xl(i, j) is greater than TcTimes ColWAve element R (i, j) (i, j be respectively element institute outside
Connect line number, the row number of rectangular matrix) right side adjacent element start it is all until first " neutral element " left side adjacent element
Order of elements moves to right an element, and overrides " neutral element ";If current line, will be on the right side of element R (i, j) without " neutral element "
The all elements that adjacent element starts move to right one;A new element is inserted at (i, j+1), left upper apex image coordinate is
[xl(i,j)+round(ColWid÷Nc)+1,yu(i, j)], bottom right vertex image coordinate is [xr(i,j),yd(i, j)], wherein
[xl(i,j),yu(i,j)]、[xr(i,j),yd(i, j)] be respectively element R (i, j) upper left, bottom right vertex image coordinate, Nc=
(ColWid÷ColWAve);By element R (i, j) bottom right vertex x coordinate xr(i, j) is revised as xl(i,j)+round(ColWid÷
Nc)。
Column alignment in the step 1.6), implementation method are as follows: scanning boundary rectangle matrix respectively arranges, and asks each external square of column left
Upper vertex x coordinate minimum M inXCol;The space D of forefront left upper apex x coordinate and minimum M inXCol will be worked aswGreater than threshold
Value TwThe order of elements that starts until adjacent element on the left of its right side first " neutral element " of element R (i, j) move to right one,
" neutral element " is somebody's turn to do in covering;" neutral element " if it does not exist, then all elements sequence for R (i, j) and its right side being belonged to the i-th row are right
Move one;R (i, j) is changed to " neutral element ".
Row less divided element segments, implementation method are as follows in the step 1.7): scanning boundary rectangle matrix rows, by it
The middle high RowHei=y of rowd(i,j)-yu(i, j) is greater than TrAdjacent element starts to first below the element R (i, j) of times RowHAve
All elements sequence above a " neutral element " until adjacent element moves down an element, and overrides " neutral element ";If working as
The all elements that adjacent element below element R (i, j) starts then are moved down one without " neutral element " by forefront;At (i+1, j)
It is inserted into a new element, left upper apex image coordinate is [xl(i,j),yu(i,j)+round(RowHei÷Nr)+1], bottom right top
Point image coordinate is [xr(i,j),yd(i, j)], wherein Nr=(RowHei ÷ RowHAve);By element R (i, j) bottom right vertex y
Coordinate yd(i, j) is revised as yu(i,j)+round(RowHei÷Nr)。
Row is aligned in the step 1.8), and implementation method is as follows: scanning boundary rectangle matrix rows ask the external square of each row left
Upper vertex y-coordinate minimum M inYRow;By the space D of current line left upper apex y-coordinate and minimum M inYRowhGreater than threshold
Value ThElement R (i, j) start to first " neutral element " below top adjacent element until order of elements move down one,
" neutral element " is somebody's turn to do in covering;" neutral element " if it does not exist is then belonged to by R (i, j) and below under all elements sequence that jth arranges
Move one;R (i, j) is changed to " neutral element ".
" neutral element " is divided in the step 1.9), and implementation method is as follows: scanning boundary rectangle matrix respectively arranges, and will work as forefront
The upper left of upper first non-" neutral element " R1 (n, j), bottom right vertex x coordinate are as owning " neutral element " R0's (z, j) on the column
Upper left, bottom right vertex x coordinate;The z row where forefront " neutral element " R0 (z, j) is scanned: if there are non-on z row
" neutral element " R1 (z, c), using the upper left non-" neutral element " R1 (z, c), bottom right vertex y-coordinate as " neutral element " R0
The upper left of (z, j), bottom right vertex y-coordinate;If on z row nothing but " neutral element ", and z=1, then by the 1 and high mean value RowHAve of row
+ 1 as upper left, the bottom right vertex y-coordinate for being somebody's turn to do " neutral element ";If " neutral element " and z > 1 nothing but on z row, will be with " the null element
Bottom right vertex y-coordinate+1, the bottom right vertex y-coordinate+1+RowHAve of the top adjacent element R1 (z-1, j) of element " same column is used as should
The upper left of " neutral element ", bottom right vertex y-coordinate.
The invention has the advantages that: the present invention is by designing a kind of unmanned plane image soybean crops row segmentation
Method realizes that soybean crops row is divided automatically in unmanned plane image, realizes for application unmanned plane image and is based on crop row
The soybean of the indexs such as color, area, height educates seed selection work and has established technical foundation.
Detailed description of the invention
Fig. 1 is unmanned plane image soybean crops row segmenting system composition schematic diagram.
Fig. 2 is unmanned plane image soybean crops row dividing method flow chart.
Fig. 3 is crop row right boundary and boundary rectangle extracting method flow chart based on column scan.
Fig. 4 is the extraction of boundary rectangle matrix and handling principle schematic diagram.
Fig. 5 is unmanned plane image soybean crops row segmentation example.
In Fig. 1: 1, unmanned plane, 2, color camera, 3, soybean crop field, 4, computer, 5, unmanned plane image soybean work
Object row segmentation software.
Specific embodiment
Present invention will be further explained below with reference to the attached drawings and examples.
As Fig. 1 illustrates a specific embodiment of unmanned plane image soybean crops row segmenting system.Unmanned plane 1 is adopted
With the MATRICE 600PRO of big boundary.Color camera 2 uses SONY α 6300.Computer 4 is S230u Twist notebook electricity
Brain, memory 8Gb, CPU are 10 operating system of Intel Core i7-3537U@2.00GHz, WIN.Storage in color camera 2
Card accesses computer 4 by memory card interface.
The segmentation of unmanned plane image soybean crops row is implemented as follows:
Unmanned plane 1 with color camera is in 3 flying overhead of soybean crop field;Color camera 2 receives soybean crops optical picture
As after, it is converted into electronic image;Soybean crops electronic image in color camera 2 is input in computer 4;Computer
Unmanned plane image soybean crops row segmentation software 5 in 4 realizes soybean crops row image segmentation.
As shown in Fig. 2, unmanned plane image soybean crops row in unmanned plane image soybean crops row segmentation software 5
Dividing method is implemented as follows:
1.1) it image segmentation: navigates using based on the green red difference OTSU automatic threshold image segmentation algorithm of normalization to unmanned plane
It claps m row × n column (m 1920, n 2560) soybean crops image S and carries out image segmentation, obtain bianry image I;
1.2) the crop row up-and-down boundary based on row scanning extracts: image I is horizontally divided into H image block;To each
Image block is into line scans;It determines and makees from zero variation to non-zero, from non-zero to zero according to adjacent scan lines scene prime number of going forward
Y-coordinate of the upper and lower boundary of object row in image I;
1.3) the crop row right boundary based on column scan and boundary rectangle matrix extract: image I is longitudinally split at V
Image block;Column scan is carried out to each image block;According to adjacent scan columns go forward scene prime number from zero to non-zero change detection make
Object row left margin;Extract upper left, the lower-left datum vertex pair of crop row boundary rectangle;It is gone forward scene prime number according to adjacent scan columns
From non-zero to zero change detection crop row right margin;Extract upper right, the bottom right datum vertex pair of crop row boundary rectangle;Lead to simultaneously
Cross 4 vertex that corresponding crop row boundary rectangle is extracted in matching between the right side, left datum vertex pair;After completing current column scan, obtain
When crop row boundary rectangle arranges on forefront, wherein each boundary rectangle is an element, each element has a boundary rectangle left side
Upper, bottom right vertex is to image coordinate;It is extracted currently according to when the upper left of boundary rectangle existing on forefront, bottom right vertex image coordinate
It is not matched in column to isolate left and right datum vertex to corresponding crop row boundary rectangle, sequentially it is inserted into current boundary rectangle column;It completes
After current image block column scan, obtains current image block crop row boundary rectangle matrix and be stored in present image crop row boundary rectangle
In matrix;Next image block initial row is adjusted according to the processed line number of current image block, carries out the column scan of next image block, until
Complete the column scan of whole sub-picture;
1.4) element row height and col width minimum value, mean value computation: asking in boundary rectangle matrix, and boundary rectangle row is high (external
Rectangle back gauge up and down) and col width (boundary rectangle left and right margins from) minimum value, and ask high greater than the row of minimum value, col width
Mean value RowHAve, ColWAve;
1.5) column less divided element segments: each element in scanning boundary rectangle matrix, if element width is col width mean value
NcTimes, and it is greater than threshold value TcThe element is then laterally divided into N by (being set as 1.2)cA element;
1.6) it column alignment: is respectively arranged in scanning boundary rectangle matrix, by owning left upper apex x coordinate in each column with the column
The difference of the left upper apex x coordinate minimum value of element is greater than threshold value TwThe each element of (being set as 200) and the right side gone together with it member
Plain sequence, which moves to right, realizes the alignment of boundary rectangle matrix column;
1.7) row less divided element segments: each element in scanning boundary rectangle matrix, if element row height is capable high mean value
NrTimes, and it is greater than threshold value TrThe element is longitudinally divided into N by (being set as 1.5)rA element;
1.8) row alignment: each row in scanning boundary rectangle matrix, by the way that left upper apex y-coordinate in every row is owned with the row
The difference of the left upper apex y-coordinate minimum value of element is greater than threshold value ThThe each element of (being set as 40) and lower section element with its same column
Sequence moves down the row alignment for realizing boundary rectangle matrix;
1.9) " neutral element " is divided: " neutral element " refers in boundary rectangle matrix, the vacancy element of no boundary rectangle, i.e. this yuan
The corresponding crop row boundary rectangle of element is not extracted by, and the upper left of " neutral element ", bottom right vertex image coordinate are 0;According to " null element
The upper left of other non-" neutral elements ", bottom right vertex image coordinate and average col width determine the left side of " neutral element " in row, column where element "
Upper, bottom right vertex image coordinate.
Crop row up-and-down boundary based on row scanning in the step 1.2) extracts, and implementation method is as follows:
2.1) row sweep length Width (being set as 400) is set, bianry image I is cut transversely into H=n ÷ Width figure
As block, wherein the H tile size is m row × nfColumn, nf=Width+ (n mod Width), mod expression rem,
Remaining each tile size is m row × Width column;
2.2) to the i-th traveling line scans of each image block, i is 1 to m, and the i-th row of statistics is gone forward scene prime number
RCount:
If i=1, RCount is stored in FRCount, previous row foreground pixel number when as next line scanning, into
The scanning of row next line;
If i > 1, and FRCount is equal to 0, RCount and is greater than 0, then crop row coboundary flag F lagUp=1 is arranged, will make
Object row coboundary ordinate i-1 is stored in UpY, carries out next line scanning;
If i > 1, and FRCount is greater than 0, RCount and is equal to 0, UpFlag=1, then calculates crop row up-and-down boundary distance Dud
=i-UpY;If DudGreater than threshold value TudAll pixels point identification on the UpY row of current image block is then crop by (being set as 20)
Row coboundary point, all pixels point identification is crop row lower boundary point on the i-th row, and UpFlag is reset, RCount is stored in
In FRCount, next line scanning is carried out;As shown in horizontal line 3-7,15,16 in Fig. 4, crop row lower boundary can be correctly extracted,
Avoid the occurrence of the problem of cannot achieve the upper and lower Boundary Extraction of crop row using horizontal line 1;Such as the horizontal line 11-12 institute in Fig. 4
Show, is the upper and lower boundary of false crop row that inter-row weed Y1 is generated, due to its spacing DudLess than threshold value Tud, therefore will not be missed
It is judged to the upper and lower boundary of crop row;
2.3) step 2.2) is repeated, until completing the row scanning of all rows of current image block;
2.4) the row scanning of next image block is carried out, until completing the row scanning of all image blocks.
As shown in figure 3, the crop row right boundary and boundary rectangle matrix in the step 1.3) based on column scan extract,
Include the following steps:
3.1) column scan: setting row scanning height Height (being set as 960), bianry image I is longitudinally cutting at V=m ÷
Height image block, wherein V tile size is mfRow × n column, mf=Height+ (m mod Height), remaining
Each tile size is Height row × n column;Column scan is carried out to the jth column of each image block, j is 1 to n, statistics jth column
Scene of going forward prime number CCount;If j=1, CCount is stored in FCCount, before previous column when as next column scan
3.1) scene prime number, re-execute the steps;If j≤n, step 3.2) is executed;If j > n, i.e., current image block completes column scan, jumps
Go to step 3.9);
3.2) crop row left margin extracts: if when foreground pixel number in forefront is greater than 0 and works as the previous column foreground pixel in forefront
Number is 0, i.e. Count > 0 and FCount=0, then is currently classified as the left margin of crop row, executes step 3.3);Otherwise, it jumps to
Step 3.4);
3.3) crop row upper left, the identification of lower-left datum vertex: forefront is worked as in scanning from top to bottom, if working as the current pixel in forefront
For coboundary pixel, then the pixel is identified as upper left datum vertex, records the pixel image coordinate, and there are upper left ginsengs for setting
Examine vertex mark LeftUFlag=1;If when the current pixel in forefront is lower boundary pixel, which is identified as lower-left ginseng
Examine vertex, record the pixel image coordinate, if while LeftUFlag be 1, setting there are upper lefts and lower-left datum vertex to mark
Know LeftUDFlag=1, and LeftUFlag is reset;Judge whether upper left and lower-left datum vertex continuously occur, that is, judges
Whether LeftUDFlag is 1, if so, this is to upper left, lower-left datum vertex to for the effective left reference of crop row boundary rectangle
Vertex pair saves this to upper left and lower-left datum vertex image coordinate, and effective left datum vertex logarithm is increased 1 certainly,
LeftUDFlag is reset, and having adjacent effective left margin label L eftSFlag before setting is 1;After completing current column scan, jump
Go to step 3.1);
3.4) effectively pairing crop row right margin extracts: if when foreground pixel number in forefront is equal to 0 and works as the previous column in forefront
Foreground pixel number is greater than 0, i.e. CCount=0 and FCCount > 0 are then currently classified as the right margin of crop row, and further judgement is worked as
Whether adjacent effective left margin, i.e. LeftSFlag=1, and current right margin and the preceding existing phase are had before preceding right margin
Adjacent effective left margin distance DlrWhether threshold value T is greater thanlr(being set as 100): if so, current right margin is effective right margin, make
Object row boundary rectangle columns jumps to step 3.5) from increasing 1;Otherwise, it gos to step 3.1);As in Fig. 4 vertical line 8-10,
Shown in 17, the left and right boundary of crop row can be correctly extracted, avoids the occurrence of using vertical line 2 and cannot achieve the left and right boundary of crop row
The problem of extraction;It is the left and right boundary of false crop row that inter-row weed Y2 is generated as shown in the vertical line 13-14 in Fig. 4, by
Therebetween away from DlrLess than threshold value Tlr, therefore the left and right boundary of crop row will not be mistaken for;
3.5) crop row upper right, the identification of bottom right datum vertex: forefront is worked as in scanning from top to bottom, if working as the current pixel in forefront
For coboundary pixel, then the pixel is identified as upper right datum vertex, records the pixel image coordinate, and there are upper right ginsengs for setting
Examine vertex mark RightUFlag=1;If when the current pixel in forefront is lower boundary pixel, which is identified as bottom right ginseng
Vertex is examined, the pixel image coordinate is recorded, if RightUFlag is 1, there are upper rights and bottom right datum vertex to mark for setting
RightUDFlag=1, and RightUFlag is reset;Judge whether upper right and bottom right datum vertex continuously occur, that is, judges
Whether RightUDFlag is 1, and judges whether LeftSFlag is 1: if so, this is to upper right, bottom right datum vertex to have
The right datum vertex pair of crop row boundary rectangle of effect saves this to upper right, bottom right datum vertex image coordinate, and will effective right ginseng
It examines vertex logarithm and is reset from 1, RightUDFlag is increased, execute step 3.6);Otherwise, it re-execute the steps 3.5), until completing to work as
After preceding column scan, LeftSFlag is reset, is gone to step 3.7);
3.6) matched crop row boundary rectangle is extracted based on left and right datum vertex: to the currently active right datum vertex
It is right, sequentially effective left datum vertex pair on the adjacent left margin of point by point scanning, if effectively upper right datum vertex and effective upper left refers to
Vertex y-coordinate absolute difference DlruLess than threshold value Tlru(being set as 20), then corresponding effectively right datum vertex is pushed up to left reference
Point is respectively outer with effective upper left, bottom right datum vertex x coordinate to 4 datum vertexs for effective crop row boundary rectangle
Rectangle upper left, bottom right vertex x coordinate are connect, with effective upper left and upper right datum vertex y-coordinate the larger value, effective lower-left and bottom right ginseng
The smaller value of vertex y-coordinate is examined as boundary rectangle upper left, bottom right vertex y-coordinate, while by effective upper left, bottom right datum vertex
X coordinate is stored in LeftX and RightX, is used as subsequent isolated vertex to external rectangular extraction, boundary rectangle number is set respectively from increasing 1
Matched left and right datum vertex is set to matching identification LeftMFlag=1, RightMFlag=1, current right margin is set and has been deposited
Right vertex is being matched to mark MatchFlag=1;Jump to step 3.5);As shown in the rectangle R1 in Fig. 4, effective upper left ginseng
It examines vertex A to be obtained by coboundary 15 and the intersection of left margin 17, effective lower-left datum vertex B is intersected by lower boundary 5 and left margin 17
It obtains, effective upper right datum vertex C is obtained by coboundary 16 and the intersection of right margin 10, and effective bottom right datum vertex D is by lower boundary 6
It is obtained with the intersection of right margin 10;The y-coordinate absolute difference D of upper right datum vertex C and upper left datum vertex AlrLess than threshold value
Tlr, therefore upper left datum vertex is upper right datum vertex to the matched upper left datum vertex pair of C, D to A, B, datum vertex A, D's
X coordinate is the upper left of R1, bottom right vertex x coordinate, upper left, the right side that the y-coordinate of datum vertex C, the y-coordinate of datum vertex B are R1
Lower vertex y-coordinate;
3.7) left datum vertex is isolated to external rectangular extraction: judging whether current right margin MatchFlag is 1: if so,
It successively scans with whether the LeftMFlag of left datum vertex pair on the matched left margin of current right margin is 0, if so, the left side
Vertex is to isolate left datum vertex pair, by upper left datum vertex x, y-coordinate, the RightX saved and lower-left datum vertex y
Coordinate presses its left upper apex y-coordinate big-small plug respectively as boundary rectangle upper left, bottom right vertex image coordinate, by the boundary rectangle
Enter in current boundary rectangle column;Execute step 3.8);In Fig. 4, datum vertex E, F are to isolate left datum vertex pair, R7R13
(indicating R7, R13 corresponding big rectangle before less divided column split) is that this isolates the boundary rectangle of left datum vertex pair.
3.8) right datum vertex is isolated to external rectangular extraction: successively scanning the upper right datum vertex pair of current right margin
Whether RightMFlag is 0, if so, the right datum vertex is to isolate right datum vertex pair, by the LeftX saved and the right side
Upper datum vertex y-coordinate, bottom right datum vertex x, y-coordinate, will respectively as the boundary rectangle upper left, bottom right vertex image coordinate
The boundary rectangle is inserted into current column boundary rectangle column by its left upper apex y-coordinate size;By the column, last boundary rectangle is right
Lower vertex y-coordinate is stored in ColMaxY, all column ColMaxY minimum value ImBloEndY is sought, by current scan list foreground pixel number
CCount is stored in FCCount, gos to step 3.1);In Fig. 4, datum vertex G, H are to isolate right datum vertex pair, R6R12
(indicating R6, R12 corresponding big rectangle before less divided column split) is that this isolates the boundary rectangle of right datum vertex pair.
3.9) boundary rectangle matrix stores: if current image block is first piece of present image, by the external square of current image block
Shape matrix MpIt is stored in entire image boundary rectangle matrix Mt;Otherwise, M is scannedpEach column calculate MpWhen the non-" null element in first, forefront
Element " left upper apex abscissa xpWith MtEach column first non-" neutral element " left upper apex abscissa xtThe absolute value D of distanceptIf: xp
<xt, and Dpt>Tpt(being set as 200), then in MtThe new column that one " neutral element " is constituted are inserted into before forefront, by MpWhen forefront connects
Behind the last one element of the new column;If Dpt≤Tpt, then directly by MpWhen forefront connects in MtWhen the last one element in forefront
Below;If traversing MtAfter all column, above-mentioned condition is not satisfied, then in MtThe insertion " neutral element " below of last column constitutes new
Column, by MpWhen forefront connects behind the last one " neutral element " of the new column;Seek MtThe greatest member number MaxColNo respectively arranged,
" neutral element " will be mended behind the element of getting the last place of all elements number less than MaxColNo, until the column element number is
Until MaxColNo, step 3.10) is executed;
3.10) next image block initial row adjusts: taking ImBloEndY as the termination row of current processed image block, removal
Left upper apex y-coordinate is greater than the boundary rectangle of ImBloEndY;The starting behavior ImBloEndY+1 of next image block, termination behavior
3.1) ImBloEndY+1+Height gos to step, carries out the scanning of next image block, until the column for completing entire image are swept
It retouches.
Column less divided element segments, implementation method are as follows in the step 1.5): scanning boundary rectangle matrix respectively arranges, by it
Middle col width ColWid=xr(i,j)-xl(i, j) is greater than TcTimes ColWAve element R (i, j) (i, j be respectively element institute outside
Connect line number, the row number of rectangular matrix) right side adjacent element start it is all until first " neutral element " left side adjacent element
Order of elements moves to right an element, and overrides " neutral element ";If current line, will be on the right side of element R (i, j) without " neutral element "
The all elements that adjacent element starts move to right one;A new element is inserted at (i, j+1), left upper apex image coordinate is
[xl(i,j)+round(ColWid÷Nc)+1,yu(i, j)], bottom right vertex image coordinate is [xr(i,j),yd(i, j)], wherein
[xl(i,j),yu(i,j)]、[xr(i,j),yd(i, j)] be respectively element R (i, j) upper left, bottom right vertex image coordinate, Nc=
(ColWid÷ColWAve);By element R (i, j) bottom right vertex x coordinate xr(i, j) is revised as xl(i,j)+round(ColWid÷
Nc).Such as Fig. 4, due to the presence of inter-row weed Y3, lead to the missing of R2-R7 right margin and R8-R13 left margin, and R2 and R8,
R3R4 and R9R10 (respectively indicating R3 and R4, R9 and the R10 corresponding big rectangle before less divided row is divided), R5 and R11, R6 and
R12, R7 and R13 are the result signal after column less divided element segments.
Column alignment in the step 1.6), implementation method are as follows: scanning boundary rectangle matrix respectively arranges, and asks each external square of column left
Upper vertex x coordinate minimum M inXCol;The space D of forefront left upper apex x coordinate and minimum M inXCol will be worked aswGreater than threshold
Value TwThe order of elements that starts until adjacent element on the left of its right side first " neutral element " of element R (i, j) move to right one,
" neutral element " is somebody's turn to do in covering;" neutral element " if it does not exist, then all elements sequence for R (i, j) and its right side being belonged to the i-th row are right
Move one;R (i, j) is changed to " neutral element ".Such as the R14-R19 in Fig. 4, the position of R8-R13 column is accounted for before column alignment
It sets, it can be achieved that correctly moving to right to right side adjacent column after column alignment.
Row less divided element segments, implementation method are as follows in the step 1.7): scanning boundary rectangle matrix rows, by it
The middle high RowHei=y of rowd(i,j)-yu(i, j) is greater than TrAdjacent element starts to first below the element R (i, j) of times RowHAve
All elements sequence above a " neutral element " until adjacent element moves down an element, and overrides " neutral element ";If working as
The all elements that adjacent element below element R (i, j) starts then are moved down one without " neutral element " by forefront;At (i+1, j)
It is inserted into a new element, left upper apex image coordinate is [xl(i,j),yu(i,j)+round(RowHei÷Nr)+1], bottom right top
Point image coordinate is [xr(i,j),yd(i, j)], wherein Nr=(RowHei ÷ RowHAve);By element R (i, j) bottom right vertex y
Coordinate yd(i, j) is revised as yu(i,j)+round(RowHei÷Nr).Such as R3 and R4, R9 and the R10 in Fig. 4, for row less divided
Result signal after element segments;
Row is aligned in the step 1.8), and implementation method is as follows: scanning boundary rectangle matrix rows ask the external square of each row left
Upper vertex y-coordinate minimum M inYRow;By the space D of current line left upper apex y-coordinate and minimum M inYRowhGreater than threshold
Value ThElement R (i, j) start to first " neutral element " below top adjacent element until order of elements move down one,
" neutral element " is somebody's turn to do in covering;" neutral element " if it does not exist is then belonged to by R (i, j) and below under all elements sequence that jth arranges
Move one;R (i, j) is changed to " neutral element ".Such as R4-R5, R10-R12 in Fig. 4, go after being aligned, it can be achieved that correctly moving down
A line, R7 can be realized correctly has moved down two rows.
" neutral element " is divided in the step 1.9), and implementation method is as follows: scanning boundary rectangle matrix respectively arranges, and will work as forefront
The upper left of upper first non-" neutral element " R1 (n, j), bottom right vertex x coordinate are as owning " neutral element " R0's (z, j) on the column
Upper left, bottom right vertex x coordinate;The z row where forefront " neutral element " R0 (z, j) is scanned: if there are non-on z row
" neutral element " R1 (z, c), using the upper left non-" neutral element " R1 (z, c), bottom right vertex y-coordinate as " neutral element " R0
The upper left of (z, j), bottom right vertex y-coordinate;If on z row nothing but " neutral element ", and z=1, then by the 1 and high mean value RowHAve of row
+ 1 as upper left, the bottom right vertex y-coordinate for being somebody's turn to do " neutral element ";If " neutral element " and z > 1 nothing but on z row, will be with " the null element
Bottom right vertex y-coordinate+1, the bottom right vertex y-coordinate+1+RowHAve of the top adjacent element R1 (z-1, j) of element " same column is used as should
The upper left of " neutral element ", bottom right vertex y-coordinate.Segmentation result if R6, R13 in Fig. 4 are " neutral element " is illustrated.
Fig. 5, left figure are soybean crops unmanned plane image, and right figure is using the soybean crops row point obtained after the present invention
Cut result, it is seen then that the achievable unmanned plane image soybean crops row of the application present invention is divided automatically.
Claims (8)
1. a kind of unmanned plane image soybean crops row dividing method, which comprises the steps of:
1.1) it image segmentation: uses based on the green red difference OTSU automatic threshold image segmentation algorithm of normalization to unmanned plane m
Row × n column soybean crops image S carries out image segmentation, obtains bianry image I;
1.2) the crop row up-and-down boundary based on row scanning extracts: image I is horizontally divided into H image block;To each image
Block is into line scans;According to adjacent scan lines scene prime number of going forward from zero crop row is determined to non-zero, from non-zero to zero variation
Y-coordinate of the upper and lower boundary in image I;
1.3) the crop row right boundary based on column scan and boundary rectangle matrix extract: image I is longitudinally split at V image
Block;Column scan is carried out to each image block;Scene prime number is gone forward from zero to non-zero change detection crop row according to adjacent scan columns
Left margin;Extract upper left, the lower-left datum vertex pair of crop row boundary rectangle;Scene prime number is gone forward from non-according to adjacent scan columns
Zero to zero change detection crop row right margin;Extract upper right, the bottom right datum vertex pair of crop row boundary rectangle;Pass through simultaneously
4 vertex of corresponding crop row boundary rectangle are extracted in matching between right, left datum vertex pair;After completing current column scan, worked as
On forefront crop row boundary rectangle arrange, wherein each boundary rectangle be an element, each element have the boundary rectangle upper left,
Bottom right vertex image coordinate;It extracts according to when the upper left of boundary rectangle existing on forefront, bottom right vertex image coordinate when in forefront
It is not matched to isolate left and right datum vertex to corresponding crop row boundary rectangle, sequentially it is inserted into current boundary rectangle column;It completes current
After image block column scan, obtains current image block crop row boundary rectangle matrix and be stored in present image crop row boundary rectangle matrix
In;Next image block initial row is adjusted according to the processed line number of current image block, carries out the column scan of next image block, until completing
The column scan of whole sub-picture;
1.4) element row height and col width minimum value, mean value computation: asking in boundary rectangle matrix, the high (boundary rectangle of boundary rectangle row
Upper and lower back gauge) and col width (boundary rectangle left and right margins from) minimum value, and seek high greater than the row of minimum value, col width mean value
RowHAve,ColWAve;
1.5) column less divided element segments: each element in scanning boundary rectangle matrix, if element width is the N of col width mean valuecTimes,
And it is greater than threshold value Tc, then the element is laterally divided into NcA element;
1.6) it column alignment: is respectively arranged in scanning boundary rectangle matrix, by by left upper apex x coordinate in each column and the column all elements
Left upper apex x coordinate minimum value difference be greater than threshold value TwEach element and the right side order of elements gone together with it move to right realization
The alignment of boundary rectangle matrix column;
1.7) row less divided element segments: each element in scanning boundary rectangle matrix, if element row height is the N of capable high mean valuerTimes,
And it is greater than threshold value Tr, which is longitudinally divided into NrA element;
1.8) row alignment: each row in scanning boundary rectangle matrix, by by left upper apex y-coordinate in every row and the row all elements
Left upper apex y-coordinate minimum value difference be greater than threshold value ThEach element and move down realization with the lower section order of elements of its same column
The row alignment of boundary rectangle matrix;
1.9) " neutral element " is divided: " neutral element " refers in boundary rectangle matrix, the vacancy element of no boundary rectangle, the i.e. element pair
The crop row boundary rectangle answered is not extracted by, and the upper left of " neutral element ", bottom right vertex image coordinate are 0;According to " neutral element "
The upper lefts of other in the row, column of place non-" neutral element ", bottom right vertex image coordinate and average col width determine " neutral element " upper left,
Bottom right vertex image coordinate.
2. a kind of unmanned plane image soybean crops row dividing method as described in claim 1, which is characterized in that described
Crop row up-and-down boundary based on row scanning extracts, and implementation method is as follows:
2.1) row sweep length Width is set, bianry image I is cut transversely into H=n ÷ Width image block, wherein H
A tile size is m row × nfColumn, nf=Width+ (nmodWidth), mod expression rem, remaining each tile size
For m row × Width column;
2.2) to each the i-th traveling of image block line scans, i is 1 to m, counts the i-th row foreground pixel number RCount:
If i=1, RCount is stored in FRCount, previous row foreground pixel number when as next line scanning;If i > 1,
And FRCount is equal to 0, RCount and is greater than 0, then crop row coboundary flag F lagUp=1 is arranged, by the vertical seat in crop row coboundary
It marks i-1 and is stored in UpY;If i > 1, and FRCount is greater than 0, RCount and is equal to 0, UpFlag=1, then calculates crop row up-and-down boundary
Distance Dud=i-UpY, if DudGreater than threshold value Tud, then it is crop row coboundary point, the i-th row by UpY row all pixels point identification
All pixels point identification is crop row lower boundary point, and UpFlag is reset, and RCount is stored in FRCount;Carry out next line
Scanning;
2.3) step 2.2) is repeated, until completing the row scanning of all rows of current image block;
2.4) the row scanning of next image block is carried out, until completing the row scanning of all image blocks.
3. a kind of unmanned plane image soybean crops row dividing method as described in claim 1, which is characterized in that described
Crop row right boundary and boundary rectangle matrix based on column scan extract, and include the following steps:
3.1) column scan: setting row scanning height Height, and bianry image I is longitudinally cutting at V=m ÷ Height image
Block, wherein V tile size is mfRow × n column, mf=Height+ (mmodHeight), remaining each tile size
For Height row × n column;Column scan is carried out to the jth column of each image block, j is 1 to n, and statistics jth arranges scene prime number of going forward
CCount;If j=1, CCount is stored in FCCount, previous column foreground pixel number when as next column scan, again
Execute step 3.1);If j≤n, step 3.2) is executed;If j > n, i.e., current image block completes column scan, gos to step 3.9);
3.2) crop row left margin extracts: if when foreground pixel number in forefront is greater than 0 and when the previous column foreground pixel number in forefront is
0, i.e. Count > 0 and FCount=0 are then currently classified as the left margin of crop row, execute step 3.3);Otherwise, step is jumped to
3.4);
3.3) crop row upper left, the identification of lower-left datum vertex: forefront is worked as in scanning from top to bottom, if when the current pixel in forefront is upper
Boundary pixel, then the pixel is identified as upper left datum vertex, records the pixel image coordinate, and there are upper lefts with reference to top for setting
Point identification LeftUFlag=1;If when the current pixel in forefront is lower boundary pixel, which is identified as lower-left with reference to top
Point records the pixel image coordinate, if while LeftUFlag be 1, setting there are upper lefts and lower-left datum vertex to mark
LeftUDFlag=1, and LeftUFlag is reset;Judge whether upper left and lower-left datum vertex continuously occur, that is, judges
Whether LeftUDFlag is 1, if so, this is to upper left, lower-left datum vertex to for the effective left reference of crop row boundary rectangle
Vertex pair saves this to upper left and lower-left datum vertex image coordinate, and effective left datum vertex logarithm is increased 1 certainly,
LeftUDFlag is reset, and having adjacent effective left margin label L eftSFlag before setting is 1;After completing current column scan, jump
Go to step 3.1);
3.4) effectively pairing crop row right margin extracts: if when foreground pixel number in forefront is equal to 0 and works as the previous column prospect in forefront
Pixel number is greater than 0, i.e. CCount=0 and FCCount > 0 are then currently classified as the right margin of crop row, and further front right is worked as in judgement
Whether have before boundary adjacent effective left margin, i.e. LeftSFlag=1, and current right margin and this it is preceding have it is adjacent
Effective left margin distance DlrWhether threshold value T is greater thanlr: if so, current right margin is that effectively pairing right margin, crop row are external
Rectangle columns jumps to step 3.5) from increasing 1;Otherwise, it gos to step 3.1);
3.5) crop row upper right, the identification of bottom right datum vertex: forefront is worked as in scanning from top to bottom, if when the current pixel in forefront is upper
Boundary pixel, then the pixel is identified as upper right datum vertex, records the pixel image coordinate, and there are upper rights with reference to top for setting
Point identification RightUFlag=1;If when the current pixel in forefront is lower boundary pixel, which is identified as bottom right with reference to top
Point records the pixel image coordinate, if RightUFlag is 1, there are upper rights and bottom right datum vertex to mark for setting
RightUDFlag=1, and RightUFlag is reset;Judge whether upper right and bottom right datum vertex continuously occur, that is, judges
Whether RightUDFlag is 1, and judges whether LeftSFlag is 1: if so, this is to upper right, bottom right datum vertex to have
The right datum vertex pair of crop row boundary rectangle of effect saves this to upper right, bottom right datum vertex image coordinate, and will effective right ginseng
It examines vertex logarithm and is reset from 1, RightUDFlag is increased, execute step 3.6);Otherwise, it re-execute the steps 3.5), until completing to work as
After preceding column scan, LeftSFlag is reset, is gone to step 3.7);
3.6) matched crop row boundary rectangle is extracted based on left and right datum vertex: to the currently active right datum vertex pair, is pressed
Effective left datum vertex pair on the adjacent left margin of sequence point by point scanning, if effectively upper right datum vertex and upper left datum vertex y-coordinate
Absolute difference DlruLess than threshold value Tlru, then corresponding effectively right datum vertex is made to effective left datum vertex to be effective
4 datum vertexs of object row boundary rectangle, respectively using effective upper left, bottom right datum vertex x coordinate as boundary rectangle upper left, bottom right
Vertex x coordinate, with effective upper left and upper right datum vertex y-coordinate the larger value, effective lower-left and bottom right datum vertex y-coordinate compared with
Small value is used as boundary rectangle upper left, bottom right vertex y-coordinate, while effective upper left, bottom right datum vertex x coordinate are stored in LeftX
And RightX, it is used as subsequent isolated datum vertex to external rectangular extraction, a matched left side is respectively set from increasing 1 in boundary rectangle number
The existing right ginseng of matching of current right margin is arranged to matching identification LeftMFlag=1, RightMFlag=1 in right datum vertex
Vertex is examined to mark MatchFlag=1;Jump to step 3.5);
3.7) left datum vertex is isolated to external rectangular extraction: judging whether current right margin MatchFlag is 1;If so, successively
Whether scanning is 0 to LeftMFlag with left datum vertex on the matched left margin of current right margin;If so, this is left with reference to top
Point is to isolate left datum vertex pair, and by upper left datum vertex x, y-coordinate has saved RightX and lower-left datum vertex y-coordinate
Respectively as boundary rectangle upper left, bottom right vertex image coordinate, which is worked as by the insertion of its left upper apex y-coordinate size
In preceding boundary rectangle column;Execute step 3.8);
3.8) right datum vertex is isolated to external rectangular extraction: successively scanning the upper right datum vertex pair of current right margin
Whether RightMFlag is 0, if so, the right datum vertex is to isolate right datum vertex pair, by the LeftX saved and the right side
Upper datum vertex y-coordinate, bottom right datum vertex x, y-coordinate, will respectively as the boundary rectangle upper left, bottom right vertex image coordinate
The boundary rectangle is inserted into current column boundary rectangle column by its left upper apex y-coordinate size;By the column, last boundary rectangle is right
Lower vertex y-coordinate is stored in ColMaxY, all column ColMaxY minimum value ImBloEndY is sought, by current scan list foreground pixel number
CCount is stored in FCCount, gos to step 3.1);
3.9) boundary rectangle matrix stores: if current image block is first piece of present image, by current image block boundary rectangle square
Battle array MpIt is stored in entire image boundary rectangle matrix Mt;Otherwise, M is scannedpEach column calculate MpWhen first, forefront non-" neutral element "
Left upper apex abscissa xpWith MtEach column first non-" neutral element " left upper apex abscissa xtThe absolute value D of distanceptIf: xp<
xt, and Dpt>Tpt, then in MtThe new column that one " neutral element " is constituted are inserted into before forefront, by MpWhen forefront connect the new column most
Behind the latter element;If Dpt≤Tpt, then directly by MpWhen forefront connects in MtBehind the last one element in forefront;If traversal
MtAfter all column, above-mentioned condition is not satisfied, then in MtLast column new column that insertion " neutral element " is constituted below, by MpCurrently
Column connect behind the last one " neutral element " of the new column;Seek MtThe greatest member number MaxColNo respectively arranged, by all elements
Number mends " neutral element " behind element of getting the last place less than MaxColNo, until the column element number is MaxColNo
Until, execute step 3.10);
3.10) next image block initial row adjusts: taking ImBloEndY as the termination row of current processed image block, removes upper left
Vertex y-coordinate is greater than the boundary rectangle of ImBloEndY;The starting behavior ImBloEndY+1 of next image block, termination behavior
3.1) ImBloEndY+1+Height gos to step, carries out the scanning of next image block, until the column for completing entire image are swept
It retouches.
4. a kind of unmanned plane image soybean crops row dividing method as described in claim 1, which is characterized in that described
Column less divided element segments, implementation method are as follows: scanning boundary rectangle matrix respectively arranges, will wherein col width ColWid=xr(i,j)-
xl(i, j) is greater than TcThe element R (i, j) (line number, row number that i, j are respectively boundary rectangle matrix where the element) of times ColWAve
Right side adjacent element starts all elements sequence until first " neutral element " left side adjacent element and moves to right an element, and
Override " neutral element ";If current line is without " neutral element ", all elements that adjacent element on the right side of element R (i, j) is started
Move to right one;A new element is inserted at (i, j+1), left upper apex image coordinate is [xl(i,j)+round(ColWid
÷Nc)+1,yu(i, j)], bottom right vertex image coordinate is [xr(i,j),yd(i, j)], wherein [xl(i,j),yu(i,j)]、[xr
(i,j),yd(i, j)] be respectively element R (i, j) upper left, bottom right vertex image coordinate, Nc=(ColWid ÷ ColWAve);It will
Element R (i, j) bottom right vertex x coordinate xr(i, j) is revised as xl(i,j)+round(ColWid÷Nc)。
5. a kind of unmanned plane image soybean crops row dividing method as described in claim 1, which is characterized in that described
Column alignment, implementation method are as follows: scanning boundary rectangle matrix respectively arranges, and seeks each external square left upper apex x coordinate minimum value of column
MinXCol;The space D of forefront left upper apex x coordinate and minimum M inXCol will be worked aswGreater than threshold value TwElement R (i, j) open
The order of elements to begin until adjacent element on the left of its first, right side " neutral element " moves to right one, and " neutral element " is somebody's turn to do in covering;If
There is no " neutral elements ", then R (i, j) and its right side all elements for belonging to the i-th row are sequentially moved to right one;R (i, j) is changed
For " neutral element ".
6. a kind of unmanned plane image soybean crops row dividing method as described in claim 1, which is characterized in that described
Row less divided element segments, implementation method are as follows: scanning boundary rectangle matrix rows will wherein go high RowHei=yd(i,j)-
yu(i, j) is greater than TrAdjacent element starts to first " neutral element " top adjacent element below the element R (i, j) of times RowHAve
Until all elements sequence move down an element, and override " neutral element ";If when forefront is without " neutral element ", by element
The all elements that adjacent element starts below R (i, j) move down one;A new element, left upper apex are inserted at (i+1, j)
Image coordinate is [xl(i,j),yu(i,j)+round(RowHei÷Nr)+1], bottom right vertex image coordinate is [xr(i,j),yd
(i, j)], wherein Nr=(RowHei ÷ RowHAve);By element R (i, j) bottom right vertex y-coordinate yd(i, j) is revised as yu(i,
j)+round(RowHei÷Nr)。
7. a kind of unmanned plane image soybean crops row dividing method as described in claim 1, which is characterized in that described
Row alignment, implementation method are as follows: scanning boundary rectangle matrix rows seek the external square left upper apex y-coordinate minimum value of each row
MinYRow;By the space D of current line left upper apex y-coordinate and minimum M inYRowhGreater than threshold value ThElement R (i, j) open
The order of elements to begin until first " neutral element " below top adjacent element moves down one, and " neutral element " is somebody's turn to do in covering;If
There is no " neutral elements ", then all elements that jth column are belonged to by R (i, j) and below sequentially move down one;R (i, j) is changed
For " neutral element ".
8. a kind of unmanned plane image soybean crops row dividing method as described in claim 1, which is characterized in that described
" neutral element " segmentation, implementation method are as follows: scanning boundary rectangle matrix respectively arranges, will be as first non-" neutral element " R1 on forefront
The upper left of (n, j), bottom right vertex x coordinate are as upper left, the bottom right vertex x coordinate for owning " neutral element " R0 (z, j) on the column;It is right
The z row where forefront " neutral element " R0 (z, j) is scanned: if there are non-" neutral element " R1 (z, c) on z row, this is non-
The upper left " neutral element " R1 (z, c), bottom right vertex y-coordinate are sat respectively as the upper left of " neutral element " R0 (z, j), bottom right vertex y
Mark;If on z row nothing but " neutral element ", and z=1, then using the 1 and high mean value RowHAve+1 of row as the upper left of " neutral element ",
Bottom right vertex y-coordinate;If " neutral element " and z > 1 nothing but on z row, will be with the top adjacent element R1 of " neutral element " same column
The bottom right vertex y-coordinate+1 of (z-1, j), bottom right vertex y-coordinate+1+RowHAve are as should the upper left of " neutral element ", bottom right vertex
Y-coordinate.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910039172.2A CN109859212B (en) | 2019-01-16 | 2019-01-16 | Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910039172.2A CN109859212B (en) | 2019-01-16 | 2019-01-16 | Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109859212A true CN109859212A (en) | 2019-06-07 |
CN109859212B CN109859212B (en) | 2020-12-04 |
Family
ID=66894850
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910039172.2A Expired - Fee Related CN109859212B (en) | 2019-01-16 | 2019-01-16 | Soybean crop row segmentation method based on aerial image of unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109859212B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110660075A (en) * | 2019-09-20 | 2020-01-07 | 中国计量大学 | Method for row segmentation of soybean crops adhered to aerial images of unmanned aerial vehicle |
CN112183448A (en) * | 2020-10-15 | 2021-01-05 | 中国农业大学 | Hulled soybean image segmentation method based on three-level classification and multi-scale FCN |
CN113505766A (en) * | 2021-09-09 | 2021-10-15 | 北京智源人工智能研究院 | Image target detection method and device, electronic equipment and storage medium |
CN113807129A (en) * | 2020-06-12 | 2021-12-17 | 广州极飞科技股份有限公司 | Crop area identification method and device, computer equipment and storage medium |
CN114022534A (en) * | 2021-10-22 | 2022-02-08 | 上海伯耶信息科技有限公司 | Tobacco leaf texture included angle extraction method |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102521596A (en) * | 2011-12-09 | 2012-06-27 | 中国科学院长春光学精密机械与物理研究所 | Method for identifying and dividing crop objects under limited cognitive condition |
CN103514459A (en) * | 2013-10-11 | 2014-01-15 | 中国科学院合肥物质科学研究院 | Method and system for identifying crop diseases and pests based on Android mobile phone platform |
CN104952070A (en) * | 2015-06-05 | 2015-09-30 | 中北大学 | Near-rectangle guide based remote-sensing cornfield image segmentation method |
US9305214B1 (en) * | 2013-10-29 | 2016-04-05 | The United States Of America, As Represented By The Secretary Of The Navy | Systems and methods for real-time horizon detection in images |
US20170262962A1 (en) * | 2016-03-11 | 2017-09-14 | Qualcomm Incorporated | Systems and methods for normalizing an image |
CN107274418A (en) * | 2017-07-07 | 2017-10-20 | 江苏省无线电科学研究所有限公司 | A kind of crop image partition method based on AP clustering algorithms |
CN107516068A (en) * | 2017-07-26 | 2017-12-26 | 福州大学 | A kind of method that single ebon hat is extracted in the high resolution image from unmanned plane |
CN108416353A (en) * | 2018-02-03 | 2018-08-17 | 华中农业大学 | Crop field spike of rice fast partition method based on the full convolutional neural networks of depth |
CN109087241A (en) * | 2018-08-22 | 2018-12-25 | 东北农业大学 | A kind of agricultural crops image data nondestructive collection method |
-
2019
- 2019-01-16 CN CN201910039172.2A patent/CN109859212B/en not_active Expired - Fee Related
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102521596A (en) * | 2011-12-09 | 2012-06-27 | 中国科学院长春光学精密机械与物理研究所 | Method for identifying and dividing crop objects under limited cognitive condition |
CN103514459A (en) * | 2013-10-11 | 2014-01-15 | 中国科学院合肥物质科学研究院 | Method and system for identifying crop diseases and pests based on Android mobile phone platform |
US9305214B1 (en) * | 2013-10-29 | 2016-04-05 | The United States Of America, As Represented By The Secretary Of The Navy | Systems and methods for real-time horizon detection in images |
CN104952070A (en) * | 2015-06-05 | 2015-09-30 | 中北大学 | Near-rectangle guide based remote-sensing cornfield image segmentation method |
US20170262962A1 (en) * | 2016-03-11 | 2017-09-14 | Qualcomm Incorporated | Systems and methods for normalizing an image |
CN107274418A (en) * | 2017-07-07 | 2017-10-20 | 江苏省无线电科学研究所有限公司 | A kind of crop image partition method based on AP clustering algorithms |
CN107516068A (en) * | 2017-07-26 | 2017-12-26 | 福州大学 | A kind of method that single ebon hat is extracted in the high resolution image from unmanned plane |
CN108416353A (en) * | 2018-02-03 | 2018-08-17 | 华中农业大学 | Crop field spike of rice fast partition method based on the full convolutional neural networks of depth |
CN109087241A (en) * | 2018-08-22 | 2018-12-25 | 东北农业大学 | A kind of agricultural crops image data nondestructive collection method |
Non-Patent Citations (2)
Title |
---|
RONG XIANG: "Image segmentation for whole tomato plant recognition at night", 《COMPUTERS AND ELECTRONICS IN AGRICULTURE》 * |
韩永华 等: "基于小波变换及Otsu分割的农田作物行提取", 《电子与信息学报》 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110660075A (en) * | 2019-09-20 | 2020-01-07 | 中国计量大学 | Method for row segmentation of soybean crops adhered to aerial images of unmanned aerial vehicle |
CN110660075B (en) * | 2019-09-20 | 2023-03-24 | 中国计量大学 | Method for row segmentation of soybean crops adhered to aerial images of unmanned aerial vehicle |
CN113807129A (en) * | 2020-06-12 | 2021-12-17 | 广州极飞科技股份有限公司 | Crop area identification method and device, computer equipment and storage medium |
CN112183448A (en) * | 2020-10-15 | 2021-01-05 | 中国农业大学 | Hulled soybean image segmentation method based on three-level classification and multi-scale FCN |
CN113505766A (en) * | 2021-09-09 | 2021-10-15 | 北京智源人工智能研究院 | Image target detection method and device, electronic equipment and storage medium |
CN114022534A (en) * | 2021-10-22 | 2022-02-08 | 上海伯耶信息科技有限公司 | Tobacco leaf texture included angle extraction method |
Also Published As
Publication number | Publication date |
---|---|
CN109859212B (en) | 2020-12-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109859212A (en) | A kind of unmanned plane image soybean crops row dividing method | |
CN103258201B (en) | A kind of form lines extracting method of amalgamation of global and local message | |
CN108960011B (en) | Partially-shielded citrus fruit image identification method | |
CN109800619B (en) | Image recognition method for citrus fruits in mature period | |
WO2005071611A1 (en) | Method for extracting person candidate area in image, person candidate area extraction system, person candidate area extraction program, method for judging top and bottom of person image, system for judging top and bottom, and program for judging top and bottom | |
US8363132B2 (en) | Apparatus for demosaicing colors and method thereof | |
CN108133471B (en) | Robot navigation path extraction method and device based on artificial bee colony algorithm | |
CN111259925A (en) | Method for counting field wheat ears based on K-means clustering and width mutation algorithm | |
CN110570422A (en) | Capsule defect visual detection method based on matrix analysis | |
CN108460333A (en) | Ground detection method and device based on depth map | |
WO2021060077A1 (en) | Fish counting system, fish counting method, and program | |
WO2012172706A1 (en) | Motion image region identification device and method thereof | |
CN111985508B (en) | Target connected domain shape analysis method suitable for linear array CCD | |
CN115049689A (en) | Table tennis identification method based on contour detection technology | |
CN109447970A (en) | The image reorientation method based on energy transfer and uniformly scaled | |
CN111738310B (en) | Material classification method, device, electronic equipment and storage medium | |
CN108388898A (en) | Character identifying method based on connector and template | |
JP4967045B2 (en) | Background discriminating apparatus, method and program | |
CN107368847A (en) | A kind of crop leaf diseases recognition methods and system | |
JP6546385B2 (en) | IMAGE PROCESSING APPARATUS, CONTROL METHOD THEREOF, AND PROGRAM | |
CN114691915A (en) | Method and device for improving tile image recognition through algorithm | |
CN113506242A (en) | Corn aflatoxin detection method based on YOLO | |
CN110660075B (en) | Method for row segmentation of soybean crops adhered to aerial images of unmanned aerial vehicle | |
JP2022091547A5 (en) | ||
CN115661491A (en) | Monitoring method for pest control in tea tree planting |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20201204 Termination date: 20220116 |