CN107957264A - A kind of tractor rotary tillage vision navigation method based on new and old native boundary line - Google Patents

A kind of tractor rotary tillage vision navigation method based on new and old native boundary line Download PDF

Info

Publication number
CN107957264A
CN107957264A CN201710389147.8A CN201710389147A CN107957264A CN 107957264 A CN107957264 A CN 107957264A CN 201710389147 A CN201710389147 A CN 201710389147A CN 107957264 A CN107957264 A CN 107957264A
Authority
CN
China
Prior art keywords
image
new
rotary tillage
shearlet
method based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710389147.8A
Other languages
Chinese (zh)
Inventor
卢伟
陈益杉
王家鹏
王新宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Agricultural University
Original Assignee
Nanjing Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Agricultural University filed Critical Nanjing Agricultural University
Publication of CN107957264A publication Critical patent/CN107957264A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Abstract

The present invention proposes a kind of tractor rotary tillage vision navigation method based on new and old native boundary line,Crop row variation and the characteristics of uneven illumination in working environment during for tractor rotary tillage operation,It is proposed that a kind of method based on Steerable filter (Guided Image Filter) and shearing wave conversion (Shearlet Transform) is used to extract new and old native boundary line to complete tractor vision guided navigation,First,Image is quickly transferred to YCrCb color spaces,Steerable filter is carried out to the image of gray processing,Then the new and old native marginal information of Shearlet canny operator extractions is used,Most vision guided navigation line is provided through Hough transform afterwards,Tractor rotary tillage vision navigation method based on new and old native boundary line proposed by the invention can be used in the intelligent navigation under farm environment.

Description

A kind of tractor rotary tillage vision navigation method based on new and old native boundary line
Technical field
The present invention relates to a kind of intelligent tractor vision navigation method, the new and old soil of Field-working Tractor-implement is based especially on Boundary line vision navigation method, belongs to agricultural engineering technology field.
Background technology
Under the background that mechanization of agriculture and intellectualized technology develop rapidly, precision agriculture has obtained significant progress, special It is not intelligent tractor automatic navigation technology.Under the farmland operation environment such as crop row variation, uneven illumination, automatic vision is led Boat technology provides good solution for the restriction of existing technology development level.Existing automatic navigation method is mainly two Kind, a kind of is the Centimeter Level satellite precision navigation (GPS, the Big Dipper) that can be achieved farmland, and another kind is that algorithm is complicated but of low cost Vision guided navigation.GPS and Big Dipper technology are both needed to strengthen differential technique using ground, with high costs, and because of geographical location and meteorology , there are interruption and delay in the factors such as environment, whens farmland satellite navigation signals.Vision guided navigation technology is widely used, but current base In the automatic steering control of farm mechanism technology of machine vision mostly leading line is extracted by studying the distributional pattern of crop row, and agriculture Field crops have harvested during industry machinery rotary tillage process, it is difficult to are navigated based on crop row.Therefore there is an urgent need for study a kind of energy The vision guided navigation algorithm of field rotary tillage process in no crops.
The content of the invention
The defects of to overcome the prior art, the present invention propose that a kind of tractor rotary tillage vision based on new and old native boundary line is led Boat method, be suitable for crop row variation, uneven illumination working environment under tractor intelligent navigation rotary tillage process.
To achieve the above object, the present invention uses following technical scheme:
A kind of tractor rotary tillage vision navigation method based on new and old native boundary line of the present invention, it is according to following step It is rapid to implement:
Step 1:Visual pattern p (x, y) in front of tractor is gathered by camera;
Step 2:Image is subjected to gray processing, and is changed by formula f (x, y)=(R (x, y)+G (x, y)+B (x, y))/3 To YCrCb color spaces, wherein Y=0.299*R+0.587*G+0.114*B, Cr=(R-Y) * 0.713+128, Cb=(B- Y)*0.564+128;
Step 3:Steerable filter processing is carried out to image under YCrCb color spaces;
The wherein processing of Steerable filter, i.e. " Local Linear Model " solve the course of work:
Make the value that q is output pixel;I and k is pixel index;I is the value of input picture, i.e., image to be filtered or other The navigational figure of image;A and b is the coefficient that window center is located at linear function when at k;P is image to be filtered;It is anti-that ε, which is, Only a values it is excessive and introduce have adjust filter effect parameter, ε is bigger, and filter effect is more obvious;μk, σ2K is respectively that I exists Average value and variance in window;It is averages of the image p to be filtered in window;| w | it is the quantity that pixel is included in window;i It is pixel with j;wijIt is a filtering core, for the function being oriented between image I and independent variable p;
Step is 1.:Image under YCrCb color spaces represents that the input of this function passes through a two dimension with two-dimensional function The output and function input that window obtains meets linear relationship, i.e.,:qi=akIi+bk,
Step is 2.:To step 1. in formula both sides do gradient algorithm, i.e.,
Step is 3.:Gap between the actual value p and real output value of digital simulation function is
Step is 4.:A is calculated based on least square methodkAnd bk,bk=pk-akμk
Step is 5.:All linear function values comprising k points are done weighted average to obtainStep 4:After Steerable filter processing, ground using Shearlet-Canny operator extractions Study carefully the marginal information of image;
Algorithm therein is as follows:
Order research image is f [n1, n2];AaFor anisotropic expansion matrix;BsTo shear matrix.A > 0 are scale parameter; S ∈ R are shear parameters;T ∈ R are translation parameters;
Step a:Read in image f [n1, n2];
Step b:By laplacian pyramid by image f [n1, n2] it is decomposed into low pass subbandWith high pass subband
Step c:By high pass subbandPseudo- polar coordinate system is transformed into from cartesian coordinate system, produced matrix passes through a frequency Domain sub-filter, pseudo- polar coordinate system convert back cartesian coordinate system;
Step d:Some subgraph outputs are tried to achieve in the multi-direction characteristic converted using Shearlet, conversion, wherein Shearlet systems are represented byShearlet is transformed to SHψF (a, s, t)= < f, ψA, s, t >, carry out canny edge detections to the subgraph of multiple directions, obtain respective edge image respectively;
Step e:Shearlet inverse transformations are carried out to the subgraph of all directions in step 2;
Step f:Principle that can be complementary according to different images edge, carries out image after inverse transformation using logical operator Fusion;
Step 5:Hough transformation extracts target navigation line;
The algorithm steps of wherein Hough transformation are as follows:
The slope that m is straight line is made, c is intercept;
(i) in image X-Y, all conllinear points (x, y) are described as y=mx+c with linear equation;
(ii) straight line is regarded to the straight line equation in parameter space M-C as, the slope of its cathetus is x, and intercept is y;
(iii), instead of former linear equation, ρ=x cos θ are expressed as with the straight line polar equation of Duda and Hart propositions + y sin θs, ρ are distance of the origin to straight line, and θ crosses the vertical line of origin and the angle of positive direction of the x-axis for straight line;
(iiii), it is necessary to carry out discretization to parameter space during calculating, the center point coordinate of each unit is:
Step 6:Go to step 1.
Compared with prior art, beneficial effects of the present invention are as follows:
In line drawing result of navigating, Steerable filter treatment effect becomes apparent from compared to other filtering algorithms, and algorithm takes It is most short;The leading line extracted by Shearlet-Canny operator edge detections is the most accurate.Subjective evaluation result also indicates that, When uneven illumination and crop row variation, leading line extraction method effect is preferable in text.
The present invention identifies new and old native boundary line under rotary tillage environment by Steerable filter and Shearlet-Canny algorithms There is the advantages of time-consuming short and precision is high, disclosure satisfy that vision guided navigation needs during intelligent tractor field rotary tillage process, there is weight The application value wanted.
Brief description of the drawings
Fig. 1 is a kind of tractor rotary tillage vision navigation method flow chart based on new and old native boundary line.
Fig. 2 is the Steerable filter fate map in a kind of tractor rotary tillage vision navigation method based on new and old native boundary line.
Fig. 3 is that the Shearlet-Canny in a kind of tractor rotary tillage vision navigation method based on new and old native boundary line is calculated Method flow chart.
Embodiment
Below in conjunction with the accompanying drawings and specific embodiment the invention will be further described.
Embodiment 1
As shown in Figure 1, a kind of tractor rotary tillage vision navigation method based on new and old native boundary line, it is real according to following step Apply:
Step 1:Visual pattern p (x, y) in front of tractor is gathered by camera;
Step 2:Gray processing is carried out, and is changed by formula f (x, y)=(R (x, y)+G (x, y)+B (x, y))/3 pairs of images To YCrCb color spaces, wherein Y=0.299*R+0.587*G+0.114*B, Cr=(R-Y) * 0.713+128, Cb=(B- Y)*0.564+128;
Step 3:Steerable filter processing is carried out to image under YCrCb color spaces;
As shown in Fig. 2, the wherein processing of Steerable filter, i.e. " Local Linear Model " the solution course of work are:
Make the value that q is output pixel;I and k is pixel index;I is the value of input picture, i.e., image to be filtered or other The navigational figure of image;A and b is the coefficient that window center is located at linear function when at k;P is image to be filtered;It is anti-that ε, which is, Only a values it is excessive and introduce have adjust filter effect parameter, ε is bigger, and filter effect is more obvious;μk, σ2K is respectively that I exists Average value and variance in window;It is averages of the image p to be filtered in window;| w | it is the quantity that pixel is included in window;i It is pixel with j;wijIt is a filtering core, for the function being oriented between image I and independent variable p;
Step is 1.:Image under YCrCb color spaces represents that the input of this function passes through a two dimension with two-dimensional function The output and function input that window obtains meets linear relationship, i.e.,:qi=akIi+bk,
Step is 2.:To step 1. in formula both sides do gradient algorithm, i.e.,
Step is 3.:Gap between the actual value p and real output value of digital simulation function is
Step is 4.:A is calculated based on least square methodkAnd bk,bk=pk-akμk
Step is 5.:All linear function values comprising k points are done weighted average to obtain
Step 4:After Steerable filter processing, using the marginal information of Shearlet-Canny operator extraction research images;
As shown in figure 3, algorithm therein is as follows:
Order research image is f [n1, n2];AaFor anisotropic expansion matrix;BsTo shear matrix;A > 0 are scale parameter; S ∈ R are shear parameters;T ∈ R are translation parameters;
Step a:Read in image f [n1, n2];
Step b:By laplacian pyramid by image f [n1, n2] it is decomposed into low pass subbandWith high pass subband
Step c:By high pass subbandPseudo- polar coordinate system is transformed into from cartesian coordinate system, produced matrix passes through a frequency Domain sub-filter, pseudo- polar coordinate system convert back cartesian coordinate system;
Step d:Some subgraph outputs are tried to achieve in the multi-direction characteristic converted using Shearlet, conversion, wherein Shearlet systems are represented byShearlet is transformed to SHψF (a, s, t)= < f, ψA, s, t>, carries out canny edge detections to the subgraph of multiple directions, obtains respective edge image respectively;
Step e:Shearlet inverse transformations are carried out to the subgraph of all directions in step 2;
Step f:Principle that can be complementary according to different images edge, carries out image after inverse transformation using logical operator Fusion;
Step 5:Hough transformation extracts target navigation line;
The algorithm steps of wherein Hough transformation are as follows:
The slope that m is straight line is made, c is intercept;
(i) in image X-Y, all conllinear points (x, y) are described as y=mx+c with linear equation;
(ii) straight line is regarded to the straight line equation in parameter space M-C as, the slope of its cathetus is x, and intercept is y;
(iii), instead of former linear equation, ρ=x cos θ are expressed as with the straight line polar equation of Duda and Hart propositions + y sin θs, ρ are distance of the origin to straight line, and θ crosses the vertical line of origin and the angle of positive direction of the x-axis for straight line;
(iiii), it is necessary to carry out discretization to parameter space during calculating, the center point coordinate of each unit is:
Step 6:Go to step 1.

Claims (4)

1. a kind of tractor rotary tillage vision navigation method based on new and old native boundary line, it is characterized in that:
Step 1:Visual pattern p (x, y) in front of tractor is gathered by camera;
Step 2:Gray processing is carried out, and is transformed into by formula f (x, y)=(R (x, y)+G (x, y)+B (x, y))/3 pairs of images YCrCb color spaces, wherein Y=0.299*R+0.587*G+0.114*B, Cr=(R-Y) * 0.713+128, Cb=(B-Y) * 0.564+128;
Step 3:Steerable filter processing is carried out to image under YCrCb color spaces;
Step 4:To the image after Steerable filter processing, believed using the edge of Shearlet-Canny operator extraction research images Breath;
Step 5:The marginal information in image is fitted by Hough transformation, target navigation line is extracted, with Duda and Hart The straight line polar equation of proposition replaces former linear equation, be ρ=x cos θ+y sin θs, and ρ is origin to the distance of straight line, θ The vertical line of origin and the angle of positive direction of the x-axis are crossed for straight line;During calculating, discretization is carried out to parameter space, each The center point coordinate of unit is:
Step 6:Go to step 1.
2. in a kind of tractor rotary tillage vision navigation method based on new and old native boundary line described in claim 1 at Steerable filter The calculating of reason, it is characterized in that, calculated according to following steps:
Make the value that q is output pixel;I and k is pixel index;I is the value of input picture, i.e., image to be filtered or other images Navigational figure;A and b is the coefficient that window center is located at linear function when at k;P is image to be filtered;ε is to prevent a values Excessive and introducing to have the parameter for adjusting filter effect, ε is bigger, and filter effect is more obvious;μk, σ2 kRespectively I is in the window Average value and variance;It is averages of the image p to be filtered in window;| w | it is the quantity that pixel is included in window.I and j is picture Element;wijIt is a filtering core, for the function being oriented between image I and independent variable p;
Step is 1.:Image under YCrCb color spaces represents that the input of this function passes through a two-dimentional window with two-dimensional function Obtained output and function input meets linear relationship, i.e.,:qi=akIi+bk,
Step is 2.:To step 1. in formula both sides do gradient algorithm, i.e.,
Step is 3.:Gap between the actual value p and real output value of digital simulation function is
Step is 4.:A is calculated based on least square methodkAnd bk,bk=pk-akμk
Step is 5.:All linear function values comprising k points are done weighted average to obtain
3. in a kind of tractor rotary tillage vision navigation method based on new and old native boundary line described in claim 1 The calculating of Shearlet-Canny operators, it is characterized in that, calculated according to following steps:
Order research image is f [n1, n2];AaFor anisotropic expansion matrix;BsTo shear matrix.A > 0 are scale parameter;s∈R For shear parameters;T ∈ R are translation parameters;
Step a:Read in image f [n1, n2];
Step b:By laplacian pyramid by image f [n1, n2] it is decomposed into low pass subband fa jWith high pass subband fa j
Step c:By high pass subband fa jPseudo- polar coordinate system is transformed into from cartesian coordinate system, the matrix of generation passes through frequency domain
Band filter, pseudo- polar coordinate system convert back cartesian coordinate system;
Step d:Some subgraph outputs, wherein Shearlet systems are tried to achieve in the multi-direction characteristic converted using Shearlet, conversion System is represented byShearlet is transformed to SHψF (a, s, t)=<F, ψA, s, t>, to more The subgraph in a direction carries out canny edge detections respectively, obtains respective edge image;
Step e:Shearlet inverse transformations are carried out to the subgraph of all directions in step 2;
Step f:Principle that can be complementary according to different images edge, melts image after inverse transformation using logical operator Close.
A kind of 4. Hough transformation in tractor rotary tillage vision navigation method based on new and old native boundary line described in claim 1 Algorithm, it is characterized in that:
The slope that m is straight line is made, c is intercept;
(i) in image X-Y, all conllinear points (x, y) are described as y=mx+c with linear equation;
(ii) straight line is regarded to the straight line equation in parameter space M-C as, the slope of its cathetus is x, intercept y;
(iii), instead of former linear equation, it is ρ=x cos θ+y sin θs with the straight line polar equation of Duda and Hart propositions;
(iiii), it is necessary to carry out discretization to parameter space during calculating, the center point coordinate of each unit is:
CN201710389147.8A 2016-10-17 2017-05-25 A kind of tractor rotary tillage vision navigation method based on new and old native boundary line Pending CN107957264A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2016109072729 2016-10-17
CN201610907272 2016-10-17

Publications (1)

Publication Number Publication Date
CN107957264A true CN107957264A (en) 2018-04-24

Family

ID=61954615

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710389147.8A Pending CN107957264A (en) 2016-10-17 2017-05-25 A kind of tractor rotary tillage vision navigation method based on new and old native boundary line

Country Status (1)

Country Link
CN (1) CN107957264A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110310239A (en) * 2019-06-20 2019-10-08 四川阿泰因机器人智能装备有限公司 It is a kind of to be fitted the image processing method for eliminating illumination effect based on characteristic value

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110310239A (en) * 2019-06-20 2019-10-08 四川阿泰因机器人智能装备有限公司 It is a kind of to be fitted the image processing method for eliminating illumination effect based on characteristic value
CN110310239B (en) * 2019-06-20 2023-05-05 四川阿泰因机器人智能装备有限公司 Image processing method for eliminating illumination influence based on characteristic value fitting

Similar Documents

Publication Publication Date Title
Zhang et al. New research methods for vegetation information extraction based on visible light remote sensing images from an unmanned aerial vehicle (UAV)
Reza et al. Rice yield estimation based on K-means clustering with graph-cut segmentation using low-altitude UAV images
Samiappan et al. Using unmanned aerial vehicles for high-resolution remote sensing to map invasive Phragmites australis in coastal wetlands
Lati et al. Estimating plant growth parameters using an energy minimization-based stereovision model
KR102053582B1 (en) Method of ground coverage classification by using image pattern learning based on deep learning
CN110020635A (en) Growing area crops sophisticated category method and system based on unmanned plane image and satellite image
EP2923333B1 (en) Method for the automatic creation of two- or three-dimensional building models
CN109376728A (en) A kind of weeds in paddy field recognition methods and its application based on multiple features fusion and BP neural network
Khan et al. UAV’s agricultural image segmentation predicated by clifford geometric algebra
JP7344987B2 (en) Convolutional neural network construction method and system based on farmland images
Ospina et al. Simultaneous mapping and crop row detection by fusing data from wide angle and telephoto images
Peng et al. Binocular-vision-based structure from motion for 3-D reconstruction of plants
CN115687850A (en) Method and device for calculating irrigation water demand of farmland
CN107957264A (en) A kind of tractor rotary tillage vision navigation method based on new and old native boundary line
CN107578447B (en) A kind of crop ridge location determining method and system based on unmanned plane image
CN105844264A (en) Oil peony fruit image identification method based on stress
CN113569772A (en) Remote sensing image farmland instance mask extraction method, system, equipment and storage medium
Hu et al. Optimal scale extraction of farmland in coal mining areas with high groundwater levels based on visible light images from an unmanned aerial vehicle (UAV)
Mohammed Amean et al. Automatic plant branch segmentation and classification using vesselness measure
CN100480628C (en) Stereo image row tree 3-D information fetching method based on image division technology
Bupathy et al. Optimizing low-cost UAV aerial image mosaicing for crop growth monitoring
CN104567872B (en) A kind of extracting method and system of agricultural machinery and implement leading line
Karydas et al. Fine scale mapping of agricultural landscape features to be used in environmental risk assessment in an olive cultivation area
CN113870278A (en) Improved Mask R-CNN model-based satellite remote sensing image farmland block segmentation method
Avetisyan et al. Modification in landscape horizontal structure, induced by changing environmental conditions: a case study of Haskovo region (Southeastern Bulgaria)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180424