CN103983270B - A kind of image conversion processing method of sonar data - Google Patents

A kind of image conversion processing method of sonar data Download PDF

Info

Publication number
CN103983270B
CN103983270B CN201410210477.2A CN201410210477A CN103983270B CN 103983270 B CN103983270 B CN 103983270B CN 201410210477 A CN201410210477 A CN 201410210477A CN 103983270 B CN103983270 B CN 103983270B
Authority
CN
China
Prior art keywords
point
sonar
current
circle
max
Prior art date
Application number
CN201410210477.2A
Other languages
Chinese (zh)
Other versions
CN103983270A (en
Inventor
陈宗海
王鹏
张启彬
包鹏
徐子伟
孙建
Original Assignee
中国科学技术大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中国科学技术大学 filed Critical 中国科学技术大学
Priority to CN201410210477.2A priority Critical patent/CN103983270B/en
Publication of CN103983270A publication Critical patent/CN103983270A/en
Application granted granted Critical
Publication of CN103983270B publication Critical patent/CN103983270B/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/539Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section

Abstract

The invention discloses the image conversion processing method of a kind of sonar data, it is possible to during being effectively applied to localization for Mobile Robot, map building and path planning.First, sonar data is carried out noise reduction process, extract the angle point information comprised in data, and extended foundation local expand map based on error ellipse, maximum error circle diagonal angle point range;Then, local is expanded map maps to bianry image space, setting up the local in bianry image space and expand map, by rotational invariance matching method, the mode of two-dimensional scan matching method and SIFT feature coupling realizes the current and coupling of history local expansion map.The invention provides a kind of new sonar range data processing method, the method of relatively conventional direct process sonar data, the image conversion of sonar data processes can fully excavate the information that sonar data comprises, enrich the mode that sonar data processes, improve precision and robustness that sonar data processes.

Description

A kind of image conversion processing method of sonar data

Technical field

The present invention relates to the processing method of a kind of sonar range data, by range data being mapped to the side of image space Formula, it is achieved that the image conversion of sonar data processes.Belong to Mobile Robotics Navigation field.

Background technology

Along with the progress of airmanship, mobile robot is widely used to assist the mankind to complete circumstances not known and visits The tasks such as survey, " Jade Hare " lunar rover of China is typical example.For completing the inter-related task of exploring unknown environments, robot Need to possess the function of independent navigation.Typically, independent navigation includes three subproblems: 1. I am at that?2. which I to go?3. I how Which goes?The most corresponding robot localization, map building and path planning.Robot self pose and environmental characteristic positional information Perception is the premise solving the problems referred to above.

At present, moving robot mainly uses vision sensor, laser sensor and ultrasonic sensor etc. to be obtained from Body and the positional information of environmental characteristic.Wherein, the abundant information that vision sensor obtains, but require that robot quickly counts According to disposal ability, it addition, light, block etc. is disturbed sensitivity by vision sensor, limit its range of application.Laser sensor and Ultrasonic sensor is range sensor, by the distance between robot measurement and environmental characteristic, it is provided that robot navigation institute The information needed.Laser sensor response is fast, and the precision of information of acquisition is high, but, laser sensor installation accuracy requires height, price Expensive.Relative, ultrasonic sensor is installed simple, and price is relatively low, it is possible to obtain the information that precision is of a relatively high, because of This, ultrasonic sensor is still widely used.But, owing to field angle is relatively big, the information that ultrasonic sensor obtains exists not Definitiveness.Probability theory, fuzzy theory and gray system theory etc. are all used to express, process ultrasound information, and finally real Existing robot map building, location and path planning.

Take a broad view of current sonar information processing method, be usually and directly process the range data that sonar obtains, and by meter Calculate the statistical information of range data, set up the formalized description of environmental characteristic.The rasterizing description side of typical example such as environment Method and characteristics map creation method.Being affected by field angle, the range information that sonar obtains is inevitably with error.Current Sonar range data processing method make use of the statistical information that packet contains to greatest extent, and at mobile robot autonomous navigation In achieve successful Application.But, when error is bigger, the accuracy of statistical information can be affected, and meanwhile, Existing methods is also It is difficult to excavate from initial data further useful information.To this end, herein sonar range data are mapped to figure from metric space Image space, utilizes the related art method of image procossing to realize the process of sonar range information.The method can be efficiently applied to move During mobile robot location, map building and path planning.

Summary of the invention

It is an object of the invention to the range data of sonar is mapped to image space, utilize image processing techniques to realize sonar The process of range data, by the process of image conversion, excavates the environmental information that sonar data comprises to greatest extent, and the present invention one Aspect provides the new approaches that sonar data processes, and on the other hand, by the image conversion matching technique of multiple sonar data, improves Precision that sonar data processes and robustness.

The present invention provides the image conversion processing method of a kind of sonar data, by sonar range information MAP to image is empty Between, the method utilizing image procossing, it is achieved the process of sonar range information, mainly comprise the steps that

The measured value that step 1, filtration produce owing to measuring blind area or exceed sonar to measure scope is (if note sonar data is (x, y, θ, ra), wherein (x, y) represents the coordinate of target, and θ is the orientation of target opposed robots, and ra is that target arrives robot Distance.Undesirable data markers is ra=R, and wherein R is that the maximum of sonar sensor measures distance, R=in the present invention 5000mm), the data set after filtering of recording a demerit is S.

Step 2, filtration singular value.In the present invention, singular value refers to not represent any physical presence feature, the most sparse Measured value.Data-oriented collection S, calculate respectively in S Euclidean distance the most between any two, and according to coordinate and the size of distance Point in S is classified.Add up each classification and comprise number a little, when the number at midpoint of classifying is less than threshold value Num, remove A little, Num is the threshold value being previously set in institute in respective class.Data set after filter singular value of recording a demerit is S0, the present invention claims S0For away from From data space, referred to as metric space.

Step 3, extraction angle point.Defining a length of N, sliding step is s, and glide direction isSliding window.EdgeSide To, from S0In take N number of point successively, remember that its transverse and longitudinal coordinate is respectively Xt=[x1,x2,...,xN] and Yt=[y1,y2,...,yN], then Xt、YtCovariance matrix be:

C t = cov ( X t , X t ) cov ( X t , Y t ) cov ( Y t , X t ) cov ( Y t , Y t ) , - - - ( 1 )

Wherein: WithIt is respectively XtAnd YtThe average of element.Note CtEigenvalue For λmaxAnd λmin, its ratio is EVR=λminmax.Along with the slip of window, S can be calculated0All EVR values, worked as Front EVR curve.At corner point, EVR can reach extreme value.It is bent that the present invention calculates current EVR by the way of relatively adjacent EVR value The peak value of line, the corresponding angle point of each peak value.

Step 4, calculating error ellipse.Calculating the peak value of current EVR curve, this step, as a example by peak value O, elaborates by mistake The calculating process of difference ellipse is as follows: (x y), then takes so that (x, y) is the center of circle, and r' is radius to calculate sonar data point corresponding for O In the range of n data point, its transverse and longitudinal coordinate is designated as X=[x respectively1,...,xn] and Y=[y1,...,yn].If the association of note X, Y Variance matrix be the eigenvalue of C, C be λ1And λ21≥λ2), corresponding characteristic vector is v1And v2, then error ellipse is v1And v2Group Become coordinate system inFor the center of circle, λ1And λ2It is respectively major axis and the ellipse of short axle, wherein

Step 5, calculating maximum error circle, set up local and expand map.WithFor the center of circle, λ1For radius, set up error Oval corresponding maximum error circle.According to the actual connection of angle point, calculate the maximum error circle that the connected angle point of reality is corresponding Public outer tangent line, calculate further the point of contact of tangent line and maximum error circle, connect reality and connect angle point correspondence maximum error circle Point of contact constitute local expand map M, all maximum erroies circle the center of circle, point of contact constitute local expand map M crucial point set im_point。

Step 6, map to image space.The transverse and longitudinal coordinate that note im_point comprises a little is respectively P=(p1,...,pm), Q =(q1,...,qm), pmaxAnd qmaxRepresent the maximum of transverse and longitudinal coordinate, p respectivelyminAnd qminIt is respectively the minima of transverse and longitudinal coordinate. Then a bit (p in im_pointi,qi) a bit (h in image space is mapped to by formula (2) and formula (3)i,ki):

h i = int [ η ( p max - p min ) · ( p i - p min ) p max - p min ] , - - - ( 2 )

k i = int [ η ( q max - q min ) · ( q i - q min ) q max - q min ] , - - - ( 3 )

Wherein, η is proportionality coefficient, and η is relevant to the yardstick of sonar data and image space.

Step 7, note (hi,ki) it is the picture in the center of circle, r in im_pointiFor the radius of corresponding circle, i=1 ..., j, j are circle Heart number.With (hi,ki) it is the center of circle, riDetermine border circular areas for radius, make the picture at point of contact in this border circular areas and im_point Pixel value in the region determined is 0, otherwise is 1.Remember in this bianry image space pixel value be the region of 0 be in image space Local expand map p_M.

Step 8, location matches.Location matches mainly includes two aspects: rotational invariance coupling and two-dimensional scan are mated.

Step A, rotational invariance are mated.

Step A1, the maximum of spacing of note p_M both horizontally and vertically pixel are respectively hmaxAnd vmax, the present invention Definition Pc(int(hmax/2),int(vmax/ 2)) it is the central point of p_M.With PcFor zero, along the horizontal and vertical side of p_M To setting up coordinate system Σc

Step A2, make D=even ((hmax/2)2+(vmax/2)2)1/2, " even " represents ((hmax/2)2+(vmax/2)2)1/2 Upwards take even number.In coordinate system ΣcIn, with PcFor the center of circle, respectively as the concentric circular that radius is r and r+ Δ r, constitute donut Rr。RrThe number of pixels comprised is designated as Nr, the pixel count wherein occupied by p_M is designated as Ur, its ratio is designated as v (r)=Ur/Nr, claim For the effective duty cycle that radius r is corresponding.According to this, calculate initial point to all of effective duty cycle in the range of D/2, constitute and effectively account for Empty than vectorial V=[v (0), v (Δ r) ... v (r) ... v (D/2)]T

Step A3, set current and history p_M effective duty cycle vector and be respectively Vl=[vl(0), vl(Δr)...vl (r)...vl(D/2)]TAnd Vd=[vd(0), vd(Δr)...vd(r)...vd(D/2)]T, then VlAnd VdMatching rate be:

p rpt = Σ r = 0 D / 2 { [ v l ( r ) - υ l ] [ v d ( r ) - υ d ] } { Σ r = 0 D / 2 [ v l ( r ) - υ l ] 2 Σ r = 0 D / 2 [ v d ( r ) - υ d ] 2 } - - - ( 4 )

Wherein, υ l = Σ r = 0 D / 2 v l ( r ) / ( 1 + D / 2 Δr ) , υ d = Σ r = 0 D / 2 v d ( r ) / ( 1 + D / 2 Δr ) .

Step B, two-dimensional scan coupling.

Step B1, the establishment of coordinate system method mentioned according to step A1, set up the coordinate of current p_M and history p_M respectively System;

Step B2, remembering that in current bianry image space, the pixel count of picture level and vertical direction is N, current p_M occupies The i-th row and the pixel count respectively N of the i-th rowriAnd Nci, then the effective duty cycle of the i-th row and column is designated as w respectivelyri=Nri/ N and wci=Nci/N.Typically, due to current p_M and history p_M towards difference, the dutycycle of same sequence number ranks is the most different.For This, the present invention realizes history p_M and the coupling of current p_M according to following rule:

2.1 fix current p_M, remember that the effective duty cycle of its i-th row and column isWithCalculate it to own The effective duty cycle of ranks, generates the effective duty cycle vector of current p_MWith W c l = [ w c 1 l , w c 2 l , . . . , w cN l ] T ;

2.2 with 2.1, and the effective duty cycle vector of note history p_M is respectivelyWith W c d = [ w c 1 d , w c 2 d , . . . , w cN d ] T ;

2.3 utilize formula (4) to calculate the canonical correlation coefficient of two dimensions respectivelyWith.History p_M rotates 1 counterclockwise Degree, recalculates the canonical correlation coefficient of current p_M and history p_M, is designated asWith.Successively, two dimensions one are calculated respectively 360 groups of canonical correlation coefficients in cycle, generate two respective correlation coefficienies of dimension vectorial:With λ c = [ λ c 0 , λ c 1 , . . . , λ c 359 ] T .

If the current p_M of step B3 and history p_M represent identical local environment, then correlation coefficient vector λrAnd λcIn right The some continuous element answered will exceed threshold value λ of settingth;Otherwise, λrAnd λcElement be respectively less than λthOr the most some discrete elements Element exceedes threshold value.Take λrAnd λcIn comprise exceed threshold value λthThe average of element, be designated as respectivelyWithThen current p_M with go through The matching rate of history p_M

Step 9, images match.Utilize SIFT operator to extract the point of interest of current p_M and history p_M respectively, and carry out figure As coupling, note matching rate is Pp, P in the present inventionpThe point of interest logarithm of the true coupling of making a comment or criticism and the ratio of all match interest point logarithms Value.

Step 10, matching rate merge.The present invention specifying, final matching rate is Pf=α prpt+βp2d+γpp, wherein α, β Weight with γ is respectively each matching rate, meets alpha+beta+γ=1.The value of α, β and γ is according to each method in reality application Matching precision determine.If PfIt is unsatisfactory for threshold requirement, then stores current p_M;Otherwise, then may utilize match information and move The tasks such as mobile robot location, map building and path planning.

The invention have the advantages that

1, sonar range data are mapped to image space, utilize image processing techniques to process sonar range data, it is provided that A kind of new sonar data roadmap.

2, by setting up local expansion map, improve the robustness that sonar data processes.

3, proposing three kinds of sonar data image conversion matching process, three kinds of methods complement each other, and improve matching precision.

4, the image conversion of sonar data processes and can be efficiently applied to localization for Mobile Robot, map building and path planning During, improve precision and the robustness of robot navigation's task.

Accompanying drawing explanation

Fig. 1 is sonar data image conversion Processing Algorithm flow chart;

Fig. 2 (a), Fig. 2 (b) are respectively data set S and S0, Fig. 2 (b) illustrates error ellipse and maximum error circle simultaneously;

Fig. 3 (a), Fig. 3 (b) are respectively rotational invariance matching process schematic diagram and rotational invariance matching method matches knot Really schematic diagram;

Fig. 4 (a), Fig. 4 (b) are respectively two-dimensional scan matching process schematic diagram and two-dimensional scan matching method matches result is shown It is intended to;

Fig. 5 (a), Fig. 5 (b) are respectively current p_M and history p_M interest point extraction result schematic diagram;

Fig. 6 (a) is current and the schematic diagram of history p_M SIFT interest points matching effect, and Fig. 6 (b) is current p_M interest The schematic diagram of some Self Matching effect.

Detailed description of the invention

The present embodiment is implemented under premised on inventive technique scheme, gives detailed embodiment and process, But the practical range of the present invention is not limited to following embodiment.

The present embodiment utilizes 16 sonar sensors that robot Pioneer3-DX equips under corridor-office environment Gather data, utilize Visual Studio2008, OpenCV-1.0.0 and Matlab R2009a hybrid programming to realize sonar The image conversion of data processes, and algorithm flow is as shown in Figure 1.It is as follows that the present invention specifically performs step:

(1) utilize robot sonar to gather current local environment data, separate the sonar data point of ra=5000mm, separate Shown in result such as Fig. 2 (a), filter the singular value in sonar data, set up metric space S0, as shown in Fig. 2 (b);

(2) the present embodiment takes N=10, s=1, Xt=[x1,x2,...,x10], Yt=[y1,y2,...,y10].First, X is calculated by formula (1)tAnd YtCovariance matrix Ct, then, utilize the eig () function provided in Matlab, calculate Ct's Characteristic vector and individual features value, calculate EVR=λ furtherminmax

(3) edgeWith step-length s sliding window, to window, data are less than N.Repeat step (2), calculate all windows corresponding EVR, constitute current distance space S0EVR curve.

(4) the most adjacent EVR value, calculates the peak value of current EVR curve.Assuming that O is one of them peak value, O is corresponding Sonar data point be (x, y).Take so that (x, y) is the center of circle, and r' is n data point in the circle of radius, and its coordinate is designated as X= [x1,...,xn] and Y=[y1,...,yn].Calculate the eigenvalue λ of the covariance matrix C, C of X, Y1And λ2And corresponding feature to Amount v1And v2.At v1And v2In the coordinate system constituted, withFor the center of circle, λ1And λ2Error ellipse is determined for major axis and short axle, its InWithFor the center of circle, λ1For radius, determine maximum error circle.

(5) calculate, according to step (4), the maximum error circle that the current all peak values of EVR curve are corresponding.Reality according to angle point Connection, calculates the public outer tangent line of maximum error circle corresponding to the connected angle point of reality, calculates tangent line further with maximum by mistake The point of contact of difference circle, builds local and expands map M, determines the crucial point set im_point of M (as shown in Fig. 3 (a) and Fig. 4 (a)).

(6) the transverse and longitudinal coordinate that note im_point comprises a little is respectively P=(p1,...,pm) and Q=(q1,...,qm), determine pmaxAnd qmax, and pminAnd qmin.By formula (2) and formula (3) by the point (p in im_pointi,qi) map to image sky Point (h betweeni,ki), η=0.02 in the present invention.

(7) (h is determinedi,ki), ri, i=1 ..., j.With (hi,ki) it is the center of circle, riBorder circular areas is determined for radius.Make by Pixel value in the region that the picture at this border circular areas and im_point point of contact determines is 0, otherwise is 1, determines in image space Local expand map p_M (as shown in Fig. 5 (a) and Fig. 5 (b)).

(8) location matches.

Step A, rotational invariance are mated.

Step A1, calculate the maximum h of spacing of p_M both horizontally and vertically pixelmaxAnd vmax, calculate further The center point P of p_Mc(int(hmax/2),int(vmax/ 2)), coordinate system Σ is set upc, as shown in Fig. 3 (a);

Step A2, first calculate D=even ((hmax/2)2+(vmax/2)2)1/2, then calculate current and history p_M has Effect dutycycle vector Vl=[vl(0), vl(Δr)...vl(r)...vl(D/2)]TAnd Vd=[vd(0), vd(Δr)...vd (r)...vd(D/2)]T, calculate V by formula (4)lAnd VdMatching rate prpt(as shown in Fig. 3 (b)).

Step B, two-dimensional scan coupling.

Step B1, the establishment of coordinate system method proposed according to step A1, set up the coordinate of current p_M and history p_M respectively System, as shown in Fig. 4 (a);

Step B2, calculate current p_M effective duty cycle vectorWith

Step B3, the effective duty cycle vector of calculating history p_MWith

Step B4, utilize formula (4) calculate two dimensions canonical correlation coefficient vectorWith λ c = [ λ c 0 , λ c 1 , . . . , λ c 359 ] T .

Step B5, calculating P2d(as shown in Fig. 4 (b)).

(9) SIFT operator is utilized to extract the point of interest of current p_M and history p_M respectively (such as Fig. 5 (a) and Fig. 5 (b) institute Show), and carry out images match (as shown in Fig. 6 (a) and Fig. 6 (b)), note matching rate is Pp

(10) final matching rate P is calculatedf=α prpt+βp2d+γpp.In the present invention, test of many times result shows, three kinds The precision height joining algorithm is followed successively by: two-dimensional scan matching algorithm, rotational invariance matching algorithm and image matching algorithm, therefore This experiment takes α=0.33, β=0.37, γ=0.30.If PfIt is unsatisfactory for threshold requirement, then stores current p_M, otherwise, then Available match information moves the tasks such as robot localization, map building and path planning.

Claims (11)

1. the image conversion processing method of a sonar data, it is characterised in that specifically include following steps:
(1) filter due to sonar blind area or beyond sonar to measure scope the measured value that produces;
(2) singular value in sonar data and discrete point are filtered by the method for cluster;
(3) calculate the EVR curve of sonar to measure value, by the way of detecting current EVR peak of curve, obtain sonar data comprise Environment angle point information;
(4) error ellipse of each Corner Feature that sonar data comprises is calculated;
(5) calculate the maximum error circle of each error ellipse, build current local and expand map M;
(6) set up current local and expand the mapping relations of map M and bianry image space;
(7) M is mapped to bianry image space, build the local in image space and expand map p_M;
(8) utilize rotational invariance matching process and two-dimensional scan matching process to realize local and expand the coupling of map;
(9) utilize SIFT operator extraction point of interest, and carry out images match;
(10) rotational invariance matching process, two-dimensional scan matching process and the fusion of image matching method matching result are carried out.
2. the method for claim 1, it is characterised in that step (1) including: note sonar data is (x, y, θ, ra), wherein (x, y) is coordinates of targets, and θ is the orientation of target opposed robots, and ra is the target distance to robot, due to measure blind area or Beyond the impact of sonar to measure scope, can produce undesirable data, be labeled as ra=R, wherein R is sonar sensor Maximum measurement distance, removes the point of all ra=R, and the data set after filtration is designated as S.
3. method as claimed in claim 2, it is characterised in that step (2) including: in calculating sonar data collection S, institute is the most two-by-two Between Euclidean distance, and according to the size of coordinate and distance, the point in S is classified, add up each classification comprise a little Number, when the number at certain classification midpoint is less than threshold value Num, removes the institute in respective class a little, and left point constitutes metric space S0
4. method as claimed in claim 3, it is characterised in that step (3) including: defines a length of N, and sliding step is s, sliding Dynamic direction isSliding window, edgeDirection, from S0In take N number of point successively, its coordinate is designated as X respectivelyt=[x1,x2,…,xN] And Yt=[y1,y2,…,yN], pass through:
C t = cov ( X t , X t ) cov ( X t , Y t ) cov ( Y t , X t ) cov ( Y t , Y t )
Calculate XtAnd YtCovariance matrix Ct, wherein WithIt is respectively XtAnd YtElement Average, remembers CtThe ratio EVR=λ of eigenvalueminmax, λmaxAnd λminFor CtEigenvalue;EdgeWith step-length s sliding window, extremely In window, data are less than N, are calculated current EVR curve, calculate current EVR curve by the way of relatively adjacent EVR value Peak value, the corresponding angle point of each peak value.
5. the method for claim 1, it is characterised in that the calculating error ellipse described in step (4) specifically comprises following interior Hold:
Assuming that O is one of them peak value, sonar data point corresponding for O is that (x y), takes so that (x, y) is the center of circle, and r' is radius N data point in circle, its coordinate is designated as X=[x1,…,xn] and Y=[y1,…,yn], calculate the covariance matrix C of X and Y, C's Eigenvalue is λ1And λ21≥λ2), corresponding characteristic vector is v1And v2, then error ellipse is v1And v2Composition coordinate system inFor the center of circle, λ1、λ2It is respectively major axis and the ellipse of short axle, wherein
6. method as claimed in claim 5, it is characterised in that map M is expanded, specifically in the currently local of the structure described in step (5) Comprise following content:
WithFor the center of circle, λ1For radius, build the maximum error circle that error ellipse is corresponding, according to the actual connection of angle point, Calculate the public outer tangent line of maximum error circle corresponding to the actual angle point that is connected, calculate tangent line and round the cutting of maximum error further Point, connects the actual point of contact being connected angle point correspondence maximum error circle and constitutes local expansion map M, the circle of all maximum erroies circle The heart, point of contact constitute local and expand the crucial point set im_point of map M.
7. method as claimed in claim 6, it is characterised in that the mapping setting up M and bianry image space described in step (6) Relation includes:
Note im_point comprises abscissa a little and vertical coordinate is respectively P=(p1,…,pm) and Q=(q1,…,qm), pmaxAnd qmax It is respectively the maximum of transverse and longitudinal coordinate, pminAnd qminIt is respectively the minima of transverse and longitudinal coordinate, then a bit (p in im_pointi, qi) pass through formulaWithMap in image space A bit (hi,ki), wherein, η is proportionality coefficient.
8. method as claimed in claim 7, it is characterised in that ground is expanded in the local set up in image space described in step (7) Figure includes herein below:
In note im_point, picture and the corresponding radius in the center of circle are respectively (hi,ki) and ri, i=1 ..., j, j are center of circle number, with (hi,ki) it is the center of circle, riDetermine border circular areas for radius, make the district that in this border circular areas and im_point, the picture at point of contact determines Pixel value in territory is 0, otherwise is 1, remember in this bianry image space pixel value be the region of 0 be that the local in image space is opened up Site of an exhibition figure p_M.
9. the method for claim 1, it is characterised in that step (8) including:
(8.1) utilize the current p_M of rotational invariance matching method matches and history p_M, specifically include following content:
I () calculates the maximum h of the spacing of p_M both horizontally and vertically pixelmaxAnd vmax, calculate further in p_M Heart point Pc(int(hmax/2),int(vmax/ 2)), along both horizontally and vertically setting up coordinate system ∑c, at coordinate system ∑cIn, with Pc For the center of circle, respectively as the concentric circular that radius is r and r+ Δ r, constitute donut Rr, RrThe number of pixels comprised is designated as Nr, wherein The pixel count occupied by p_M is designated as Ur, its ratio is designated as v (r)=Ur/Nr, referred to as corresponding for radius r effective duty cycle;
(ii) D=even ((h is calculatedmax/2)2+(vmax/2)2)1/2, " even " represents ((hmax/2)2+(vmax/2)2)1/2Upwards take even number, Then the current and effective duty cycle vector V of history p_M is calculatedlAnd Vd, and pass through Calculate VlAnd VdMatching rate prpt, wherein, Vl=[vl(0), vl(Δr)…vl(r)…vl(D/2)]T, Vd=[vd(0), vd(Δ r)…vd(r)…vd(D/2)]T,
(8.2) utilize the current p_M of two-dimensional scan matching method matches and history p_M, specifically include following content:
I (), according to the establishment of coordinate system method in (8.1), sets up the coordinate system of current p_M and history p_M respectively;
(ii) remember that in current bianry image space, the pixel count of picture level and vertical direction is N, the i-th row occupied by p_M and The pixel count of the i-th row N respectivelyriAnd Nci, then the effective duty cycle of the i-th row and column is designated as w respectivelyri=Nri/ N and wci=Nci/ N, The effective duty cycle calculating all ranks of current p_M respectively constitutes effective duty cycle vectorWithIn like manner, history p_M is calculated Effective duty cycle vectorWith
(iii) the current and canonical correlation coefficient of history p_M ranks effective duty cycle vector and final matching degree P are calculated respectively2d
10. method as claimed in claim 9, it is characterised in that step (9) includes following content: utilize SIFT operator to carry respectively Take the point of interest of current p_M and history p_M, carry out images match, calculate matching rate Pp
11. methods as claimed in claim 10, it is characterised in that step (10) including:
Calculate final matching rate Pf=α prpt+βp2d+γpp, wherein alpha+beta+γ=1, the value of α, β and γ is according to each match party Depending on the precision of method, if PfIt is unsatisfactory for threshold requirement, then stores current p_M, otherwise, then may utilize match information and move Robot localization, map building and path planning task.
CN201410210477.2A 2014-05-16 2014-05-16 A kind of image conversion processing method of sonar data CN103983270B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410210477.2A CN103983270B (en) 2014-05-16 2014-05-16 A kind of image conversion processing method of sonar data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410210477.2A CN103983270B (en) 2014-05-16 2014-05-16 A kind of image conversion processing method of sonar data

Publications (2)

Publication Number Publication Date
CN103983270A CN103983270A (en) 2014-08-13
CN103983270B true CN103983270B (en) 2016-09-28

Family

ID=51275334

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410210477.2A CN103983270B (en) 2014-05-16 2014-05-16 A kind of image conversion processing method of sonar data

Country Status (1)

Country Link
CN (1) CN103983270B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106197421B (en) * 2016-06-24 2019-03-22 北京工业大学 A kind of forward position target point generation method independently explored for mobile robot

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103247040A (en) * 2013-05-13 2013-08-14 北京工业大学 Layered topological structure based map splicing method for multi-robot system
CN103278170A (en) * 2013-05-16 2013-09-04 东南大学 Mobile robot cascading map building method based on remarkable scenic spot detection

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103247040A (en) * 2013-05-13 2013-08-14 北京工业大学 Layered topological structure based map splicing method for multi-robot system
CN103278170A (en) * 2013-05-16 2013-09-04 东南大学 Mobile robot cascading map building method based on remarkable scenic spot detection

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CHOI J et al..Autonomous topological modeling of a home environment and topological localization using a sonar grid map.《Autonomous Robots》.2011,第30卷(第4期),第351-368页. *
CHOI Y-H et al..A line feature based SLAM with low grade range sensors using geometric constraints and active exploration for mobile robot.《Autonomous Robots》.2008,第24卷(第1期),第13-27页. *
一种基于声纳信息的移动机器人地图创建方法;高丽华等;《制造业自动化》;20061130;第28卷(第11期);第33-35、65页 *
基于声纳的室内环境栅格地图创建方法的研究;李润伟;《中国优秀硕士学位论文全文数据库信息科技辑》;20090115;论文全文 *

Also Published As

Publication number Publication date
CN103983270A (en) 2014-08-13

Similar Documents

Publication Publication Date Title
Pizzoli et al. REMODE: Probabilistic, monocular dense reconstruction in real time
Pfeiffer et al. Exploiting the power of stereo confidences
AU2012314067B2 (en) Localising transportable apparatus
Fritsch et al. A new performance measure and evaluation benchmark for road detection algorithms
CN103278170B (en) Based on mobile robot's cascade map creating method that remarkable scene point detects
Dewan et al. Motion-based detection and tracking in 3d lidar scans
Kümmerle et al. Large scale graph-based SLAM using aerial images as prior information
Stachniss et al. Exploring unknown environments with mobile robots using coverage maps
US8521418B2 (en) Generic surface feature extraction from a set of range data
Jin et al. Environmental boundary tracking and estimation using multiple autonomous vehicles
Milella et al. Stereo-based ego-motion estimation using pixel tracking and iterative closest point
Nieto et al. Recursive scan-matching SLAM
CN103236064B (en) A kind of some cloud autoegistration method based on normal vector
CN103268729B (en) Based on mobile robot's tandem type map creating method of composite character
Pfeiffer et al. Modeling dynamic 3D environments by means of the stixel world
CN106595659A (en) Map merging method of unmanned aerial vehicle visual SLAM under city complex environment
Kim et al. SLAM in indoor environments using omni-directional vertical and horizontal line features
US20040039498A1 (en) System and method for the creation of a terrain density model
JP2010061655A (en) Object tracking using linear feature
KR20090088516A (en) Method for self-localization of a robot based on object recognition and environment information around the recognized object
CN105182328B (en) A kind of GPR buried target detection method based on two-dimensional empirical mode decomposition
US8831778B2 (en) Method of accurate mapping with mobile robots
CN105856230A (en) ORB key frame closed-loop detection SLAM method capable of improving consistency of position and pose of robot
CN102435188A (en) Monocular vision/inertia autonomous navigation method for indoor environment
CN106156748A (en) Traffic scene participant's recognition methods based on vehicle-mounted binocular camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant