Summary of the invention
Technical matters to be solved by this invention is for the deficiencies in the prior art, a kind of line laser distance-finding method for self-movement robot is provided, the method only carries out the process of regional area to the image of the line laser striped be incident upon on target object photographed, make to need data to be processed greatly to reduce, under the prerequisite not affecting accuracy rate, improve system effectiveness, whole system travelling speed is doubled, save working time and convenient and reliable.
Technical matters to be solved of the present invention is achieved by the following technical solution:
For a line laser distance-finding method for self-movement robot, the method comprises the steps:
Step 100: to be arranged on laser line generator on self-movement robot for light source, emission line laser projection is on target object, and imageing sensor shooting is incident upon the image containing line laser striped on target object;
Step 200: determining image procossing regional area containing on the image of line laser striped;
Step 300: carry out distortion correction to image procossing regional area, determines the actual position of line laser striped on image;
Step 400: find center of energy line in the line laser striped after distortion correction, and determine this position of center of energy line on image;
Step 500: according to the pixel coordinate of center of energy line on image, obtain the actual range of self-movement robot distance objective object.
Described step 200 specifically comprises:
Step 201: photograph containing the image of line laser striped in travel through every a line, find laser gray-scale value in this row maximum a bit;
Step 202: the maximum both sides of any of the gray-scale value found in every a line get N1 pixel respectively, form image procossing regional area, wherein 10≤N1≤30.
Adopt centroid algorithm to find center of energy line in described step 400, specifically comprise:
Step 401: in the image of the line laser striped after distortion correction, travel through every a line, find laser gray-scale value in this row maximum a bit;
Step 402: N2 pixel is respectively got respectively again in any the both sides maximum at the laser gray-scale value searched out, wherein, 5≤N2≤20;
Step 403: the center of energy line that centroid method obtains the image of line laser striped is done to these pixels;
Centroid algorithm formula is:
yii=Σ(f(x,yi)*yi)/Σf(x,yi)
Wherein: yii be through centroid algorithm after pixel column coordinate position, (x, yi) is former pixel coordinate (coordinate obtained after distortion correction), and f (x, yi) is the gray-scale value of (x, yi) position; Namely the yii obtained in above formula is the position at the center of energy point place of the line laser stripe pattern of every a line; The center of energy point composition center of energy line of all row.
Curve fitting method also can be adopted in described step 400 to find the center of energy line of line laser striped.
Described step 500 specifically comprises: according to range of triangle principle, and establish a three way relationship formula, tries to achieve the actual range q of self-movement robot distance objective object.
Described step 500 specifically comprises:
Step 501: the center of energy row coordinate yii measuring the image of line laser striped corresponding to n actual range q, sets up the table of comparisons of actual range q and center of energy row coordinate;
Step 502: according to the center of energy row coordinate yii of the image of the line laser striped of reality, obtain actual range q by interpolation algorithm.
N in described step 501 is more than or equal to 20.
In sum, the present invention only carries out the process of regional area to the image of the line laser striped be incident upon on target object photographed, make to need data to be processed greatly to reduce, under the prerequisite not affecting accuracy rate, improve system effectiveness, whole system travelling speed is doubled, saves the working time.
Below in conjunction with the drawings and specific embodiments, technical scheme of the present invention is described in detail.
Embodiment
Fig. 1 is similar triangle theory schematic diagram of the present invention.As shown in Figure 1, the light launched by light source A is irradiated on target object C, is incident upon the image containing line laser striped on target object C by imageing sensor B shooting.In the present invention, what light source A adopted is laser line generator, and therefore, its light spot shape being incident upon on target object C is striated.A triangle △ ABC is formed by light source A, imageing sensor B and target object C.When the shooting focal length of imageing sensor B is f, because the relative position between light source A and imageing sensor B is fixing, reference position a then on imageing sensor film determines, now, the utilizing emitted light sent by light source A reflects from target object C, and the drop point on film is b, then form another triangle △ abB between stripe pattern and imageing sensor, obviously, △ ABC and △ abB is similar triangles, has the proportionate relationship determined each other.Because light source A and imageing sensor B is arranged on self-movement robot body, the vertical range between target object C to both lines can be considered to the actual range q between target object C to self-movement robot.In △ ABC, the numerical value of the distance s between light source A and imageing sensor B is determined, in △ abB, the numerical value of f is determined, and the length x of lines print image can be fixed up by the mode of value, utilize △ ABC and △ abB to be similar triangles, have the proportionate relationship determined each other, the actual range q between target object C to self-movement robot can be obtained.
The setting position on self-movement robot specific to light source A and imageing sensor B, imageing sensor B can be arranged on the dead ahead of self-movement robot, it is made to be connected with the mainboard of self-movement robot, and provide the positional information at current place for mainboard, carry out processing and action decision for mainboard.Light source A is arranged on the top of self-movement robot front end, with imageing sensor B roughly on same perpendicular.
Based on above-mentioned ultimate principle, the invention provides a kind of line laser distance-finding method for self-movement robot, the method comprises the steps:
Step 100: to be arranged on laser line generator on self-movement robot for light source, emission line laser projection is on target object, and imageing sensor shooting is incident upon the image containing line laser striped on target object;
Step 200: determining image procossing regional area containing on the image of line laser striped;
Step 300: carry out distortion correction to image procossing regional area, determines the actual position of line laser striped on image;
Step 400: find center of energy line in the line laser striped after distortion correction, and determine this position of center of energy line on image;
Step 500: according to the pixel coordinate of center of energy line on image, obtain the actual range of self-movement robot distance objective object.
Described step 200 specifically comprises:
Step 201: travel through every a line in the image of the line laser striped photographed, find laser gray-scale value in this row maximum a bit;
Step 202: the maximum both sides of any of the gray-scale value found in every a line get N1 pixel respectively, form image procossing regional area, wherein 10≤N1≤30.
Adopt centroid algorithm to find center of energy line in described step 400, specifically comprise:
Step 401: in the image of the line laser striped after distortion correction, travel through every a line, find laser gray-scale value in this row maximum a bit;
Step 402: N2 pixel is respectively got respectively again in any the both sides maximum at the laser gray-scale value searched out, wherein, 5≤N2≤20;
Step 403: the center of energy line that centroid method obtains the image of line laser striped is done to these pixels;
Centroid algorithm formula is:
yii=Σ(f(x,yi)*yi)/Σf(x,yi)
Wherein: yii be through centroid algorithm after pixel column coordinate position, (x, yi) is former pixel coordinate (coordinate obtained after distortion correction), and f (x, yi) is the gray-scale value of (x, yi) position; Namely the yii obtained in above formula is the position at the center of energy point place of the line laser stripe pattern of every a line; The center of energy point composition center of energy line of all row.
Described step 500 specifically comprises: according to range of triangle principle, and establish a three way relationship formula, tries to achieve the actual range q of self-movement robot distance objective object.
Described step 500 specifically comprises:
Step 501: the center of energy row coordinate yii measuring the image of line laser striped corresponding to n actual range q, sets up the table of comparisons of actual range q and center of energy row coordinate;
Step 502: according to the center of energy row coordinate yii of the image of the line laser striped of reality, obtain actual range q by interpolation algorithm.
N in described step 501 is more than or equal to 20.
Specifically, shown in composition graphs 1, light source A in the present invention adopts laser line generator (laser), emission line laser projection is on target object C, (shown in Fig. 1 is the diagram of single-point laser, line laser is laser expansion in a direction perpendicular to the plane of the paper), then adopt cmos image sensor (imager) shooting of 640 × 480 containing the image of line laser striped.By row scan process containing line laser stripe pattern every data line, unwanted background image is thrown and removes, thus form line laser striped (as gray-scale value maximum point and periphery 30 points) on the column direction of imageing sensor, then to the process of line laser striped to obtain the laser stripe center of energy line on column direction.According to the needs of processing accuracy and speed, can suitably choose or adjust the numerical value of N1, N2 and n.
In above-mentioned steps 400, except centroid method can be adopted to find except the center of energy line of line laser striped, the additive methods such as curve fitting algorithm can also be adopted to carry out.Described curve fitting method is the principle that the optical intensity on the cross section that make use of laser stripe is similar to Gaussian distributed.The method of curve is adopted to obtain the optical losses coordinate of subpixel accuracy to pixel coordinate and gray-scale value near the peak point of light distribution on xsect.Concrete steps are as follows: (1) searches for every a line successively to processed image, find light intensity peak point, that is: the maximum point of gray-scale value in this row, if this point is M
0(x
0, y
0), gray-scale value is g
0; (2) M is chosen
0left side adjoint point M
-2(x
0, y
-2), M
-1(x
0, y
-1) and right side adjoint point M
1(x
0, y
1), M
2(x
0, y
2), if its gray-scale value is respectively g
-2, g
-1, g
1, g
2.Utilize above 5 points, carry out conic fitting according to the principle of least square; (3) by curve obtain maximum point y to coordinate y
0*, then (x
0, y
0*) be namely required light stripe center point.Fit curve equation can be para-curve or Gaussian curve.The method is suitable for normal direction in image and changes little straight line striations.
Under normal circumstances, the fault of construction itself existed because the camera in imageing sensor is inner, can cause captured image to deform to a certain extent, especially more toward image edge direction, be out of shape more serious.Therefore, must by the process of distortion correction, to the correct image of distortion.Because line laser is filled with some directions of image, obtain real image thus obtain more accurate distance accuracy being absolutely necessary so correct its distortion.By distortion correction process, the laser stripe that can will be out of shape in image, the curved shape on such as both sides corrects into real linear pattern.But, if as prior art, all distortion correction process is carried out to the general image that imageing sensor photographs, then needs to travel through each pixel to entire image, so, just need very large data and information processing capacity.And the present invention in step 200, first determining image procossing regional area containing on the image of line laser striped, then by step 300, distortion correction is being carried out to image procossing regional area, determines the actual position of line laser striped on image.Such processing mode, only uses distortion correction algorithm to the topography got, greatly can accelerate arithmetic speed, thus accelerates image processing speed.
Topography's distortion correction, specifically comprises:
First, photograph laser stripe gray level image, and travel through the maximum point of every a line searching laser gray-scale value in the picture;
Secondly, get 30 pixels respectively on the laser gray-scale value maximum point both sides of every a line, carry out topography's distortion correction process, certainly, selected number of pixels is not limited in 30, and according to the needs of processing accuracy or speed, the number can chosen from Row sum-equal matrix is how many;
Distortion correction algorithmic formula is:
x
corrected=x(1+k
1r
2+k
2r
4+k
3r
6)+2*p
1*xy+p
2*(r
2+2*x
2)
y
corrected=y(1+k
1r2+k
2r4+k
3r
6)+2*p
2*xy+p
1*(r
2+2*y
2)
Wherein: (x, y) is the original image distortional point original position on the image sensor photographed, (x
corrected, y
corrected) be correct after reposition;
K
1, k
2, k
3, p
1and p
2be the distortion factor of camera respectively, obtained by camera calibration; Wherein r
2=x
2+ y
2.
Centroid algorithm is adopted to find the center of energy line of laser stripe in above-mentioned steps 400 of the present invention, real laser stripe image is only obtained specifically in previous step, laser stripe data in each row all occupy the position of several pixels, cannot carry out accurate Calculation.In order to calculate the space length of its correspondence, need the center of energy line extracting laser stripe, the laser spots pixel scale on the image sensor on this center of energy line is high, the pixel scale that normally zero point is several, can more accurate corresponding space length.
By the process of each step above-mentioned, after obtaining corresponding data and information, just can the actual range q between self-movement robot and target object C be calculated.Can be calculated by the following two kinds method.
Specifically, the first method is after the coordinate obtaining striped center of energy, range of triangle principle can be utilized to obtain the laser stripe space length corresponding to each central point in a column direction, can obtain the section distance corresponding to laser rays.Obtain the range of triangle principle of the method also with good grounds upper figure of distance, establish a three way relationship formula, and wherein variable is also yii, then carries out calculating with other known parameters and ask for distance.
Another kind method utilizes look-up table to calculate distance, the i.e. corresponding space length of each provision centre coordinate, one group of fixed table be made up of distance Z and fringe center row coordinate yii is set up in test on this basis, that is the data measuring n distance Z are in advance needed, and the data of the row coordinate yii corresponding with it, obtain the linear relationship of row coordinate yii respective distances Z thus, thus set up the form of a calculating for reference, in actual use, n determines according to measuring distance scope, such as, measurement range within 5 meters at least needs the corresponding relation data of measurement more than 20 groups, that is the value of n is greater than and equals 20 just comparatively reliable, after test obtains this reference table in advance, actual fringe center row coordinate only need be obtained when actual measurement, then just may correspond to a space length through interpolation algorithm.
After obtaining all space lengths, self-movement robot just can carry out controlling of path planning and walking manner according to relevant information.
In sum, this line laser distance-finding method for self-movement robot provided by the invention, by line laser distance measuring sensor, solve the defect of existing self-movement robot non intelligentization, overcome the problems such as instability that in existing ranging technology, single-point laser and whirligig bring, cost high and life-span is short simultaneously, the distance measuring sensor in the present invention in the short distance range finding application of indoor 200mm-5000mm efficiency and cost performance all very high.
Laser range sensor of the present invention uses line-structured light as light source, is easier to calibration, adjustment relative to a laser.Line-structured light coordinates with face battle array cmos sensor, by the depth distance information of the two dimensional gray information extraction measurement target in cmos image.In line-structured light range finding, video camera needs to demarcate, and obtains the inside and outside parameter of video camera, carrys out the distortion of correcting image thus obtain real laser stripe image by these parameters.The laser line segment modulating rear display by target object due to line-structured light is the laser stripe having one fixed width, this laser stripe Gaussian distributed on xsect, so when computer memory distance, the center of energy line extracting line laser striped is in the picture needed to calculate.Line structure optical sensor obtains the two-dimensional distance information within the scope of sensor settings as " eyes " of robot, for the indoor situations that robot main frame provides " eyes " to see, main frame is positioned by these information, programme path set up indoor map.The route that robot can formulate oneself according to the map goes to clean ground, and this makes the work of sweeping robot definitely intelligent, and efficiency improves a lot.