CN104236521A - Line-laser ranging method applied to auto-moving robots - Google Patents

Line-laser ranging method applied to auto-moving robots Download PDF

Info

Publication number
CN104236521A
CN104236521A CN201310234951.0A CN201310234951A CN104236521A CN 104236521 A CN104236521 A CN 104236521A CN 201310234951 A CN201310234951 A CN 201310234951A CN 104236521 A CN104236521 A CN 104236521A
Authority
CN
China
Prior art keywords
line
image
line laser
laser
center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310234951.0A
Other languages
Chinese (zh)
Inventor
汤进举
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to CN201310234951.0A priority Critical patent/CN104236521A/en
Priority to PCT/CN2014/079742 priority patent/WO2014198227A1/en
Publication of CN104236521A publication Critical patent/CN104236521A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Abstract

A line-laser ranging method applied to auto-moving robots comprises step 100, taking a line laser device arranged on an auto-moving robot as an emission source, enabling the line laser device to emit line laser and project on a target object, and using an image sensor to shoot an image which is projected on the target object and contains the line laser stripes; step 200, determining an image processing local area on the image containing the line laser stripes; step 300, performing distortion correction on the image processing local area, so as to determine the actual position of the line laser stripes on the image; step 400, finding an energy center line in the line laser stripes subjected to distortion correction, and determining the position of the energy center line in the image; and step 500, obtaining the actual distance of the auto-moving robot to the target object according to the pixel coordinate of the energy center line in the image. The line-laser ranging method only performs local-area processing on the image, so that the to-be processed data size is substantially reduced, and the method helps to substantially improve the processing efficiency and save the working time under the premise of not influencing the accuracy.

Description

For the line laser distance-finding method of self-movement robot
Technical field
The present invention relates to a kind of line laser distance-finding method, especially a kind of line laser distance-finding method for self-movement robot, belongs to small household appliances manufacturing technology field.
Background technology
The core technology of self-movement robot, is exactly the environment around perception, then plans the route of walking, effectively travel through regional, complete the operation of regional.In this technique, be substantially the most also it is crucial that " perception " module, i.e. laser range sensor.
Existing sweeping robot is wherein a kind of is do not use distance measuring sensor, by hitting plate sovereignty nuisance thing, then cleans ground at random.Can cause invalid cleaning repeatedly like this, such as: some corner is not swept into or some region is repeatedly cleaned, operation object blindly and level of intelligence is low, wastes time and energy.Laser range sensor is employed as its intelligentized front end in some similar application now, but in such applications, its principle uses single-point laser as light source, forms emitting-receiving system, by the distance of the method computer memory object such as time, space with receiving trap.Usually, when measuring section distance, based on single-point laser range finding, then two dimensional cross-section distance measuring equipment is just constituted in conjunction with whirligig.The defect of this distance measuring equipment and method is: rotate top chance and introduce noise, system is stable not, and serviceable life is shorter and cost is higher.
It is the distance utilizing range finder to come between sniffing robot and barrier that existing sweeping robot also has a kind of, and sweeping robot evades ground obstacle thus the generation of collision free in real time according to the distance that detector detects.Or sweeping robot also can carry out self-position location according to range information, this range information can also be utilized to set up indoor map, facilitate planning and the setting of follow-up walking path.
At present, sweeping robot on the market mostly utilizes ultrasonic sensor or infrared sensor to carry out the detection of obstacle distance, but ultrasonic signal or infrared signal are vulnerable to external interference and detection range is less, precision is not high.In order to improve scope and the precision of range observation further, existing sweeping robot starts to adopt laser range sensor to carry out the distance of detecting obstacles thing.As Chinese patent CN101809461 discloses a kind of sweeping robot, employing be laser range sensor, comprise a lasing light emitter and imageing sensor.This laser range sensor is according to the range information of triangle telemetry dyscalculia thing.In order to obtain the range information of whole indoor barrier, the laser detector on this sweeping robot also arranges a rotatable base, and continuable 360 degree of laser detector is scanned the barrier of indoor.But the rotation top chance of this rotary laser distance measuring sensor introduces noise, and system is stable not, and serviceable life is shorter and cost is higher.In addition, Chinese patent CN101210800 also discloses a kind of line laser distance measuring sensor, comprises linear laser source and camera.This line laser distance measuring sensor is according to the distance of the laser stripe location of pixels dyscalculia thing in image, and its distance-finding method obtains laser stripe data by after entire image binaryzation, then to carrying out range of triangle dyscalculia thing distance after this data subsequent treatment.But owing to needing to be processed by complete image, contain much information, processing speed is slow.
Summary of the invention
Technical matters to be solved by this invention is for the deficiencies in the prior art, a kind of line laser distance-finding method for self-movement robot is provided, the method only carries out the process of regional area to the image of the line laser striped be incident upon on target object photographed, make to need data to be processed greatly to reduce, under the prerequisite not affecting accuracy rate, improve system effectiveness, whole system travelling speed is doubled, save working time and convenient and reliable.
Technical matters to be solved of the present invention is achieved by the following technical solution:
For a line laser distance-finding method for self-movement robot, the method comprises the steps:
Step 100: to be arranged on laser line generator on self-movement robot for light source, emission line laser projection is on target object, and imageing sensor shooting is incident upon the image containing line laser striped on target object;
Step 200: determining image procossing regional area containing on the image of line laser striped;
Step 300: carry out distortion correction to image procossing regional area, determines the actual position of line laser striped on image;
Step 400: find center of energy line in the line laser striped after distortion correction, and determine this position of center of energy line on image;
Step 500: according to the pixel coordinate of center of energy line on image, obtain the actual range of self-movement robot distance objective object.
Described step 200 specifically comprises:
Step 201: photograph containing the image of line laser striped in travel through every a line, find laser gray-scale value in this row maximum a bit;
Step 202: the maximum both sides of any of the gray-scale value found in every a line get N1 pixel respectively, form image procossing regional area, wherein 10≤N1≤30.
Adopt centroid algorithm to find center of energy line in described step 400, specifically comprise:
Step 401: in the image of the line laser striped after distortion correction, travel through every a line, find laser gray-scale value in this row maximum a bit;
Step 402: N2 pixel is respectively got respectively again in any the both sides maximum at the laser gray-scale value searched out, wherein, 5≤N2≤20;
Step 403: the center of energy line that centroid method obtains the image of line laser striped is done to these pixels;
Centroid algorithm formula is:
yii=Σ(f(x,yi)*yi)/Σf(x,yi)
Wherein: yii be through centroid algorithm after pixel column coordinate position, (x, yi) is former pixel coordinate (coordinate obtained after distortion correction), and f (x, yi) is the gray-scale value of (x, yi) position; Namely the yii obtained in above formula is the position at the center of energy point place of the line laser stripe pattern of every a line; The center of energy point composition center of energy line of all row.
Curve fitting method also can be adopted in described step 400 to find the center of energy line of line laser striped.
Described step 500 specifically comprises: according to range of triangle principle, and establish a three way relationship formula, tries to achieve the actual range q of self-movement robot distance objective object.
Described step 500 specifically comprises:
Step 501: the center of energy row coordinate yii measuring the image of line laser striped corresponding to n actual range q, sets up the table of comparisons of actual range q and center of energy row coordinate;
Step 502: according to the center of energy row coordinate yii of the image of the line laser striped of reality, obtain actual range q by interpolation algorithm.
N in described step 501 is more than or equal to 20.
In sum, the present invention only carries out the process of regional area to the image of the line laser striped be incident upon on target object photographed, make to need data to be processed greatly to reduce, under the prerequisite not affecting accuracy rate, improve system effectiveness, whole system travelling speed is doubled, saves the working time.
Below in conjunction with the drawings and specific embodiments, technical scheme of the present invention is described in detail.
Accompanying drawing explanation
Fig. 1 is similar triangle theory schematic diagram of the present invention.
Embodiment
Fig. 1 is similar triangle theory schematic diagram of the present invention.As shown in Figure 1, the light launched by light source A is irradiated on target object C, is incident upon the image containing line laser striped on target object C by imageing sensor B shooting.In the present invention, what light source A adopted is laser line generator, and therefore, its light spot shape being incident upon on target object C is striated.A triangle △ ABC is formed by light source A, imageing sensor B and target object C.When the shooting focal length of imageing sensor B is f, because the relative position between light source A and imageing sensor B is fixing, reference position a then on imageing sensor film determines, now, the utilizing emitted light sent by light source A reflects from target object C, and the drop point on film is b, then form another triangle △ abB between stripe pattern and imageing sensor, obviously, △ ABC and △ abB is similar triangles, has the proportionate relationship determined each other.Because light source A and imageing sensor B is arranged on self-movement robot body, the vertical range between target object C to both lines can be considered to the actual range q between target object C to self-movement robot.In △ ABC, the numerical value of the distance s between light source A and imageing sensor B is determined, in △ abB, the numerical value of f is determined, and the length x of lines print image can be fixed up by the mode of value, utilize △ ABC and △ abB to be similar triangles, have the proportionate relationship determined each other, the actual range q between target object C to self-movement robot can be obtained.
The setting position on self-movement robot specific to light source A and imageing sensor B, imageing sensor B can be arranged on the dead ahead of self-movement robot, it is made to be connected with the mainboard of self-movement robot, and provide the positional information at current place for mainboard, carry out processing and action decision for mainboard.Light source A is arranged on the top of self-movement robot front end, with imageing sensor B roughly on same perpendicular.
Based on above-mentioned ultimate principle, the invention provides a kind of line laser distance-finding method for self-movement robot, the method comprises the steps:
Step 100: to be arranged on laser line generator on self-movement robot for light source, emission line laser projection is on target object, and imageing sensor shooting is incident upon the image containing line laser striped on target object;
Step 200: determining image procossing regional area containing on the image of line laser striped;
Step 300: carry out distortion correction to image procossing regional area, determines the actual position of line laser striped on image;
Step 400: find center of energy line in the line laser striped after distortion correction, and determine this position of center of energy line on image;
Step 500: according to the pixel coordinate of center of energy line on image, obtain the actual range of self-movement robot distance objective object.
Described step 200 specifically comprises:
Step 201: travel through every a line in the image of the line laser striped photographed, find laser gray-scale value in this row maximum a bit;
Step 202: the maximum both sides of any of the gray-scale value found in every a line get N1 pixel respectively, form image procossing regional area, wherein 10≤N1≤30.
Adopt centroid algorithm to find center of energy line in described step 400, specifically comprise:
Step 401: in the image of the line laser striped after distortion correction, travel through every a line, find laser gray-scale value in this row maximum a bit;
Step 402: N2 pixel is respectively got respectively again in any the both sides maximum at the laser gray-scale value searched out, wherein, 5≤N2≤20;
Step 403: the center of energy line that centroid method obtains the image of line laser striped is done to these pixels;
Centroid algorithm formula is:
yii=Σ(f(x,yi)*yi)/Σf(x,yi)
Wherein: yii be through centroid algorithm after pixel column coordinate position, (x, yi) is former pixel coordinate (coordinate obtained after distortion correction), and f (x, yi) is the gray-scale value of (x, yi) position; Namely the yii obtained in above formula is the position at the center of energy point place of the line laser stripe pattern of every a line; The center of energy point composition center of energy line of all row.
Described step 500 specifically comprises: according to range of triangle principle, and establish a three way relationship formula, tries to achieve the actual range q of self-movement robot distance objective object.
Described step 500 specifically comprises:
Step 501: the center of energy row coordinate yii measuring the image of line laser striped corresponding to n actual range q, sets up the table of comparisons of actual range q and center of energy row coordinate;
Step 502: according to the center of energy row coordinate yii of the image of the line laser striped of reality, obtain actual range q by interpolation algorithm.
N in described step 501 is more than or equal to 20.
Specifically, shown in composition graphs 1, light source A in the present invention adopts laser line generator (laser), emission line laser projection is on target object C, (shown in Fig. 1 is the diagram of single-point laser, line laser is laser expansion in a direction perpendicular to the plane of the paper), then adopt cmos image sensor (imager) shooting of 640 × 480 containing the image of line laser striped.By row scan process containing line laser stripe pattern every data line, unwanted background image is thrown and removes, thus form line laser striped (as gray-scale value maximum point and periphery 30 points) on the column direction of imageing sensor, then to the process of line laser striped to obtain the laser stripe center of energy line on column direction.According to the needs of processing accuracy and speed, can suitably choose or adjust the numerical value of N1, N2 and n.
In above-mentioned steps 400, except centroid method can be adopted to find except the center of energy line of line laser striped, the additive methods such as curve fitting algorithm can also be adopted to carry out.Described curve fitting method is the principle that the optical intensity on the cross section that make use of laser stripe is similar to Gaussian distributed.The method of curve is adopted to obtain the optical losses coordinate of subpixel accuracy to pixel coordinate and gray-scale value near the peak point of light distribution on xsect.Concrete steps are as follows: (1) searches for every a line successively to processed image, find light intensity peak point, that is: the maximum point of gray-scale value in this row, if this point is M 0(x 0, y 0), gray-scale value is g 0; (2) M is chosen 0left side adjoint point M -2(x 0, y -2), M -1(x 0, y -1) and right side adjoint point M 1(x 0, y 1), M 2(x 0, y 2), if its gray-scale value is respectively g -2, g -1, g 1, g 2.Utilize above 5 points, carry out conic fitting according to the principle of least square; (3) by curve obtain maximum point y to coordinate y 0*, then (x 0, y 0*) be namely required light stripe center point.Fit curve equation can be para-curve or Gaussian curve.The method is suitable for normal direction in image and changes little straight line striations.
Under normal circumstances, the fault of construction itself existed because the camera in imageing sensor is inner, can cause captured image to deform to a certain extent, especially more toward image edge direction, be out of shape more serious.Therefore, must by the process of distortion correction, to the correct image of distortion.Because line laser is filled with some directions of image, obtain real image thus obtain more accurate distance accuracy being absolutely necessary so correct its distortion.By distortion correction process, the laser stripe that can will be out of shape in image, the curved shape on such as both sides corrects into real linear pattern.But, if as prior art, all distortion correction process is carried out to the general image that imageing sensor photographs, then needs to travel through each pixel to entire image, so, just need very large data and information processing capacity.And the present invention in step 200, first determining image procossing regional area containing on the image of line laser striped, then by step 300, distortion correction is being carried out to image procossing regional area, determines the actual position of line laser striped on image.Such processing mode, only uses distortion correction algorithm to the topography got, greatly can accelerate arithmetic speed, thus accelerates image processing speed.
Topography's distortion correction, specifically comprises:
First, photograph laser stripe gray level image, and travel through the maximum point of every a line searching laser gray-scale value in the picture;
Secondly, get 30 pixels respectively on the laser gray-scale value maximum point both sides of every a line, carry out topography's distortion correction process, certainly, selected number of pixels is not limited in 30, and according to the needs of processing accuracy or speed, the number can chosen from Row sum-equal matrix is how many;
Distortion correction algorithmic formula is:
x corrected=x(1+k 1r 2+k 2r 4+k 3r 6)+2*p 1*xy+p 2*(r 2+2*x 2)
y corrected=y(1+k 1r2+k 2r4+k 3r 6)+2*p 2*xy+p 1*(r 2+2*y 2)
Wherein: (x, y) is the original image distortional point original position on the image sensor photographed, (x corrected, y corrected) be correct after reposition;
K 1, k 2, k 3, p 1and p 2be the distortion factor of camera respectively, obtained by camera calibration; Wherein r 2=x 2+ y 2.
Centroid algorithm is adopted to find the center of energy line of laser stripe in above-mentioned steps 400 of the present invention, real laser stripe image is only obtained specifically in previous step, laser stripe data in each row all occupy the position of several pixels, cannot carry out accurate Calculation.In order to calculate the space length of its correspondence, need the center of energy line extracting laser stripe, the laser spots pixel scale on the image sensor on this center of energy line is high, the pixel scale that normally zero point is several, can more accurate corresponding space length.
By the process of each step above-mentioned, after obtaining corresponding data and information, just can the actual range q between self-movement robot and target object C be calculated.Can be calculated by the following two kinds method.
Specifically, the first method is after the coordinate obtaining striped center of energy, range of triangle principle can be utilized to obtain the laser stripe space length corresponding to each central point in a column direction, can obtain the section distance corresponding to laser rays.Obtain the range of triangle principle of the method also with good grounds upper figure of distance, establish a three way relationship formula, and wherein variable is also yii, then carries out calculating with other known parameters and ask for distance.
Another kind method utilizes look-up table to calculate distance, the i.e. corresponding space length of each provision centre coordinate, one group of fixed table be made up of distance Z and fringe center row coordinate yii is set up in test on this basis, that is the data measuring n distance Z are in advance needed, and the data of the row coordinate yii corresponding with it, obtain the linear relationship of row coordinate yii respective distances Z thus, thus set up the form of a calculating for reference, in actual use, n determines according to measuring distance scope, such as, measurement range within 5 meters at least needs the corresponding relation data of measurement more than 20 groups, that is the value of n is greater than and equals 20 just comparatively reliable, after test obtains this reference table in advance, actual fringe center row coordinate only need be obtained when actual measurement, then just may correspond to a space length through interpolation algorithm.
After obtaining all space lengths, self-movement robot just can carry out controlling of path planning and walking manner according to relevant information.
In sum, this line laser distance-finding method for self-movement robot provided by the invention, by line laser distance measuring sensor, solve the defect of existing self-movement robot non intelligentization, overcome the problems such as instability that in existing ranging technology, single-point laser and whirligig bring, cost high and life-span is short simultaneously, the distance measuring sensor in the present invention in the short distance range finding application of indoor 200mm-5000mm efficiency and cost performance all very high.
Laser range sensor of the present invention uses line-structured light as light source, is easier to calibration, adjustment relative to a laser.Line-structured light coordinates with face battle array cmos sensor, by the depth distance information of the two dimensional gray information extraction measurement target in cmos image.In line-structured light range finding, video camera needs to demarcate, and obtains the inside and outside parameter of video camera, carrys out the distortion of correcting image thus obtain real laser stripe image by these parameters.The laser line segment modulating rear display by target object due to line-structured light is the laser stripe having one fixed width, this laser stripe Gaussian distributed on xsect, so when computer memory distance, the center of energy line extracting line laser striped is in the picture needed to calculate.Line structure optical sensor obtains the two-dimensional distance information within the scope of sensor settings as " eyes " of robot, for the indoor situations that robot main frame provides " eyes " to see, main frame is positioned by these information, programme path set up indoor map.The route that robot can formulate oneself according to the map goes to clean ground, and this makes the work of sweeping robot definitely intelligent, and efficiency improves a lot.

Claims (7)

1., for a line laser distance-finding method for self-movement robot, the method comprises the steps:
Step 100: to be arranged on laser line generator on self-movement robot for light source, emission line laser projection, on target object, is incident upon the image containing line laser striped on target object by the imageing sensor shooting on self-movement robot;
Step 200: determining image procossing regional area containing on the image of line laser striped;
Step 300: carry out distortion correction to image procossing regional area, determines the actual position of line laser striped on image;
Step 400: find center of energy line in the line laser striped after distortion correction, and determine this position of center of energy line on image;
Step 500: according to the pixel coordinate of center of energy line on image, obtain the actual range (q) of self-movement robot distance objective object.
2., as claimed in claim 1 for the line laser distance-finding method of self-movement robot, it is characterized in that, described step 200 specifically comprises:
Step 201: photograph containing the image of line laser striped in travel through every a line, find laser gray-scale value in this row maximum a bit;
Step 202: the maximum both sides of any of the gray-scale value found in every a line get N1 pixel respectively, form image procossing regional area, wherein 10≤N1≤30.
3. as claimed in claim 1 for the line laser distance-finding method of self-movement robot, it is characterized in that, adopt centroid algorithm to find the center of energy line of line laser striped in described step 400, specifically comprise:
Step 401: in the image of the line laser striped after distortion correction, travel through every a line, find laser gray-scale value in this row maximum a bit;
Step 402: N2 pixel is respectively got respectively again in any the both sides maximum at the laser gray-scale value searched out, wherein, 5≤N2≤20;
Step 403: the center of energy line that centroid method obtains the image of line laser striped is done to these pixels;
Centroid algorithm formula is:
yii=Σ(f(x,yi)*yi)/Σf(x,yi)
Wherein: yii be through centroid algorithm after pixel column coordinate position, (x, yi) is former pixel coordinate (coordinate obtained after distortion correction), and f (x, yi) is the gray-scale value of (x, yi) position; Namely the yii obtained in above formula is the position at the center of energy point place of the line laser stripe pattern of every a line; The center of energy point composition center of energy line of all row.
4., as claimed in claim 1 for the line laser range finding method for sensing from shifter people, it is characterized in that, in described step 400, adopt curve fitting method to find the center of energy line of line laser striped.
5. as claimed in claim 1 for the line laser distance-finding method of self-movement robot, it is characterized in that, described step 500 specifically comprises: according to range of triangle principle, and establish a three way relationship formula, tries to achieve the actual range (q) of self-movement robot distance objective object.
6., as claimed in claim 1 for the line laser distance-finding method of self-movement robot, it is characterized in that, described step 500 specifically comprises:
Step 501: the center of energy row coordinate yii measuring the image of line laser striped corresponding to n actual range (q), sets up the table of comparisons of actual range (q) and center of energy row coordinate;
Step 502: according to the center of energy row coordinate yii of the image of the line laser striped of reality, obtain actual range (q) by interpolation algorithm.
7., as claimed in claim 6 for the line laser distance-finding method of self-movement robot, it is characterized in that, the n in described step 501 is more than or equal to 20.
CN201310234951.0A 2013-06-14 2013-06-14 Line-laser ranging method applied to auto-moving robots Pending CN104236521A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201310234951.0A CN104236521A (en) 2013-06-14 2013-06-14 Line-laser ranging method applied to auto-moving robots
PCT/CN2014/079742 WO2014198227A1 (en) 2013-06-14 2014-06-12 Line laser ranging method used for self-moving robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310234951.0A CN104236521A (en) 2013-06-14 2013-06-14 Line-laser ranging method applied to auto-moving robots

Publications (1)

Publication Number Publication Date
CN104236521A true CN104236521A (en) 2014-12-24

Family

ID=52021667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310234951.0A Pending CN104236521A (en) 2013-06-14 2013-06-14 Line-laser ranging method applied to auto-moving robots

Country Status (2)

Country Link
CN (1) CN104236521A (en)
WO (1) WO2014198227A1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657981A (en) * 2015-01-07 2015-05-27 大连理工大学 Dynamic compensation method for three-dimensional laser distance metering data of mobile robot in moving process
CN104859562A (en) * 2015-05-27 2015-08-26 北京信息科技大学 Method and device for detecting barriers behind vehicle
CN105890047A (en) * 2015-01-21 2016-08-24 深圳市沃森空调技术有限公司 Air-conditioning robot
CN106092146A (en) * 2016-08-30 2016-11-09 宁波菜鸟智能科技有限公司 Laser ranging bearing calibration and system
CN106125738A (en) * 2016-08-26 2016-11-16 北京航空航天大学 A kind of identification of pallets device and method based on AGV
CN106226755A (en) * 2016-08-30 2016-12-14 北京小米移动软件有限公司 Robot
CN106371101A (en) * 2015-07-20 2017-02-01 北醒(北京)光子科技有限公司 Intelligent range finding and obstacle avoidance device
WO2017198038A1 (en) * 2016-05-17 2017-11-23 安徽酷哇机器人有限公司 Laser triangulation system safe for human eyes
WO2017202384A1 (en) * 2016-05-27 2017-11-30 科沃斯机器人股份有限公司 Laser ranging device and installation method for photosensitive chip thereof
CN107607960A (en) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 A kind of anallatic method and device
CN107861113A (en) * 2017-11-06 2018-03-30 深圳市杉川机器人有限公司 Scaling method and device
WO2018127209A1 (en) * 2017-01-09 2018-07-12 苏州宝时得电动工具有限公司 Autonomous moving device, and positioning system, positioning method and control method therefor
CN108345002A (en) * 2018-02-27 2018-07-31 上海图漾信息科技有限公司 Structure light measurement device and method
CN108398694A (en) * 2017-02-06 2018-08-14 苏州宝时得电动工具有限公司 Laser range finder and laser distance measurement method
CN108594205A (en) * 2018-06-21 2018-09-28 深圳市镭神智能系统有限公司 A kind of laser radar based on line laser
WO2019001001A1 (en) * 2017-06-28 2019-01-03 杭州海康机器人技术有限公司 Obstacle information acquisition apparatus and method
CN109615586A (en) * 2018-05-07 2019-04-12 杭州新瀚光电科技有限公司 Infrared image distortion correction algorithm
CN110781779A (en) * 2019-10-11 2020-02-11 北京地平线机器人技术研发有限公司 Object position detection method and device, readable storage medium and electronic equipment
CN112116619A (en) * 2020-09-16 2020-12-22 昆明理工大学 Multi-line structured light system stripe center line extraction method based on structural constraint
CN112595385A (en) * 2020-11-25 2021-04-02 创新奇智(南京)科技有限公司 Target height obtaining method and device
CN113050113A (en) * 2021-03-10 2021-06-29 广州南方卫星导航仪器有限公司 Laser point positioning method and device
WO2021135392A1 (en) * 2019-12-30 2021-07-08 科沃斯机器人股份有限公司 Structured light module and autonomous moving apparatus
WO2022227876A1 (en) * 2021-04-29 2022-11-03 北京石头世纪科技股份有限公司 Distance measurement method and apparatus, and robot and storage medium
CN111238453B (en) * 2018-11-28 2023-09-05 赫克斯冈技术中心 Intelligent positioning module

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109774197B (en) * 2018-07-13 2021-05-07 中国航空工业集团公司济南特种结构研究所 Method for determining position of laser projector of composite material curved surface layering

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200817651A (en) * 2006-10-02 2008-04-16 Ming-Chih Lu Distance measurement system and method
CN101587591A (en) * 2009-05-27 2009-11-25 北京航空航天大学 Visual accurate tracking technique based on double parameter thresholds dividing
CN101809461A (en) * 2007-07-19 2010-08-18 Neato机器人技术公司 Distance sensor system and method
KR20120007735A (en) * 2010-07-15 2012-01-25 삼성전기주식회사 Distance measuring module and electronic device including the same
CN102889864A (en) * 2011-07-19 2013-01-23 中铝上海铜业有限公司 Detection system for tower shape of object with strip coil edge and detection method thereof

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6496754B2 (en) * 2000-11-17 2002-12-17 Samsung Kwangju Electronics Co., Ltd. Mobile robot and course adjusting method thereof
JP2007530978A (en) * 2004-03-29 2007-11-01 エヴォリューション ロボティクス インコーポレイテッド Position estimation method and apparatus using reflected light source
KR100638220B1 (en) * 2005-04-23 2006-10-27 엘지전자 주식회사 Position sensing device of mobile robot and robot cleaner equipped with it
JP4609734B2 (en) * 2007-09-05 2011-01-12 カシオ計算機株式会社 Distance measuring device and projector provided with the distance measuring device
TWI461656B (en) * 2011-12-01 2014-11-21 Ind Tech Res Inst Apparatus and method for sencing distance
CN102980528B (en) * 2012-11-21 2015-07-08 上海交通大学 Calibration method of pose position-free constraint line laser monocular vision three-dimensional measurement sensor parameters

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200817651A (en) * 2006-10-02 2008-04-16 Ming-Chih Lu Distance measurement system and method
CN101809461A (en) * 2007-07-19 2010-08-18 Neato机器人技术公司 Distance sensor system and method
CN101587591A (en) * 2009-05-27 2009-11-25 北京航空航天大学 Visual accurate tracking technique based on double parameter thresholds dividing
KR20120007735A (en) * 2010-07-15 2012-01-25 삼성전기주식회사 Distance measuring module and electronic device including the same
CN102889864A (en) * 2011-07-19 2013-01-23 中铝上海铜业有限公司 Detection system for tower shape of object with strip coil edge and detection method thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
冯翠芝: "基于线激光三角法的目标深度探测", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
刘振 等: "基于互相关算法的激光条纹中心提取", 《中国激光》 *

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657981B (en) * 2015-01-07 2017-05-24 大连理工大学 Dynamic compensation method for three-dimensional laser distance metering data of mobile robot in moving process
CN104657981A (en) * 2015-01-07 2015-05-27 大连理工大学 Dynamic compensation method for three-dimensional laser distance metering data of mobile robot in moving process
CN105890047A (en) * 2015-01-21 2016-08-24 深圳市沃森空调技术有限公司 Air-conditioning robot
CN104859562A (en) * 2015-05-27 2015-08-26 北京信息科技大学 Method and device for detecting barriers behind vehicle
CN106371101B (en) * 2015-07-20 2019-08-16 北醒(北京)光子科技有限公司 A kind of device of intelligent distance-measuring and avoidance
CN106371101A (en) * 2015-07-20 2017-02-01 北醒(北京)光子科技有限公司 Intelligent range finding and obstacle avoidance device
WO2017198038A1 (en) * 2016-05-17 2017-11-23 安徽酷哇机器人有限公司 Laser triangulation system safe for human eyes
CN107436439A (en) * 2016-05-27 2017-12-05 科沃斯机器人股份有限公司 The installation method of laser ranging system and its sensitive chip
WO2017202384A1 (en) * 2016-05-27 2017-11-30 科沃斯机器人股份有限公司 Laser ranging device and installation method for photosensitive chip thereof
CN106125738A (en) * 2016-08-26 2016-11-16 北京航空航天大学 A kind of identification of pallets device and method based on AGV
CN106226755A (en) * 2016-08-30 2016-12-14 北京小米移动软件有限公司 Robot
CN106092146A (en) * 2016-08-30 2016-11-09 宁波菜鸟智能科技有限公司 Laser ranging bearing calibration and system
CN106226755B (en) * 2016-08-30 2019-05-21 北京小米移动软件有限公司 Robot
WO2018127209A1 (en) * 2017-01-09 2018-07-12 苏州宝时得电动工具有限公司 Autonomous moving device, and positioning system, positioning method and control method therefor
CN108398694B (en) * 2017-02-06 2024-03-15 苏州宝时得电动工具有限公司 Laser range finder and laser range finding method
CN108398694A (en) * 2017-02-06 2018-08-14 苏州宝时得电动工具有限公司 Laser range finder and laser distance measurement method
CN109143167A (en) * 2017-06-28 2019-01-04 杭州海康机器人技术有限公司 A kind of complaint message acquisition device and method
WO2019001001A1 (en) * 2017-06-28 2019-01-03 杭州海康机器人技术有限公司 Obstacle information acquisition apparatus and method
CN107607960A (en) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 A kind of anallatic method and device
CN107861113A (en) * 2017-11-06 2018-03-30 深圳市杉川机器人有限公司 Scaling method and device
CN108345002A (en) * 2018-02-27 2018-07-31 上海图漾信息科技有限公司 Structure light measurement device and method
CN109615586A (en) * 2018-05-07 2019-04-12 杭州新瀚光电科技有限公司 Infrared image distortion correction algorithm
CN108594205A (en) * 2018-06-21 2018-09-28 深圳市镭神智能系统有限公司 A kind of laser radar based on line laser
CN111238453B (en) * 2018-11-28 2023-09-05 赫克斯冈技术中心 Intelligent positioning module
CN110781779A (en) * 2019-10-11 2020-02-11 北京地平线机器人技术研发有限公司 Object position detection method and device, readable storage medium and electronic equipment
WO2021135392A1 (en) * 2019-12-30 2021-07-08 科沃斯机器人股份有限公司 Structured light module and autonomous moving apparatus
CN112116619A (en) * 2020-09-16 2020-12-22 昆明理工大学 Multi-line structured light system stripe center line extraction method based on structural constraint
CN112595385A (en) * 2020-11-25 2021-04-02 创新奇智(南京)科技有限公司 Target height obtaining method and device
CN113050113A (en) * 2021-03-10 2021-06-29 广州南方卫星导航仪器有限公司 Laser point positioning method and device
CN113050113B (en) * 2021-03-10 2023-08-01 广州南方卫星导航仪器有限公司 Laser spot positioning method and device
WO2022227876A1 (en) * 2021-04-29 2022-11-03 北京石头世纪科技股份有限公司 Distance measurement method and apparatus, and robot and storage medium

Also Published As

Publication number Publication date
WO2014198227A1 (en) 2014-12-18

Similar Documents

Publication Publication Date Title
CN104236521A (en) Line-laser ranging method applied to auto-moving robots
CN102927908B (en) Robot eye-on-hand system structured light plane parameter calibration device and method
CN106780601B (en) Spatial position tracking method and device and intelligent equipment
CN103512579B (en) A kind of map constructing method based on thermal infrared video camera and laser range finder
CN105160702B (en) The stereopsis dense Stereo Matching method and system aided in based on LiDAR point cloud
KR102452393B1 (en) Detector for optically determining a position of at least one object
CN102650886B (en) Vision system based on active panoramic vision sensor for robot
CN111046776A (en) Mobile robot traveling path obstacle detection method based on depth camera
CN103424083B (en) The detection method of Object Depth, device and system
CN103837085B (en) The displacement of targets device for measuring vector quantity demarcated based on laser tracker pointwise and method
CN104361603B (en) Gun camera image target designating method and system
CN105069743A (en) Detector splicing real-time image registration method
Hui et al. A novel line scan camera calibration technique with an auxiliary frame camera
CN206611521U (en) A kind of vehicle environment identifying system and omni-directional visual module based on multisensor
Lyu et al. An interactive lidar to camera calibration
CN111288891B (en) Non-contact three-dimensional measurement positioning system, method and storage medium
CN102519434A (en) Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
Cvišić et al. Recalibrating the KITTI dataset camera setup for improved odometry accuracy
CN106403900A (en) Flyer tracking and locating system and method
CN113013781B (en) Laser emission and dynamic calibration device, method, equipment and medium based on image processing
CN112950696A (en) Navigation map generation method and generation device and electronic equipment
CN105865462A (en) Three dimensional SLAM method based on events with depth enhanced vision sensor
CN106524995A (en) Positioning method for detecting spatial distances of target objects on basis of visible-light images in real time
CN113920201A (en) Polar line geometric constraint fisheye camera calibration method
Shacklock et al. Visual guidance for autonomous vehicles: capability and challenges

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Applicant after: ECOVACS ROBOTICS Co.,Ltd.

Address before: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Applicant before: ECOVACS ROBOTICS Co.,Ltd.

Address after: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Applicant after: ECOVACS ROBOTICS Co.,Ltd.

Address before: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Applicant before: ECOVACS ROBOTICS (SUZHOU) Co.,Ltd.

COR Change of bibliographic data
RJ01 Rejection of invention patent application after publication

Application publication date: 20141224

RJ01 Rejection of invention patent application after publication