WO2014198227A1 - Line laser ranging method used for self-moving robot - Google Patents

Line laser ranging method used for self-moving robot Download PDF

Info

Publication number
WO2014198227A1
WO2014198227A1 PCT/CN2014/079742 CN2014079742W WO2014198227A1 WO 2014198227 A1 WO2014198227 A1 WO 2014198227A1 CN 2014079742 W CN2014079742 W CN 2014079742W WO 2014198227 A1 WO2014198227 A1 WO 2014198227A1
Authority
WO
WIPO (PCT)
Prior art keywords
line
image
line laser
self
energy center
Prior art date
Application number
PCT/CN2014/079742
Other languages
French (fr)
Chinese (zh)
Inventor
汤进举
Original Assignee
科沃斯机器人有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 科沃斯机器人有限公司 filed Critical 科沃斯机器人有限公司
Publication of WO2014198227A1 publication Critical patent/WO2014198227A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • the invention relates to a line laser ranging method, in particular to a line laser ranging method for a self-mobile robot, belonging to the technical field of small household appliance manufacturing. Background technique
  • the core technology of the mobile robot is to perceive the surrounding environment, then plan the walking route, effectively traverse the various areas, and complete the operations in each area.
  • the most basic and critical part of this technology is the "sensing" module, the laser ranging sensor.
  • One of the existing sweeping robots is that the distance measuring sensor is not used, the obstacle is isolated by the collision plate, and then the ground is randomly cleaned. This will result in multiple invalid cleanings, such as: some corners are not cleaned or some areas are cleaned multiple times, the purpose of the work is blind and the level of intelligence is low, time-consuming and laborious.
  • Some similar applications now use laser ranging sensors as their intelligent front-ends, but in these applications, the principle is to use a single-point laser as the light source, and the receiving device to form a transmitting-receiving system, by time, space, etc. Calculate the distance of a space object.
  • a two-dimensional section distance measuring device is constructed. The drawbacks of this type of distance measuring device and method are: the rotating tourb machine introduces noise, the system is not stable enough, the service life is short and the cost is high.
  • Another type of existing sweeping robot uses a distance detector to detect the distance between the robot and the obstacle.
  • the sweeping robot avoids the ground obstacle in real time according to the distance detected by the detector to avoid collision.
  • the sweeping robot can also position itself according to the distance information, and can also use the distance information to establish an indoor map, which facilitates the planning and setting of the subsequent walking path.
  • the laser detector on the sweeping robot is also provided with a rotatable base, so that the laser detector can scan the obstacles in the room at a sustainable 360 degrees.
  • the rotating laser ranging sensor's rotating gyro opportunity introduces noise, the system is not stable enough, and the service life is short. And the cost is higher.
  • Chinese patent CN101210800 also discloses a line laser ranging sensor comprising a line laser source and a camera. The line laser ranging sensor calculates the distance of the obstacle according to the pixel position of the laser stripe in the image, and the ranging method is to binarize the whole image to obtain laser stripe data, and then perform subsequent processing on the triangle after the data is processed. Obstacle distance.
  • the amount of information is large and the processing speed is slow. Summary of the invention
  • the technical problem to be solved by the present invention is to provide a line laser ranging method for a self-moving robot, which only partially images the captured line laser stripe image projected on the target object, in view of the deficiencies of the prior art.
  • the processing of the area greatly reduces the data that needs to be processed, and improves the system efficiency without affecting the accuracy rate, so that the entire system runs at a double speed, saves working time and is convenient and reliable.
  • a line laser ranging method for a self-mobile robot comprising the following steps:
  • Step 100 using a line laser disposed on the self-moving robot as a light source, the emission line laser is projected on the target object, and the image sensor captures an image of the line-containing laser stripe projected on the target object;
  • Step 200 Determine an image processing local area on the image of the line laser stripe
  • Step 300 Perform distortion correction on the image processing local area to determine the true position of the line laser stripe on the image;
  • Step 400 Find an energy center line in the line laser stripe after distortion correction, and determine a position of the energy center line on the image;
  • Step 500 According to the pixel coordinates of the energy center line on the image, the actual distance from the mobile robot to the target object is obtained.
  • the step 200 specifically includes:
  • Step 201 traverse each line in the captured image with line laser stripe to find the point at which the laser gradation value is the largest in the line;
  • Step 202 N1 pixels are respectively taken on both sides of the point where the gray value found in each line is the largest, and the image processing local area is formed, where 10 N1 30.
  • the step 400 uses a centroid algorithm to find an energy center line, which specifically includes:
  • Step 401 In the image of the line laser stripe after the distortion correction, traverse each line to find the point at which the laser gradation value is the largest in the line;
  • Step 402 N2 pixels are respectively taken on both sides of the point where the gray value of the laser light having the largest value is found, wherein 5 ⁇ N2 ⁇ 20;
  • Step 403 Performing a centroid method on the pixels to obtain an energy center line of the image of the line laser stripe
  • centroid algorithm formula is:
  • yi i is the pixel column coordinate position after the centroid algorithm
  • (x, yi) is the original pixel coordinate (coordinate obtained after distortion correction)
  • f (x, yi) is the gray at the (x, yi) position Degree value
  • yi i obtained in the above formula is the position of the energy center point of the line laser stripe image of each line; the energy center points of all the lines constitute the energy center line.
  • the curve fitting method can also be used in the step 400 to find the energy center line of the line laser stripe.
  • the step 500 specifically includes: establishing a triangular relationship formula according to the principle of triangulation, and obtaining the actual distance 9 from the moving object to the target object.
  • the step 500 specifically includes:
  • Step 501 Measure the energy center column coordinate yi i of the image of the line laser stripe corresponding to the n actual distances q , and establish a comparison table between the actual distance q and the energy center column coordinates;
  • Step 502 According to the energy center column coordinate yi i of the image of the actual line laser stripe, the actual distance q is obtained by an interpolation algorithm.
  • the n in the step 501 is greater than or equal to 20.
  • the present invention only performs local area processing on the captured image of the line laser stripe projected on the target object, so that the data to be processed is greatly reduced, and the system efficiency is improved without affecting the accuracy. It doubles the operating speed of the entire system and saves working time.
  • Figure 1 is a schematic diagram of a similar triangular principle of the present invention. detailed description
  • Figure 1 is a schematic diagram of a similar triangular principle of the present invention.
  • the light emitted from the light source A is irradiated onto the target object C, and an image of the line-containing laser stripe projected on the target object C is captured by the image sensor B.
  • the light source A is a line laser, and therefore, the spot shape projected on the target object C is stripe-like.
  • a triangle AABC is formed by the light source A, the image sensor B, and the target object C.
  • the emitted light emitted by the light source A is reflected back from the target object C, and the falling point on the film is b, and another triangular AabB is formed between the striped image and the image sensor.
  • AABC and AabB are similar triangles with a certain proportional relationship with each other. Since the light source A and the image sensor B are both disposed on the body of the mobile robot, the vertical distance between the target object C and the line connecting the two can be regarded as the actual distance q between the target object C and the self-mobile robot. In AABC, the value of the distance s between the light source A and the image sensor B is determined.
  • AabB the value of f is determined, and the length X of the line stripe image can be fixed by value, using AABC. And AabB is a similar triangle, with a certain proportional relationship between each other, the actual distance q between the target object C and the self-mobile robot can be obtained.
  • the image sensor B can be disposed directly in front of the mobile robot, so that it is connected with the motherboard of the mobile robot, and provides the current location information of the motherboard. For the board to handle and make decisions.
  • the light source A is disposed at the top of the front end of the mobile robot, substantially in the same vertical plane as the image sensor B.
  • the present invention provides a line laser ranging method for a self-mobile robot, the method comprising the following steps:
  • Step 100 using a line laser disposed on the self-moving robot as a light source, the emission line laser is projected on the target object, and the image sensor captures an image of the line-containing laser stripe projected on the target object;
  • Step 200 Determine an image processing local area on the image of the line laser stripe
  • Step 300 Perform distortion correction on the image processing local area to determine the true position of the line laser stripe on the image;
  • Step 400 Find an energy center line in the line laser stripe after distortion correction, and determine a position of the energy center line on the image;
  • Step 500 According to the pixel coordinates of the energy center line on the image, the actual distance from the mobile robot to the target object is obtained.
  • the step 200 specifically includes:
  • Step 201 traverse each line in the image of the captured line laser stripe to find a point at which the laser gradation value is the largest in the line;
  • Step 202 N1 pixels are respectively taken on both sides of the point where the gray value found in each line is the largest, and the image processing local area is formed, where 10 N1 30.
  • the step 400 uses a centroid algorithm to find an energy center line, which specifically includes:
  • Step 401 traverse each line in the image of the line laser stripe after the distortion correction, and find a point at which the laser gradation value is the largest in the line;
  • Step 402 N2 pixels are respectively taken on both sides of the point where the gray value of the laser is found to be the largest, wherein 5 ⁇ N2 ⁇ 20;
  • Step 403 Performing a centroid method on the pixels to obtain an energy center line of the image of the line laser stripe
  • centroid algorithm formula is:
  • yi i is the pixel column coordinate position after the centroid algorithm
  • (x, yi) is the original pixel coordinate (coordinate obtained after distortion correction)
  • f (x, yi) is the gray at the (x, yi) position Degree value
  • yi i obtained in the above formula is the position of the energy center point of the line laser stripe image of each line; the energy center points of all the lines constitute the energy center line.
  • the step 500 specifically includes: establishing a triangular relationship formula according to the principle of triangulation, and obtaining the actual distance 9 from the moving object to the target object.
  • the step 500 specifically includes:
  • Step 501 Measure the energy center column coordinate yi i of the image of the line laser stripe corresponding to the n actual distances q , and establish a comparison table between the actual distance q and the energy center column coordinates;
  • Step 502 According to the energy center column coordinate yi i of the image of the actual line laser stripe, the actual distance q is obtained by an interpolation algorithm.
  • the n in the step 501 is greater than or equal to 20.
  • the light source A in the present invention uses a line laser, and the emission line laser is projected on the target object C (shown in FIG. 1 is a single-point laser, line laser) It is an extension of the laser in the direction perpendicular to the paper.)
  • a 640 X 480 CMOS image sensor is used to capture images with line laser stripes. Scanning each line of data containing the line laser stripe image by line scan, and discarding the unwanted background image, thereby forming line laser stripe (such as the maximum point of gray value and 30 points around) in the column direction of the image sensor.
  • the line laser stripe processing is then performed to obtain the laser fringe energy centerline in the column direction.
  • the values of Nl, N2, and n can be appropriately selected or adjusted according to the processing accuracy and speed.
  • step 400 in addition to the centroid method for finding the energy center line of the line laser stripe, other methods such as a curve fitting algorithm may be used.
  • the curve fitting method utilizes the principle that the cross-sectional intensity of the laser stripe approximates the Gaussian distribution.
  • a sub-pixel-level optical strip center coordinate is obtained by curve fitting the pixel point coordinates and the gray value in the vicinity of the peak point of the light intensity distribution in the cross section.
  • the specific steps are as follows: (1) Search each line sequentially for the processed image, and find the peak intensity point, that is, the maximum point of the gray value in the line, and set the point to M. (x., y.), the gray value is g. ; (2) Select M.
  • the left neighbor M- 2 (x., y), IVU (x., y- and the right neighbor Mi ⁇ y , M 2 (x., y 2 ), and their gray values are g- 2 , g- 1 ; gl , g 2 .
  • the quadratic curve fitting is performed according to the principle of least squares; (3) the coordinates of the maximal point in the direction are obtained by curve fitting. Then (x., which is the center point of the light stripe sought.
  • the fitted curve equation can be a parabolic or Gaussian curve. This method is suitable for straight strips of light in the image where the normal direction does not change much.
  • the deformed image must be corrected by the process of distortion correction. Since the line laser fills a certain direction of the image, it is indispensable to correct the deformation to obtain a real image to obtain more accurate ranging accuracy. Through the distortion correction process, the deformed laser stripes in the image, such as the curved shapes on both sides, can be corrected to a true straight shape. However, if the entire image captured by the image sensor is subjected to distortion correction processing as in the prior art, it is necessary to traverse each pixel of the entire image, which requires a large amount of data and information processing.
  • step 200 the image processing local area is first determined on the image containing the line laser stripe, and then, in step 300, the image processing local area is corrected for distortion to determine the true position of the line laser stripe on the image.
  • the distortion correction algorithm is used only for the obtained partial image, which can greatly speed up the calculation and speed up the image processing speed.
  • Local image distortion correction including:
  • a laser stripe grayscale image is captured, and each line is traversed in the image to find a point where the laser gradation value is the largest;
  • each pixel takes 30 pixels on each side of the maximum point of the laser gradation value, and performs local image distortion correction processing.
  • the number of selected pixels is not limited to 30, and according to the processing precision or speed, Adjust the number of selected ones by yourself;
  • the distortion correction algorithm formula is:
  • (x, is the original position of the original image distortion point captured on the image sensor, J is the corrected new position
  • the centroid algorithm is used to find the energy center line of the laser stripe, specifically, only the real laser stripe image is obtained in the previous step, and the data of the laser stripe in each line occupies several pixels. Location, no accurate calculations are possible.
  • the actual distance q between the self-mobile robot and the target object C can be calculated.
  • the calculation can be performed in the following two ways.
  • the first method is to obtain the spatial distance corresponding to each center point of the laser stripe in the column direction by using the triangulation principle after obtaining the coordinates of the stripe energy center, thereby obtaining the laser line corresponding to Section distance.
  • the method of obtaining the distance is also based on the principle of triangulation in the above figure, and a triangular relation formula is established, in which the variable is also yi i, and then calculated and obtained distance from other known parameters.
  • Another method is to calculate the distance by using the look-up table method, that is, each coordinate center coordinate corresponds to a spatial distance, and on this basis, a set of fixed tables consisting of the distance Z and the stripe center column coordinate yi i is established, that is, it needs to be advanced Measuring the data of n distances Z and the data of the column coordinates yi i corresponding thereto, thereby obtaining a linear relationship of the distance Z of the column coordinates yii, thereby establishing a table for reference calculation, in actual use, n according to the measurement
  • the distance range is determined. For example, the measurement range within 5 meters needs to measure at least 20 sets of correspondence data, that is, the value of n is greater than or equal to 20, which is more reliable. After pre-testing to obtain this reference table, in actual Only the actual centerline coordinates of the stripe are obtained during the measurement, and then an interpolation algorithm is used to correspond to a spatial distance.
  • the mobile robot After obtaining all the spatial distances, the mobile robot can control the path planning and walking mode according to the relevant information.
  • the line laser ranging method for the self-mobile robot solves the defect of the non-intelligent of the existing self-mobile robot through the line laser ranging sensor, and overcomes the existing ranging.
  • the single point laser and the rotating device bring about instability, high cost and short life.
  • the distance measuring sensor of the present invention has high efficiency and cost performance in a short-range ranging application of 200 mm to 5000 mm indoors.
  • the laser ranging sensor of the present invention uses line structured light as a light source, which is easier to calibrate and adjust than a spot laser.
  • the line structure light cooperates with the area array CMOS sensor to extract the depth distance information of the measurement target by the two-dimensional gray scale information in the CMOS image.
  • the camera needs to be calibrated to obtain the internal and external parameters of the camera. These parameters are used to correct the deformation of the image to obtain a true laser fringe image. Since the laser line segment of the line structure light modulated by the target object is a laser stripe having a certain width, the laser stripe obeys a Gaussian distribution in the cross section, so when calculating the spatial distance, the energy center of the line laser stripe needs to be extracted in the image.
  • the line structure light sensor is used as the "eye” of the robot to obtain the two-dimensional distance information within the sensor setting range, and provides the robot host with the indoor situation seen by the "eye”. The host uses this information to locate, plan the route and establish an indoor map. The robot will make its own route according to the map to clean the ground, which makes the cleaning robot's work absolutely intelligent, and the efficiency is greatly improved.

Abstract

A line laser ranging method used for a self-moving robot, comprising: step 100: arranging a line laser (A) on a self-moving robot as an emission source, the emitted line laser being projected onto a target object (C), and an image sensor (B) shooting an image containing a laser stripe, which is projected onto the target object (C); step 200: determining a local image processing region on the image containing the line laser stripe; step 300: conducting distortion correction on the local image processing region, and determining the true position of the line laser stripe on the image; step 400: searching for an energy centre line in the line laser stripe after the distortion correction, and determining the position of the energy centre line on the image; and step 500: according to a pixel coordinate of the energy centre line on the image, obtaining the real distance (q) from the self-moving robot to the target object (C). The method only processes a local region of the image, which reduces the amount of data which needs to be processed, thereby improving the processing efficiency and saving working time on the premise that the accuracy rate is not affected.

Description

用于自移动机器人的线激光测距方法 技术领域  Line laser ranging method for self-mobile robots
本发明涉及一种线激光测距方法, 尤其是一种用于自移动机器人的线激光测距方 法, 属于小家电制造技术领域。 背景技术  The invention relates to a line laser ranging method, in particular to a line laser ranging method for a self-mobile robot, belonging to the technical field of small household appliance manufacturing. Background technique
自移动机器人的核心技术, 就是感知周围的环境, 然后规划行走的路线, 有效的 遍历各个区域, 完成各个区域的作业。 在这一技术中, 最基本也是最关键的是 "感知" 模块, 即激光测距传感器。  The core technology of the mobile robot is to perceive the surrounding environment, then plan the walking route, effectively traverse the various areas, and complete the operations in each area. The most basic and critical part of this technology is the "sensing" module, the laser ranging sensor.
现有的扫地机器人其中一种是不使用测距传感器的, 通过撞板隔离障碍物, 然后 随机清扫地面。 这样会造成多次的无效清扫, 比如: 某些角落没有清扫到或某些区域 被多次清扫, 作业目的盲目且智能水平低, 费时费力。 现在一些类似的应用中使用了 激光测距传感器作为其智能化的前端, 但在这些应用中, 其原理是使用单点激光作为 光源, 与接收装置构成发射 -接收系统, 通过时间、 空间等方法计算空间物体的距离。 通常在测量剖面距离时, 以单点激光测距为基础, 再结合旋转装置就构成了二维剖面 测距装置。 这种测距装置和方法的缺陷在于: 旋转陀机会引入噪声, 系统不够稳定, 使用寿命较短且成本较高。  One of the existing sweeping robots is that the distance measuring sensor is not used, the obstacle is isolated by the collision plate, and then the ground is randomly cleaned. This will result in multiple invalid cleanings, such as: some corners are not cleaned or some areas are cleaned multiple times, the purpose of the work is blind and the level of intelligence is low, time-consuming and laborious. Some similar applications now use laser ranging sensors as their intelligent front-ends, but in these applications, the principle is to use a single-point laser as the light source, and the receiving device to form a transmitting-receiving system, by time, space, etc. Calculate the distance of a space object. Usually, when measuring the section distance, based on single-point laser ranging, combined with the rotating device, a two-dimensional section distance measuring device is constructed. The drawbacks of this type of distance measuring device and method are: the rotating tourb machine introduces noise, the system is not stable enough, the service life is short and the cost is high.
现有的扫地机器人还有一种是利用距离探测器来探测机器人与障碍物之间的距 离, 扫地机器人根据探测器探测出的距离远近实时规避地面障碍物从而避免碰撞的发 生。 或者, 扫地机器人也可以根据距离信息进行自身位置定位, 还可以利用该距离信 息建立室内地图, 方便后续行走路径的规划和设置。  Another type of existing sweeping robot uses a distance detector to detect the distance between the robot and the obstacle. The sweeping robot avoids the ground obstacle in real time according to the distance detected by the detector to avoid collision. Alternatively, the sweeping robot can also position itself according to the distance information, and can also use the distance information to establish an indoor map, which facilitates the planning and setting of the subsequent walking path.
目前, 市面上的扫地机器人大都利用超声波传感器或红外传感器进行障碍物距离 的探测, 但超声波信号或红外信号易受到外界干扰且探测距离较小、 精度不高。 为了 进一步提高距离测量的范围和精度, 现有的扫地机器人开始采用激光测距传感器来探 测障碍物的距离。 如中国专利 CN101809461公开了一种扫地机器人, 采用的是激光测 距传感器, 包含点激光源和图像传感器。 该激光测距传感器根据三角测距法计算障碍 物的距离信息。 为了得到整个室内障碍物的距离信息, 该扫地机器人上的激光探测器 还设置一个可旋转的底座,使得激光探测器可持续的 360度对室内的障碍物进行扫描。 然而, 该旋转激光测距传感器的旋转陀机会引入噪声, 系统不够稳定, 使用寿命较短 且成本较高。 另外, 中国专利 CN101210800也公开了一种线激光测距传感器, 包含线 激光源和相机。 该线激光测距传感器根据图像中的激光条纹像素位置计算障碍物的距 离, 其测距方法是将整幅图像二值化后得到激光条纹数据, 再对该数据后续处理后进 行三角测距计算障碍物距离。 但由于需要将完整的图像进行处理, 信息量大, 处理速 度缓慢。 发明内容 At present, most of the sweeping robots on the market use ultrasonic sensors or infrared sensors to detect the obstacle distance, but the ultrasonic signal or infrared signal is susceptible to external interference and the detection distance is small and the precision is not high. In order to further improve the range and accuracy of distance measurement, existing sweeping robots have begun to use laser ranging sensors to detect the distance of obstacles. For example, Chinese patent CN101809461 discloses a sweeping robot, which uses a laser ranging sensor, including a point laser source and an image sensor. The laser ranging sensor calculates the distance information of the obstacle according to the triangulation method. In order to obtain the distance information of the entire indoor obstacle, the laser detector on the sweeping robot is also provided with a rotatable base, so that the laser detector can scan the obstacles in the room at a sustainable 360 degrees. However, the rotating laser ranging sensor's rotating gyro opportunity introduces noise, the system is not stable enough, and the service life is short. And the cost is higher. In addition, Chinese patent CN101210800 also discloses a line laser ranging sensor comprising a line laser source and a camera. The line laser ranging sensor calculates the distance of the obstacle according to the pixel position of the laser stripe in the image, and the ranging method is to binarize the whole image to obtain laser stripe data, and then perform subsequent processing on the triangle after the data is processed. Obstacle distance. However, due to the need to process the complete image, the amount of information is large and the processing speed is slow. Summary of the invention
本发明所要解决的技术问题在于针对现有技术的不足, 提供一种用于自移动机器 人的线激光测距方法, 该方法只对拍摄到的投射在目标物体上的线激光条纹的图像进 行局部区域的处理, 使需要处理的数据大大缩减, 在不影响准确率的前提下, 提高系 统效率, 使得整个系统运行速度加倍, 节省工作时间且方便可靠。  The technical problem to be solved by the present invention is to provide a line laser ranging method for a self-moving robot, which only partially images the captured line laser stripe image projected on the target object, in view of the deficiencies of the prior art. The processing of the area greatly reduces the data that needs to be processed, and improves the system efficiency without affecting the accuracy rate, so that the entire system runs at a double speed, saves working time and is convenient and reliable.
本发明的所要解决的技术问题是通过如下技术方案实现的:  The technical problem to be solved by the present invention is achieved by the following technical solutions:
一种用于自移动机器人的线激光测距方法, 该方法包括如下步骤:  A line laser ranging method for a self-mobile robot, the method comprising the following steps:
步骤 100: 以设置在自移动机器人上的线激光器为光源, 发射线激光投射在目标 物体上, 图像传感器拍摄投射在目标物体上的含线激光条纹的图像;  Step 100: using a line laser disposed on the self-moving robot as a light source, the emission line laser is projected on the target object, and the image sensor captures an image of the line-containing laser stripe projected on the target object;
步骤 200: 在含线激光条纹的图像上确定图像处理局部区域;  Step 200: Determine an image processing local area on the image of the line laser stripe;
步骤 300: 对图像处理局部区域进行畸变校正, 确定线激光条纹在图像上的真实 位置;  Step 300: Perform distortion correction on the image processing local area to determine the true position of the line laser stripe on the image;
步骤 400: 在畸变校正后的线激光条纹中寻找能量中心线, 并确定该能量中心线 在图像上的位置;  Step 400: Find an energy center line in the line laser stripe after distortion correction, and determine a position of the energy center line on the image;
步骤 500: 根据能量中心线在图像上的像素坐标, 得到自移动机器人距离目标物 体的实际距离。  Step 500: According to the pixel coordinates of the energy center line on the image, the actual distance from the mobile robot to the target object is obtained.
所述步骤 200具体包括:  The step 200 specifically includes:
步骤 201 : 在拍摄到的含线激光条纹的图像中遍历每一行, 寻找该行中激光灰度 值最大的一点;  Step 201: traverse each line in the captured image with line laser stripe to find the point at which the laser gradation value is the largest in the line;
步骤 202: 在每一行找到的灰度值最大的一点的两侧分别各取 N1个像素, 形成图 像处理局部区域, 其中 10 N1 30。  Step 202: N1 pixels are respectively taken on both sides of the point where the gray value found in each line is the largest, and the image processing local area is formed, where 10 N1 30.
所述步骤 400中采用质心算法寻找能量中心线, 具体包括:  The step 400 uses a centroid algorithm to find an energy center line, which specifically includes:
步骤 401 : 在畸变校正后的线激光条纹的图像中, 遍历每一行, 寻找该行中激光 灰度值最大的一点;  Step 401: In the image of the line laser stripe after the distortion correction, traverse each line to find the point at which the laser gradation value is the largest in the line;
步骤 402:在寻找到的激光灰度值最大的一点的两侧分别再各取 N2个像素,其中, 5^N2^20; Step 402: N2 pixels are respectively taken on both sides of the point where the gray value of the laser light having the largest value is found, wherein 5^N2^20;
步骤 403 : 对这些像素做质心法获得线激光条纹的图像的能量中心线;  Step 403: Performing a centroid method on the pixels to obtain an energy center line of the image of the line laser stripe;
质心算法公式为:  The centroid algorithm formula is:
yi i=∑ (f (χ, yi) *yi) / ∑ f ( , yi)  Yi i=∑ (f (χ, yi) *yi) / ∑ f ( , yi)
其中: yi i是经过质心算法之后的像素列坐标位置, (x,yi)是原像素坐标 (经过 畸变矫正后得到的坐标), f (x,yi)是(x,yi)位置处的灰度值; 上式中所得到的 yi i 即是每一行的线激光条纹图像的能量中心点所在的位置; 所有行的能量中心点组成能 量中心线。  Where: yi i is the pixel column coordinate position after the centroid algorithm, (x, yi) is the original pixel coordinate (coordinate obtained after distortion correction), f (x, yi) is the gray at the (x, yi) position Degree value; yi i obtained in the above formula is the position of the energy center point of the line laser stripe image of each line; the energy center points of all the lines constitute the energy center line.
所述步骤 400中还可采用曲线拟合法寻找线激光条纹的能量中心线。  The curve fitting method can also be used in the step 400 to find the energy center line of the line laser stripe.
所述步骤 500具体包括: 根据三角测距原理, 建立三角关系公式, 求得自移动机 器人距离目标物体的实际距离9。  The step 500 specifically includes: establishing a triangular relationship formula according to the principle of triangulation, and obtaining the actual distance 9 from the moving object to the target object.
所述步骤 500具体包括:  The step 500 specifically includes:
步骤 501 : 测量 n个实际距离 q对应的线激光条纹的图像的能量中心列坐标 yi i, 建立实际距离 q与能量中心列坐标的对照表; Step 501: Measure the energy center column coordinate yi i of the image of the line laser stripe corresponding to the n actual distances q , and establish a comparison table between the actual distance q and the energy center column coordinates;
步骤 502: 根据实际的线激光条纹的图像的能量中心列坐标 yi i, 通过插值算法 得到实际距离 q。  Step 502: According to the energy center column coordinate yi i of the image of the actual line laser stripe, the actual distance q is obtained by an interpolation algorithm.
所述步骤 501中的 n大于或等于 20。  The n in the step 501 is greater than or equal to 20.
综上所述, 本发明只对拍摄到的投射在目标物体上的线激光条纹的图像进行局部 区域的处理, 使需要处理的数据大大缩减, 在不影响准确率的前提下, 提高系统效率, 使得整个系统运行速度加倍, 节省工作时间。  In summary, the present invention only performs local area processing on the captured image of the line laser stripe projected on the target object, so that the data to be processed is greatly reduced, and the system efficiency is improved without affecting the accuracy. It doubles the operating speed of the entire system and saves working time.
下面结合附图和具体实施例, 对本发明的技术方案进行详细地说明。 附图说明  The technical solutions of the present invention will be described in detail below with reference to the accompanying drawings and specific embodiments. DRAWINGS
图 1为本发明相似三角形原理示意图。 具体实施方式  Figure 1 is a schematic diagram of a similar triangular principle of the present invention. detailed description
图 1为本发明相似三角形原理示意图。 如图 1所示, 由光源 A发射出的光线照射 到目标物体 C上,通过图像传感器 B拍摄投射在目标物体 C上的含线激光条纹的图像。 在本发明中, 光源 A采用的是线激光器, 因此, 其投射在目标物体 C上的光斑形状为 条纹状。 由光源 A、 图像传感器 B和目标物体 C构成一个三角形 AABC。 当图像传感 器 B的拍摄焦距为 f时, 由于光源 A和图像传感器 B之间的相对位置是固定的, 则图 像传感器胶片上的起始位置 a确定, 此时, 由光源 A发出的发射光从目标物体 C反射 回来, 在胶片上的落点为 b, 则条纹图像和图像传感器之间构成另一个三角形 AabB, 显然, AABC和 AabB为相似三角形, 彼此之间有确定的比例关系。 由于光源 A和图 像传感器 B都是设置在自移动机器人机体上的, 目标物体 C到两者连线之间的垂直距 离可以被认为是目标物体 C到自移动机器人之间的实际距离 q。 在 AABC中, 光源 A 和图像传感器 B之间的距离 s的数值是确定的, 在 AabB中, f 的数值是确定的, 而线 条纹图像的长度 X可以通过取值的方式固定下来,利用 AABC和 AabB为相似三角形, 彼此之间有确定的比例关系,即可求出目标物体 C到自移动机器人之间的实际距离 q。 Figure 1 is a schematic diagram of a similar triangular principle of the present invention. As shown in FIG. 1, the light emitted from the light source A is irradiated onto the target object C, and an image of the line-containing laser stripe projected on the target object C is captured by the image sensor B. In the present invention, the light source A is a line laser, and therefore, the spot shape projected on the target object C is stripe-like. A triangle AABC is formed by the light source A, the image sensor B, and the target object C. When the photographing focal length of the image sensor B is f, since the relative position between the light source A and the image sensor B is fixed, the graph The starting position a on the sensor film is determined. At this time, the emitted light emitted by the light source A is reflected back from the target object C, and the falling point on the film is b, and another triangular AabB is formed between the striped image and the image sensor. Obviously, AABC and AabB are similar triangles with a certain proportional relationship with each other. Since the light source A and the image sensor B are both disposed on the body of the mobile robot, the vertical distance between the target object C and the line connecting the two can be regarded as the actual distance q between the target object C and the self-mobile robot. In AABC, the value of the distance s between the light source A and the image sensor B is determined. In AabB, the value of f is determined, and the length X of the line stripe image can be fixed by value, using AABC. And AabB is a similar triangle, with a certain proportional relationship between each other, the actual distance q between the target object C and the self-mobile robot can be obtained.
具体到光源 A和图像传感器 B在自移动机器人上的设置位置,可以将图像传感器 B 设置在自移动机器人的正前方, 使它与自移动机器人的主板相连, 并为主板提供当 前所在的位置信息, 以供主板进行处理和行动决策。 光源 A设置在自移动机器人前端 的顶部, 与图像传感器 B大致在同一竖直平面上。  Specifically, to the setting position of the light source A and the image sensor B on the self-mobile robot, the image sensor B can be disposed directly in front of the mobile robot, so that it is connected with the motherboard of the mobile robot, and provides the current location information of the motherboard. For the board to handle and make decisions. The light source A is disposed at the top of the front end of the mobile robot, substantially in the same vertical plane as the image sensor B.
基于上述的基本原理, 本发明提供一种用于自移动机器人的线激光测距方法, 该 方法包括如下步骤:  Based on the above basic principle, the present invention provides a line laser ranging method for a self-mobile robot, the method comprising the following steps:
步骤 100: 以设置在自移动机器人上的线激光器为光源, 发射线激光投射在目标 物体上, 图像传感器拍摄投射在目标物体上的含线激光条纹的图像;  Step 100: using a line laser disposed on the self-moving robot as a light source, the emission line laser is projected on the target object, and the image sensor captures an image of the line-containing laser stripe projected on the target object;
步骤 200: 在含线激光条纹的图像上确定图像处理局部区域;  Step 200: Determine an image processing local area on the image of the line laser stripe;
步骤 300: 对图像处理局部区域进行畸变校正, 确定线激光条纹在图像上的真实 位置;  Step 300: Perform distortion correction on the image processing local area to determine the true position of the line laser stripe on the image;
步骤 400: 在畸变校正后的线激光条纹中寻找能量中心线, 并确定该能量中心线 在图像上的位置;  Step 400: Find an energy center line in the line laser stripe after distortion correction, and determine a position of the energy center line on the image;
步骤 500: 根据能量中心线在图像上的像素坐标, 得到自移动机器人距离目标物 体的实际距离。  Step 500: According to the pixel coordinates of the energy center line on the image, the actual distance from the mobile robot to the target object is obtained.
所述步骤 200具体包括:  The step 200 specifically includes:
步骤 201 : 在拍摄到的线激光条纹的图像中遍历每一行, 寻找该行中激光灰度值 最大的一点;  Step 201: traverse each line in the image of the captured line laser stripe to find a point at which the laser gradation value is the largest in the line;
步骤 202: 在每一行找到的灰度值最大的一点的两侧分别各取 N1个像素, 形成图 像处理局部区域, 其中 10 N1 30。  Step 202: N1 pixels are respectively taken on both sides of the point where the gray value found in each line is the largest, and the image processing local area is formed, where 10 N1 30.
所述步骤 400中采用质心算法寻找能量中心线, 具体包括:  The step 400 uses a centroid algorithm to find an energy center line, which specifically includes:
步骤 401 : 在畸变校正后的线激光条纹的图像中, 遍历每一行, 寻找该行中激光 灰度值最大的一点; 步骤 402:在寻找到的激光灰度值最大的一点的两侧分别再各取 N2个像素,其中, 5 ^N2^20; Step 401: traverse each line in the image of the line laser stripe after the distortion correction, and find a point at which the laser gradation value is the largest in the line; Step 402: N2 pixels are respectively taken on both sides of the point where the gray value of the laser is found to be the largest, wherein 5^N2^20;
步骤 403 : 对这些像素做质心法获得线激光条纹的图像的能量中心线;  Step 403: Performing a centroid method on the pixels to obtain an energy center line of the image of the line laser stripe;
质心算法公式为:  The centroid algorithm formula is:
yi i=∑ (f (χ, yi) *yi) / ∑ f ( , yi)  Yi i=∑ (f (χ, yi) *yi) / ∑ f ( , yi)
其中: yi i是经过质心算法之后的像素列坐标位置, (x,yi)是原像素坐标 (经过 畸变矫正后得到的坐标), f (x,yi)是(x,yi)位置处的灰度值; 上式中所得到的 yi i 即是每一行的线激光条纹图像的能量中心点所在的位置; 所有行的能量中心点组成能 量中心线。  Where: yi i is the pixel column coordinate position after the centroid algorithm, (x, yi) is the original pixel coordinate (coordinate obtained after distortion correction), f (x, yi) is the gray at the (x, yi) position Degree value; yi i obtained in the above formula is the position of the energy center point of the line laser stripe image of each line; the energy center points of all the lines constitute the energy center line.
所述步骤 500具体包括: 根据三角测距原理, 建立三角关系公式, 求得自移动机 器人距离目标物体的实际距离9。  The step 500 specifically includes: establishing a triangular relationship formula according to the principle of triangulation, and obtaining the actual distance 9 from the moving object to the target object.
所述步骤 500具体包括:  The step 500 specifically includes:
步骤 501 : 测量 n个实际距离 q对应的线激光条纹的图像的能量中心列坐标 yi i, 建立实际距离 q与能量中心列坐标的对照表; Step 501: Measure the energy center column coordinate yi i of the image of the line laser stripe corresponding to the n actual distances q , and establish a comparison table between the actual distance q and the energy center column coordinates;
步骤 502: 根据实际的线激光条纹的图像的能量中心列坐标 yi i, 通过插值算法 得到实际距离 q。  Step 502: According to the energy center column coordinate yi i of the image of the actual line laser stripe, the actual distance q is obtained by an interpolation algorithm.
所述步骤 501中的 n大于或等于 20。  The n in the step 501 is greater than or equal to 20.
具体来说, 结合图 1所示, 本发明中的光源 A采用线激光器 (laser) , 发射线激 光投射在目标物体 C上, (图 1 中所示的是单点激光的图示, 线激光是点激光在垂直 于纸面方向上的扩展), 再采用 640 X 480的 CMOS图像传感器 (imager) 拍摄含线激 光条纹的图像。 按行扫描处理含线激光条纹图像的的每一行数据, 将不需要的背景图 像抛除, 从而在图像传感器的列方向上形成线激光条纹 (如灰度值最大点及周边 30 个点), 再对线激光条纹处理以获得列方向上的激光条纹能量中心线。根据处理精度和 速度的需要, 可适当选取或调整 Nl、 N2和 n的数值。  Specifically, as shown in FIG. 1, the light source A in the present invention uses a line laser, and the emission line laser is projected on the target object C (shown in FIG. 1 is a single-point laser, line laser) It is an extension of the laser in the direction perpendicular to the paper.) A 640 X 480 CMOS image sensor is used to capture images with line laser stripes. Scanning each line of data containing the line laser stripe image by line scan, and discarding the unwanted background image, thereby forming line laser stripe (such as the maximum point of gray value and 30 points around) in the column direction of the image sensor. The line laser stripe processing is then performed to obtain the laser fringe energy centerline in the column direction. The values of Nl, N2, and n can be appropriately selected or adjusted according to the processing accuracy and speed.
上述步骤 400中, 除了可以采用质心法寻找线激光条纹的能量中心线之外, 还可 以采用曲线拟合算法等其他方法进行。 所述的曲线拟合法是利用了激光条纹的横截面 光强近似服从高斯分布的原理。 在横截面上光强分布的峰值点附近对像素点坐标和灰 度值采用曲线拟合的方法得到亚像素级精度的光条中心坐标。 具体步骤如下: (1 ) 对 已处理过的图像依次搜索每一行, 找到光强峰值点, 即: 该行中灰度值的最大点, 设 该点为 M。(x。,y。), 灰度值为 g。; (2) 选取 M。的左侧邻点 M-2 (x。,y ), IVU (x。, y— 和右 侧邻点 Mi ^ y , M2 (x。,y2), 设其灰度值分别为 g—2, g-1 ; gl, g2。 利用以上 5点, 根据最小二乘原理进行二次曲线拟合;(3 )由曲线拟合求出极大值点在 向的坐标 。 , 则 (x。, 。 即是所求的光条纹中心点。 拟合曲线方程可以是抛物线或者高斯曲线。 该方法适合于图像中法线方向变化不大的直线光条纹。 In the above step 400, in addition to the centroid method for finding the energy center line of the line laser stripe, other methods such as a curve fitting algorithm may be used. The curve fitting method utilizes the principle that the cross-sectional intensity of the laser stripe approximates the Gaussian distribution. A sub-pixel-level optical strip center coordinate is obtained by curve fitting the pixel point coordinates and the gray value in the vicinity of the peak point of the light intensity distribution in the cross section. The specific steps are as follows: (1) Search each line sequentially for the processed image, and find the peak intensity point, that is, the maximum point of the gray value in the line, and set the point to M. (x., y.), the gray value is g. ; (2) Select M. The left neighbor M- 2 (x., y), IVU (x., y- and the right neighbor Mi ^ y , M 2 (x., y 2 ), and their gray values are g- 2 , g- 1 ; gl , g 2 . Using the above 5 points, The quadratic curve fitting is performed according to the principle of least squares; (3) the coordinates of the maximal point in the direction are obtained by curve fitting. Then (x., which is the center point of the light stripe sought. The fitted curve equation can be a parabolic or Gaussian curve. This method is suitable for straight strips of light in the image where the normal direction does not change much.
通常情况下, 由于图像传感器中的摄像头内部本身存在的结构缺陷, 会导致所拍 摄的图像在一定程度上发生变形, 尤其是越往图像边缘方向, 变形就越严重。 因此, 必须要通过畸变矫正的过程, 对变形的图像进行校正。 由于线激光充满了图像的某一 个方向, 所以矫正其变形得到真实的图像从而获得更精确的测距精度是必不可少的。 通过畸变矫正处理, 可以将图像中变形的激光条纹, 比如两边的弯曲形状矫正成真实 的直线型。 但是, 如果像现有技术一样, 对图像传感器拍摄到的整体图像全部都进行 畸变矫正处理, 则需要对整幅图像遍历每一个像素, 这样一来, 就需要很大的数据和 信息处理量。 而本发明在步骤 200中, 先在含线激光条纹的图像上确定图像处理局部 区域, 再通过步骤 300, 对图像处理局部区域进行畸变校正, 确定线激光条纹在图像 上的真实位置。 这样的处理方式, 只对取到的局部图像使用畸变矫正算法, 可以大大 加快运算速度, 从而加快图像处理速度。  Usually, due to the structural defects inherent in the camera inside the image sensor, the captured image will be deformed to a certain extent, especially in the direction of the edge of the image, the more severe the deformation. Therefore, the deformed image must be corrected by the process of distortion correction. Since the line laser fills a certain direction of the image, it is indispensable to correct the deformation to obtain a real image to obtain more accurate ranging accuracy. Through the distortion correction process, the deformed laser stripes in the image, such as the curved shapes on both sides, can be corrected to a true straight shape. However, if the entire image captured by the image sensor is subjected to distortion correction processing as in the prior art, it is necessary to traverse each pixel of the entire image, which requires a large amount of data and information processing. In the present invention, in step 200, the image processing local area is first determined on the image containing the line laser stripe, and then, in step 300, the image processing local area is corrected for distortion to determine the true position of the line laser stripe on the image. In this way, the distortion correction algorithm is used only for the obtained partial image, which can greatly speed up the calculation and speed up the image processing speed.
局部图像畸变校正, 具体包括:  Local image distortion correction, including:
首先, 拍摄到激光条纹灰度图像, 并在图像中遍历每一行寻找激光灰度值最大的 点;  First, a laser stripe grayscale image is captured, and each line is traversed in the image to find a point where the laser gradation value is the largest;
其次, 在每一行的激光灰度值最大点两边分别各取 30个像素, 进行局部图像畸变 矫正处理, 当然, 所选取的像素个数并不仅限于 30个, 根据处理精度或速度的需要, 可以自行调整选取的个数多少;  Secondly, each pixel takes 30 pixels on each side of the maximum point of the laser gradation value, and performs local image distortion correction processing. Of course, the number of selected pixels is not limited to 30, and according to the processing precision or speed, Adjust the number of selected ones by yourself;
畸变矫正算法公式为:  The distortion correction algorithm formula is:
1 1 Γ '+ΙίζΓ 4½3Γ +2*Pl (ΐ +2*χ ) 1 1 Γ '+ΙίζΓ 4 1⁄2 3 Γ +2*Pl (ΐ +2*χ )
( l+kir2+k2r4+k3r6) +2*p2*xy+Pi* (r2+2 ( l+kir 2 +k 2 r 4 +k 3 r 6 ) +2*p 2 *xy+Pi* (r 2 +2
其中: (x, 是拍摄到的原始图像畸变点在图像传感器上的原始位置, 。。 J是校正后的新位置;  Where: (x, is the original position of the original image distortion point captured on the image sensor, J is the corrected new position;
、 、 和 分别是摄像头的畸变系数, 通过摄像头标定获得; 其中 r2 = f。 , , and respectively are the distortion coefficients of the camera, obtained by camera calibration; where r 2 = f.
本发明上述步骤 400中采用质心算法寻找激光条纹的能量中心线, 具体来说是在 上一步骤中只获得了真实的激光条纹图像, 激光条纹在每一行中的数据都占据了好几 个像素的位置, 无法进行精确计算。 为了计算其对应的空间距离, 需要提取激光条纹 的能量中心线, 该能量中心线上的激光点在图像传感器上的像素级别高, 通常是零点 几的像素级别, 能更加精确的对应空间距离。 In the above step 400 of the present invention, the centroid algorithm is used to find the energy center line of the laser stripe, specifically, only the real laser stripe image is obtained in the previous step, and the data of the laser stripe in each line occupies several pixels. Location, no accurate calculations are possible. In order to calculate the corresponding spatial distance, it is necessary to extract the energy center line of the laser stripe, and the laser spot on the energy center line has a high pixel level on the image sensor, usually zero. A few pixel levels, which can more accurately correspond to the spatial distance.
通过上述各个步骤的处理, 获得相应的数据和信息之后, 便可以对自移动机器人 和目标物体 C之间的实际距离 q进行计算了。 可以通过如下两种方法进行计算。  After the corresponding data and information are obtained through the processing of the above respective steps, the actual distance q between the self-mobile robot and the target object C can be calculated. The calculation can be performed in the following two ways.
具体来说, 第一方法是在得到条纹能量中心的坐标之后, 即可利用三角测距原理 获得激光条纹在列方向上的每一个中心点所对应的空间距离, 即可得到激光线所对应 的剖面距离。 获得距离的方法还有根据上图的三角测距原理, 建立三角关系公式, 其 中变量也是 yi i, 然后和其他已知参数进行计算求取距离。  Specifically, the first method is to obtain the spatial distance corresponding to each center point of the laser stripe in the column direction by using the triangulation principle after obtaining the coordinates of the stripe energy center, thereby obtaining the laser line corresponding to Section distance. The method of obtaining the distance is also based on the principle of triangulation in the above figure, and a triangular relation formula is established, in which the variable is also yi i, and then calculated and obtained distance from other known parameters.
另一种方法是利用査表法计算距离, 即每一个条文中心坐标对应一个空间距离, 在这个基础上测试建立一组由距离 Z和条纹中心列坐标 yi i组成的固定表, 也就是说 需要预先测量 n个距离 Z的数据, 以及与其对应的列坐标 yi i的数据, 由此得到列坐 标 yii对应距离 Z的线性关系, 从而建立一张可供参照计算的表格, 实际使用中, n根 据测量距离范围来定,例如 5米以内的测量范围至少需要测量 20组以上的对应关系数 据, 也就是说 n的取值要大于等于 20才较为可靠, 经过预先测试得到这张参照表后, 在实际测量时只需获得实际的条纹中心列坐标, 然后经过插值算法就可对应一个空间 距离。  Another method is to calculate the distance by using the look-up table method, that is, each coordinate center coordinate corresponds to a spatial distance, and on this basis, a set of fixed tables consisting of the distance Z and the stripe center column coordinate yi i is established, that is, it needs to be advanced Measuring the data of n distances Z and the data of the column coordinates yi i corresponding thereto, thereby obtaining a linear relationship of the distance Z of the column coordinates yii, thereby establishing a table for reference calculation, in actual use, n according to the measurement The distance range is determined. For example, the measurement range within 5 meters needs to measure at least 20 sets of correspondence data, that is, the value of n is greater than or equal to 20, which is more reliable. After pre-testing to obtain this reference table, in actual Only the actual centerline coordinates of the stripe are obtained during the measurement, and then an interpolation algorithm is used to correspond to a spatial distance.
在得到所有空间距离之后, 自移动机器人就可以根据相关的信息进行路径规划和 行走方式的控制了。  After obtaining all the spatial distances, the mobile robot can control the path planning and walking mode according to the relevant information.
综上所述, 本发明提供的这种用于自移动机器人的线激光测距方法, 通过线激光 测距传感器, 解决了现有自移动机器人非智能化的缺陷, 同时克服了现有测距技术中 单点激光与旋转装置带来的不稳定、 成本高且寿命短等问题, 本发明中的测距传感器 在室内 200mm-5000mm的短程测距应用中效率和性价比都很高。  In summary, the line laser ranging method for the self-mobile robot provided by the present invention solves the defect of the non-intelligent of the existing self-mobile robot through the line laser ranging sensor, and overcomes the existing ranging. In the technology, the single point laser and the rotating device bring about instability, high cost and short life. The distance measuring sensor of the present invention has high efficiency and cost performance in a short-range ranging application of 200 mm to 5000 mm indoors.
本发明的激光测距传感器使用线结构光作为光源, 相对于点激光更易于校准、 调 整。线结构光与面阵 CMOS传感器配合,通过 CMOS图像中的二维灰度信息提取测量 目标的深度距离信息。 线结构光测距中摄像机需要进行标定, 来获得摄像机内部和外 部参数, 通过这些参数来矫正图像的变形从而获得真实的激光条纹图像。 由于线结构 光受目标物体调制后显示的激光线段是有一定宽度的激光条纹, 该激光条纹在横截面 上服从高斯分布, 所以在计算空间距离时, 需要在图像中提取线激光条纹的能量中心 线来进行计算。 线结构光传感器作为机器人的 "眼睛" 获得传感器设定范围内的二维 距离信息, 为机器人主机提供 "眼睛" 所看到的室内情况, 主机通过这些信息进行定 位、 规划路线并建立室内地图。 机器人会根据地图制定自己的路线去清扫地面, 这使 得扫地机器人的工作绝对智能化, 且效率有很大提高。  The laser ranging sensor of the present invention uses line structured light as a light source, which is easier to calibrate and adjust than a spot laser. The line structure light cooperates with the area array CMOS sensor to extract the depth distance information of the measurement target by the two-dimensional gray scale information in the CMOS image. In the line structure optical ranging, the camera needs to be calibrated to obtain the internal and external parameters of the camera. These parameters are used to correct the deformation of the image to obtain a true laser fringe image. Since the laser line segment of the line structure light modulated by the target object is a laser stripe having a certain width, the laser stripe obeys a Gaussian distribution in the cross section, so when calculating the spatial distance, the energy center of the line laser stripe needs to be extracted in the image. Line to calculate. The line structure light sensor is used as the "eye" of the robot to obtain the two-dimensional distance information within the sensor setting range, and provides the robot host with the indoor situation seen by the "eye". The host uses this information to locate, plan the route and establish an indoor map. The robot will make its own route according to the map to clean the ground, which makes the cleaning robot's work absolutely intelligent, and the efficiency is greatly improved.

Claims

权利要求书 claims
1、 一种用于自移动机器人的线激光测距方法, 该方法包括如下步骤: 1. A line laser ranging method for self-moving robots. The method includes the following steps:
步骤 100: 以设置在自移动机器人上的线激光器为光源, 发射线激光投射在目标 物体上, 通过自移动机器人上的图像传感器拍摄投射在目标物体上的含线激光条纹的 图像; Step 100: Use the line laser set on the self-moving robot as the light source, emit the line laser and project it on the target object, and capture the image containing the line laser stripes projected on the target object through the image sensor on the self-moving robot;
步骤 200: 在含线激光条纹的图像上确定图像处理局部区域; Step 200: Determine the image processing local area on the image containing line laser stripes;
步骤 300: 对图像处理局部区域进行畸变校正, 确定线激光条纹在图像上的真实 位置; Step 300: Perform distortion correction on the local area of image processing to determine the true position of the line laser stripe on the image;
步骤 400: 在畸变校正后的线激光条纹中寻找能量中心线, 并确定该能量中心线 在图像上的位置; Step 400: Find the energy center line in the distortion-corrected line laser stripe, and determine the position of the energy center line on the image;
步骤 500: 根据能量中心线在图像上的像素坐标, 得到自移动机器人距离目标物 体的实际距离 (q)。 Step 500: According to the pixel coordinates of the energy center line on the image, obtain the actual distance (q) from the mobile robot to the target object.
2、 如权利要求 1所述的用于自移动机器人的线激光测距方法, 其特征在于, 所述 步骤 200具体包括: 2. The line laser ranging method for self-moving robots according to claim 1, characterized in that the step 200 specifically includes:
步骤 201 : 在拍摄到的含线激光条纹的图像中遍历每一行, 寻找该行中激光灰度 值最大的一点; Step 201: Traverse each line in the captured image containing line laser stripes, and find the point with the largest laser grayscale value in the line;
步骤 202: 在每一行找到的灰度值最大的一点的两侧分别各取 N1个像素, 形成图 像处理局部区域, 其中 10 N1 30。 Step 202: Take N1 pixels on both sides of the point with the largest grayscale value found in each row to form an image processing local area, of which 10 N1 30.
3、 如权利要求 1所述的用于自移动机器人的线激光测距方法, 其特征在于, 所述 步骤 400中采用质心算法寻找线激光条纹的能量中心线, 具体包括: 3. The line laser ranging method for self-moving robots as claimed in claim 1, characterized in that, in step 400, a centroid algorithm is used to find the energy center line of the line laser stripe, which specifically includes:
步骤 401 : 在畸变校正后的线激光条纹的图像中, 遍历每一行, 寻找该行中激光 灰度值最大的一点; Step 401: In the distortion-corrected line laser stripe image, traverse each line and find the point with the largest laser grayscale value in the line;
步骤 402:在寻找到的激光灰度值最大的一点的两侧分别再各取 N2个像素,其中, Step 402: Take N2 pixels on both sides of the found point with the largest laser grayscale value, where,
5^N2^20; 5^N2^20;
步骤 403 : 对这些像素做质心法获得线激光条纹的图像的能量中心线; Step 403: Perform the centroid method on these pixels to obtain the energy center line of the line laser stripe image;
质心算法公式为: The formula of the centroid algorithm is:
yi i=∑ (f (χ, yi) *yi) / ∑ f ( , yi) yi i=∑ (f (χ, yi) *yi) / ∑ f ( , yi)
其中: yi i是经过质心算法之后的像素列坐标位置, (x,yi)是原像素坐标 (经过 畸变矫正后得到的坐标), f (x,yi)是(x,yi)位置处的灰度值; 上式中所得到的 yi i 即是每一行的线激光条纹图像的能量中心点所在的位置; 所有行的能量中心点组成能 量中心线。 Among them: yi i is the pixel column coordinate position after the centroid algorithm, (x, yi) is the original pixel coordinate (the coordinate obtained after distortion correction), f (x, yi) is the gray value at the (x, yi) position degree value; yi i obtained in the above formula That is, the location of the energy center point of the line laser stripe image of each row; the energy center points of all rows form the energy center line.
4、 如权利要求 1所述的用于自移动器人的线激光测距传感方法, 其特征在于, 所 述步骤 400中采用曲线拟合法寻找线激光条纹的能量中心线。 4. The line laser ranging sensing method for a self-moving robot as claimed in claim 1, characterized in that, in the step 400, a curve fitting method is used to find the energy center line of the line laser stripe.
5、 如权利要求 1所述的用于自移动机器人的线激光测距方法, 其特征在于, 所述 步骤 500具体包括: 根据三角测距原理, 建立三角关系公式, 求得自移动机器人距离 目标物体的实际距离 (q)。 5. The line laser ranging method for a self-moving robot as claimed in claim 1, characterized in that the step 500 specifically includes: establishing a triangular relationship formula according to the principle of triangular ranging, and obtaining the distance to the target from the self-moving robot. The actual distance of the object (q).
6、 如权利要求 1所述的用于自移动机器人的线激光测距方法, 其特征在于, 所述 步骤 500具体包括: 6. The line laser ranging method for self-moving robots according to claim 1, wherein the step 500 specifically includes:
步骤 501 :测量 n个实际距离(q)对应的线激光条纹的图像的能量中心列坐标 yi i, 建立实际距离 (q) 与能量中心列坐标的对照表; Step 501: Measure the energy center column coordinates yi i of the line laser stripe images corresponding to n actual distances (q), and establish a comparison table between the actual distances (q) and the energy center column coordinates;
步骤 502: 根据实际的线激光条纹的图像的能量中心列坐标 yi i, 通过插值算法 得到实际距离 (q)。 Step 502: According to the energy center column coordinates yi i of the actual line laser stripe image, obtain the actual distance (q) through the interpolation algorithm.
7、 如权利要求 6所述的用于自移动机器人的线激光测距方法, 其特征在于, 所述 步骤 501中的 n大于或等于 20。 7. The line laser ranging method for self-moving robots according to claim 6, wherein n in step 501 is greater than or equal to 20.
PCT/CN2014/079742 2013-06-14 2014-06-12 Line laser ranging method used for self-moving robot WO2014198227A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310234951.0 2013-06-14
CN201310234951.0A CN104236521A (en) 2013-06-14 2013-06-14 Line-laser ranging method applied to auto-moving robots

Publications (1)

Publication Number Publication Date
WO2014198227A1 true WO2014198227A1 (en) 2014-12-18

Family

ID=52021667

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/079742 WO2014198227A1 (en) 2013-06-14 2014-06-12 Line laser ranging method used for self-moving robot

Country Status (2)

Country Link
CN (1) CN104236521A (en)
WO (1) WO2014198227A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109774197A (en) * 2018-07-13 2019-05-21 中国航空工业集团公司济南特种结构研究所 A kind of composite material curved surface laying laser-projector method for determining position

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657981B (en) * 2015-01-07 2017-05-24 大连理工大学 Dynamic compensation method for three-dimensional laser distance metering data of mobile robot in moving process
CN105890047A (en) * 2015-01-21 2016-08-24 深圳市沃森空调技术有限公司 Air-conditioning robot
CN104859562A (en) * 2015-05-27 2015-08-26 北京信息科技大学 Method and device for detecting barriers behind vehicle
CN106371101B (en) * 2015-07-20 2019-08-16 北醒(北京)光子科技有限公司 A kind of device of intelligent distance-measuring and avoidance
CN105759280A (en) * 2016-05-17 2016-07-13 上海酷哇机器人有限公司 Laser triangulation system safe for human eyes
CN107436439A (en) * 2016-05-27 2017-12-05 科沃斯机器人股份有限公司 The installation method of laser ranging system and its sensitive chip
CN106125738A (en) * 2016-08-26 2016-11-16 北京航空航天大学 A kind of identification of pallets device and method based on AGV
CN106226755B (en) * 2016-08-30 2019-05-21 北京小米移动软件有限公司 Robot
CN106092146A (en) * 2016-08-30 2016-11-09 宁波菜鸟智能科技有限公司 Laser ranging bearing calibration and system
WO2018127209A1 (en) * 2017-01-09 2018-07-12 苏州宝时得电动工具有限公司 Autonomous moving device, and positioning system, positioning method and control method therefor
CN108398694B (en) * 2017-02-06 2024-03-15 苏州宝时得电动工具有限公司 Laser range finder and laser range finding method
CN109143167B (en) * 2017-06-28 2021-07-23 杭州海康机器人技术有限公司 Obstacle information acquisition device and method
CN107607960A (en) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 A kind of anallatic method and device
CN107861113B (en) * 2017-11-06 2020-01-14 深圳市杉川机器人有限公司 Calibration method and device
CN108345002A (en) * 2018-02-27 2018-07-31 上海图漾信息科技有限公司 Structure light measurement device and method
CN109615586B (en) * 2018-05-07 2022-03-11 杭州新瀚光电科技有限公司 Infrared image distortion correction algorithm
CN108594205A (en) * 2018-06-21 2018-09-28 深圳市镭神智能系统有限公司 A kind of laser radar based on line laser
EP3660451B1 (en) * 2018-11-28 2022-04-27 Hexagon Technology Center GmbH Intelligent stationing module
CN110781779A (en) * 2019-10-11 2020-02-11 北京地平线机器人技术研发有限公司 Object position detection method and device, readable storage medium and electronic equipment
CN110960138A (en) * 2019-12-30 2020-04-07 科沃斯机器人股份有限公司 Structured light module and autonomous mobile device
CN112116619B (en) * 2020-09-16 2022-06-10 昆明理工大学 Multi-line structured light system stripe center line extraction method based on structural constraint
CN112595385A (en) * 2020-11-25 2021-04-02 创新奇智(南京)科技有限公司 Target height obtaining method and device
CN113050113B (en) * 2021-03-10 2023-08-01 广州南方卫星导航仪器有限公司 Laser spot positioning method and device
CN114608520B (en) * 2021-04-29 2023-06-02 北京石头创新科技有限公司 Ranging method, ranging device, robot and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1354073A (en) * 2000-11-17 2002-06-19 三星光州电子株式会社 Movable robot and its path regulating method
WO2005098476A1 (en) * 2004-03-29 2005-10-20 Evolution Robotics, Inc. Method and apparatus for position estimation using reflected light sources
US20060237634A1 (en) * 2005-04-23 2006-10-26 Lg Electronics Inc. Position sensing device for mobile robots and robot cleaner equipped with the same
JP2009063379A (en) * 2007-09-05 2009-03-26 Casio Comput Co Ltd Distance measuring device and projector including this distance measuring device
CN102980528A (en) * 2012-11-21 2013-03-20 上海交通大学 Calibration method of pose position-free constraint line laser monocular vision three-dimensional measurement sensor parameters
CN103134469A (en) * 2011-12-01 2013-06-05 财团法人工业技术研究院 Distance sensing device and distance sensing method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8996172B2 (en) * 2006-09-01 2015-03-31 Neato Robotics, Inc. Distance sensor system and method
TWI289196B (en) * 2006-10-02 2007-11-01 Ming-Chih Lu Distance measurement system and method
CN101587591B (en) * 2009-05-27 2010-12-08 北京航空航天大学 Visual accurate tracking technique based on double parameter thresholds dividing
KR20120007735A (en) * 2010-07-15 2012-01-25 삼성전기주식회사 Distance measuring module and electronic device including the same
CN102889864A (en) * 2011-07-19 2013-01-23 中铝上海铜业有限公司 Detection system for tower shape of object with strip coil edge and detection method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1354073A (en) * 2000-11-17 2002-06-19 三星光州电子株式会社 Movable robot and its path regulating method
WO2005098476A1 (en) * 2004-03-29 2005-10-20 Evolution Robotics, Inc. Method and apparatus for position estimation using reflected light sources
US20060237634A1 (en) * 2005-04-23 2006-10-26 Lg Electronics Inc. Position sensing device for mobile robots and robot cleaner equipped with the same
JP2009063379A (en) * 2007-09-05 2009-03-26 Casio Comput Co Ltd Distance measuring device and projector including this distance measuring device
CN103134469A (en) * 2011-12-01 2013-06-05 财团法人工业技术研究院 Distance sensing device and distance sensing method
CN102980528A (en) * 2012-11-21 2013-03-20 上海交通大学 Calibration method of pose position-free constraint line laser monocular vision three-dimensional measurement sensor parameters

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109774197A (en) * 2018-07-13 2019-05-21 中国航空工业集团公司济南特种结构研究所 A kind of composite material curved surface laying laser-projector method for determining position
CN109774197B (en) * 2018-07-13 2021-05-07 中国航空工业集团公司济南特种结构研究所 Method for determining position of laser projector of composite material curved surface layering

Also Published As

Publication number Publication date
CN104236521A (en) 2014-12-24

Similar Documents

Publication Publication Date Title
WO2014198227A1 (en) Line laser ranging method used for self-moving robot
US10052766B2 (en) Automatic in-situ registration and calibration of robotic arm/sensor/workspace system
Ishikawa et al. Lidar and camera calibration using motions estimated by sensor fusion odometry
CN112797915B (en) Calibration method, calibration device and system of line structured light measurement system
CN110118528B (en) Line structure light calibration method based on chessboard target
US8175337B2 (en) Apparatus and method of measuring distance using structured light
TWI518350B (en) Time-of-flight imager
CN101698303B (en) Automatic calibration method between three-dimensional laser and monocular vision
TWI517101B (en) Calibration system and method for 3d scanner
CN104567728A (en) Laser vision profile measurement system, measurement method and three-dimensional target
CN109827521B (en) Calibration method for rapid multi-line structured optical vision measurement system
CN105303560A (en) Robot laser scanning welding seam tracking system calibration method
WO2019012770A1 (en) Imaging device and monitoring device
CN103257342A (en) Three-dimension laser sensor and two-dimension laser sensor combined calibration method
CN110703230B (en) Position calibration method between laser radar and camera
CN111352123B (en) Robot and method for vehicle inspection, method for determining centerline of vehicle chassis
CN113096183B (en) Barrier detection and measurement method based on laser radar and monocular camera
CN116009559B (en) Inspection robot and inspection method for inner wall of water conveying pipeline
CN111402411A (en) Scattered object identification and grabbing method based on line structured light
CN109764809A (en) A method of fruit tree canopy volume is calculated based on two-dimensional laser sensor in real time
CN111156921A (en) Contour data processing method based on sliding window mean filtering
Kühnle et al. Grasping in depth maps of time-of-flight cameras
JP2004309318A (en) Position detection method, its device and its program, and calibration information creation method
JP2024501731A (en) Speed measurement method and speed measurement device using multiple cameras
Huang et al. Extrinsic calibration of a multi-beam LiDAR system with improved intrinsic laser parameters using v-shaped planes and infrared images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14810187

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14810187

Country of ref document: EP

Kind code of ref document: A1