WO2019136613A1 - 机器人室内定位的方法及装置 - Google Patents

机器人室内定位的方法及装置 Download PDF

Info

Publication number
WO2019136613A1
WO2019136613A1 PCT/CN2018/071995 CN2018071995W WO2019136613A1 WO 2019136613 A1 WO2019136613 A1 WO 2019136613A1 CN 2018071995 W CN2018071995 W CN 2018071995W WO 2019136613 A1 WO2019136613 A1 WO 2019136613A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
speckle
landmark
positioning
indoor
Prior art date
Application number
PCT/CN2018/071995
Other languages
English (en)
French (fr)
Inventor
王声平
张立新
Original Assignee
深圳市沃特沃德股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市沃特沃德股份有限公司 filed Critical 深圳市沃特沃德股份有限公司
Priority to PCT/CN2018/071995 priority Critical patent/WO2019136613A1/zh
Publication of WO2019136613A1 publication Critical patent/WO2019136613A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device

Definitions

  • the invention relates to the field of robots, and in particular to a method and a device for positioning a robot indoor.
  • SLAM Simultaneous Localization and Mapping
  • a mobile robot equipped with a sensor estimates the environmental map and the position of the robot from the measured data of the sensor.
  • the sensor can be divided into a robot external sensor and an internal sensor.
  • the external sensor is used to measure the movement of the robot relative to the external environment, including a camera, a laser scanner, an accelerometer or a GPS; and the internal sensor measures the state of the robot relative to the previous moment (position) ) changes, including odometers and gyroscopes.
  • SLAM Extended Kalman Filter-Simultaneous Localization and Mapping
  • EKF-SLAM Extended Kalman Filter-Simultaneous Localization and Mapping
  • internal sensors are used to measure the state change of the robot, while external sensors correct the measurement error.
  • Existing SLAM applications have two typical scenarios: tracking, where the initial position of the robot is usually known; global positioning, usually giving little or no a priori information about the robot's starting position or environmental characteristics. Therefore, the core issue of SLAM is the posterior estimate of the robot's motion path and its environmental characteristics. To solve the above core problems, it is necessary to establish an appropriate model to solve the posterior probability.
  • the EKF-SLAM method based on extended Kalman filter has been widely recognized and widely used.
  • the EKF-SLAM method first estimates the state at a certain moment in the motion process, then obtains the feedback in the form of (noisy) measured variables, and finally corrects the estimated value based on the feedback. In this way, the EKF-SLAM method can efficiently estimate the past, present, and even future states of motion without knowing the detailed nature of the robot.
  • the above method has the following disadvantages: the road sign is often placed on the ground and is easily interfered by other robots in the past; the simple road sign has simple image processing, but has no error correction, and the complex road sign has complete data error correction.
  • the pattern is too complicated and the real-time performance is poor; and because the accumulation of measurement errors and the computational complexity increase rapidly with the number of nodes, the calculation amount is too large, and it is not easy to expand to indoor positioning in a large environment.
  • the main object of the present invention is to provide a method and device for indoor positioning, which aims to solve the technical problem that the measurement error and the positioning are not accurate in the road sign data processing in the existing indoor positioning SLAM method.
  • the main object of the present invention is to provide a method and device for indoor positioning, which aims to solve the technical problem that the measurement error and the positioning are not accurate in the road sign data processing in the existing indoor positioning SLAM method.
  • the invention provides a method for indoor positioning of a robot, comprising:
  • the robot is located according to the corresponding position information of the robot coordinates.
  • the invention also provides a device for positioning a robot indoor, comprising:
  • An identification module for identifying a speckle landmark preset on an indoor ceiling
  • a conversion module configured to convert image coordinates in the topology map to corresponding position information of the robot coordinates
  • a positioning module configured to locate the robot according to the corresponding position information of the robot coordinates.
  • the speckle landmarks preset by the ceiling are used as the position recognition identifiers, which facilitate image recognition, reduce the complexity and calculation amount of the front end image processing algorithm, improve the real-time performance of the positioning system, and thereby improve the positioning of the robot indoors. Precision.
  • the invention adopts an infrared laser projector to actively project infrared speckle on the ceiling, is not affected by external illumination, and is also applicable in a dark environment, and is suitable for positioning of a large-scale indoor environment and map construction.
  • FIG. 1 is a schematic flow chart of a method for positioning a robot indoor according to an embodiment of the present invention
  • FIG. 2 is a schematic flow chart of an optimization method for indoor positioning of a robot according to an embodiment of the present invention
  • step S5 is a schematic flow chart of step S5 according to an embodiment of the present invention.
  • step S6 is a schematic flow chart of step S6 of an embodiment of the present invention.
  • Figure 5 is a schematic flow chart of step S62 of an embodiment of the present invention.
  • FIG. 6 is a schematic flowchart diagram of step S1 according to an embodiment of the present invention.
  • FIG. 7 is a schematic flow chart of a method for positioning a robot indoor according to another embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a topology diagram according to an embodiment of the present invention.
  • FIG. 9 is a schematic diagram of a closed-loop condition constraint relationship in a topology diagram according to an embodiment of the present invention.
  • FIG. 10 is a schematic diagram of a robot indoor positioning system according to an embodiment of the present invention.
  • Figure 11 is a schematic structural view of a robot indoor positioning device according to an embodiment of the present invention.
  • FIG. 12 is a schematic diagram showing an optimized structure of a robot indoor positioning device according to an embodiment of the present invention.
  • FIG. 13 is a schematic structural diagram of a determining module according to an embodiment of the present invention.
  • FIG. 14 is a schematic structural diagram of an optimization module according to an embodiment of the present invention.
  • Figure 15 is a schematic structural view of a solving unit according to an embodiment of the present invention.
  • 16 is a schematic structural diagram of an identification module according to an embodiment of the present invention.
  • Figure 17 is a schematic view showing the structure of a robot indoor positioning device according to another embodiment of the present invention. The implementation, functional features, and advantages of the present invention will be further described in conjunction with the embodiments.
  • a method for indoor positioning of a robot includes:
  • the speckle landmarks of this step include picture speckles pre-applied to the ceiling and pre-projected light projection speckles.
  • the infrared speckle projected by the infrared laser speckle projector is preferably invisible to the naked eye and harmless to the human body, and is not affected by external illumination, and is also applicable in a dark environment, and is suitable for positioning of a large-scale indoor environment and Map construction.
  • the present embodiment provides a schematic diagram of a robot indoor positioning system, which includes an infrared laser speckle projector 1, a wide-angle infrared camera 3 fixed to the robot, and a robot chassis 2 equipped with an odometer.
  • a plurality of infrared laser speckle projectors are uniformly placed indoors, and the speckle landmarks are projected onto the ceiling, and the speckle landmarks are randomly distributed, and each speckle landmark pattern is different for easy identification.
  • the projection range of the speckle projector the number and position layout of the infrared laser speckle projector are determined, and the overlap of the infrared speckle is minimized to simplify the identification, further simplify the calculation, and combine the field of view of the corresponding infrared camera and the infrared camera.
  • the height of the ceiling is considered comprehensively so that the infrared camera does not have an observation dead angle, that is, the camera can capture at least one speckle landmark anywhere in the scene.
  • This embodiment preferably has an infrared laser projector with a wavelength of 830 nm, so that the positioning range covered by a single infrared laser projector is as wide as possible.
  • an infrared wide-angle camera disposed on the robot perpendicular to the ceiling direction is preferably used for speckle recognition so that the infrared camera has a sufficiently large field of view.
  • a wide-angle lens with a focal length of 2.8 mm or 2.5 mm is preferably mounted, and a narrow-band filter with a center wavelength of 830 nm is mounted in front of the lens, and only the light wave of the infrared speckle is photoimaged.
  • the graph-based SLAM method of the present embodiment converts abstract raw sensor measurement data into a simple graph optimization problem.
  • the original measurement data is replaced by the edges in the graph.
  • the edges between the two nodes in the topology map are marked by the probability distribution of the relative positions of the robot poses, and are constrained by the sensor measurement data, as shown in Figure 8, where the nodes Indicates the pose of the robot, Indicates the location of the landmark, Representing the constraint relationship between the landmark and the robot, Indicates the constraint relationship of the robot pose at time t-1 and time t, where Can be measured by a landmark detection sensor, and It can be measured by an odometer.
  • a model is constructed by the constraint relationship between the landmark and the robot pose, and the problem of robot positioning and map construction is solved by graph optimization.
  • the observation constraint between j, ⁇ ij represents the information matrix of the observation constraint of node i and node j (the inverse of the error covariance matrix), Indicates the prediction: the pose conversion relationship between node i and node j is the odometer measurement value.
  • the pose of the robot contains the coordinates (x, y) on the plane and its direction ⁇ , which belongs to the SE(2) space.
  • the landmark can be represented by the coordinates of the two-dimensional space; the position of the robot Landmark location Each time the robot moves a certain distance or rotates at a certain angle, the position of the robot at the current moment can be obtained by the odometer measurement, and the road sign detected by the current robot is detected according to the image taken by the camera.
  • the posture of the robot is obtained by the odometer measurement data.
  • the posture of the robot at time t Let the odometer measure the moving distance between t and t+1 as ⁇ d. At this time, the heading change rate of the robot on the horizontal plane is ⁇ , then the pose of the robot at time t+1 Calculated according to the dead reckoning formula:
  • the real-time shooting environment of the robot determines the position of the speckle landmark in the image through the recognition of the speckle landmark. According to the camera model, it is converted into the camera coordinate system to obtain the position of the speckle landmark relative to the robot as the observation.
  • the robot realizes the observation by recognizing the infrared speckle.
  • the image captured by the robot through the infrared camera is normalized; the infrared image captured by the infrared camera is a gray image, and the image is normalized to avoid the influence of the intensity of the speckle on the detection of the speckle landmark.
  • the embodiment of the present invention selects an adaptive threshold method to binarize the grayscale image, and can pass the adaptiveThreshold function in the OpenCV library. achieve.
  • the contour is detected in the binarized image, and the candidate speckle landmark object is found; considering that the points in the speckle are separated, in order to extract the contour of the speckle landmark pattern, the present invention first performs Gaussian blurring and expansion on the image. And operations such as corrosion, so that the spots in the speckle are connected together.
  • the target object can be found by detecting the contour line in the image.
  • the image contour is extracted by using the FindContours function in the OpenCV library, and the contour is represented by storing a sequence of consecutive points constituting the contour.
  • the speckle landmark of this embodiment is rectangular, so the outline should also be rectangular. However, there is often a quadrilateral profile in the contour that remains after culling.
  • the outline is further processed by polygon approximation, and the vertices of the approximate polygon are recorded. If the approximate number of polygon vertices is greater than 4, then this polygon is not the speckle landmark to be detected and should be discarded. Since the infrared camera is used, it is less disturbed by the environment when it is indoors, and is processed to further determine whether it is a speckle landmark.
  • the resulting image will be deformed accordingly due to the difference in distance and orientation. So map the image to a plane perpendicular to the camera so that you can get the true shape of the object in the image.
  • the resulting image is converted to a plane perpendicular to the camera using the GetPerspectiveTransform and WarpPerspective functions in the OpenCV library.
  • the quadrilateral retained in the image can correspond to the actual shape in the environment.
  • the minAreaRect function is used to calculate the rectangle corresponding to the speckle landmark, and the center point position (u, v) of the rectangle is used as the position of the speckle sign.
  • the coordinates of the speckle landmark on the image plane can be used to determine the three-dimensional spatial position in the camera coordinate system, that is, the pose estimation from two-dimensional to three-dimensional.
  • This embodiment is based on a camera projection model:
  • the position of the landmark is converted from the image coordinate system to the camera coordinate system, that is, the robot corresponding coordinate system.
  • M is the mapping matrix composed of camera internal parameters and distortion parameters
  • (u, v) is the point on the image coordinate system
  • (x, y) is the point on the camera coordinate system, because the distance from the camera to the ceiling is fixed. Therefore, the coordinate position of the speckle image relative to the camera projected onto the horizontal plane is obtained.
  • a method for indoor positioning of a robot according to an embodiment of the present invention includes:
  • the pose relationship between the robot at time t and time t+1 is measured by an odometer.
  • the odometer measurement is the Euler transform in the SE(2) group.
  • the measurement noise is Gaussian white noise
  • the measurement error is represented by a 3*3 symmetric matrix, which is the information matrix, which depends on the motion of the robot. For example, when the robot moves more, its uncertainty increases. Node in the topology map To The edges between are constrained by the following two quantities: Indicates the motion between nodes; Represents the measured covariance matrix as a symmetric positive definite matrix.
  • the robot When the robot detects the position Landmark detected Its corresponding edge in the figure can be modeled by the current pose of the robot and the position of the landmark.
  • the measurement of the landmark is represented by a two-dimensional coordinate (x, y) on the plane. It is assumed that under Gaussian white noise, noise can be modeled by the inverse of its covariance matrix.
  • the model between the robot pose and the landmark is represented by the following parameters: Indicates that the robot is The location of the landmark detected; Represents the covariance matrix of the landmark measurement.
  • the optimization goal of the diagram of this embodiment is to find a node configuration that minimizes the log likelihood function F(x) of all measurement errors.
  • the above information matrix H is used to map the measurement error into the trajectory of the robot through the Jacobian matrix, so the structure is sparse, and the non-zero term is the observation limit between the poses.
  • the information matrix H and the residual matrix b of the system are a series of matrix and vector accumulation structures, wherein each matrix and vector respectively correspond to a constraint, and each constraint contributes to the system.
  • the additional term C ij and the structure of the additional term depends on the Jacobian determinant of the systematic error function. Since the constrained property function only depends on the value between the two nodes, the equation (6) Jacobian matrix has the following form:
  • a ij and B ij are the derivatives of the error function associated with node i and node j, respectively.
  • the structure of matrix block H ij and coefficient vector b ij can be obtained from equation (7) as follows:
  • the step S5 includes:
  • S50 Determine whether the speckle landmarks observed in the current pose of the robot are the same as the speckle landmarks observed in the specified pose of the robot.
  • the robot moves more than 0.5 m or the rotation angle exceeds 0.5°, a new node is inserted into the topology map, and as the robot roams in the indoor environment, the entire topology map is constructed.
  • the solid arrow between adjacent nodes represents the odometer constraint between the two nodes, and the odometer measurement accuracy is improved by the astigmatism landmark matching between the two frames before and after, and the dotted line between the nodes at the adjacent time is not
  • the arrows represent the constraints of the camera's observations. The robot begins to explore in the unknown area and is observed by the camera to match the past observations.
  • the step S6 includes:
  • S60 Define a first measurement error function between the observed parameter of the robot in the adjacent position and the measured parameter of the sensor according to the closed-loop condition constraint relationship.
  • Landmarks in the Euler coordinate system can directly represent the error by their difference.
  • the error function of the landmark is:
  • the odometer is measured in the SE(2) space, through The operation can write out the measurement function as:
  • the first measurement error function is:
  • S61 Select an incremental parameter of the pose parameter of the adjacent moment in the movement of the robot.
  • the landmark location of this embodiment belongs to a point in the Euler space, and the corresponding increments can be directly added.
  • the pose parameter of the robot does not belong to the Euler space.
  • this space is allowed to contain many parameters, such as including the rotation matrix R( ⁇ ) and the translation vector (x, y) T or the rotation angle ⁇ and Translation vector (x, y) T .
  • the robot can select the rotation angle ⁇ and the translation vector (x, y) T , by definition (operation to represent the corresponding incremental relationship of the pose of the robot at the adjacent moment. Since the angle is not It belongs to the Euler space and cannot directly add increments. It must be renormalized after each accumulation. In this patent, the pose of the robot is given. And increment Define the following operations:
  • equation (3) is a representation of the SLAM linearization based on the graph
  • equation (4) is its solution
  • the incremental parameters used as its update step, in each iteration, the solution in the previous iteration x * is used as the initial pose during the next iteration.
  • Most of the structures in the system are sparse.
  • the sparse matrix can be used to store the Hessian matrix H. Because of the symmetry of the information matrix H, only the upper part needs to be calculated to improve the computational efficiency.
  • step S62 includes:
  • the convergence of this step is whether there is a tendency to approach a certain value, and if so, it has convergence.
  • the closest specified value in this step is the solution of the first measurement error function, that is, finding the optimized value, exiting the loop, and calculating the Hessian matrix H in the starting space, where x * is the final result of the SLAM system. .
  • step S620 the method includes:
  • the robot pose and the landmark position are updated, it is judged that there is no convergence, and according to the closed-loop condition constraint relationship, the second measurement error between the observed parameter of the robot in the adjacent position and the measured parameter of the sensor is defined. Function, continue to loop judgment.
  • step S1 includes:
  • S10 Determine, by the speckle information, whether the photographed landmark is a known speckle landmark.
  • the speckle information of this step includes speckle position, speckle area, speckle shape, etc., and the newly photographed speckle landmark is matched with the previously photographed speckle landmark through the template to determine whether it is a new landmark, if the previous shot is scattered A plaque with a matching template is not a new speckle landmark, but a new speckle landmark.
  • a unique ID information is assigned to facilitate the identification of the speckle landmark to estimate the position and posture of the marker relative to the robot.
  • step S10 the method includes:
  • a method for indoor positioning of a robot according to another embodiment of the present invention before step S1, includes:
  • S101 Calibrate the internal parameters and distortion parameters of the camera.
  • the mapping matrix M composed of the camera internal parameters and the distortion parameters is calibrated to convert the position of the landmark from the image coordinate system to the camera coordinate system, that is, the robot corresponding coordinate system.
  • an apparatus for indoor positioning of a robot includes:
  • the identification module 1 is configured to identify a speckle landmark preset on an indoor ceiling
  • a conversion module 3 configured to convert image coordinates in the topology map into corresponding position information of the robot coordinates
  • the positioning module 4 is configured to locate the robot according to the corresponding position information of the robot coordinates.
  • an apparatus for indoor positioning of a robot includes:
  • a determining module 5 configured to determine a closed loop condition constraint relationship between the current pose of the robot and the speckle landmark
  • the optimization module 6 is configured to optimize the robot coordinates according to the closed loop condition constraint relationship.
  • the determining module 5 of an embodiment of the present invention includes:
  • a first determining unit 50 configured to determine whether the speckle landmarks observed in the current pose of the robot are the same as the speckle landmarks observed in the specified pose of the robot;
  • the determining unit 51 is configured to determine that the constraint relationship of the closed loop condition system exists if they are the same.
  • an optimization module 6 of an embodiment of the present invention includes:
  • a defining unit 60 configured to define, according to the closed-loop condition constraint relationship, a first measurement error function between an observed parameter of the robot in an adjacent time position and a measured parameter of the sensor;
  • the selecting unit 61 is configured to select an incremental parameter of the positional parameter of the adjacent moment in the movement of the robot;
  • the solving unit 62 is configured to solve the first measurement error function in an iterative manner according to the incremental parameter to obtain the optimized robot coordinates.
  • a solution unit 62 includes:
  • a determining subunit 620 configured to determine, when the robot updates the pose, whether the function value has a convergence in an iterative solution to the first measurement error function
  • the determining sub-unit 621 is configured to determine, if yes, that the corresponding function value is a solution of the first measurement error function.
  • the solving unit 62 includes:
  • the sub-unit 622 is redefined, and if not, the second measurement error function between the observation parameter under the pose of the robot and the measured parameter of the sensor is defined according to the closed-loop condition constraint relationship.
  • the identification module 1 of an embodiment of the present invention includes:
  • a second determining unit 10 configured to determine, by using the speckle information, whether the captured speckle landmark is a known speckle landmark
  • the retrieval unit 11 is configured to: if yes, retrieve the ID information of the speckle landmark;
  • the identifying unit 12 is configured to identify the speckle landmark by using the ID information.
  • the identification module 1 includes:
  • the matching unit 13 is configured to match the corresponding ID information to the speckle landmark if no.
  • an apparatus for indoor positioning of a robot includes:
  • the calibration module 101 is configured to calibrate the internal parameters and distortion parameters of the camera.
  • the storage module 102 is configured to store a mapping matrix composed of the internal parameters and the distortion parameters.
  • the apparatus for indoor positioning of the robot and the method of indoor positioning of the robot provided in the above embodiments are based on the same inventive concept.

Abstract

本发明揭示了机器人室内定位的方法及装置,其中,本发明提供的机器人室内定位的方法,包括:识别预设于室内天花板上的散斑地标;根据所述散斑地标以及机器人的位置变化形成图像位置的拓扑图;将所述拓扑图中的图像坐标转换为机器人坐标的相应位置;根据所述机器人坐标的相应位置定位机器人。本发明预设于天花板的散斑地标作为位置识别标识,便于图像识别,降低前端图像处理算法的复杂度以及计算量,提高定位系统的实时性,进而提高机器人室内定位的精准度。

Description

机器人室内定位的方法及装置 技术领域
本发明涉及到机器人领域,特别是涉及到机器人室内定位的方法及装置。
背景技术
同步定位与地图构建(Simultaneous Localization and Mapping,简称SLAM)是机器人领域众所周知的问题。配有传感器的移动机器人,通过从传感器的测量数据中,估计环境地图以及机器人的位置。传感器可分为机器人外部传感器和内部传感器,外部传感器用来测量机器人相对外界环境的移动量,包括摄像机、激光扫描仪、加速度计或者GPS;而内部传感器测量机器人相对自身前一时刻的状态(位置)变化,包括里程计和陀螺仪。在传统的SLAM方法中,比如EKF-SLAM(Extended Kalman Filter-Simultaneous Localization and Mapping,基于卡尔曼滤波的同步定位与地图构建),内部传感器用于测量机器人的状态变化,而外部传感器矫正测量误差。现有SLAM应用有两种典型情境:追踪,通常机器人的初始位置已知;全局定位,通常只给出很少甚至不给出有关于机器人起始位置或者环境特征的先验信息。因此,SLAM的核心问题是关于机器人运动路径和所处环境特征的后验估计。要解决上述核心问题,就必须建立适当的模型,以解出后验概率。
机器人学术界涌现出了各种各样的针对SLAM核心问题的解决方案。其中,基于扩展卡尔曼滤波器的EKF-SLAM方法获得了广泛的认可并被普遍应用。EKF-SLAM方法首先估计运动过程中某一时刻的状态,然后以(含噪声的)测量变量的方式获得反馈,最后根据反馈修正估计值。这样,EKF-SLAM方法就能在无需了解机器人详细性质的前提下,高效地对运动的过去、当前、甚至将来的状态进行估计。但上述方法还存在以下缺点:路标常常放置于地面上,容易被周围过往的其它机器人所干扰;简单的路标虽然图像处理简单,但是不具备纠错性,复杂路标虽然具有完整数据纠错性,但是图案过于复杂,实时性差;而且由于测量误差的积累以及计算复杂度随节点的数量增加而急速增 大,计算量过大,不容易扩展到大环境下的室内定位。
因此,现有技术还有待改进。
技术问题
本发明的主要目的为提供一种室内定位的方法及装置,旨在解决现有室内定位中SLAM方法中路标数据处理中易引起测量误差,定位不精准的技术问题。
技术解决方案
本发明的主要目的为提供一种室内定位的方法及装置,旨在解决现有室内定位中SLAM方法中路标数据处理中易引起测量误差,定位不精准的技术问题。
本发明提供了的机器人室内定位的方法,包括:
识别预设于室内天花板上的散斑地标;
根据所述散斑地标以及机器人的位姿变化形成图像位置的拓扑图;
将所述拓扑图中的图像坐标转换为机器人坐标的相应位置信息;
根据所述机器人坐标的相应位置信息定位机器人。
本发明还提供一种机器人室内定位的装置,包括:
识别模块,用于识别预设于室内天花板上的散斑地标;
形成模块,用于根据所述散斑地标以及机器人的位姿变化形成图像位置的拓扑图;
转换模块,用于将所述拓扑图中的图像坐标转换为机器人坐标的相应位置信息;
定位模块,用于根据所述机器人坐标的相应位置信息定位机器人。
有益效果
本发明有益技术效果:本发明预设于天花板的散斑地标作为位置识别标识,便于图像识别,降低前端图像处理算法的复杂度以及计算量,提高定位系统的实时性,进而提高机器人室内定位的精准度。而且本发明采用红外激光投射器主动在天花板上投射红外散斑,不受外界光照的影响,在黑暗环境下也适用,适用于大尺度室内环境的定位以及地图构建。
附图说明
图1本发明一实施例的机器人室内定位的方法流程示意图;
图2本发明一实施例的机器人室内定位的优化方法流程示意图;
图3本发明一实施例的步骤S5的流程示意图;
图4本发明一实施例的步骤S6的流程示意图;
图5本发明一实施例的步骤S62的流程示意图;
图6本发明一实施例的步骤S1的流程示意图;
图7本发明另一实施例的机器人室内定位的方法流程示意图;
图8本发明一实施例的拓扑图示意图;
图9本发明一实施例的拓扑图中的闭环条件系约束关系示意图;
图10本发明一实施例的机器人室内定位系统示意图;
图11本发明一实施例的机器人室内定位装置的结构示意图;
图12本发明一实施例的机器人室内定位装置的优化结构示意图;
图13本发明一实施例的确定模块的结构示意图;
图14本发明一实施例的优化模块的结构示意图;
图15本发明一实施例的求解单元的结构示意图;
图16本发明一实施例的识别模块的结构示意图;
图17本发明另一实施例的机器人室内定位装置的结构示意图。本发明目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
本发明的最佳实施方式
应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。
参照图1,本发明一实施例的机器人室内定位的方法,包括:
S1:识别预设于室内天花板上的散斑地标。
本步骤的散斑地标包括预先贴敷于天花板的图片散斑和预先投射的光投影散斑。本实施例优选由红外激光散斑投射器投射的红外散斑,肉眼不可见且对人体无害,且不受外界光照的影响,在黑暗环境下也适用,适用于大尺度室内环境的定位以及地图构建。如附图10所示,本实施例提供了一种机器人室内定位系统示意图,该系统包括 红外激光散斑投射器1、固定在机器人上的广角红外摄像机3以及搭载有里程计的机器人底盘2。本实施例优选在室内均匀放置多个红外激光散斑投射器,将散斑地标投射至天花板上,散斑地标随机分布,每个散斑地标图案各异,以便于识别。根据散斑投射器的投射范围确定红外激光散斑投射器的数量以及位置布局,尽量减少红外散斑的区域重叠,以简化识别,进一步简化计算,同时结合对应的红外摄像机的视野大小以及红外摄像机距离天花板的高度综合考虑,以使得红外摄像机不存在观测死角为准,即摄像机在场景中任何地方至少能拍摄到一个散斑地标。本实施例优选波长为830nm红外激光投射器,使单个红外激光投射器覆盖的定位范围尽可能广。本实施例优选垂直于天花板方向设置于机器人上的红外广角摄像机进行散斑识别,以便红外摄像机有足够大的视野范围。本实施例优选焦距为2.8mm或者2.5mm的广角镜头,镜头前方安装中心波长为830nm的窄带滤波片,只对红外散斑的光波进行感光成像。
S2:根据上述散斑地标以及机器人的位姿变化形成图像位置的拓扑图。
本实施例的基于图的SLAM方法将抽象原始的传感器测量数据转换为构建简单的图优化问题。原始的测量数据被图中的边取代,拓扑图中两个节点之间的边通过机器人位姿的相对位置的概率分布来标记,受到传感器测量数据的约束,如附图8所示,其中节点
Figure PCTCN2018071995-appb-000001
表示机器人的位姿,
Figure PCTCN2018071995-appb-000002
表示地标的位置,
Figure PCTCN2018071995-appb-000003
表示地标与机器人之间的约束关系,
Figure PCTCN2018071995-appb-000004
表示在t-1时刻与t时刻机器人位姿的约束关系,其中
Figure PCTCN2018071995-appb-000005
可以通过地标探测传感器测得,而
Figure PCTCN2018071995-appb-000006
可以通过里程计测量得到。本实施例通过地标与机器人位姿之间的约束关系构建模型,通过图优化解决机器人定位与地图构建问题。
令x={x 1,...,x T}表示拓扑图中节点的位姿向量,其中x i代表第i个节点的位姿,如附图9所示,z ij表示节点i和节点j之间的观测约束,Ω ij表示节点i和节点j观测约束的信息矩阵(误差协方差矩阵的逆),
Figure PCTCN2018071995-appb-000007
表示预测:节点i和节点j之间的位姿转换关系即里程计测量值。
S3:将上述拓扑图中的图像坐标转换为机器人坐标的相应位置信息。
机器人的位姿包含了在平面上的坐标(x,y)以及其方向θ,属于SE(2)空间,地标可以通过二维空间的坐标表示;机器人位置
Figure PCTCN2018071995-appb-000008
地标位置
Figure PCTCN2018071995-appb-000009
机器人每移动一定距离或者旋转一定角度,可以通过里程计测量可得到机器人当前时刻的位姿,根据摄像机拍摄的图像,检测当前机器人探测到的路标。
通过里程计测量数据得到机器人的位姿。机器人在t时刻的位姿
Figure PCTCN2018071995-appb-000010
设里程计测得t到t+1之间的运动距离为Δd,此时机器人在水平面上的航向变化率为ω,则t+1时刻机器人的位姿
Figure PCTCN2018071995-appb-000011
按照航位推算公式计算为:
Figure PCTCN2018071995-appb-000012
机器人实时拍摄环境,通过散斑地标的识别,确定散斑地标在图像中的位置,根据摄像机模型,将其转化至摄像机坐标系,得到散斑地标相对于机器人的位置,以作为观测量。本发明实施例中,机器人通过识别红外散斑实现观测。
将机器人通过红外摄像机拍摄到的图像归一化;红外摄像机拍摄到的红外图像是灰色图像,将图像归一化可避免散斑的强弱对散斑地标检测的影响。
对已归一化后的图像进行二值化操作;考虑的散斑投射光照强度的影响,本发明实施例选取自适应阈值方法对灰度图像进行二值化,可通过OpenCV库中adaptiveThreshold函数实现。
在二值化的图像中检测轮廓线,并查找出候选散斑地标对象;考虑到散斑中各个点是分离的,为了提取散斑地标图案的轮廓,本发明首先对图像进行高斯模糊,膨胀以及腐蚀等操作,让散斑中的斑点都 连接一起。以便使图像中的散斑变成黑白两种颜色组成的方形。由于本实施例的散斑地标都是长方形,具有闭合的轮廓和特定的大小,因此可以通过检测图像中的轮廓线来查找出目标对象。本发明实施例使用OpenCV库中的FindContours函数进行图像轮廓的提取,通过存储序列记录一列构成轮廓的连续点来表示轮廓。因为不可能是小的轮廓,并通过ContourArea函数计算轮廓面积,剔除面积过小或者过大的非散斑轮廓。本实施例的散斑地标是长方形的,因此轮廓也应该是长方形。而经过剔除处理后保留的轮廓中往往有并非四边形轮廓存在。进一步通过多边形近似处理轮廓,记录近似多边形的顶点。如果近似的多边形顶点数大于4,那么这个多边形就不是所要检测的散斑地标,应该舍弃。由于使用的是红外摄像机,在室内时受环境干扰较小,处理之后以便进一步确定是否是散斑地标。同时,摄像机观测物体时,由于距离及方位不同,得到的图像会产生相应的变形。因此要把图像映射到与摄像头垂直的平面上,这样才能得到图像中物体的真实形状。利用OpenCV函数库中的GetPerspectiveTransform和WarpPerspective函数,将得到的图像转换到与摄像头垂直的平面上。图像中保留的四边形就能够与环境中的真实形状对应。最后使用minAreaRect函数计算散斑地标对应的矩形,将矩形的中心点位置(u,v)作为散斑路标的位置。
为了得到散斑地标与机器人之间的位置关系,可利用散斑地标在图像平面上的坐标,确定出其在摄像机坐标系下的三维空间位置,也就是从二维到三维的位姿估计。本实施例根据摄像机投射模型:
Figure PCTCN2018071995-appb-000013
将地标的位置从图像坐标系转化至摄像机坐标系,也即机器人对应坐标系。其中,M为摄像机内参和畸变参数组成的映射矩阵,(u,v)为图像坐标系上的点,(x,y)为摄像机坐标系上的点,由于摄像机到天花板上的距离是固定的,因此得到散斑图像相对于摄像机投射到水平面上的坐标位置。
S4:根据上述机器人坐标的相应位置信息定位机器人。
参照图2,进一步地,本发明一实施例的机器人室内定位的方法,步骤S3之后,包括:
S5:确定机器人当前位姿与所述散斑地标之间的闭环条件系约束关系。
本实施例中,机器人在t时刻以及t+1时刻之间的位姿关系通过里程计测量得到,
Figure PCTCN2018071995-appb-000014
Figure PCTCN2018071995-appb-000015
表示机器人在固定时间间隔内的相对运动。由于里程计测量受到噪声的影响,这种测量结果通常与真实值是略微有点偏差。里程计测量是在SE(2)群中欧拉变换,假设测量噪声是高斯白噪声,通过3*3的对称矩阵来表示其测量误差,即为信息矩阵,该矩阵取决于机器人的运动。举例地,当机器人移动越大时,其不确定性也随之越大。拓扑图中节点
Figure PCTCN2018071995-appb-000016
Figure PCTCN2018071995-appb-000017
之间的边由下面两个量来约束:
Figure PCTCN2018071995-appb-000018
表示节点之间的运动;
Figure PCTCN2018071995-appb-000019
表示测量的协方差矩阵,为对称的正定矩阵。当机器人检测在位置
Figure PCTCN2018071995-appb-000020
时探测到地标
Figure PCTCN2018071995-appb-000021
其在图中对应的边可以通过机器人当前的位姿以及地标的位置来建模。地标的测量有平面上二维坐标(x,y)表示。假设在高斯白噪声下,噪声可以通过其协方差矩阵的逆来建模。机器人位姿到地标之间边的模型通过以下参数表示:
Figure PCTCN2018071995-appb-000022
表示机器人在
Figure PCTCN2018071995-appb-000023
所探测到地标的位置;
Figure PCTCN2018071995-appb-000024
表示地标测量的协方差矩阵。
S6:根据所述闭环条件系约束关系优化所述机器人坐标。
观测值z ij的对数似然函数l ij表述为:
Figure PCTCN2018071995-appb-000025
定义测量误差函数e(x i,x j,z ij)表示预期的估计值,
Figure PCTCN2018071995-appb-000026
为与实际的观测值z ij之间的误差,表示为:
Figure PCTCN2018071995-appb-000027
(2)
所以,测量误差函数表达式为:
Figure PCTCN2018071995-appb-000028
本实施例图的优化目标就是找到使得所有测量误差的对数似然函数F(x)最小的节点配置
Figure PCTCN2018071995-appb-000029
在得到一个比较精确的机器人初步位姿
Figure PCTCN2018071995-appb-000030
的情况下,利用牛顿-高斯优化数值算法求的式(4)的解,在初步位姿
Figure PCTCN2018071995-appb-000031
附近利用一阶泰勒展开式来逼近误差函数:
Figure PCTCN2018071995-appb-000032
上述J ij是e ij(x)的雅克比矩阵,代入公式(3)可以得到
Figure PCTCN2018071995-appb-000033
的近似表示:
Figure PCTCN2018071995-appb-000034
通过局部的逼近,重写上述函数:
Figure PCTCN2018071995-appb-000035
其中,c=Σc ij,b=Σb ij,和H=ΣH ij
最小化F(x)来解决该线性系统:HΔx *=-b(8)
上述信息矩阵H,是通过雅克比矩阵来映射测量误差到机器人运动轨迹中,所以结构是稀疏的,其中的非零项是位姿之间的观测限制。
根据等式(8),系统的信息矩阵H和残差矩阵b均是一系列矩阵和向量累加的结构,其中,每一个矩阵和向量分别对应着一个约束,每一个约束均会对系统贡献一个附加项C ij,而附加项的结构依赖于 系统误差函数的雅克比行列式。由于约束的物产函数仅仅依赖于两个节点之间的值,式(6)雅克比矩阵有如下形式:
Figure PCTCN2018071995-appb-000036
其中A ij和B ij分别是误差函数分别与节点i和节点j相关的导数,从等式(7)可以得到矩阵块H ij和系数向量b ij的结构如下:
Figure PCTCN2018071995-appb-000037
为了简单起见,公式中省略了零项。
参照图3,进一步地,本发明一实施例的机器人室内定位的方法,所述步骤S5,包括:
S50:判断机器人当前位姿下观测的散斑地标与机器人指定位姿下观测的所述散斑地标是否相同。
本实施例中,当机器人移动超过0.5m或者旋转角度超过0.5°时,就往拓扑图中插入新的节点,随着机器人在室内环境中漫游,整个拓扑图就被构建出来。相邻时刻节点之间的实线箭头代表两节点之间的里程计约束,通过前后两帧图像之间散斑地标匹配来提高里程计测量估计精度,而不相邻时刻的节点之间的虚线箭头代表了相机观测数据的约束。机器人开始在未知区域内探索,通过摄像机观测,与过去的观测进行匹配。
S51:若相同,则判定存在闭环条件系的所述约束关系。
若不同位置观测到同一散斑地标,则构成闭环条件系的约束关系。
参照图4,进一步地,本发明一实施例的机器人室内定位的方法,所述步骤S6,包括:
S60:根据所述闭环条件系约束关系,定义机器人相邻时刻位姿下的观测参量与传感器实测参量之间的第一测量误差函数。
构造机器人位姿
Figure PCTCN2018071995-appb-000038
到地标
Figure PCTCN2018071995-appb-000039
之间的误差函数,
Figure PCTCN2018071995-appb-000040
表示虚拟 测量,虚拟测量是机器人从
Figure PCTCN2018071995-appb-000041
观测到地标
Figure PCTCN2018071995-appb-000042
的位置,用等式1表示
Figure PCTCN2018071995-appb-000043
地标在欧拉坐标系下,可以直接以它们的差值来表示误差。地标的误差函数为:
Figure PCTCN2018071995-appb-000044
定义机器人相邻测量位姿
Figure PCTCN2018071995-appb-000045
Figure PCTCN2018071995-appb-000046
之间的测量误差函数,里程计测量在SE(2)空间内,通过
Figure PCTCN2018071995-appb-000047
操作可以写出测量函数为:
Figure PCTCN2018071995-appb-000048
同样地,第一测量误差函数为:
Figure PCTCN2018071995-appb-000049
S61:选择机器人移动过程中相邻时刻位姿参量的增量参数。
本实施例的地标位置属于欧拉空间内的点,其对应的增量可以直接相加得到。但机器人的位姿参量不属于欧拉空间,在SE(2)群中,这个空间允许包含许多参数,例如包括旋转矩阵R(θ)和平移向量(x,y) T或者是旋转角度θ和平移向量(x,y) T
机器人通过在二维平面上的移动作为参数增量,可以选择旋转角度θ和平移向量(x,y) T,通过定义(操作来表示机器人相邻时刻位姿的对应增量关系。由于角度不属于欧拉空间,无法直接相加表示增量,每一次累加后必须从新归一化。在本专利中,给出机器人的位姿
Figure PCTCN2018071995-appb-000050
和增量
Figure PCTCN2018071995-appb-000051
定义如下操作:
Figure PCTCN2018071995-appb-000052
Figure PCTCN2018071995-appb-000053
对应的逆的操作如下:
Figure PCTCN2018071995-appb-000054
S62:根据所述增量参数通过迭代的方式求解所述第一测量误差函数,以得到优化的所述机器人坐标。
基于图的SLAM问题最终归结到求解线性方程(8)的问题。本发明中通过迭代的线性化方式来解决非线性问题。其中式(3)是基于图的SLAM线性化的表示形式,式(4)是它的解,而使用的增量参数作为其更新步进,在每一次迭代过程中,前一次迭代中的解x *作为下一次迭代过程中的初始位姿。其中系统中绝大多数结构都是稀疏的,可以使用稀疏矩阵去存储海森矩阵H,且由于信息矩阵H的对称性,只需要计算上半部分以提高计算效率。
首先,初始化机器人位姿以及偏移向量b和信息矩阵;
将所有的误差量e ij和协方差矩阵Ω ij,计算器其雅克比矩阵A ij和B ij
Figure PCTCN2018071995-appb-000055
计算线性系统的H矩阵非零块,根据等式(10)有:
Figure PCTCN2018071995-appb-000056
计算系数向量:
Figure PCTCN2018071995-appb-000057
最后解线性方程HΔx *=-b即可实现优化机器人坐标。
参照图5,进一步地,本发明一实施例的机器人室内定位的方法,步骤S62,包括:
S620:在机器人更新位姿时,判断迭代求解所述第一测量误差函数的过程中所述函数值是否具有收敛性。
本步骤的收敛性即是否有趋近于某一指定值的趋势,若有,则具有收敛性。
S621:若是,则判定对应的函数值为所述第一测量误差函数的解。
本步骤中的最趋近的指定值为第一测量误差函数的解,即找到最优化的值,退出循环,计算在起点空间内的海森矩阵H,此时x *为SLAM系统的最终结果。
进一步地,步骤S620之后,包括:
S622:若否,则根据所述闭环条件系约束关系,定义机器人相邻时刻位姿下的观测参量与传感器实测参量之间的第二测量误差函数。
本实施例在更新机器人位姿以及地标位置时,判断其没有收敛,则根据所述闭环条件系约束关系,定义机器人相邻时刻位姿下的观测参量与传感器实测参量之间的第二测量误差函数,继续循环判断。
参照图6,进一步地,本发明一实施例的机器人室内定位的方法,步骤S1,包括:
S10:通过散斑信息判断拍摄到的所述散斑地标是否为已知散斑地标。
本步骤的散斑信息包括散斑位置、散斑区域、散斑形状等,将新拍摄的散斑地标与之前拍摄的散斑地标通过模板匹配,判断是否是新的地标,如果之前拍摄的散斑地标有与之匹配的模板,则不是新的散斑地标,反之是新的散斑地标。
S11:若是,则调取所述散斑地标的ID信息。
已知散斑地标都存在与之匹配的ID信息,通过识别ID信息区别不同的散斑地标。
S12:通过所述ID信息标识所述散斑地标。
本实施例中对于每一个不同的散斑地标,为之分配一个唯一的ID信息,方便识别散斑地标,以便估计标志物相对于机器人的位置和位姿。
进一步地,步骤S10之后,包括:
S13:若否,则为所述散斑地标匹配相应的ID信息。
若为新的散斑地标,则为之分配新ID信息并保存至本地,以便观测使用。
参照图7,本发明另一实施例的机器人室内定位的方法,步骤S1之前,包括:
S101:标定摄像机的内参和畸变参数。
S102:存储所述内参和畸变参数组成的映射矩阵。
标定摄像机内参和畸变参数组成的映射矩阵M,以便将地标的位置从图像坐标系转化至摄像机坐标系,也即机器人对应坐标系。
参照图11,本发明一实施例的机器人室内定位的装置,包括:
识别模块1,用于识别预设于室内天花板上的散斑地标;
形成模块2,用于根据所述散斑地标以及机器人的位姿变化形成图像位置拓扑图;
转换模块3,用于将所述拓扑图中的图像坐标转换为机器人坐标的相应位置信息;
定位模块4,用于根据所述机器人坐标的相应位置信息定位机器人。
参照图12,进一步地,本发明一实施例的机器人室内定位的装置,包括:
确定模块5,用于确定机器人当前位姿与所述散斑地标之间的闭环条件系约束关系;
优化模块6,用于根据所述闭环条件系约束关系优化所述机器人坐标。
参照图13,进一步地,本发明一实施例的确定模块5,包括:
第一判断单元50,用于判断机器人当前位姿下观测的散斑地标与机器人指定位姿下观测的所述散斑地标是否相同;
判定单元51,用于若相同,则判定存在闭环条件系的所述约束关系。
参照图14,进一步地,本发明一实施例的优化模块6,包括:
定义单元60,用于根据所述闭环条件系约束关系,定义机器人相邻时刻位姿下的观测参量与传感器实测参量之间的第一测量误差函数;
选择单元61,用于选择机器人移动过程中相邻时刻位姿参量的增量参数;
求解单元62,用于根据所述增量参数通过迭代的方式求解所述 第一测量误差函数,以得到优化的所述机器人坐标。
参照图15,进一步地,本发明一实施例的求解单元62,包括:
判断子单元620,用于在机器人更新位姿时,判断迭代求解所述第一测量误差函数的过程中所述函数值是否具有收敛性;
判定子单元621,用于若是,则判定对应的函数值为所述第一测量误差函数的解。
优选地,所述求解单元62,包括:
重新定义子单元622,用于若否,则根据所述闭环条件系约束关系,定义机器人相邻时刻位姿下的观测参量与传感器实测参量之间的第二测量误差函数。
参照图16,进一步地,本发明一实施例的识别模块1,包括:
第二判断单元10,用于通过散斑信息判断拍摄到的所述散斑地标是否为已知散斑地标;
调取单元11,用于若是,则调取所述散斑地标的ID信息;
标识单元12,用于通过所述ID信息标识所述散斑地标。
进一步地,所述识别模块1,包括:
匹配单元13,用于若否,则为所述散斑地标匹配相应的ID信息。
参照图17,本发明另一实施例的机器人室内定位的装置,包括:
标定模块101,用于标定摄像机的内参和畸变参数。
存储模块102,用于存储所述内参和畸变参数组成的映射矩阵。
上述实施例中提供的机器人室内定位的装置和机器人室内定位的方法均是基于相同的发明构思。因此,机器人室内定位的装置中各个具体实施例的功能模块/单元的具体的功能可以参见前述方法实施例,在此不再赘述。
以上所述仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (20)

  1. 一种机器人室内定位的方法,其特征在于,包括:
    识别预设于室内天花板上的散斑地标;
    根据所述散斑地标以及机器人的位姿变化形成图像位置的拓扑图;
    将所述拓扑图中的图像坐标转换为机器人坐标的相应位置信息;
    根据所述机器人坐标的相应位置信息定位所述机器人。
  2. 根据权利要求1所述的机器人室内定位的方法,其特征在于,所述将所述拓扑图中的图像坐标转换为机器人坐标的相应位置信息的步骤之后,包括:
    确定所述机器人当前位姿与所述散斑地标之间的闭环条件系约束关系;
    根据所述闭环条件系约束关系优化所述机器人坐标。
  3. 根据权利要求2所述的机器人室内定位的方法,其特征在于,所述确定机器人当前位姿与所述散斑地标之间的闭环条件系约束关系的步骤,包括:
    判断所述机器人当前位姿下观测的散斑地标与所述机器人指定位姿下观测的所述散斑地标是否相同;
    若相同,则判定存在所述闭环条件系约束关系。
  4. 根据权利要求2所述的机器人室内定位的方法,其特征在于,所述根据所述闭环条件系约束关系优化所述机器人坐标的步骤,包括:
    根据所述闭环条件系约束关系,定义机器人相邻时刻位姿下的观测参量与传感器实测参量之间的第一测量误差函数;
    选择机器人移动过程中相邻时刻位姿参量的增量参数;
    根据所述增量参数通过迭代的方式求解所述第一测量误差函数,以得到优化的所述机器人坐标。
  5. 根据权利要求4所述的机器人室内定位的方法,其特征在于,所述根据所述增量参数通过迭代的方式求解所述第一测量误差函数的步骤,包括:
    在机器人更新位姿时,判断迭代求解所述第一测量误差函数的过程中所述函数值是否具有收敛性;
    若是,则判定对应的函数值为所述第一测量误差函数的解。
  6. 根据权利要求5所述的机器人室内定位的方法,其特征在于,所述在机器人更新位姿时,判断在迭代求解所述第一测量误差函数的过程中所述函数值是否具有收敛性的步骤之后,包括:
    若否,则根据所述闭环条件系约束关系,定义机器人相邻时刻位姿下的观测参量与传感器实测参量之间的第二测量误差函数。
  7. 根据权利要求1所述的机器人室内定位的方法,其特征在于,所述识别室内天花板上的散斑地标的步骤,包括:
    通过散斑信息判断拍摄到的所述散斑地标是否为已知散斑地标;
    若是,则调取所述散斑地标的ID信息;
    通过所述ID信息标识所述散斑地标。
  8. 根据权利要求7所述的机器人室内定位的方法,其特征在于,所述通过散斑信息判断拍摄到的所述散斑地标是否为已知散斑地标的步骤之后,包括:
    若否,则为所述散斑地标匹配相应的ID信息。
  9. 根据权利要求1所述的机器人室内定位的方法,其特征在于,所述识别室内天花板上的散斑地标的步骤之前,包括:
    标定摄像机的内参和畸变参数;
    存储所述内参和畸变参数组成的映射矩阵。
  10. 根据权利要求1所述的机器人室内定位的方法,其特征在于,所述散斑地标包括通过红外激光散斑投射器投射的红外线信号组成的散斑图案。
  11. 一种机器人室内定位的装置,其特征在于,包括:
    识别模块,用于识别预设于室内天花板上的散斑地标;
    形成模块,用于根据所述散斑地标以及机器人的位姿变化形成图像位置的拓扑图;
    转换模块,用于将所述拓扑图中的图像坐标转换为机器人坐标的相应位置信息;
    定位模块,用于根据所述机器人坐标的相应位置信息定位机器人。
  12. 根据权利要求11所述的机器人室内定位的装置,其特征在于,包括:
    确定模块,用于确定机器人当前位姿与所述散斑地标之间的闭环条件系约束关系;
    优化模块,用于根据所述闭环条件系约束关系优化所述机器人坐标。
  13. 根据权利要求12所述的机器人室内定位的装置,其特征在于,所述确定模块,包括:
    第一判断单元,用于判断机器人当前位姿下观测的散斑地标与机器人指定位姿下观测的所述散斑地标是否相同;
    判定单元,用于若相同,则判定存在闭环条件系的所述约束关系。
  14. 根据权利要求12所述的机器人室内定位的装置,其特征在于,所述优化模块,包括:
    定义单元,用于根据所述闭环条件系约束关系,定义机器人相邻时刻位姿下的观测参量与传感器实测参量之间的第一测量误差函数;
    选择单元,用于选择机器人移动过程中相邻时刻位姿参量的增量参数;
    求解单元,用于根据所述增量参数通过迭代的方式求解所述第一测量误差函数,以得到优化的所述机器人坐标。
  15. 根据权利要求14所述的机器人室内定位的装置,其特征在于,所述求解单元,包括:
    判断子单元,用于在机器人更新位姿时,判断迭代求解所述第一测量误差函数的过程中所述函数值是否具有收敛性;
    判定子单元,用于若是,则判定对应的函数值为所述第一测量误差函数的解。
  16. 根据权利要求15所述的机器人室内定位的装置,其特征在于,所述求解单元,包括:
    重新定义子单元,用于若否,则根据所述闭环条件系约束关系,定义机器人相邻时刻位姿下的观测参量与传感器实测参量之间的第二测量误差函数。
  17. 根据权利要求11所述的机器人室内定位的装置,其特征在于,所述识别模块,包括:
    第二判断单元通过散斑信息判断拍摄到的所述散斑地标是否为已知散斑地标;
    调取单元,用于若是,则调取所述散斑地标的ID信息;
    标识单元,用于通过所述ID信息标识所述散斑地标。
  18. 根据权利要求17所述的机器人室内定位的装置,其特征在于,所述识别模块,包括:
    匹配单元,用于若否,则为所述散斑地标匹配相应的ID信息。
  19. 根据权利要求11所述的机器人室内定位的装置,其特征在于,包括:
    标定模块,用于标定摄像机的内参和畸变参数;
    存储模块,用于存储所述内参和畸变参数组成的映射矩阵。
  20. 根据权利要求11所述的机器人室内定位的装置,其特征在于,所述散斑地标包括通过红外激光散斑投射器投射的红外线信号组成的散斑图案。
PCT/CN2018/071995 2018-01-09 2018-01-09 机器人室内定位的方法及装置 WO2019136613A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/071995 WO2019136613A1 (zh) 2018-01-09 2018-01-09 机器人室内定位的方法及装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/071995 WO2019136613A1 (zh) 2018-01-09 2018-01-09 机器人室内定位的方法及装置

Publications (1)

Publication Number Publication Date
WO2019136613A1 true WO2019136613A1 (zh) 2019-07-18

Family

ID=67218821

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/071995 WO2019136613A1 (zh) 2018-01-09 2018-01-09 机器人室内定位的方法及装置

Country Status (1)

Country Link
WO (1) WO2019136613A1 (zh)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110411457A (zh) * 2019-08-27 2019-11-05 纵目科技(上海)股份有限公司 基于行程感知与视觉融合的定位方法、系统、终端和存储介质
CN112184812A (zh) * 2020-09-23 2021-01-05 广东海洋大学 提升无人机相机对AprilTag识别定位精度方法及定位方法和系统
CN112710308A (zh) * 2019-10-25 2021-04-27 阿里巴巴集团控股有限公司 机器人的定位方法、装置和系统
CN112765548A (zh) * 2021-01-13 2021-05-07 阿里巴巴集团控股有限公司 用于传感器融合定位的协方差确定方法、定位方法和装置
CN113203419A (zh) * 2021-04-25 2021-08-03 重庆大学 基于神经网络的室内巡检机器人校正定位方法
CN113781550A (zh) * 2021-08-10 2021-12-10 国网河北省电力有限公司保定供电分公司 一种四足机器人定位方法与系统
CN114440892A (zh) * 2022-01-27 2022-05-06 中国人民解放军军事科学院国防科技创新研究院 一种基于拓扑地图和里程计的自定位方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012071796A1 (zh) * 2010-11-29 2012-06-07 嘉兴亚特园林机械研究所 一种移动机器人定位系统及其定位方法
CN104330090A (zh) * 2014-10-23 2015-02-04 北京化工大学 机器人分布式表征智能语义地图创建方法
CN105806337A (zh) * 2014-12-30 2016-07-27 Tcl集团股份有限公司 一种应用于室内机器人的定位方法和室内机器人
CN105987693A (zh) * 2015-05-19 2016-10-05 北京蚁视科技有限公司 一种视觉定位装置及基于该装置的三维测绘系统及方法
CN205656497U (zh) * 2016-02-17 2016-10-19 深圳思瑞普科技有限公司 一种rfid定位机器人
CN106153043A (zh) * 2015-04-13 2016-11-23 Tcl集团股份有限公司 一种基于红外测距传感器的机器人室内定位方法及系统
CN205909828U (zh) * 2016-08-06 2017-01-25 中科院合肥技术创新工程院 一种用于室内移动机器人定位的红外路标

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012071796A1 (zh) * 2010-11-29 2012-06-07 嘉兴亚特园林机械研究所 一种移动机器人定位系统及其定位方法
CN104330090A (zh) * 2014-10-23 2015-02-04 北京化工大学 机器人分布式表征智能语义地图创建方法
CN105806337A (zh) * 2014-12-30 2016-07-27 Tcl集团股份有限公司 一种应用于室内机器人的定位方法和室内机器人
CN106153043A (zh) * 2015-04-13 2016-11-23 Tcl集团股份有限公司 一种基于红外测距传感器的机器人室内定位方法及系统
CN105987693A (zh) * 2015-05-19 2016-10-05 北京蚁视科技有限公司 一种视觉定位装置及基于该装置的三维测绘系统及方法
CN205656497U (zh) * 2016-02-17 2016-10-19 深圳思瑞普科技有限公司 一种rfid定位机器人
CN205909828U (zh) * 2016-08-06 2017-01-25 中科院合肥技术创新工程院 一种用于室内移动机器人定位的红外路标

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110411457A (zh) * 2019-08-27 2019-11-05 纵目科技(上海)股份有限公司 基于行程感知与视觉融合的定位方法、系统、终端和存储介质
CN110411457B (zh) * 2019-08-27 2024-04-19 纵目科技(上海)股份有限公司 基于行程感知与视觉融合的定位方法、系统、终端和存储介质
CN112710308A (zh) * 2019-10-25 2021-04-27 阿里巴巴集团控股有限公司 机器人的定位方法、装置和系统
CN112184812A (zh) * 2020-09-23 2021-01-05 广东海洋大学 提升无人机相机对AprilTag识别定位精度方法及定位方法和系统
CN112184812B (zh) * 2020-09-23 2023-09-22 广东海洋大学 提升无人机相机对AprilTag识别定位精度方法及定位方法和系统
CN112765548A (zh) * 2021-01-13 2021-05-07 阿里巴巴集团控股有限公司 用于传感器融合定位的协方差确定方法、定位方法和装置
CN112765548B (zh) * 2021-01-13 2024-01-09 阿里巴巴集团控股有限公司 用于传感器融合定位的协方差确定方法、定位方法和装置
CN113203419A (zh) * 2021-04-25 2021-08-03 重庆大学 基于神经网络的室内巡检机器人校正定位方法
CN113203419B (zh) * 2021-04-25 2023-11-10 重庆大学 基于神经网络的室内巡检机器人校正定位方法
CN113781550A (zh) * 2021-08-10 2021-12-10 国网河北省电力有限公司保定供电分公司 一种四足机器人定位方法与系统
CN114440892A (zh) * 2022-01-27 2022-05-06 中国人民解放军军事科学院国防科技创新研究院 一种基于拓扑地图和里程计的自定位方法
CN114440892B (zh) * 2022-01-27 2023-11-03 中国人民解放军军事科学院国防科技创新研究院 一种基于拓扑地图和里程计的自定位方法

Similar Documents

Publication Publication Date Title
WO2019136613A1 (zh) 机器人室内定位的方法及装置
CN111076733B (zh) 一种基于视觉与激光slam的机器人室内建图方法及系统
CN108332752B (zh) 机器人室内定位的方法及装置
CN107160395B (zh) 地图构建方法及机器人控制系统
JP6760114B2 (ja) 情報処理装置、データ管理装置、データ管理システム、方法、及びプログラム
WO2021139590A1 (zh) 基于蓝牙与slam的室内定位导航装置及其方法
CN113436260B (zh) 基于多传感器紧耦合的移动机器人位姿估计方法和系统
CN110146099B (zh) 一种基于深度学习的同步定位与地图构建方法
JP5480667B2 (ja) 位置姿勢計測装置、位置姿勢計測方法、プログラム
WO2012023593A1 (en) Position and orientation measurement apparatus, position and orientation measurement method, and storage medium
WO2022078513A1 (zh) 定位方法、装置、自移动设备和存储介质
CN113096183B (zh) 一种基于激光雷达与单目相机的障碍物检测与测量方法
CN112396656B (zh) 一种视觉与激光雷达融合的室外移动机器人位姿估计方法
CN111964680A (zh) 一种巡检机器人的实时定位方法
JP6922348B2 (ja) 情報処理装置、方法、及びプログラム
Zhang LILO: A Novel Lidar–IMU SLAM System With Loop Optimization
KR102490521B1 (ko) 라이다 좌표계와 카메라 좌표계의 벡터 정합을 통한 자동 캘리브레이션 방법
CN112762929B (zh) 一种智能导航方法、装置和设备
CN111571561B (zh) 移动机器人
WO2019113859A1 (zh) 基于机器视觉的虚拟墙构建方法及装置、地图构建方法、可移动电子设备
JP2018116147A (ja) 地図作成装置、地図作成方法及び地図作成用コンピュータプログラム
CN111964681B (zh) 一种巡检机器人的实时定位系统
CN112598736A (zh) 一种基于地图构建的视觉定位方法及装置
Li et al. CAD-vision-range-based self-localization for mobile robot using one landmark
CN117576200B (zh) 一种长周期移动机器人定位方法、系统、设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18899451

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18899451

Country of ref document: EP

Kind code of ref document: A1