WO2022267285A1 - 机器人位姿的确定方法、装置、机器人及存储介质 - Google Patents

机器人位姿的确定方法、装置、机器人及存储介质 Download PDF

Info

Publication number
WO2022267285A1
WO2022267285A1 PCT/CN2021/126715 CN2021126715W WO2022267285A1 WO 2022267285 A1 WO2022267285 A1 WO 2022267285A1 CN 2021126715 W CN2021126715 W CN 2021126715W WO 2022267285 A1 WO2022267285 A1 WO 2022267285A1
Authority
WO
WIPO (PCT)
Prior art keywords
pose
robot
determining
confidence
grid
Prior art date
Application number
PCT/CN2021/126715
Other languages
English (en)
French (fr)
Inventor
谷雨隆
熊友军
张思民
赵云
Original Assignee
深圳市优必选科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市优必选科技股份有限公司 filed Critical 深圳市优必选科技股份有限公司
Publication of WO2022267285A1 publication Critical patent/WO2022267285A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages

Definitions

  • the present application belongs to the field of robot control, and in particular relates to a method, a device, a robot and a storage medium for determining a pose of a robot.
  • the problem of relocation is sometimes encountered. For example, when the position is initialized and the positioning is wrong, the robot needs to be relocated to determine the pose of the robot. In the existing method for determining the pose of a robot, the determined pose cannot be evaluated, and thus the determined pose is not accurate enough.
  • the embodiments of the present application provide a method, a device, a robot, and a storage medium for determining a robot pose, so as to improve the accuracy of the determined robot pose.
  • the first aspect of the embodiments of the present application provides a method for determining the pose of a robot, including:
  • the first pose being the pose of the robot in the map coordinate system
  • the lidar of the robot When the lidar of the robot performs laser scanning, determine the first position of the laser point corresponding to the lidar in the map coordinate system according to the first pose;
  • the grid being a grid in a probability map, the probability map being a map corresponding to the map coordinate system
  • the determining the matching score between the first position and the grid where the first position is located includes:
  • a matching score between the first position and the grid where the first position is located is determined according to an average value of the first position and the grid where the first position is located.
  • the number of laser points is at least two, and correspondingly, the number of matching scores is at least two; the determining the first confidence of the first pose according to the matching scores degrees, including:
  • the determining the target pose according to the first confidence level includes:
  • the target pose is determined according to the second pose with the highest second confidence among the at least one second pose.
  • the determining the target pose according to the second pose with the highest second confidence among the at least one second pose includes:
  • the method before adjusting the first pose within the adjustment range to obtain at least one second pose, the method further includes:
  • An adjustment range is determined according to the first degree of confidence.
  • the determining the target pose according to the first confidence level includes:
  • a target pose is determined according to the updated first pose.
  • the second aspect of the embodiment of the present application provides a device for determining the pose of a robot, including:
  • An acquisition module configured to acquire a first pose of the robot, where the first pose is the pose of the robot in the map coordinate system;
  • the first calculation module is used to determine the first position of the laser point corresponding to the lidar in the map coordinate system according to the first pose when the lidar of the robot performs laser scanning;
  • the second calculation module is used to determine the matching score between the first position and the grid where the first position is located, the grid is a grid in a probability map, and the probability map is in a coordinate system with the map the corresponding map;
  • a third calculation module permanently determining the first confidence level of the first pose according to the matching score
  • a determining module configured to determine the target pose according to the first degree of confidence.
  • the third aspect of the embodiments of the present application provides a robot, including a memory, a processor, and a computer program stored in the memory and operable on the processor.
  • the processor executes the computer program, it realizes The method for determining the pose of a robot as described in the first aspect above.
  • the fourth aspect of the embodiments of the present application provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the robot position as described in the first aspect above is realized. How to determine posture.
  • a fifth aspect of the embodiments of the present application provides a computer program product, which, when the computer program product is run on a terminal device, causes the terminal device to execute the method for determining the pose of a robot described in any one of the above first aspects.
  • the beneficial effect of the embodiment of the present application is: by obtaining the first pose of the robot, the first position of the laser point of the laser radar in the map coordinate system is determined according to the first pose, and the first position is determined According to the matching score of the grid where the first position is located, the first confidence degree of the first pose is determined according to the matching score, and then the target pose is determined according to the first confidence degree, so that the first pose can be evaluated by the first confidence degree, Get a more accurate target pose.
  • Fig. 1 is a schematic diagram of the implementation flow of the method for determining the pose of a robot provided by an embodiment of the present application;
  • Fig. 2 is a schematic diagram of laser scanning performed by a robot provided in an embodiment of the present application
  • Fig. 3 is a schematic diagram of a device for determining a pose of a robot provided in an embodiment of the present application
  • Fig. 4 is a schematic structural diagram of a robot provided by an embodiment of the present application.
  • the currently used relocation method is Adaptive Monte Carlo Localization (AMCL) method, which generates many position guesses with a probability model on the basis of the user's estimated position, and then uses the optimal position guess as Estimated value, repeat the above process until the optimal position guess meets the judgment condition, and the optimal position guess is taken as the final positioning result.
  • ACL Adaptive Monte Carlo Localization
  • Another relocation method is the template matching method.
  • the template matching method is to convert the scanning information of the lidar on the robot into machine vision information, and then perform template matching and relocation according to the visual information. After the relocation is performed according to the template matching, various optimization methods can be performed on the basis of the template matching to adjust the matching result.
  • this application provides a method for determining the pose of a robot.
  • the first position of the first pose of the robot is determined according to the matching score. Confidence, and then determine the target pose according to the first confidence, so that the first pose can be evaluated through the first confidence, and the target pose with higher accuracy can be obtained.
  • the method for determining the pose of a robot provided in the present application is exemplarily described below.
  • the method for determining the pose of a robot provided in the embodiment of the present application may be executed on the robot, or may be executed on an electronic device or server communicating with the robot, and the electronic device may be a mobile phone, a computer, a tablet, and the like.
  • the method for determining the pose of a robot provided in the embodiment of the present application will be described by taking the method for determining the pose of the robot provided in the embodiment of the present application to be executed on a robot as an example.
  • the method for determining the pose of a robot includes:
  • S101 Obtain a first pose of the robot, where the first pose is the pose of the robot in a map coordinate system.
  • the map coordinate system is the coordinate system of the space where the robot is located, the x-axis and y-axis of the map coordinate system are located in the plane on which the robot stands, and the z-axis of the map coordinate system is perpendicular to the plane on which the robot stands.
  • the first pose of the robot including the position and posture of the robot in the map coordinate system.
  • the position of the robot in the map coordinate system is the coordinate of the robot in the map coordinate system.
  • the coordinates may be two-dimensional coordinates, for example, including x-axis coordinates and y-axis coordinates.
  • the coordinates may also be three-dimensional coordinates.
  • the pose of the robot in the map coordinate system refers to the offset angle of the robot, such as roll angle and pitch angle.
  • the first pose of the robot can be determined by the robot according to the surrounding environment or the latest pose. For example, the robot determines the position of the surrounding obstacles based on the captured image or the laser radar scan, and compares the position of the obstacle with the obstacle in the map coordinate system Match the position of the object, determine the distance from the obstacle, and then determine the first pose according to the position of the obstacle in the map coordinate system. For another example, after the robot is restarted, the pose before the restart is used as the first pose.
  • the first pose of the robot can also be input by the user. For example, the robot displays a map corresponding to the map coordinate system on the display interface of the robot. If the user finds that the robot is in the bedroom, select the position of the bedroom on the map and select the pose of the robot. , the robot takes the position and pose selected by the user as the first pose.
  • a laser radar 21 is provided on the robot, and the laser radar 21 can emit laser light for laser scanning. After the emitted laser touches the obstacle, a laser point will be formed on the obstacle.
  • the laser radar receives the laser reflected by the obstacle, and according to the orientation and time of the emitted laser, as well as the orientation and time of the reflected laser, the laser can be determined. The position of the point in the lidar coordinate system.
  • the position in the first pose of the robot is the position of the center of the robot in the map coordinate system, and the relative position of the center of the robot and the lidar is fixed.
  • the position of the lidar can be determined according to the first pose.
  • the position of the lidar is the origin of the lidar coordinate system
  • the center of the robot is the origin of the robot coordinate system, so it is determined
  • the relationship between the lidar coordinate system and the robot coordinate system is the origin of the robot coordinate system.
  • the relationship between the robot coordinate system and the map coordinate system can also be determined according to the position of the center of the robot in the map coordinate system.
  • the robot determines the position of the laser point in the laser radar coordinate system, according to the position of the laser point in the laser radar coordinate system and the relationship between the laser radar coordinate system and the robot coordinate system, it can be determined that the laser point is in the robot coordinate system According to the position of the laser point in the robot coordinate system and the relationship between the robot coordinate system and the map coordinate system, the first position of the laser point in the map coordinate system can be determined.
  • S103 Determine a matching score between the first position and a grid where the first position is located, where the grid is a grid in a probability map, and the probability map is a map corresponding to the map coordinate system.
  • the probability map is a map determined in the map coordinate system, which may be obtained by the robot during map construction, or obtained after conversion according to a pre-built grid map.
  • the grid map divides the space into multiple grids, and each grid corresponds to a probability, which represents the probability of obstacles in the corresponding grid, which is determined according to whether there is a laser point in the corresponding grid during the laser scanning process .
  • each grid corresponds to a mean value and a variance. The mean value and variance are determined according to the number of laser points in the corresponding grid and the position of the laser point during the laser scanning process. Converting a raster map to a probability map can be achieved using the Normal Distributions Transform (NDT) algorithm.
  • NDT Normal Distributions Transform
  • the grid where the first position is located can be determined according to the position of each grid in the map coordinate system.
  • the matching score between the first position and the grid where the first position is located may be calculated according to the mean value of the first position and the grid where the first position is located. In one possible implementation, according to the formula
  • the first position may be a two-dimensional vector, for example, including x-axis coordinates and y-axis coordinates corresponding to the first position.
  • the first position may also be a three-dimensional vector, for example, including x-axis coordinates, y-axis coordinates, and z-axis coordinates corresponding to the first position.
  • the mean value of the grid where the first position is located is a two-dimensional vector formed by the corresponding coordinates; if the first position is a three-dimensional vector, the mean value of the grid where the first position is located is formed by the corresponding coordinates 3D vector of .
  • the difference between the first position and the mean value of the grid where the first position is located may also be used as the matching score between the first position and the grid where the first position is located.
  • the matching score between the first position and the grid where the first position is located may also be calculated according to the variance between the first position and the grid where the first position is located.
  • S104 Determine a first confidence level of the first pose according to the matching score.
  • the number of laser radars installed on the robot is one, and the matching score calculated according to the first position is the first confidence level of the first pose.
  • the number of formed laser spots is at least two.
  • the at least two laser points may be obtained by emitting laser light at least twice from one laser radar, and the directions of the two emitted laser light are different; the at least two laser points may also be obtained by at least two laser radars emitting laser light respectively.
  • the number of laser radars installed on the robot is at least two. When the robot scans laser light, each laser radar emits laser light. After the emitted laser light touches an obstacle, a laser point will be formed on the obstacle. Therefore , the corresponding number of laser spots is at least two.
  • the robot calculates a matching score corresponding to each first position according to the first position of each laser point, and takes the average of at least two matching scores as the first confidence degree of the first pose. For example, if the number of laser radars is five, the number of corresponding laser points is five, and the matching scores corresponding to the five laser points are 1, 0.8, 0, 0.2, and 0.5 respectively, then the five matching scores are averaged, The resulting average value is 0.5, so the first confidence level is 0.5.
  • S105 Determine the target pose according to the first confidence level.
  • the robot is adjusted based on the first pose pose until a pose with confidence greater than or equal to the preset value is obtained, which is the target pose.
  • the first pose is adjusted within an adjustment range to obtain at least one second pose.
  • the lidar of the robot performs laser scanning to obtain the position of the laser point in the lidar coordinate system.
  • the second position of the laser point in the map coordinate system is determined.
  • the second confidence degree of the second pose is determined, wherein, the number of the second pose is multiple, and the corresponding second confidence degree is also Multiple.
  • the second position with the highest second confidence level is used as the target pose, thereby improving the accuracy of the determined target pose.
  • the adjustment range is determined according to the first confidence level. For example, if the first confidence level is less than a preset value, it means that the accuracy of the first pose is low, and the first range is used as the adjustment range. If The first confidence level is greater than the preset value, indicating that the accuracy of the first pose is relatively high, and the second range is used as the adjustment range, and the second range is smaller than the first range. That is, if the accuracy of the first pose is low, adjust the first pose within a large adjustment range; if the accuracy of the first pose is high, adjust the first pose within a small adjustment range, which can improve Calculate the speed and quickly get the target pose.
  • the adjustment range is preset.
  • the adjustment range includes the adjustment range in the x-axis direction, the adjustment range in the y-axis direction and the adjustment range in the angle, the adjustment range in the x-direction and the adjustment range in the y-direction.
  • the adjustment range is ⁇ 1 cm, and the angle adjustment range is ⁇ 1 degree.
  • the first pose is a vector (0,0,0)
  • the three elements in the vector are the x-axis coordinate, y-axis coordinate and angle of the robot's position
  • the second position obtained according to the adjustment range
  • the posture includes (1,0,0), (1,1,0), (1,-1,0), (-1,1,0), (1,1,-1) and other 27 combinations 27 poses.
  • a preset algorithm is used to optimize the second pose with the highest second confidence to obtain at least one optimized third pose.
  • the lidar of the robot scans the laser to obtain the position of the laser point in the lidar coordinate system. Then according to the position of the laser point in the laser radar coordinate system and the relationship between the map coordinate system and the laser radar coordinate system, the third position of the laser point in the map coordinate system is determined. Then, according to the matching score of the third position and the grid where the third position is located, a third confidence degree of the third pose is determined. Wherein, the quantity of the third pose is multiple, and the quantity of the corresponding third confidence degree is also multiple. After obtaining multiple third confidence degrees, the third pose with the highest third confidence degree is used as the target pose, which further improves the accuracy of the obtained target pose.
  • the preset algorithm is a Gauss-Newton iterative matching algorithm.
  • a third pose is obtained for each iteration, and a third confidence degree corresponding to the third pose is calculated.
  • the third confidence degree obtained in the current iteration is greater than or equal to that obtained in the previous iteration. If the third confidence degree obtained by the current iteration is lower than the third confidence degree obtained by the previous iteration, then the third confidence degree obtained by the previous iteration is taken as the highest third confidence degree, and the above
  • the third pose obtained in one iteration is used as the target pose.
  • Using the Gauss-Newton iterative matching algorithm can improve the calculation speed and the accuracy of the calculated target pose.
  • the Gauss-Newton iterative algorithm performs iterative optimization through the Hessian matrix. Specifically, the difference between the third position of the laser point corresponding to the third pose in the map coordinate system and the mean value of the grid where the third position is located is used as an error function, and the error function is derived to obtain the corresponding laser point The Jacobian matrix of .
  • each laser point corresponds to a Jacobian matrix, and the Jacobian matrices of all laser points corresponding to the third pose are added to obtain the summed Jacobian matrix. According to the summed Jacobian matrix, the Hessian matrix is calculated.
  • the third pose obtained in the next iteration is determined according to the value of the Hessian matrix obtained in the current iteration.
  • calculate the corresponding third confidence degree compare the third confidence degree obtained in the current iteration with the third confidence degree obtained in the previous iteration, if the third confidence degree obtained in the current iteration is greater than The third confidence degree obtained in the previous iteration, then continue to iterate, if the third confidence degree obtained in the current iteration is less than the third confidence degree obtained in the previous iteration, then use the third confidence degree obtained in the previous iteration as the highest third confidence degree degree, the third pose obtained in the previous iteration is taken as the target pose.
  • the iteration is continued until the maximum number of iterations is reached, and then the iteration is terminated.
  • the third pose with the highest third confidence is determined from the third poses obtained in each iteration, and the third pose with the highest third confidence is the target pose.
  • iterative algorithms such as simulated annealing optimization algorithm and gradient optimization algorithm may also be used to optimize the third pose to obtain the target pose.
  • the first confidence degree is obtained, if the first confidence degree is greater than a preset value, it indicates that the accuracy of the first pose is relatively high, and the target pose is determined according to the first pose. If the first confidence degree is less than the preset value, it means that the accuracy of the first pose is low. If the pose adjustment is performed on the basis of the first pose, it is not easy to obtain the target pose, then the updated first pose posture. For example, if the first pose is input by the user, a prompt to re-input the pose may be output on the display interface of the robot. For another example, if the first pose is obtained after the robot is initialized, the robot is re-initialized to obtain the updated first pose.
  • the first position of the laser point corresponding to the laser scan in the map coordinate system is determined according to the first position and the first position.
  • the matching score of the grid of determines the first confidence level of the updated first pose.
  • the target pose is determined according to the updated first pose corresponding to the first confidence degree, so as to prevent the blind optimization of the pose during the positioning process, thereby improving the Calculate speed.
  • the first position of the laser point of the lidar in the map coordinate system is determined according to the first pose, and the matching score between the first position and the grid where the first position is located is determined , determine the first confidence level of the first pose according to the matching score, and then determine the target pose according to the first confidence level, so that the first pose can be evaluated through the first confidence level, and a more accurate target pose can be obtained.
  • Fig. 3 shows a structural block diagram of the device for determining the pose of the robot provided by the embodiment of the present application. relevant part.
  • the device for determining the pose of the robot includes,
  • the obtaining module 10 is used to obtain the first pose of the robot, and the first pose is the pose of the robot under the map coordinate system;
  • the first calculation module 20 is configured to determine the first position of the laser point corresponding to the lidar in the map coordinate system according to the first pose when the lidar of the robot performs laser scanning;
  • the second calculation module 30 is configured to determine the matching score between the first position and the grid where the first position is located, the grid is a grid in a probability map, and the probability map is a grid corresponding to the coordinates of the map Department of the corresponding map;
  • the third calculation module 40 permanently determines the first confidence degree of the first pose according to the matching score
  • a determining module 50 configured to determine the target pose according to the first degree of confidence.
  • the second computing module 30 is specifically configured to:
  • a matching score between the first position and the grid where the first position is located is determined according to an average value of the first position and the grid where the first position is located.
  • the number of laser points is at least two, and correspondingly, the number of matching scores is at least two; the third calculation module 40 is specifically used for:
  • the determining module 50 is specifically configured to:
  • the target pose is determined according to the second pose with the highest second confidence among the at least one second pose.
  • the determining module 50 is specifically further configured to:
  • the determining module 50 is specifically further configured to:
  • An adjustment range is determined according to the first degree of confidence.
  • the determining module 50 is specifically further configured to:
  • a target pose is determined according to the updated first pose.
  • Fig. 4 is a schematic structural diagram of a robot provided by an embodiment of the present application.
  • the robot of this embodiment includes: a processor 11 , a memory 12 and a computer program 13 stored in the memory 12 and operable on the processor 11 .
  • the processor 11 executes the computer program 13
  • the steps in the above embodiment of the robot control method are implemented, for example, steps S101 to S105 shown in FIG. 1 .
  • the processor 11 executes the computer program 13, it realizes the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the acquisition module 10 to the determination module 50 shown in FIG. 3 .
  • the computer program 13 can be divided into one or more modules/units, and the one or more modules/units are stored in the memory 12 and executed by the processor 11 to complete this application.
  • the one or more modules/units may be a series of computer program instruction segments capable of accomplishing specific functions, and the instruction segments are used to describe the execution process of the computer program 13 in the terminal device.
  • Fig. 4 is only an example of the robot, and does not constitute a limitation to the robot, and may include more or less components than those shown in the illustration, or combine certain components, or different components, such as the described
  • the robot may also include input and output devices, network access devices, buses, etc.
  • the processor 11 can be a central processing unit (Central Processing Unit, CPU), and can also be other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the memory 12 may be an internal storage unit of the robot, such as a hard disk or memory of the robot. Described memory 12 also can be the external storage device of described robot, for example the plug-in type hard disk that is equipped on described robot, smart memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash memory card (Flash Card) etc. Further, the memory 12 may also include both an internal storage unit of the robot and an external storage device. The memory 12 is used to store the computer program and other programs and data required by the robot. The memory 12 can also be used to temporarily store data that has been output or will be output.
  • the disclosed apparatus/terminal device and method may be implemented in other ways.
  • the device/terminal device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • an integrated module/unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the computer programs can be stored in a computer-readable storage medium, and the computer When the program is executed by the processor, the steps in the above-mentioned various method embodiments can be realized.
  • the computer program includes computer program code, and the computer program code may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a USB flash drive, a removable hard disk, a magnetic disk, an optical disk, a computer memory, and a read-only memory (ROM, Read-Only Memory) , Random Access Memory (RAM, Random Access Memory), electrical carrier signal, telecommunication signal and software distribution medium, etc.

Abstract

一种机器人位姿的确定方法适用于机器人控制领域。机器人位姿的确定方法包括获取机器人的第一位姿,在机器人的激光雷达进行激光扫描时,根据第一位姿确定激光雷达对应的激光点在地图坐标系下的第一位置,确定第一位置与第一位置所在的栅格的匹配分数,栅格是概率地图中的栅格,根据匹配分数确定第一位姿的第一置信度;根据第一置信度确定目标位姿,从而可以通过第一置信度评估第一位姿,得到准确度更高的目标位姿。还提供了一种机器人位姿的确定装置、机器人及存储介质。

Description

机器人位姿的确定方法、装置、机器人及存储介质
本申请要求于2021年06月25日在中国专利局提交的、申请号为202110709447.6的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请属于机器人控制领域,尤其涉及机器人位姿的确定方法、装置、机器人及存储介质。
背景技术
在机器人移动过程中,有时会遇到重定位的问题,例如在位置初始化、定位错误时,需要对机器人进行重定位,以确定机器人的位姿。现有的机器人的位姿的确定方法中,无法对确定出的位姿进行评估,因而确定出的位姿不够准确。
技术问题
有鉴于此,本申请实施例提供了机器人位姿的确定方法、装置、机器人及存储介质,以提高确定出的机器人位姿的准确度。
技术解决方案
本申请实施例的第一方面提供了一种机器人位姿的确定方法,包括:
获取机器人的第一位姿,所述第一位姿是所述机器人在地图坐标系下的位姿;
在所述机器人的激光雷达进行激光扫描时,根据所述第一位姿确定所述激光雷达对应的激光点在所述地图坐标系下的第一位置;
确定所述第一位置与所述第一位置所在的栅格的匹配分数,所述栅格是概率地图中的栅格,所述概率地图是与所述地图坐标系对应的地图;
根据所述匹配分数确定所述第一位姿的第一置信度;
根据所述第一置信度确定目标位姿。
在一种可能的实现方式中,所述确定所述第一位置与所述第一位置所在的栅格的匹配分数,包括:
根据所述第一位置与所述第一位置所在的栅格的均值,确定所述述第一位置与所述第一位置所在的栅格的匹配分数。
在一种可能的实现方式中,激光点的数量为至少两个,对应地,所述匹配分数的数量为至少两个;所述根据所述匹配分数确定所述第一位姿的第一置信度,包括:
将至少两个所述匹配分数的平均值作为所述第一位姿的第一置信度。
在一种可能的实现方式中,所述根据所述第一置信度确定目标位姿,包括:
在调整范围内调整所述第一位姿,得到至少一个第二位姿;
确定所述至少一个第二位姿的第二置信度;
根据所述至少一个第二位姿中,第二置信度最高的第二位姿确定目标位姿。
在一种可能的实现方式中,所述根据所述至少一个第二位姿中,第二置信度最高的第二位姿确定目标位姿,包括:
采用预设算法对所述第二置信度最高的第二位姿进行优化,得到优化后的至少一个第三位姿;
确定所述至少一个第三位姿的第三置信度;
将所述至少一个第三位姿中,第三置信度最高的第三位姿作为目标位姿。
在一种可能的实现方式中,在所述在调整范围内调整所述第一位姿,得到至少一个第二位姿之前,所述方法还包括:
根据所述第一置信度确定调整范围。
在一种可能的实现方式中于,所述根据所述第一置信度确定目标位姿,包括:
若所述第一置信度小于预设值,获取更新后的第一位姿;
根据所述更新后的第一位姿确定目标位姿。
本申请实施例的第二方面提供了一种机器人位姿的确定装置,包括:
获取模块,用于获取机器人的第一位姿,所述第一位姿是所述机器人在地图坐标系下的位姿;
第一计算模块,用于在所述机器人的激光雷达进行激光扫描时,根据所述第一位姿确定所述激光雷达对应的激光点在所述地图坐标系下的第一位置;
第二计算模块,用于确定所述第一位置与所述第一位置所在的栅格的匹配分数,所述栅格是概率地图中的栅格,所述概率地图是与所述地图坐标系对应的地图;
第三计算模块,永固根据所述匹配分数确定所述第一位姿的第一置信度;
确定模块,用于根据所述第一置信度确定目标位姿。
本申请实施例的第三方面提供了一种机器人,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如上述第一方面所述的机器人位姿的确定方法。
本申请实施例的第四方面提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现如上述第一方面所述的机器人位姿的确定方法。
本申请实施例的第五方面提供了一种计算机程序产品,当计算机程序产品在终端设备上运行时,使得终端设备执行上述第一方面中任一项所述的机器人位姿的确定方法。
有益效果
本申请实施例与现有技术相比存在的有益效果是:通过获取机器人的第一位姿,根据第一位姿确定激光雷达的激光点在地图坐标系下的第一位置,确定第一位置与第一位置所在的栅格的匹配分数,根据匹配分数确定第一位姿的第一置信度,再根据第一置信度确定目标位姿,从而可以通过第一置信度评估第一位姿,得到准确度更高的目标位姿。
附图说明
图1是本申请一实施例提供的机器人位姿的确定方法的实现流程示意图;
图2是本申请实施例提供的机器人进行激光扫描的示意图;
图3是本申请实施例提供的机器人位姿的确定装置的示意图;
图4是本申请实施例提供的机器人的结构示意图。
本发明的实施方式
以下描述中,为了说明而不是为了限定,提出了诸如特定系统结构、技术之类的具体细节,以便透彻理解本申请实施例。然而,本领域的技术人员应当清楚,在没有这些具体细节的其它实施例中也可以实现本申请。在其它情况中,省略对众所周知的系统、装置、电路以及方法的详细说明,以免不必要的细节妨碍本申请的描述。
为了说明本申请所述的技术方案,下面通过具体实施例来进行说明。
应当理解,当在本说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。
另外,在本申请的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
在机器人移动过程中,经常会遇到重定位的问题。重定位方法有两种,一种是用户指定,即用户控制机器人移动到固定位置,该方法计算简单,但效率低、限制条件多,在重定位问题中使用较少。另一种是软件处理,即机器人通过感知周围环境,与地图匹配,确定出当前位置,该方法计算复杂,但效率高、成本低,因此大量使用。
当前使用较多的重定位方法为自适应蒙特卡洛定位(adaptive Monte Carlo Localization,AMCL)方法,该方法在用户估计位置的基础上,以概率模型生成很多位置猜测,再以最优位置猜测为估计值,重复上述过程,直到最优位置猜测满足判断条件,将最优位置猜测作为最终定位结果。另一种重定位方法是模板匹配方法,模板匹配方法是将机器人上的激光雷达的扫描信息转换成机器视觉信息,再根据视觉信息进行模板匹配重定位。在根据模板匹配进行重定位后,可以在模板匹配的基础上进行多种优化方式,调整匹配结果。
上述两种重定位方法的重定位结果并不准确,需要对重定位得到的机器人位姿进行精细调整,以确定出准确度更高的机器人位姿。
为此,本申请提供一种机器人位姿的确定方法,通过将机器人的激光雷达扫描时对应的激光点与激光点所在的栅格进行匹配,根据匹配分数确定机器人的第一位姿的第一置信度,再根据第一置信度确定目标位姿,从而可以通过第一置信度评估第一位姿,得到准确度更高的目标位姿。
下面对本申请提供的机器人位姿的确定方法进行示例性说明。
其中,本申请实施例提供的机器人位姿的确定方法可以执行于机器人,也可以执行于与机器人通信的电子设备或者服务器,电子设备可以是手机、电脑、平板等。下面以本申请实施例提供的机器人位姿的确定方法执行于机器人为例,对本申请实施例提供的机器人位姿的确定方法进行说明。
请参阅附图1,本申请一实施例提供的机器人位姿的确定方法包括:
S101:获取机器人的第一位姿,所述第一位姿是所述机器人在地图坐标系下的位姿。
其中,地图坐标系是机器人所在的空间的坐标系,地图坐标系的x轴和y轴位于机器人所站立的平面内,地图坐标系的z轴与机器人所站立的平面垂直。机器人的第一位姿,包括机器人在地图坐标系中的位置以及姿态。机器人在地图坐标系中的位置是机器人在地图坐标系中的坐标,该坐标可以是二维坐标,例如,包括x轴坐标和y轴坐标,该坐标也可以是三维坐标。机器人在地图坐标系中的姿态是指机器人的偏移角度,例如,翻滚角和俯仰角。
机器人的第一位姿可以是机器人根据周围环境或者最近一次的位姿确定的,例如,机器人根据拍摄的图像或者激光雷达扫描确定周围障碍物的位置,将障碍物的位置与地图坐标系中障碍物的位置进行匹配,确定出距离障碍物的距离,再根据障碍物在地图坐标系中的位置确定第一位姿。又例如,机器人在重启后,将重启前的位姿作为第一位姿。机器人的第一位姿也可以是用户输入的,例如,机器人在机器人的显示界面显示地图坐标系对应的地图,若用户发现机器人在卧室内,在地图上选择卧室的位置,并选择机器人的姿态,机器人将获取的用户选择的位置以及姿态作为第一位姿。
S102:在所述机器人的激光雷达进行激光扫描时,根据所述第一位姿确定所述激光雷达对应的激光点在所述地图坐标系下的第一位置。
如图2所示,机器人上设置有激光雷达21,激光雷达21可以发射激光,以进行激光扫描。发射的激光接触障碍物后,会在障碍物上形成激光点,激光雷达接收经障碍物反射后的激光,根据发射的激光的方位和时间,以及反射后的激光的方位和时间,可以确定激光点在激光雷达坐标系中的位置。
机器人的第一位姿中的位置是机器人的中心在地图坐标系中的位置,而机器人的中心与激光雷达的相对位置是固定的。机器人在确定第一位姿后,根据第一位姿可以确定激光雷达的位置,激光雷达的位置即为激光雷达坐标系的原点,而机器人的中心是机器人坐标系的原点,因此也就确定了激光雷达坐标系与机器人坐标系之间的关系。同时,根据机器人的中心在地图坐标系中的位置也可以确定出机器人坐标系与地图坐标系之间的关系。
机器人在确定激光点在激光雷达坐标系中的位置后,根据激光点在激光雷达坐标系中的位置,以及激光雷达坐标系与机器人坐标系之间的关系,可以确定出激光点在机器人坐 标系中的位置;再根据激光点在机器人坐标系中的位置,以及机器人坐标系与地图坐标系的关系,即可确定出激光点在地图坐标系下的第一位置。
S103:确定所述第一位置与所述第一位置所在的栅格的匹配分数,所述栅格是概率地图中的栅格,所述概率地图是与所述地图坐标系对应的地图。
其中,概率地图是在地图坐标系下确定的地图,可以是机器人在进行地图构建时得到的,也可以是根据预先构建的栅格地图转换后得到的。栅格地图将空间划分为多个栅格,每个栅格对应一个概率,该概率表示对应栅格存在障碍物的概率,该概率是根据激光扫描过程中,对应栅格是否存在激光点确定的。概率地图中,每个栅格对应一个均值和一个方差,均值和方差是根据激光扫描过程中,对应栅格内存在激光点的次数以及激光点的位置确定的。将栅格地图转换为概率地图可以采用正态分布变换(Normal Distributions Transform,NDT)算法实现。
在确定激光点在地图坐标系下的第一位置后,根据地图坐标系中每个栅格的位置可以确定出第一位置所在的栅格。
在一实施例中,可以根据第一位置与第一位置所在的栅格的均值计算第一位置与第一位置所在的栅格的匹配分数。在一种可能的实现方式中,根据公式
Figure PCTCN2021126715-appb-000001
计算第一位置与第一位置所在的栅格的匹配分数,其中,X i表示第一位置,q i表示第一位置所在的栅格的均值,score i表示匹配分数,匹配分数的范围为0~1。“T”表示转置运算,
Figure PCTCN2021126715-appb-000002
表示协方差运算,“exp”表示指数运算。第一位置可以是二维向量,例如,包括第一位置对应的x轴坐标和y轴坐标。第一位置也可以是三维向量,例如,包括第一位置对应的x轴坐标、y轴坐标和z轴坐标。若第一位置是二维向量,第一位置所在的栅格的均值是对应坐标所形成的二维向量,若第一位置是三维向量,第一位置所在的栅格的均值是对应坐标所形成的三维向量。
在其他实施例中,也可以将第一位置与第一位置所在的栅格的均值的差值作为第一位置与第一位置所在的栅格的匹配分数。也可以根据第一位置与第一位置所在的栅格的方差计算第一位置与第一位置所在的栅格的匹配分数。
S104:根据所述匹配分数确定所述第一位姿的第一置信度。
在一实施例中,设于机器人上的激光雷达的数量为1个,则根据第一位置计算出的匹配分数即为第一位姿的第一置信度。在另一实施例中,在进行激光扫描时,形成的激光点的数量为至少两个。该至少两个激光点可以是一个激光雷达发射至少两次激光得到的,两次发射激光的方向不同;该至少两个激光点也可以是至少两个激光雷达分别发射激光得到的。例如,设于机器人上的激光雷达的数量为至少两个,机器人在进行激光扫描时,每个激光雷达均发射激光,发射的激光接触障碍物后,均会在障碍物上形成激光点,因此,对应的激光点的数量为至少两个。机器人根据每个激光点的第一位置计算出与每个第一位置对应的匹配分数,将至少两个匹配分数的平均值作为第一位姿的第一置信度。例如,激光雷达的数量为五个,对应的激光点的数量为五个,五个激光点对应的匹配分数分别为1、0.8、0、0.2、0.5,则对五个匹配分数求平均值,得到的平均值为0.5,则第一置信度为0.5。
S105:根据所述第一置信度确定目标位姿。
在一实施例中,若第一置信度大于或者等于预设值,则将第一位姿作为目标位姿,若第一置信度小于预设值,则在第一位姿的基础上调整机器人的位姿,直到得到置信度大于或者等于预设值的位姿,该位姿即为目标位姿。
在一实施例中,在得到第一位姿的第一置信度后,在调整范围内调整第一位姿,得到 至少一个第二位姿。在得到至少一个第二位姿后,对于每个第二位姿,计算机器人处于第二位姿时,机器人的激光雷达进行激光扫描,得到的激光点在激光雷达坐标系下的位置。再根据激光点在激光雷达坐标系下的位置,以及地图坐标系与激光雷达坐标系的关系,确定激光点在地图坐标系下的第二位置。再根据第二位置以及第二位置所在的栅格的匹配分数,确定第二位姿的第二置信度,其中,第二位姿的数量为多个,对应的第二置信度的数量也为多个。在得到多个第二置信度后,将第二置信度最高的第二位置作为目标位姿,从而提高了确定出的目标位姿的准确度。可选的,在计算出多个第二置信度后,首先比较第一置信度和第二置信度的大小关系,若存在大于第一置信度的第二置信度,则将第二置信度最高的第二位置作为目标位姿;若不存在大于第一置信度的第二置信度,将第一位姿作为目标位姿。
在一实施例中,调整范围是根据第一置信度确定的,例如,若第一置信度小于预设值,说明第一位姿的准确度较低,则将第一范围作为调整范围,若第一置信度大于预设值,说明第一位姿的准确度较高,则将第二范围作为调整范围,第二范围小于第一范围。即若第一位姿的准确度较低,在大的调整范围内调整第一位姿,若第一位姿的准确度较高,在小的调整范围内调整第一位姿,从而可以提高计算速度,快速得到目标位姿。
在另一实施例中,调整范围是预先设定的,示例性地,调整范围包括x轴方向的调整范围,y轴方向的调整范围以及角度的调整范围,x方向的调整范围和y方向的调整范围均为±1厘米,角度的调整范围为±1度。例如,若第一位姿是向量(0,0,0)其中,向量中的三个元素分别是机器人所在的位置的x轴坐标、y轴坐标以及角度,则根据调整范围得到的第二位姿包括(1,0,0)、(1,1,0)、(1,-1,0)、(-1,1,0)、(1,1,-1)等27种组合形成的27个位姿。
在一实施例中,在确定第二置信度最高的第二位姿后,采用预设算法对第二置信度最高的第二位姿进行优化,得到优化后的至少一个第三位姿。在得到至少一个第三位姿后,对于每个第三位姿,计算机器人处于第三位姿时,机器人的激光雷达进行激光扫描,得到的激光点在激光雷达坐标系下的位置。再根据激光点在激光雷达坐标系下的位置,以及地图坐标系与激光雷达坐标系的关系,确定激光点在地图坐标系下的第三位置。再根据第三位置以及第三位置所在的栅格的匹配分数,确定第三位姿的第三置信度。其中,第三位姿的数量为多个,对应的第三置信度的数量也为多个。在得到多个第三置信度后,将第三置信度最高的第三位姿作为目标位姿,进一步提高了得到的目标位姿的准确度。
在一实施例中,预设算法是高斯牛顿迭代匹配算法,算法运行过程中,每次迭代得到一个第三位姿,计算第三位姿对应的第三置信度。每次迭代得到对应的第三置信度后,将当前迭代得到的第三置信度与上一次迭代得到的第三置信度进行比较,若当前迭代得到的第三置信度大于或者等于上一次迭代得到的第三置信度,则继续迭代,若当前迭代得到的第三置信度小于上一次迭代得到的第三置信度,则将上一次迭代得到的第三置信度作为最高第三置信度,将上一次迭代得到的第三位姿作为目标位姿。采用高斯牛顿迭代匹配算法,可以提高计算速度和计算出的目标位姿的准确度。
其中,高斯牛顿迭代算法通过海森矩阵进行迭代优化。具体地,将第三位姿对应的激光点在地图坐标系中的第三位置,与第三位置所在的栅格的均值的差值,作为误差函数,对误差函数求导,得到对应激光点的雅克比矩阵。对于第三位姿,每个激光点均对应一个雅克比矩阵,对第三位姿对应的所有激光点的雅克比矩阵相加,得到求和后的雅克比矩阵。根据求和后的雅克比矩阵,计算出海森矩阵。在每次迭代得到第三位姿后,根据当前迭代得到的海森矩阵的值确定下一次迭代得到的第三位姿。每次迭代得到第三位姿后,计算对应的第三置信度,将当前迭代得到的第三置信度与上一次迭代得到的第三置信度进行比较,若当前迭代得到的第三置信度大于上一次迭代得到的第三置信度,则继续迭代,若当前迭代得到的第三置信度小于上一次迭代得到的第三置信度,则将上一次迭代得到的第三置信度作为最高第三置信度,将上一次迭代得到的第三位姿作为目标位姿。
在其他实施例中,在高斯牛顿迭代算法运行过程中,也可以在迭代得到第三位姿,计算对应的第三置信度后,继续迭代,直到达到最大迭代次数,则终止迭代。在终止迭代后,从每次迭代得到的第三位姿中,确定出第三置信度最高的第三位姿,该第三置信度最高的第三位姿即为目标位姿。
在其他实施例中,也可以采用模拟退火优化算法、梯度优化算法等迭代算法优化第三位姿,得到目标位姿。
在一实施例中,在得到第一置信度后,若第一置信度大于预设值,说明第一位姿的准确度较高,根据第一位姿确定目标位姿。若第一置信度小于预设值,说明第一位姿的准确度较低,若在第一位姿的基础上进行位姿调整,不容易得到目标位姿,则获取更新后的第一位姿。例如,若第一位姿是用户输入的,可以在机器人的显示界面输出重新输入位姿的提示。又例如,若第一位姿是机器人初始化后得到的,则重新对机器人进行初始化,得到更新后的第一位姿。在得到更新后的第一位姿后,确定机器人位于更新后的第一位姿时,进行激光扫描所对应的激光点在地图坐标系下的第一位置,根据第一位置与第一位置所在的栅格的匹配分数,确定更新后的第一位姿的第一置信度。直到得到大于或者等于预设值的第一置信度,根据该第一置信度对应的更新后的第一位姿确定目标位姿,从而可以防止在定位过程中中盲目优化位姿,进而提高了计算速度。
上述实施例中,通过获取机器人的第一位姿,根据第一位姿确定激光雷达的激光点在地图坐标系下的第一位置,确定第一位置与第一位置所在的栅格的匹配分数,根据匹配分数确定第一位姿的第一置信度,再根据第一置信度确定目标位姿,从而可以通过第一置信度评估第一位姿,得到准确度更高的目标位姿。
应理解,上述实施例中各步骤的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
对应于上文实施例所述的机器人位姿的确定方法,图3示出了本申请实施例提供的机器人位姿的确定装置的结构框图,为了便于说明,仅示出了与本申请实施例相关的部分。
如图3所示,机器人位姿的确定装置包括,
获取模块10,用于获取机器人的第一位姿,所述第一位姿是所述机器人在地图坐标系下的位姿;
第一计算模块20,用于在所述机器人的激光雷达进行激光扫描时,根据所述第一位姿确定所述激光雷达对应的激光点在所述地图坐标系下的第一位置;
第二计算模块30,用于确定所述第一位置与所述第一位置所在的栅格的匹配分数,所述栅格是概率地图中的栅格,所述概率地图是与所述地图坐标系对应的地图;
第三计算模块40,永固根据所述匹配分数确定所述第一位姿的第一置信度;
确定模块50,用于根据所述第一置信度确定目标位姿。
在一种可能的实现方式中,所述第二计算模块30具体用于:
根据所述第一位置与所述第一位置所在的栅格的均值,确定所述述第一位置与所述第一位置所在的栅格的匹配分数。
在一种可能的实现方式中,激光点的数量为至少两个,对应地,所述匹配分数的数量为至少两个;所述第三计算模块40具体用于:
将至少两个所述匹配分数的平均值作为所述第一位姿的第一置信度。
在一种可能的实现方式中,所述确定模块50具体用于:
在调整范围内调整所述第一位姿,得到至少一个第二位姿;
确定所述至少一个第二位姿的第二置信度;
根据所述至少一个第二位姿中,第二置信度最高的第二位姿确定目标位姿。
在一种可能的实现方式中,所述确定模块50具体还用于:
采用预设算法对所述第二置信度最高的第二位姿进行优化,得到优化后的至少一个第三位姿;
确定所述至少一个第三位姿的第三置信度;
将所述至少一个第三位姿中,第三置信度最高的第三位姿作为目标位姿。
在一种可能的实现方式中,所述确定模块50具体还用于:
根据所述第一置信度确定调整范围。
在一种可能的实现方式中,所述确定模块50具体还用于:
若所述第一置信度小于预设值,获取更新后的第一位姿;
根据所述更新后的第一位姿确定目标位姿。
需要说明的是,上述装置/单元之间的信息交互、执行过程等内容,由于与本申请方法实施例基于同一构思,其具体功能及带来的技术效果,具体可参见方法实施例部分,此处不再赘述。
图4是本申请实施例提供的机器人的结构示意图。如图4所示,该实施例的机器人包括:处理器11、存储器12以及存储在所述存储器12中并可在所述处理器11上运行的计算机程序13。所述处理器11执行所述计算机程序13时实现上述机器人的控制方法实施例中的步骤,例如图1所示的步骤S101至S105。或者,所述处理器11执行所述计算机程序13时实现上述各装置实施例中各模块/单元的功能,例如图3所示获取模块10至确定模块50的功能。
示例性的,所述计算机程序13可以被分割成一个或多个模块/单元,所述一个或者多个模块/单元被存储在所述存储器12中,并由所述处理器11执行,以完成本申请。所述一个或多个模块/单元可以是能够完成特定功能的一系列计算机程序指令段,该指令段用于描述所述计算机程序13在所述终端设备中的执行过程。
本领域技术人员可以理解,图4仅仅是机器人的示例,并不构成对机器人的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件,例如所述机器人还可以包括输入输出设备、网络接入设备、总线等。
所述处理器11可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
所述存储器12可以是所述机器人的内部存储单元,例如机器人的硬盘或内存。所述存储器12也可以是所述机器人的外部存储设备,例如所述机器人上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。进一步地,所述存储器12还可以既包括所述机器人的内部存储单元也包括外部存储设备。所述存储器12用于存储所述计算机程序以及所述机器人所需的其他程序和数据。所述存储器12还可以用于暂时地存储已经输出或者将要输出的数据。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,仅以上述各功能单元、模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能单元、模块完成,即将所述装置的内部结构划分成不同的功能单元或模块,以完成以上描述的全部或者部分功能。实施例中的各功能单元、模块可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中,上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
在本申请所提供的实施例中,应该理解到,所揭露的装置/终端设备和方法,可以通过其它的方式实现。例如,以上所描述的装置/终端设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装 置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
集成的模块/单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,也可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读介质可以包括:能够携带所述计算机程序代码的任何实体或装置、记录介质、U盘、移动硬盘、磁碟、光盘、计算机存储器、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、电载波信号、电信信号以及软件分发介质等。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围,均应包含在本申请的保护范围之内。

Claims (10)

  1. 一种机器人位姿的确定方法,其特征在于,包括:
    获取机器人的第一位姿,所述第一位姿是所述机器人在地图坐标系下的位姿;
    在所述机器人的激光雷达进行激光扫描时,根据所述第一位姿确定所述激光雷达对应的激光点在所述地图坐标系下的第一位置;
    确定所述第一位置与所述第一位置所在的栅格的匹配分数,所述栅格是概率地图中的栅格,所述概率地图是与所述地图坐标系对应的地图;
    根据所述匹配分数确定所述第一位姿的第一置信度;
    根据所述第一置信度确定目标位姿。
  2. 根据权利要求1所述的机器人位姿的确定方法,其特征在于,所述确定所述第一位置与所述第一位置所在的栅格的匹配分数,包括:
    根据所述第一位置与所述第一位置所在的栅格的均值,确定所述述第一位置与所述第一位置所在的栅格的匹配分数。
  3. 根据权利要求1所述的机器人位姿的确定方法,其特征在于,激光点的数量为至少两个,对应地,所述匹配分数的数量为至少两个;所述根据所述匹配分数确定所述第一位姿的第一置信度,包括:
    将至少两个所述匹配分数的平均值作为所述第一位姿的第一置信度。
  4. 根据权利要求1所述的机器人位姿的确定方法,其特征在于,所述根据所述第一置信度确定目标位姿,包括:
    在调整范围内调整所述第一位姿,得到至少一个第二位姿;
    确定所述至少一个第二位姿的第二置信度;
    根据所述至少一个第二位姿中,第二置信度最高的第二位姿确定目标位姿。
  5. 根据权利要求4所述的机器人位姿的确定方法,其特征在于,所述根据所述至少一个第二位姿中,第二置信度最高的第二位姿确定目标位姿,包括:
    采用预设算法对所述第二置信度最高的第二位姿进行优化,得到优化后的至少一个第三位姿;
    确定所述至少一个第三位姿的第三置信度;
    将所述至少一个第三位姿中,第三置信度最高的第三位姿作为目标位姿。
  6. 根据权利要求4所述的机器人位姿的确定方法,其特征在于,在所述在调整范围内调整所述第一位姿,得到至少一个第二位姿之前,所述方法还包括:
    根据所述第一置信度确定调整范围。
  7. 根据权利要求1所述的机器人位姿的确定方法,其特征在于,所述根据所述第一置信度确定目标位姿,包括:
    若所述第一置信度小于预设值,获取更新后的第一位姿;
    根据所述更新后的第一位姿确定目标位姿。
  8. 一种机器人位姿的确定装置,其特征在于,包括:
    获取模块,用于获取机器人的第一位姿,所述第一位姿是所述机器人在地图坐标系下的位姿;
    第一计算模块,用于在所述机器人的激光雷达进行激光扫描时,根据所述第一位姿确定所述激光雷达对应的激光点在所述地图坐标系下的第一位置;
    第二计算模块,用于确定所述第一位置与所述第一位置所在的栅格的匹配分数,所述栅格是概率地图中的栅格,所述概率地图是与所述地图坐标系对应的地图;
    第三计算模块,永固根据所述匹配分数确定所述第一位姿的第一置信度;
    确定模块,用于根据所述第一置信度确定目标位姿。
  9. 一种机器人,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运 行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现如权利要求1至7任一项所述的机器人位姿的确定方法。
  10. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至7任一项所述的机器人位姿的确定方法。
PCT/CN2021/126715 2021-06-25 2021-10-27 机器人位姿的确定方法、装置、机器人及存储介质 WO2022267285A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110709447.6A CN113510703B (zh) 2021-06-25 2021-06-25 机器人位姿的确定方法、装置、机器人及存储介质
CN202110709447.6 2021-06-25

Publications (1)

Publication Number Publication Date
WO2022267285A1 true WO2022267285A1 (zh) 2022-12-29

Family

ID=78065892

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/126715 WO2022267285A1 (zh) 2021-06-25 2021-10-27 机器人位姿的确定方法、装置、机器人及存储介质

Country Status (2)

Country Link
CN (1) CN113510703B (zh)
WO (1) WO2022267285A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116840820A (zh) * 2023-08-29 2023-10-03 上海仙工智能科技有限公司 一种检测2d激光定位丢失的方法及系统、存储介质
CN117066702A (zh) * 2023-08-25 2023-11-17 上海频准激光科技有限公司 一种基于激光器的激光打标控制系统
CN117784120A (zh) * 2024-02-23 2024-03-29 南京新航线无人机科技有限公司 一种无人机飞行状态监测方法及系统

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113510703B (zh) * 2021-06-25 2022-09-16 深圳市优必选科技股份有限公司 机器人位姿的确定方法、装置、机器人及存储介质
CN114035579A (zh) * 2021-11-11 2022-02-11 上海景吾智能科技有限公司 清扫机器人地图加载和切换方法及系统
CN116148879B (zh) * 2021-11-22 2024-05-03 珠海一微半导体股份有限公司 一种机器人提升障碍物标注精度的方法
CN114643579B (zh) * 2022-03-29 2024-01-16 深圳优地科技有限公司 一种机器人定位方法、装置、机器人及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5105368A (en) * 1990-08-01 1992-04-14 At&T Bell Laboratories Method for improving robot accuracy
CN105953798A (zh) * 2016-04-19 2016-09-21 深圳市神州云海智能科技有限公司 移动机器人的位姿确定方法和设备
CN108873001A (zh) * 2018-09-17 2018-11-23 江苏金智科技股份有限公司 一种精准评判机器人定位精度的方法
CN110174894A (zh) * 2019-05-27 2019-08-27 小狗电器互联网科技(北京)股份有限公司 机器人及其重定位方法
CN110900602A (zh) * 2019-11-26 2020-03-24 苏州博众机器人有限公司 一种定位恢复方法、装置、机器人及存储介质
CN111765882A (zh) * 2020-06-18 2020-10-13 浙江大华技术股份有限公司 激光雷达定位方法及其相关装置
CN113510703A (zh) * 2021-06-25 2021-10-19 深圳市优必选科技股份有限公司 机器人位姿的确定方法、装置、机器人及存储介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108458715B (zh) * 2018-01-18 2020-05-15 亿嘉和科技股份有限公司 一种基于激光地图的机器人定位初始化方法
CN110319834B (zh) * 2018-03-30 2021-04-23 深圳市神州云海智能科技有限公司 一种室内机器人定位的方法及机器人
CN109084732B (zh) * 2018-06-29 2021-01-12 北京旷视科技有限公司 定位与导航方法、装置及处理设备
CN111105454B (zh) * 2019-11-22 2023-05-09 北京小米移动软件有限公司 一种获取定位信息的方法、装置及介质
CN111708047B (zh) * 2020-06-16 2023-02-28 浙江华睿科技股份有限公司 机器人定位评估方法、机器人及计算机存储介质
CN111895989A (zh) * 2020-06-24 2020-11-06 浙江大华技术股份有限公司 一种机器人的定位方法、装置和电子设备
CN111693053B (zh) * 2020-07-09 2022-05-06 上海大学 一种基于移动机器人的重定位方法及系统
CN112462769A (zh) * 2020-11-25 2021-03-09 深圳市优必选科技股份有限公司 机器人定位方法、装置、计算机可读存储介质及机器人

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5105368A (en) * 1990-08-01 1992-04-14 At&T Bell Laboratories Method for improving robot accuracy
CN105953798A (zh) * 2016-04-19 2016-09-21 深圳市神州云海智能科技有限公司 移动机器人的位姿确定方法和设备
CN108873001A (zh) * 2018-09-17 2018-11-23 江苏金智科技股份有限公司 一种精准评判机器人定位精度的方法
CN110174894A (zh) * 2019-05-27 2019-08-27 小狗电器互联网科技(北京)股份有限公司 机器人及其重定位方法
CN110900602A (zh) * 2019-11-26 2020-03-24 苏州博众机器人有限公司 一种定位恢复方法、装置、机器人及存储介质
CN111765882A (zh) * 2020-06-18 2020-10-13 浙江大华技术股份有限公司 激光雷达定位方法及其相关装置
CN113510703A (zh) * 2021-06-25 2021-10-19 深圳市优必选科技股份有限公司 机器人位姿的确定方法、装置、机器人及存储介质

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117066702A (zh) * 2023-08-25 2023-11-17 上海频准激光科技有限公司 一种基于激光器的激光打标控制系统
CN117066702B (zh) * 2023-08-25 2024-04-19 上海频准激光科技有限公司 一种基于激光器的激光打标控制系统
CN116840820A (zh) * 2023-08-29 2023-10-03 上海仙工智能科技有限公司 一种检测2d激光定位丢失的方法及系统、存储介质
CN116840820B (zh) * 2023-08-29 2023-11-24 上海仙工智能科技有限公司 一种检测2d激光定位丢失的方法及系统、存储介质
CN117784120A (zh) * 2024-02-23 2024-03-29 南京新航线无人机科技有限公司 一种无人机飞行状态监测方法及系统

Also Published As

Publication number Publication date
CN113510703A (zh) 2021-10-19
CN113510703B (zh) 2022-09-16

Similar Documents

Publication Publication Date Title
WO2022267285A1 (zh) 机器人位姿的确定方法、装置、机器人及存储介质
CN110307838B (zh) 机器人重定位方法、装置、计算机可读存储介质及机器人
CN112771573B (zh) 基于散斑图像的深度估计方法及装置、人脸识别系统
US8199977B2 (en) System and method for extraction of features from a 3-D point cloud
US10706567B2 (en) Data processing method, apparatus, system and storage media
CN110348454B (zh) 匹配局部图像特征描述符
CN111612841B (zh) 目标定位方法及装置、移动机器人及可读存储介质
WO2020168685A1 (zh) 一种三维扫描视点规划方法、装置及计算机可读存储介质
CN112050751B (zh) 一种投影仪标定方法、智能终端及存储介质
CN112308925A (zh) 可穿戴设备的双目标定方法、设备及存储介质
KR20220062622A (ko) 데이터 처리 방법 및 관련 장치
WO2022143285A1 (zh) 扫地机器人及其测距方法、装置以及计算机可读存储介质
CN111915657A (zh) 一种点云配准方法、装置、电子设备及存储介质
CN113793387A (zh) 单目散斑结构光系统的标定方法、装置及终端
WO2023010565A1 (zh) 单目散斑结构光系统的标定方法、装置及终端
WO2022217794A1 (zh) 一种动态环境移动机器人的定位方法
CN113362445B (zh) 基于点云数据重建对象的方法及装置
CN113192174B (zh) 建图方法、装置及计算机存储介质
Chen et al. Multi-stage matching approach for mobile platform visual imagery
KR20220026423A (ko) 지면에 수직인 평면들의 3차원 재구성을 위한 방법 및 장치
CN115239776B (zh) 点云的配准方法、装置、设备和介质
CN109242941B (zh) 三维对象合成通过使用视觉引导作为二维数字图像的一部分
CN113658156A (zh) 一种去除深度图像中局外点的球体拟合方法和装置
Li et al. Efficient and precise visual location estimation by effective priority matching-based pose verification in edge-cloud collaborative IoT
CN116758205B (zh) 数据处理方法、装置、设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21946767

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE