CN108571967B - Positioning method and device - Google Patents
Positioning method and device Download PDFInfo
- Publication number
- CN108571967B CN108571967B CN201710147835.3A CN201710147835A CN108571967B CN 108571967 B CN108571967 B CN 108571967B CN 201710147835 A CN201710147835 A CN 201710147835A CN 108571967 B CN108571967 B CN 108571967B
- Authority
- CN
- China
- Prior art keywords
- point cloud
- cloud data
- matching degree
- initial
- historical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000005070 sampling Methods 0.000 claims description 100
- 238000005457 optimization Methods 0.000 claims description 38
- 238000004364 calculation method Methods 0.000 claims description 17
- 230000004048 modification Effects 0.000 claims description 13
- 238000012986 modification Methods 0.000 claims description 13
- 230000008859 change Effects 0.000 description 18
- 230000008569 process Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention is applicable to the technical field of electronics, provides a positioning method and a positioning device, and aims to solve the problems that a mobile robot using a microprocessor in the prior art is low in positioning efficiency and positioning accuracy. The method comprises the following steps: calculating the point cloud matching degree between the initial point cloud data and historical point cloud data in a preset time period; modifying the initial point cloud data according to the step length variable quantity of the preset quantity, and calculating the point cloud matching degree between the initial point cloud data and the historical point cloud data; taking the point cloud data corresponding to the maximum point cloud matching degree as first point cloud data; on the basis of the first point cloud data, second point cloud data are obtained according to the same method; and if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, outputting positioning information according to the first point cloud data. According to the technical scheme, the robot carrying the microprocessor can quickly and accurately realize positioning, so that the positioning efficiency and the positioning precision are improved.
Description
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to a positioning method and apparatus.
Background
At present, mobile robots are widely applied to service industries, and application scenes of the mobile robots tend to indoor environments such as markets, supermarkets and hospitals. With the increase of the complexity of the application environment, the requirement of the mobile robot on the positioning accuracy in the motion process is higher and higher.
Meanwhile, with the continuous development of microprocessors, more and more robot providers begin to adopt microprocessors as core processing units of mobile robots.
However, because the processing capability and resources of the microprocessor are limited, the existing positioning method with high positioning accuracy often depends on stronger system processing capability and needs to occupy more system resources, so that the mobile robot using the microprocessor is limited in positioning, and the positioning accuracy is not high and the positioning efficiency is low.
Disclosure of Invention
The embodiment of the invention provides a positioning method and a positioning device, and aims to solve the problems that a mobile robot using a microprocessor in the prior art is low in positioning efficiency and positioning accuracy.
In a first aspect, an embodiment of the present invention provides a positioning method, where the positioning method includes:
acquiring initial point cloud data, wherein the initial point cloud data comprises sampling information of a preset number of sampling points, and the sampling information comprises position coordinates and direction angles of the sampling points;
calculating the point cloud matching degree between the initial point cloud data and historical point cloud data in a preset time period;
modifying the initial point cloud data according to a preset number of step size variable quantities to obtain a preset number of initial reference point cloud data, and calculating a point cloud matching degree between each initial reference point cloud data and the historical point cloud data, wherein the step size variable quantities comprise position coordinate variable quantities and direction angle variable quantities;
determining point cloud data corresponding to the maximum point cloud matching degree as first point cloud data according to the point cloud matching degree of the initial point cloud data and the point cloud matching degree of each initial reference point cloud data;
modifying the first point cloud data according to the step length variation of the preset number to obtain a preset number of first reference point cloud data, and calculating the point cloud matching degree between each first reference point cloud data and the historical point cloud data;
determining point cloud data corresponding to the maximum point cloud matching degree as second point cloud data according to the point cloud matching degree of the first point cloud data and the point cloud matching degree of each first reference point cloud data;
and if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, outputting positioning information according to the first point cloud data, wherein the positioning information comprises the position and the posture of the robot.
In another aspect, an embodiment of the present invention provides a positioning apparatus, where the positioning apparatus includes:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring initial point cloud data, the initial point cloud data comprises sampling information of a preset number of sampling points, and the sampling information comprises position coordinates and direction angles of the sampling points;
the calculation module is used for calculating the point cloud matching degree between the initial point cloud data and historical point cloud data in a preset time period;
the first modification module is used for modifying the initial point cloud data according to a preset number of step size variable quantities to obtain a preset number of initial reference point cloud data, and calculating a point cloud matching degree between each initial reference point cloud data and the historical point cloud data, wherein the step size variable quantities comprise position coordinate variable quantities and direction angle variable quantities;
the first selection module is used for determining point cloud data corresponding to the maximum point cloud matching degree as first point cloud data according to the point cloud matching degree of the initial point cloud data and the point cloud matching degree of each initial reference point cloud data;
the second modification module is used for modifying the first point cloud data according to the preset number of step size variable quantities to obtain the preset number of first reference point cloud data, and calculating the point cloud matching degree between each first reference point cloud data and the historical point cloud data;
the second selection module is used for determining point cloud data corresponding to the maximum point cloud matching degree according to the point cloud matching degree of the first point cloud data and the point cloud matching degree of each first reference point cloud data to serve as second point cloud data;
and the first output module is used for outputting positioning information according to the first point cloud data if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, wherein the positioning information comprises the position and the posture of the robot.
The method comprises the steps of acquiring initial point cloud data comprising position coordinates and direction angles of sampling points, calculating point cloud matching degrees between the initial point cloud data and historical point cloud data in a preset time period, modifying the initial point cloud data according to step change of a preset number on the basis of the initial point cloud data to obtain initial reference point cloud data of a preset number, calculating the point cloud matching degrees between each initial reference point cloud data and the historical point cloud data, selecting point cloud data corresponding to the maximum point cloud matching degree from the initial point cloud data and the initial reference point cloud data to serve as first point cloud data, obtaining second point cloud data on the basis of the first point cloud data according to the same method as the obtained first point cloud data, and outputting the position and the posture of a robot according to the first point cloud data if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, the current point cloud data and the historical point cloud data are rigidly matched in different directions, and the optimal direction is matched in a gradient descending searching mode to serve as the positioning optimization direction, so that the rapid convergence in the positioning optimization direction is realized, the robot carrying the microprocessor can realize positioning rapidly and accurately, and the positioning efficiency and the positioning precision are improved.
Drawings
Fig. 1 is a flowchart of a positioning method according to an embodiment of the present invention;
fig. 2 is a flowchart of a positioning method according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a positioning apparatus according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a positioning device according to a fourth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The following detailed description of implementations of the invention refers to the accompanying drawings.
The first embodiment is as follows:
fig. 1 is a flowchart of a positioning method according to an embodiment of the present invention, where an execution subject of the embodiment of the present invention is a robot, which may specifically be a microprocessor device of the robot, and the positioning method illustrated in fig. 1 may specifically include steps S101 to S107, which are detailed as follows:
s101, obtaining initial point cloud data, wherein the initial point cloud data comprises sampling information of a preset number of sampling points, and the sampling information comprises position coordinates and direction angles of the preset number of sampling points.
The Point Cloud (Point Cloud) is also called laser Point Cloud, and is a set of a series of massive points expressing target space distribution and target surface characteristics, which is obtained by acquiring the space coordinates of each sampling Point on the surface of an object under the same space reference system by using laser, and the Point set is the Point Cloud. The point cloud data may include spatial resolution, point location accuracy, surface normal vectors, and the like.
The point cloud data can be acquired according to information acquired by laser sensors in various scanning ranges or various sensors capable of converting into point cloud output.
In the embodiment of the present invention, the robot may include a robot body, a laser sensor, a microprocessor, and the like, wherein the robot body may include wheels, a vehicle body, an odometer, a motor driver, a robot body motion controller, and the like, the laser sensor is configured to collect a relative distance between the robot body and a reference object in a current environment within a scanning range of the laser sensor, and the microprocessor is configured to process data collected by the laser sensor, complete positioning of the robot, and output positioning information.
Specifically, the microprocessor acquires initial point cloud data with position coordinates and direction angle information according to the acquired sampling information of a preset number of sampling points.
The preset number of the sampling points can be specifically set according to the requirements of practical application, and is not limited here.
S102, calculating the point cloud matching degree between the initial point cloud data and historical point cloud data in a preset time period.
Specifically, the microprocessor matches the initial point cloud data obtained in step S101 with historical point cloud data in a preset time period, and calculates a point cloud matching degree, where the point cloud matching degree is used to identify a deviation condition of sampling information between a sampling point in the initial point cloud data and a corresponding sampling point in the historical point cloud data.
The historical point cloud data in the preset time period refers to all point cloud data in the preset time period before the current time. It should be noted that the specific duration of the preset time period may be set according to the needs of practical applications, and is not limited herein.
S103, modifying the initial point cloud data according to the preset number of step size variable quantities to obtain the preset number of initial reference point cloud data, and calculating the point cloud matching degree between each initial reference point cloud data and the historical point cloud data, wherein the step size variable quantities comprise position coordinate variable quantities and direction angle variable quantities.
The step change amount includes a position coordinate change amount and a direction angle change amount, and each step change amount has a fixed change amount in a coordinate or a direction. For example, assuming that 5 fixed changes are preset in the x coordinate, the y coordinate, and the direction angle, there are 125(5 × 5 — 125) step changes in total, i.e., the preset number is 125.
Specifically, based on the initial point cloud data obtained in step S101, the initial point cloud data is modified according to a preset number of step size variations to obtain a preset number of initial reference point cloud data, and a point cloud matching degree between the initial reference point cloud data and the historical point cloud data at each step size variation is calculated, where a calculation method of the point cloud matching degree is the same as that of the calculation method of the point cloud matching degree in step S102.
And S104, determining point cloud data corresponding to the maximum point cloud matching degree as first point cloud data according to the point cloud matching degree of the initial point cloud data and the point cloud matching degree of each initial reference point cloud data.
Specifically, the maximum point cloud matching degree is selected from the point cloud matching degree of the initial point cloud data calculated in step S102 and the point cloud matching degree of the initial reference point cloud data calculated in step S103 at each step variation, and the point cloud data corresponding to the maximum point cloud matching degree is used as the first point cloud data.
S105, modifying the first point cloud data according to the step length variable quantity of the preset quantity to obtain the first reference point cloud data of the preset quantity, and calculating the point cloud matching degree between each first reference point cloud data and the historical point cloud data.
Specifically, the method the same as that in step S103 is adopted, based on the first point cloud data obtained in step S104, the first point cloud data is modified according to the step change amounts of the preset number, the first reference point cloud data of the preset number is obtained, and the point cloud matching degree between the first reference point cloud data and the historical point cloud data under each step change amount is calculated, which is the same as the calculation method for calculating the point cloud matching degree in step S102.
And S106, determining point cloud data corresponding to the maximum point cloud matching degree as second point cloud data according to the point cloud matching degree of the first point cloud data and the point cloud matching degree of each first reference point cloud data.
Specifically, the same method as that in step S104 is adopted, the maximum point cloud matching degree is selected from the point cloud matching degree of the first point cloud data calculated in step S104 and the point cloud matching degree of the first reference point cloud data calculated in step S105 under each step variation, and the point cloud data corresponding to the maximum point cloud matching degree is used as the second point cloud data.
And S107, if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, outputting positioning information according to the first point cloud data, wherein the positioning information comprises the position and the posture of the robot.
Specifically, the size between the point cloud matching degree of the first point cloud data and the point cloud matching degree of the second point cloud data is judged, and if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, the first point cloud data is considered to be the current optimal positioning data of the robot, so that the current position and posture of the robot are determined according to the position coordinates and the direction angles of the sampling points in the first point cloud data, and the positioning information containing the position and the posture of the robot is output.
It can be understood that, in the embodiment of the present invention, if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, the positioning information is output according to the first point cloud data, and in other embodiments, the positioning information may also be output according to the first point cloud data if the point cloud matching degree of the first point cloud data is greater than the point cloud matching degree of the second point cloud data.
In the embodiment, initial point cloud data including position coordinates and direction angles of sampling points are obtained, point cloud matching degrees between the initial point cloud data and historical point cloud data in a preset time period are calculated, the initial point cloud data are modified according to a preset number of step change amounts on the basis of the initial point cloud data to obtain a preset number of initial reference point cloud data, the point cloud matching degrees between each initial reference point cloud data and the historical point cloud data are calculated, point cloud data corresponding to the maximum point cloud matching degree are selected from the initial point cloud data and the initial reference point cloud data to serve as first point cloud data, then the first point cloud data are used as the basis, second point cloud data are obtained according to the same method as the first point cloud data, and if the point cloud matching degree of the first point cloud data is larger than or equal to the point cloud matching degree of the second point cloud data, the position and the posture of the robot are output according to the first point cloud data, the current point cloud data and the historical point cloud data are rigidly matched in different directions, and the optimal direction is matched in a gradient descending searching mode to serve as the positioning optimization direction, so that the rapid convergence in the positioning optimization direction is realized, the robot carrying the microprocessor can realize positioning rapidly and accurately, and the positioning efficiency and the positioning precision are improved.
Example two:
fig. 2 is a flowchart of a positioning method according to a second embodiment of the present invention, where an execution subject of the second embodiment of the present invention is a robot, which may specifically be a microprocessor device of the robot, and the positioning method illustrated in fig. 2 may specifically include steps S201 to S216, which are detailed as follows:
s201, acquiring odometer information and laser sensor information of each sampling point with preset number.
The odometer is a sensing device for measuring the travel, and the odometer arranged on the robot calculates the walking mileage of the robot by recording the number of turns of the rotation of the motor.
Specifically, the microprocessor collects odometer information of each sampling point through an odometer, and collects laser sensor information such as relative distance between each sampling point and a corresponding reference point in the current environment through a laser sensor.
And S202, calculating the position coordinates and the direction angles of each sampling point according to the odometer information.
Specifically, the microprocessor calculates the position coordinates and the direction angle of each sampling point from the travel data in the odometer information.
S203, associating the position coordinates and the direction angles of each sampling point with the laser sensor information to generate initial point cloud data, wherein the initial point cloud data comprises sampling information of a preset number of sampling points, and the sampling information comprises the position coordinates and the direction angles of the sampling points.
Specifically, the microprocessor associates the position coordinates and the direction angles of each sampling point with the laser sensor information, namely, the position coordinates and the direction angles of each sampling point are added to the laser sensor information of the sampling point, the laser sensor information contains the position coordinates and the direction angles of the sampling points, initial point cloud data is generated according to the associated laser sensor information, the initial point cloud data comprises sampling information of a preset number of sampling points, and the sampling information comprises the position coordinates and the direction angles of the sampling points.
The preset number of the sampling points can be specifically set according to the requirements of practical application, and is not limited here.
And S204, matching the initial point cloud data with historical point cloud data in a preset time period according to a point cloud matching condition to obtain the historical point cloud data meeting the point cloud matching condition, wherein the point cloud matching condition comprises the minimum sum of squares of distances between each sampling point in the initial point cloud data and a corresponding sampling point in the historical point cloud data.
The historical point cloud data in the preset time period refers to all point cloud data in the preset time period before the current time. It should be noted that the specific duration of the preset time period may be set according to the needs of practical applications, and is not limited herein.
Specifically, a corresponding sampling point exists in each historical point cloud data of the sampling points in the initial point cloud data, the square sum of distances between the corresponding sampling points is calculated in the initial point cloud data and each historical point cloud data, if j pieces of historical point cloud data exist in a preset time period, the square sum of a distance can be calculated for each piece of historical point cloud data, the minimum value is selected from the calculated square sum of the j distances, and the historical point cloud data corresponding to the minimum value is the historical point cloud data meeting the point cloud matching condition.
S205, calculating an angle deviation and a distance deviation between the initial point cloud data and the historical point cloud data meeting the point cloud matching condition, wherein the distance deviation comprises a distance error between each sampling point in the initial point cloud data and a corresponding sampling point in the historical point cloud data meeting the point cloud matching condition.
Specifically, from the historical point cloud data determined in step S204, an angle deviation and a distance deviation between the historical point cloud data and the initial point cloud data generated in step S203 are calculated.
The angle deviation may be a central angle deviation between the initial point cloud data and the historical point cloud data, or a combination of angle deviations between each corresponding sampling point, and a specific calculation manner of the angle deviation may be selected according to a requirement of an actual application, which is not limited herein.
The distance deviation includes a distance error between corresponding sample points in the initial point cloud data generated at step S203 and the historical point cloud data determined at step S204.
S206, calculating the point cloud matching degree between the initial point cloud data and the historical point cloud data meeting the point cloud matching conditions according to the formula (1):
wherein S is point cloud matching degree, α is a preset credibility parameter, theta is an angle deviation, β is a preset matching degree parameter, and M isiThe distance error corresponding to the ith sampling point in the sampling points with the distance error smaller than the preset error threshold value is obtained, and N is the number of the sampling points with the distance error smaller than the preset error threshold value.
Specifically, the preset confidence level parameter α, the preset matching level parameter β, and the preset error threshold may be set according to the actual application requirement, and are not limited herein.
The angle deviation θ is an angle deviation between the initial point cloud data calculated in step S205 and the historical point cloud data satisfying the point cloud matching condition.
Determining the number N, M of sampling points with the distance error smaller than the error threshold according to the distance error between the initial point cloud data and the corresponding sampling points in the historical point cloud data meeting the point cloud matching condition, which is calculated in the step S205iThe distance error corresponding to the ith sampling point in the N sampling points.
S207, modifying the initial point cloud data according to the preset number of step size variable quantities to obtain the preset number of initial reference point cloud data, and calculating the point cloud matching degree between each initial reference point cloud data and the historical point cloud data, wherein the step size variable quantities comprise position coordinate variable quantities and direction angle variable quantities.
The step change amounts include a position coordinate change amount and a direction angle change amount, each of which has a fixed change amount in a coordinate or a direction, for example, if 5 fixed change amounts are preset in the x coordinate, the y coordinate, and the direction angle, a total of 125(5 × 125) step change amounts, that is, the preset number is 125.
Specifically, based on the initial point cloud data generated in step S203, the initial point cloud data is modified according to a preset number of step size variations to obtain a preset number of initial reference point cloud data, and a point cloud matching degree between the initial reference point cloud data and the historical point cloud data at each step size variation is calculated, where a calculation method of the point cloud matching degree is the same as a calculation method of the point cloud matching degree calculated in steps S204 to S206.
And S208, determining point cloud data corresponding to the maximum point cloud matching degree as first point cloud data according to the point cloud matching degree of the initial point cloud data and the point cloud matching degree of each initial reference point cloud data.
Specifically, the maximum point cloud matching degree is selected from the point cloud matching degree of the initial point cloud data calculated in step S206 and the point cloud matching degree of the initial reference point cloud data calculated in step S207 at each step variation, and the point cloud data corresponding to the maximum point cloud matching degree is used as the first point cloud data.
S209, modifying the first point cloud data according to the step length variable quantity of the preset quantity to obtain the first reference point cloud data of the preset quantity, and calculating the point cloud matching degree between each first reference point cloud data and the historical point cloud data.
Specifically, the method the same as that in step S207 is adopted, the first point cloud data is modified according to the step change amounts of the preset number on the basis of the first point cloud data to obtain the first reference point cloud data of the preset number, and the point cloud matching degree between the first reference point cloud data and the historical point cloud data at each step change amount is calculated, wherein the calculation method of the point cloud matching degree is the same as that of the point cloud matching degree calculated in steps S204 to S206.
S210, point cloud data corresponding to the maximum point cloud matching degree is determined according to the point cloud matching degree of the first point cloud data and the point cloud matching degree of each first reference point cloud data, and the point cloud data serves as second point cloud data.
Specifically, the same method as that in step S208 is adopted to select the maximum point cloud matching degree from the point cloud matching degree of the first point cloud data and the point cloud matching degree of the first reference point cloud data at each step variation calculated in step S209, and the point cloud data corresponding to the maximum point cloud matching degree is used as the second point cloud data.
S211, judging whether the point cloud matching degree of the first point cloud data is larger than or equal to the point cloud matching degree of the second point cloud data.
Specifically, if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, step S212 is executed, otherwise step S213 is executed.
And S212, outputting positioning information according to the first point cloud data, wherein the positioning information comprises the position and the posture of the robot.
Specifically, if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, the first point cloud data is considered as the current optimal positioning data of the robot, so that the current position and posture of the robot are determined according to the position coordinates and the direction angles of the sampling points in the first point cloud data, and the positioning information containing the position and posture of the robot is output.
It can be understood that, in the embodiment of the present invention, if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, the positioning information is output according to the first point cloud data, and in other embodiments, the positioning information may also be output according to the first point cloud data if the point cloud matching degree of the first point cloud data is greater than the point cloud matching degree of the second point cloud data.
And after the microprocessor outputs the positioning information according to the first point cloud data, the process is ended.
And S213, acquiring the current optimization times.
Specifically, if the point cloud matching degree of the first point cloud data is smaller than the point cloud matching degree of the second point cloud data, the microprocessor obtains the current optimization times, and the optimization times can be obtained by counting the execution times of step S211.
S214, judging whether the current optimization times reach a preset time threshold value.
Specifically, it is determined whether the current optimization time reaches a preset time threshold, if the current optimization time does not reach the preset time threshold, step S215 is executed, otherwise step S216 is executed.
The preset number threshold may be set to 10 times, but is not limited thereto, and the specific number threshold may be set according to the needs of the practical application, and is not limited herein.
And S215, identifying the second point cloud data as the first point cloud data, and returning to the step S209 to continue execution.
Specifically, if the current optimization frequency does not reach the preset frequency threshold, the second point cloud data obtained in step S210 is re-identified as the first point cloud data, and the process returns to step S209 to continue to search for the positioning optimization direction.
And S216, outputting positioning information according to the second point cloud data.
Specifically, if the current optimization frequency reaches a preset frequency threshold, the positioning optimization direction is not continuously searched, the second point cloud data is directly considered as the current optimal positioning data of the robot, the current position and posture of the robot are determined according to the position coordinates and the direction angles of the sampling points in the second point cloud data, and the positioning information containing the position and posture of the robot is output.
And after the microprocessor outputs the positioning information according to the second point cloud data, the process is ended.
In the embodiment, firstly, the position coordinates and the direction angles of sampling points are calculated by collecting odometer information and are associated with laser sensor information to generate initial point cloud data containing the position coordinates and the direction angles, then the initial point cloud data are matched with historical point cloud data in a preset time period according to a point cloud matching condition to obtain historical point cloud data meeting the point cloud matching condition, the angle deviation and the distance deviation between the initial point cloud data and the historical point cloud data are calculated, the point cloud matching degree between the initial point cloud data and the historical point cloud data meeting the point cloud matching condition is calculated according to a formula (1) according to the angle deviation and the distance deviation, then the initial point cloud data are modified according to a preset number of step change quantities on the basis of the initial point cloud data to obtain a preset number of initial reference point cloud data, and the point cloud matching degree between each initial reference point cloud data and the historical point cloud data is calculated, selecting point cloud data corresponding to the maximum point cloud matching degree from the initial point cloud data and the initial reference point cloud data to serve as first point cloud data, then obtaining second point cloud data by the same method as the first point cloud data on the basis of the first point cloud data, outputting the position and the posture of the robot according to the first point cloud data if the point cloud matching degree of the first point cloud data is larger than or equal to the point cloud matching degree of the second point cloud data, and otherwise, continuously searching the positioning optimization direction on the basis of the second point cloud data until the maximum optimization times are reached. The method has the advantages that the rigid matching of the current point cloud data and the historical point cloud data in different directions is realized, and the optimal matching direction is searched in a gradient descending mode to serve as the positioning optimization direction, so that the rapid convergence in the positioning optimization direction is realized, the mobile robot carrying the microprocessor can realize the positioning rapidly and accurately, and the positioning efficiency and the positioning precision are improved.
Example three:
fig. 3 is a schematic structural diagram of a positioning device according to a third embodiment of the present invention, and for convenience of description, only the parts related to the third embodiment of the present invention are shown. A positioning device illustrated in fig. 3 may be an execution subject of the positioning method provided in the first embodiment. One positioning device exemplified in fig. 3 includes: the system comprises an acquisition module 31, a calculation module 32, a first modification module 33, a first selection module 34, a second modification module 35, a second selection module 36 and a first output module 37, wherein the detailed description of each functional module is as follows:
the acquisition module 31 is configured to acquire initial point cloud data, where the initial point cloud data includes sampling information of a preset number of sampling points, and the sampling information includes position coordinates and direction angles of the preset number of sampling points;
a calculating module 32, configured to calculate a point cloud matching degree between the initial point cloud data acquired by the acquiring module 31 and historical point cloud data in a preset time period;
the first modification module 33 is configured to modify the initial point cloud data acquired by the acquisition module 31 according to a preset number of step size variations to obtain a preset number of initial reference point cloud data, and calculate a point cloud matching degree between each initial reference point cloud data and the historical point cloud data, where the step size variations include a position coordinate variation and a direction angle variation;
the first selecting module 34 is configured to determine, according to the point cloud matching degree of the initial point cloud data obtained by the calculating module 32 and the point cloud matching degree of each initial reference point cloud data obtained by the first modifying module 33, point cloud data corresponding to the maximum point cloud matching degree as first point cloud data;
the second modification module 35 is configured to modify the first point cloud data obtained by the first selection module 34 according to a preset number of step size variations to obtain a preset number of first reference point cloud data, and calculate a point cloud matching degree between each first reference point cloud data and the historical point cloud data;
the second selecting module 36 is configured to determine, according to the point cloud matching degree of the first point cloud data obtained by the first selecting module 34 and the point cloud matching degree of each first reference point cloud data obtained by the second modifying module 35, point cloud data corresponding to the maximum point cloud matching degree as second point cloud data;
a first output module 37, configured to output positioning information according to the first point cloud data if the point cloud matching degree of the first point cloud data obtained by the first selecting module 34 is greater than or equal to the point cloud matching degree of the second point cloud data obtained by the second selecting module 36, where the positioning information includes a position and a posture of the robot.
The process of implementing each function by each module in the positioning apparatus provided in this embodiment may specifically refer to the description of the embodiment shown in fig. 1, and is not described herein again.
As can be seen from the positioning apparatus illustrated in fig. 3, in this embodiment, initial point cloud data including position coordinates and direction angles of sampling points is obtained, a point cloud matching degree between the initial point cloud data and historical point cloud data in a preset time period is calculated, the initial point cloud data is modified according to a preset number of step size variations on the basis of the initial point cloud data to obtain a preset number of initial reference point cloud data, a point cloud matching degree between each initial reference point cloud data and historical point cloud data is calculated, point cloud data corresponding to a maximum point cloud matching degree is selected from the initial point cloud data and the initial reference point cloud data to serve as first point cloud data, then, on the basis of the first point cloud data, second point cloud data is obtained according to the same method as that used to obtain the first point cloud data, and if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, and outputting the position and the posture of the robot according to the first point cloud data, and realizing the rapid convergence in the positioning optimization direction by matching the rigidity of the current point cloud data and the historical point cloud data in different directions and matching the optimal direction in a gradient descending search mode as the positioning optimization direction, so that the robot carrying the microprocessor can realize the positioning rapidly and accurately, thereby improving the positioning efficiency and the positioning precision.
Example four:
fig. 4 is a schematic structural diagram of a positioning device according to a fourth embodiment of the present invention, and for convenience of description, only the parts related to the fourth embodiment of the present invention are shown. One positioning device illustrated in fig. 4 may be an execution subject of the positioning method provided in the second embodiment. One positioning device exemplified in fig. 4 includes: the system comprises an acquisition module 41, a calculation module 42, a first modification module 43, a first selection module 44, a second modification module 45, a second selection module 46 and a first output module 47, wherein the detailed description of each functional module is as follows:
an obtaining module 41, configured to obtain initial point cloud data, where the initial point cloud data includes sampling information of a preset number of sampling points, and the sampling information includes position coordinates and direction angles of the preset number of sampling points;
a calculating module 42, configured to calculate a point cloud matching degree between the initial point cloud data acquired by the acquiring module 41 and historical point cloud data in a preset time period;
a first modifying module 43, configured to modify the initial point cloud data acquired by the acquiring module 41 according to a preset number of step size variations to obtain a preset number of initial reference point cloud data, and calculate a point cloud matching degree between each initial reference point cloud data and the historical point cloud data, where the step size variations include a position coordinate variation and a direction angle variation;
a first selecting module 44, configured to determine, according to the point cloud matching degree of the initial point cloud data obtained by the calculating module 42 and the point cloud matching degree of each initial reference point cloud data obtained by the first modifying module 43, point cloud data corresponding to the maximum point cloud matching degree as first point cloud data;
the second modification module 45 is configured to modify the first point cloud data according to a preset number of step size variations to obtain a preset number of first reference point cloud data, and calculate a point cloud matching degree between each first reference point cloud data and the historical point cloud data;
a second selecting module 46, configured to determine, according to the point cloud matching degree of the first point cloud data and the point cloud matching degree of each first reference point cloud data obtained by the second modifying module 45, point cloud data corresponding to the maximum point cloud matching degree as second point cloud data;
the first output module 47 is configured to output positioning information according to the first point cloud data if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data obtained by the second selection module 46, where the positioning information includes a position and a posture of the robot.
Further, the obtaining module 41 includes:
the acquisition sub-module 411 is used for acquiring odometer information and laser sensor information of each sampling point;
the information calculation submodule 412 is used for calculating the position coordinates and the direction angles of each sampling point according to the odometer information obtained by the acquisition submodule 411;
the generating submodule 413 is configured to associate the position coordinates and the direction angles obtained by the information calculating submodule 412 with the laser sensor information acquired by the acquisition submodule 411, and generate initial point cloud data.
Further, the calculation module 42 includes:
the data matching submodule 421 is configured to match the initial point cloud data generated by the generating submodule 413 with the historical point cloud data in a preset time period according to a point cloud matching condition to obtain historical point cloud data meeting the point cloud matching condition, where the point cloud matching condition includes that a sum of squares of distances between each sampling point in the initial point cloud data and a corresponding sampling point in the historical point cloud data is minimum;
a deviation calculating submodule 422, configured to calculate an angle deviation and a distance deviation between the initial point cloud data and the historical point cloud data satisfying the point cloud matching condition obtained by the data matching submodule 421, where the distance deviation includes a distance error between each sampling point in the initial point cloud data and a corresponding sampling point in the historical point cloud data satisfying the point cloud matching condition;
a matching degree operator module 423, configured to calculate a point cloud matching degree between the initial point cloud data and the historical point cloud data satisfying the point cloud matching condition according to formula (2):
wherein S is point cloud matching degree, α is a preset credibility parameter, theta is an angle deviation calculated by the deviation calculation submodule 422, β is a preset matching degree parameter, and M isiAnd N is the number of the sampling points of which the distance errors obtained by the deviation calculation sub-module 422 are smaller than the preset error threshold.
Further, the positioning device further comprises:
an optimization frequency obtaining module 48, configured to obtain a current optimization frequency if the point cloud matching degree of the first point cloud data is smaller than the point cloud matching degree of the second point cloud data obtained by the second selecting module 46;
and the repeated processing module 49 is configured to identify the second point cloud data as the first point cloud data if the optimization times acquired by the optimization time acquisition module 48 do not reach a preset time threshold, return to the second modification module 45 to modify the first point cloud data according to the preset number of step size variations, obtain a preset number of first reference point cloud data, and calculate a point cloud matching degree between each first reference point cloud data and the historical point cloud data.
Further, the positioning device further comprises:
a second output module 50, configured to output positioning information according to the second point cloud data obtained by the second selecting module 46 if the optimization times obtained by the optimization times obtaining module 48 reaches a preset time threshold.
The process of implementing each function by each module in the positioning apparatus provided in this embodiment may specifically refer to the description of the embodiment shown in fig. 2, and is not repeated here.
As can be seen from the positioning apparatus illustrated in fig. 4, in this embodiment, first, the position coordinates and the direction angles of the sampling points are calculated by collecting odometer information, and the odometer information is associated with the laser sensor information to generate initial point cloud data including the position coordinates and the direction angles, then the initial point cloud data is matched with historical point cloud data in a preset time period according to a point cloud matching condition to obtain historical point cloud data satisfying the point cloud matching condition, an angle deviation and a distance deviation between the initial point cloud data and the historical point cloud data are calculated, a point cloud matching degree between the initial point cloud data and the historical point cloud data satisfying the point cloud matching condition is calculated according to a formula (2) according to the angle deviation and the distance deviation, then the initial point cloud data is modified according to a preset number of step size variations based on the initial point cloud data to obtain a preset number of initial reference point cloud data, and calculating the point cloud matching degree between each initial reference point cloud data and the historical point cloud data, selecting point cloud data corresponding to the maximum point cloud matching degree from the initial point cloud data and the initial reference point cloud data to serve as first point cloud data, then obtaining second point cloud data by using the first point cloud data as a basis according to the same method as the first point cloud data, outputting the position and the posture of the robot according to the first point cloud data if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, and otherwise, continuously searching the positioning optimization direction on the basis of the second point cloud data until the maximum optimization times are reached. The method has the advantages that the rigid matching of the current point cloud data and the historical point cloud data in different directions is realized, and the optimal matching direction is searched in a gradient descending mode to serve as the positioning optimization direction, so that the rapid convergence in the positioning optimization direction is realized, the mobile robot carrying the microprocessor can realize the positioning rapidly and accurately, and the positioning efficiency and the positioning precision are improved.
It should be noted that, in the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts in the embodiments may be referred to each other. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
It should be noted that, in the above apparatus embodiment, each included module is only divided according to functional logic, but is not limited to the above division as long as the corresponding function can be implemented; in addition, the specific names of the functional modules are only for convenience of distinguishing from each other and are not used for limiting the protection scope of the present invention.
It will be understood by those skilled in the art that all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing associated hardware, and the corresponding program may be stored in a computer-readable storage medium, such as ROM/RAM, a magnetic disk or an optical disk.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.
Claims (10)
1. A positioning method, characterized in that the positioning method comprises:
acquiring initial point cloud data, wherein the initial point cloud data comprises sampling information of a preset number of sampling points, and the sampling information comprises position coordinates and direction angles of the sampling points;
calculating the point cloud matching degree between the initial point cloud data and historical point cloud data in a preset time period;
modifying the initial point cloud data according to a preset number of step size variable quantities to obtain a preset number of initial reference point cloud data, and calculating a point cloud matching degree between each initial reference point cloud data and the historical point cloud data, wherein the step size variable quantities comprise position coordinate variable quantities and direction angle variable quantities;
determining point cloud data corresponding to the maximum point cloud matching degree as first point cloud data according to the point cloud matching degree of the initial point cloud data and the point cloud matching degree of each initial reference point cloud data;
modifying the first point cloud data according to the step length variable quantity of the preset quantity to obtain first reference point cloud data of the preset quantity, and calculating the point cloud matching degree between each first reference point cloud data and the historical point cloud data;
determining point cloud data corresponding to the maximum point cloud matching degree as second point cloud data according to the point cloud matching degree of the first point cloud data and the point cloud matching degree of each first reference point cloud data;
and if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, outputting positioning information according to the first point cloud data, wherein the positioning information comprises the position and the posture of the robot.
2. The method of claim 1, wherein the obtaining initial point cloud data comprises:
acquiring odometer information and laser sensor information of each sampling point;
calculating the position coordinates and the direction angles of each sampling point according to the odometer information;
and associating the position coordinates and the direction angles with the laser sensor information to generate the initial point cloud data.
3. The method according to claim 1, wherein the calculating the point cloud matching degree between the initial point cloud data and the historical point cloud data in a preset time period comprises:
matching the initial point cloud data with historical point cloud data in the preset time period according to a point cloud matching condition to obtain historical point cloud data meeting the point cloud matching condition, wherein the point cloud matching condition comprises the minimum sum of squares of distances between each sampling point in the initial point cloud data and a corresponding sampling point in the historical point cloud data;
calculating an angle deviation and a distance deviation between the initial point cloud data and the historical point cloud data meeting the point cloud matching condition, wherein the distance deviation comprises a distance error between each sampling point in the initial point cloud data and a corresponding sampling point in the historical point cloud data meeting the point cloud matching condition; the angular deviation is a central angular deviation between the initial point cloud data and the historical point cloud data, or a combination of the angular deviations between each corresponding sampling point;
calculating the point cloud matching degree between the initial point cloud data and the historical point cloud data meeting the point cloud matching condition according to the following formula:
wherein S is the point cloud matching degree, α is a preset credibility parameter, theta is the angle deviation, β is a preset matching degree parameter, and MiAnd N is the number of the sampling points of which the distance errors are smaller than the preset error threshold value.
4. The positioning method according to any one of claims 1 to 3, wherein if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, after outputting positioning information according to the first point cloud data, the positioning method further comprises:
if the point cloud matching degree of the first point cloud data is smaller than that of the second point cloud data, acquiring the current optimization times;
and if the optimization times do not reach a preset time threshold, identifying the second point cloud data as the first point cloud data, executing the step length variation according to the preset number to modify the first point cloud data to obtain a preset number of first reference point cloud data, and calculating the point cloud matching degree between each first reference point cloud data and the historical point cloud data.
5. The method of claim 4, wherein if the point cloud matching degree of the first point cloud data is smaller than the point cloud matching degree of the second point cloud data, after obtaining the current optimization times, the method further comprises:
and if the optimization times reach the time threshold, outputting the positioning information according to the second point cloud data.
6. A positioning device, characterized in that it comprises:
the system comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring initial point cloud data, the initial point cloud data comprises sampling information of a preset number of sampling points, and the sampling information comprises position coordinates and direction angles of the sampling points;
the calculation module is used for calculating the point cloud matching degree between the initial point cloud data and historical point cloud data in a preset time period;
the first modification module is used for modifying the initial point cloud data according to a preset number of step size variable quantities to obtain a preset number of initial reference point cloud data, and calculating a point cloud matching degree between each initial reference point cloud data and the historical point cloud data, wherein the step size variable quantities comprise position coordinate variable quantities and direction angle variable quantities;
the first selection module is used for determining point cloud data corresponding to the maximum point cloud matching degree as first point cloud data according to the point cloud matching degree of the initial point cloud data and the point cloud matching degree of each initial reference point cloud data;
the second modification module is used for modifying the first point cloud data according to the preset number of step size variable quantities to obtain the preset number of first reference point cloud data, and calculating the point cloud matching degree between each first reference point cloud data and the historical point cloud data;
the second selection module is used for determining point cloud data corresponding to the maximum point cloud matching degree according to the point cloud matching degree of the first point cloud data and the point cloud matching degree of each first reference point cloud data to serve as second point cloud data;
and the first output module is used for outputting positioning information according to the first point cloud data if the point cloud matching degree of the first point cloud data is greater than or equal to the point cloud matching degree of the second point cloud data, wherein the positioning information comprises the position and the posture of the robot.
7. The positioning device of claim 6, wherein the acquisition module comprises:
the acquisition sub-module is used for acquiring the odometer information and the laser sensor information of each sampling point;
the information calculation sub-module is used for calculating the position coordinates and the direction angles of the sampling points according to the odometer information;
and the generation submodule is used for associating the position coordinates and the direction angles with the laser sensor information to generate the initial point cloud data.
8. The positioning apparatus of claim 6, wherein the computing module comprises:
the data matching sub-module is used for matching the initial point cloud data with historical point cloud data in the preset time period according to a point cloud matching condition to obtain historical point cloud data meeting the point cloud matching condition, wherein the point cloud matching condition comprises the minimum sum of squares of distances between each sampling point in the initial point cloud data and a corresponding sampling point in the historical point cloud data;
a deviation calculation submodule for calculating an angle deviation and a distance deviation between the initial point cloud data and the historical point cloud data satisfying the point cloud matching condition, wherein the distance deviation includes a distance error between each sampling point in the initial point cloud data and a corresponding sampling point in the historical point cloud data satisfying the point cloud matching condition; the angular deviation is a central angular deviation between the initial point cloud data and the historical point cloud data, or a combination of the angular deviations between each corresponding sampling point;
the matching degree operator module is used for calculating the point cloud matching degree between the initial point cloud data and the historical point cloud data meeting the point cloud matching conditions according to the following formula:
wherein S is the point cloud matching degree, α is a preset credibility parameter, theta is the angle deviation, β is a preset matching degree parameter, and MiAnd N is the number of the sampling points of which the distance errors are smaller than the preset error threshold value.
9. The positioning device of any one of claims 6 to 8, further comprising:
the optimization frequency obtaining module is used for obtaining the current optimization frequency if the point cloud matching degree of the first point cloud data is smaller than the point cloud matching degree of the second point cloud data;
and the repeated processing module is used for identifying the second point cloud data as the first point cloud data if the optimization times do not reach a preset time threshold, executing the step length variation according to the preset number to modify the first point cloud data to obtain a preset number of first reference point cloud data, and calculating the point cloud matching degree between each first reference point cloud data and the historical point cloud data.
10. The positioning device of claim 9, further comprising:
and the second output module is used for outputting the positioning information according to the second point cloud data if the optimization times reach the time threshold.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710147835.3A CN108571967B (en) | 2017-03-13 | 2017-03-13 | Positioning method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710147835.3A CN108571967B (en) | 2017-03-13 | 2017-03-13 | Positioning method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108571967A CN108571967A (en) | 2018-09-25 |
CN108571967B true CN108571967B (en) | 2020-06-26 |
Family
ID=63577477
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710147835.3A Active CN108571967B (en) | 2017-03-13 | 2017-03-13 | Positioning method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108571967B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109540142B (en) * | 2018-11-27 | 2021-04-06 | 达闼科技(北京)有限公司 | Robot positioning navigation method and device, and computing equipment |
CN110954108A (en) * | 2019-12-04 | 2020-04-03 | 宁波羽声海洋科技有限公司 | Underwater matching navigation positioning method and device based on ocean current and electronic equipment |
CN111340860B (en) * | 2020-02-24 | 2023-09-19 | 北京百度网讯科技有限公司 | Registration and updating methods, devices, equipment and storage medium of point cloud data |
CN111815706B (en) * | 2020-06-23 | 2023-10-27 | 熵智科技(深圳)有限公司 | Visual identification method, device, equipment and medium for single-item unstacking |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9043069B1 (en) * | 2012-11-07 | 2015-05-26 | Google Inc. | Methods and systems for scan matching approaches for vehicle heading estimation |
CN105806345A (en) * | 2016-05-17 | 2016-07-27 | 杭州申昊科技股份有限公司 | Initialized positioning method for transformer substation inspection robot laser navigation |
CN106023210A (en) * | 2016-05-24 | 2016-10-12 | 百度在线网络技术(北京)有限公司 | Unmanned vehicle, and unmanned vehicle positioning method, device and system |
CN106092104A (en) * | 2016-08-26 | 2016-11-09 | 深圳微服机器人科技有限公司 | The method for relocating of a kind of Indoor Robot and device |
CN106323273A (en) * | 2016-08-26 | 2017-01-11 | 深圳微服机器人科技有限公司 | Robot relocation method and device |
-
2017
- 2017-03-13 CN CN201710147835.3A patent/CN108571967B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9043069B1 (en) * | 2012-11-07 | 2015-05-26 | Google Inc. | Methods and systems for scan matching approaches for vehicle heading estimation |
CN105806345A (en) * | 2016-05-17 | 2016-07-27 | 杭州申昊科技股份有限公司 | Initialized positioning method for transformer substation inspection robot laser navigation |
CN106023210A (en) * | 2016-05-24 | 2016-10-12 | 百度在线网络技术(北京)有限公司 | Unmanned vehicle, and unmanned vehicle positioning method, device and system |
CN106092104A (en) * | 2016-08-26 | 2016-11-09 | 深圳微服机器人科技有限公司 | The method for relocating of a kind of Indoor Robot and device |
CN106323273A (en) * | 2016-08-26 | 2017-01-11 | 深圳微服机器人科技有限公司 | Robot relocation method and device |
Also Published As
Publication number | Publication date |
---|---|
CN108571967A (en) | 2018-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108571967B (en) | Positioning method and device | |
CN111121754A (en) | Mobile robot positioning navigation method and device, mobile robot and storage medium | |
CN112179353B (en) | Positioning method and device of self-moving robot, robot and readable storage medium | |
CN110501036A (en) | The calibration inspection method and device of sensor parameters | |
CN1705861A (en) | Walker navigation device and program | |
CN110969649A (en) | Matching evaluation method, medium, terminal and device of laser point cloud and map | |
JP6589410B2 (en) | Map generating apparatus and program | |
CN110906924A (en) | Positioning initialization method and device, positioning method and device and mobile device | |
CN114111774B (en) | Vehicle positioning method, system, equipment and computer readable storage medium | |
CN112129282B (en) | Method and device for converting positioning results among different navigation modes | |
CN113932790A (en) | Map updating method, device, system, electronic equipment and storage medium | |
CN112750161A (en) | Map updating method for mobile robot and mobile robot positioning method | |
KR20210083198A (en) | Augmented reality device and positioning method | |
CN115290071A (en) | Relative positioning fusion method, device, equipment and storage medium | |
KR100998709B1 (en) | A method of robot localization using spatial semantics of objects | |
CN101598540B (en) | Three-dimensional positioning method and three-dimensional positioning system | |
CN113203424B (en) | Multi-sensor data fusion method and device and related equipment | |
JP2011511943A (en) | A method for calculating the motion of an object from sensor data with the aid of a computer | |
JP4210763B2 (en) | Method and apparatus for continuous positioning of moving body using both wireless LAN positioning and GPS positioning, and continuous positioning program for moving body | |
Fu et al. | Using RFID and INS for indoor positioning | |
CN111982115A (en) | Feature point map construction method, device and medium based on inertial navigation system | |
CN108955564B (en) | Laser data resampling method and system | |
CN112578369A (en) | Uncertainty estimation method and device, electronic equipment and storage medium | |
JP7214057B1 (en) | DATA PROCESSING DEVICE, DATA PROCESSING METHOD AND DATA PROCESSING PROGRAM | |
Yang et al. | A novel vision localization method of automated micro-polishing robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP03 | Change of name, title or address |
Address after: B501, Building F2, TCL Science Park, No. 1001, Zhongshanyuan Road, Shuguang Community, Xili Street, Nanshan District, Shenzhen City, Guangdong Province, 518000 Patentee after: LAUNCH DIGITAL TECHNOLOGY Co.,Ltd. Address before: 518000 Third Floor, Fengyun Building, Galaxy, No. 5 Xinxi Road, North District, Nanshan High-tech Park, Shenzhen City, Guangdong Province Patentee before: LAUNCH DIGITAL TECHNOLOGY Co.,Ltd. |
|
CP03 | Change of name, title or address |