CN114136316A - Inertial navigation error elimination method based on point cloud characteristic points, chip and robot - Google Patents

Inertial navigation error elimination method based on point cloud characteristic points, chip and robot Download PDF

Info

Publication number
CN114136316A
CN114136316A CN202111453891.2A CN202111453891A CN114136316A CN 114136316 A CN114136316 A CN 114136316A CN 202111453891 A CN202111453891 A CN 202111453891A CN 114136316 A CN114136316 A CN 114136316A
Authority
CN
China
Prior art keywords
point cloud
angle
characteristic points
inertial navigation
error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111453891.2A
Other languages
Chinese (zh)
Inventor
严勇显
赖钦伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN202111453891.2A priority Critical patent/CN114136316A/en
Publication of CN114136316A publication Critical patent/CN114136316A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/183Compensation of inertial measurements, e.g. for temperature effects
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The invention discloses a method, a chip and a robot for eliminating inertial navigation errors based on point cloud characteristic points. And then when the same point cloud characteristic point is detected next time, the pose of the characteristic point can be transformed into the reference twice before and after, and compared with the pose transformation of the inertial navigation, so that the deviation between the robot and the actual position is obtained, and then the error of the inertial navigation is corrected by utilizing the deviation, and the navigation precision of the robot is improved. The point cloud characteristic points only comprise key points for describing point cloud data, and the number of the point cloud characteristic points is small, so that the memory occupation is small, the operation speed is high, and the method is suitable for long-term storage and correction of errors accumulated by the robot for a long time.

Description

Inertial navigation error elimination method based on point cloud characteristic points, chip and robot
Technical Field
The invention relates to the technical field of navigation, in particular to a method, a chip and a robot for eliminating inertial navigation errors based on point cloud characteristic points.
Background
Inertial navigation is a low-cost and practical navigation method for an intelligent mobile robot, but the inertial navigation has prominent drawbacks, mainly expressed in low navigation accuracy. Among them, gyro drift and encoder drift are the main causes affecting navigation accuracy. In the operation process of the sweeping robot, due to complex factors such as temperature or wheel slippage, errors occur in the gyroscope and the encoder, and if the errors are not corrected, the robot gradually deviates from a route.
Disclosure of Invention
In order to solve the problems, the invention provides a method for eliminating inertial navigation errors based on point cloud characteristic points, a chip and a robot, wherein the inertial navigation errors can be quickly corrected through the point cloud characteristic points, and the method is particularly suitable for correcting the errors accumulated by the robot for a long time and improving the navigation precision of the robot. The specific technical scheme of the invention is as follows:
the inertial navigation error elimination method based on the point cloud characteristic points comprises the following steps: step S1, the mobile robot randomly collects point cloud data of an area, extracts point cloud characteristic points of the point cloud data and stores the point cloud characteristic points as a reference point set; step S2, in the walking process, if the mobile robot detects point cloud characteristic points which are the same as the point cloud characteristic points in the reference point set, matching the characteristic points to obtain pose transformation of the characteristic points; and step S3, the mobile robot calculates the pose transformation of inertial navigation, then the pose transformation of the characteristic points and the pose transformation of the inertial navigation are subtracted to obtain the inertial navigation error and judge, and if the inertial navigation error is larger than a threshold value, the inertial navigation error is corrected.
Further, in step S1, the mobile robot collects point cloud data through the 3D-TOF sensor.
Further, the step S1 further includes: after point cloud data are collected, the mobile robot judges, if the number of the point cloud data is smaller than a preset value, the point cloud data of the next area are collected until the number of the point cloud data is larger than the preset value, and then point cloud feature points are extracted.
Further, in step S2, the method for detecting the point cloud feature points identical to the point cloud feature points in the reference point set by the mobile robot includes: the mobile robot extracts point cloud characteristic points of the currently acquired point cloud data, then a descriptor of the point cloud characteristic points is established and compared with a descriptor of a reference point set, and if the point cloud characteristic points and the descriptor of the reference point set are the same, the point cloud characteristic points of the currently acquired point cloud data are the same as the reference point set; and storing the descriptors of the reference point set and the point cloud characteristic points corresponding to the descriptors in the reference point set.
Further, the inertial navigation pose transformation in step S3 includes an angle transformation of a gyroscope, an angle transformation of an encoder, and a displacement transformation of the encoder; the angle transformation of the gyroscope is obtained by making a difference between the current angle of the gyroscope and the last corrected angle of the gyroscope; the angle conversion of the encoder is obtained by integrating the angle recorded by a code disc in the current encoder; the displacement transformation of the encoder is obtained by integrating the displacement recorded by the code scale in the current encoder.
Further, the inertial navigation error in the step S3 includes an angle error of the gyroscope, an angle error of the encoder, and a displacement error of the encoder; the angle error of the gyroscope is obtained by making a difference between the angle transformation of the characteristic point and the angle transformation of the gyroscope; the angle error of the encoder is obtained by making a difference between the angle transformation of the characteristic points and the angle transformation of the encoder; the displacement error of the encoder is obtained by making a difference between the displacement transformation of the characteristic points and the displacement transformation of the encoder; the angle transformation of the characteristic points and the displacement transformation of the characteristic points belong to the pose transformation of the characteristic points.
Further, the method for correcting the inertial navigation data in step S3 includes: when the angle error of the gyroscope is larger than a threshold value, updating the current angle of the gyroscope to be the sum of the current angle of the gyroscope and the angle error of the gyroscope; when the angle error of the encoder is larger than a threshold value, updating the angle recorded by the current code disc to the sum of the angle recorded by the current code disc and the angle error of the encoder; and when the displacement error of the encoder is larger than the threshold value, updating the displacement recorded by the current code disc to the sum of the displacement recorded by the current code disc and the displacement error of the encoder.
A chip is stored with a computer program, and when the computer program is executed, the steps of the inertial navigation error elimination method based on point cloud feature points are realized.
A mobile robot is equipped with the chip.
The invention has the beneficial effects that: compared with the prior art, the method provided by the invention has the advantages that the point cloud data is collected through the 3D-TOF sensor, the characteristic points of the point cloud data are extracted and stored as the reference point set. And then when the same point cloud characteristic point is detected next time, the pose of the characteristic point can be transformed into the reference twice before and after, and compared with the pose transformation of the inertial navigation, so that the deviation between the robot and the actual position is obtained, and then the error of the inertial navigation is corrected by utilizing the deviation, and the navigation precision of the robot is improved. The point cloud characteristic points only comprise key points for describing point cloud data, and the number of the point cloud characteristic points is small, so that the memory occupation is small, the operation speed is high, and the method is suitable for long-term storage and correction of errors accumulated by the robot for a long time.
Drawings
Fig. 1 is a flowchart of an inertial navigation error elimination method based on point cloud feature points according to an embodiment of the present invention.
Detailed Description
The following further describes embodiments of the present invention with reference to the drawings. It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
In the inertial navigation sweeping robot in the prior art, a coded disc in an odometer is used for counting a driving wheel, if the driving wheel slips and a ground medium bumps, a coded disc counting value has a large error relative to the actual walking distance of the robot along with the accumulation of time, so that the calculated pose of the robot has a deviation, and the deviation is directly shown on a map that the robot constructs in real time and becomes inclined and cannot be matched with an original image; even if a laser radar or a visual camera is used, the scanning positioning can be carried out in real time due to the reasons of self-transmission, slipping and the like of wheels, the scanning positioning result position has large deviation, and the robot can not reposition itself in serious cases, so that the robot stops.
Therefore, as shown in fig. 1, an embodiment of the present invention provides a method for eliminating inertial navigation error based on point cloud feature points, which is particularly suitable for correcting an accumulated error of robot inertial navigation, and the method includes the following steps:
step S1, the mobile robot randomly collects point cloud data of an area, extracts point cloud characteristic points of the point cloud data and stores the point cloud characteristic points as a reference point set. In step S1, the mobile robot collects point cloud data and then determines that the point cloud data is collected, and if the number of the point cloud data is smaller than a preset value, the point cloud data of the next area is collected until the number of the point cloud data is larger than the preset value, and then point cloud feature points are extracted, otherwise, representative point cloud feature points may not be extracted due to too small amount of point cloud data, or only a small amount of point cloud feature points may be extracted, which is not beneficial to subsequently detecting whether the newly obtained point cloud feature points are the same as the reference point set. Preferably, the mobile robot acquires point cloud data by means of a 3D-TOF sensor. It should be noted that the method for extracting the point cloud Feature points includes any one of NARF (Normal Aligned Radial Feature), SIFT (Scale Invariant Feature Transform), or an extraction method based on a neural network, and the obtained Feature points include position, Scale, and direction information.
Step S2, in the walking process, if the mobile robot detects point cloud characteristic points which are the same as the point cloud characteristic points in the reference point set, the characteristic points are matched, and pose transformation of the characteristic points is obtained. It should be noted that descriptors corresponding to the point cloud feature points are also stored in the reference point set. Besides the position, scale and direction information of the feature points, the descriptor also comprises points which are around the feature points of the point cloud and contribute to the feature points, so that the feature points do not change along with various changes, such as illumination changes, view angle changes and the like. And the descriptor has higher uniqueness so as to improve the probability of correct matching of the point cloud feature points. The process of building the descriptor is roughly as follows: (1) dividing the area where the point cloud characteristic points are located according to a preset size, and determining the area for calculating the descriptor; (2) rotating the coordinate axis into the direction of the point cloud characteristic points to ensure the invariance of rotation; (3) generating a gradient histogram and carrying out trilinear interpolation to obtain a feature vector of the point cloud feature points; (4) normalizing the feature vector; (5) and setting a threshold value to cut off a larger gradient value, and then carrying out normalization processing again to obtain a descriptor with higher uniqueness.
In the process of executing step S2, the method for detecting the point cloud feature points identical to the reference point set by the mobile robot includes: the mobile robot extracts the point cloud characteristic points of the currently acquired point cloud data, then a descriptor of the point cloud characteristic points is established and compared with the descriptor of the reference point set, and if the point cloud characteristic points and the descriptor of the reference point set are the same, the point cloud characteristic points of the currently acquired point cloud data are the same as the reference point set. It should be noted that, because the descriptor has a higher uniqueness (or representativeness), the descriptor is used to determine whether the point cloud feature point of the currently acquired point cloud data is the same as the reference point set, so that the accuracy of the determination result can be greatly improved, and the error caused by subsequently using the wrong data to correct the inertial navigation is avoided, thereby causing an adverse effect.
In the process of executing step S2, the mobile robot performs feature Point matching by using a PNP (passive-N-Point) algorithm to obtain pose transformation of the feature points. Preferably, the value of N is 3, that is, 3 are adopted for the matching calculation of the feature points. The known point cloud data registration needs two complete point clouds to be calculated, and point cloud feature point matching only needs N pairs of feature points, so that the calculation amount is greatly reduced. In addition, the number of the point cloud characteristic points is small, the requirement on the memory is low, the point cloud characteristic points can be stored in the robot for a long time, and in addition, the uniqueness of the descriptor enables the point cloud characteristic points to be particularly suitable for correcting accumulated errors of the robot after the robot runs for a long time. The pose transformation of the feature points comprises angle transformation of the feature points and displacement transformation of the feature points, namely, an angle difference value and a displacement difference value between the position of the point cloud feature points acquired for the second time and the position of the point cloud feature points acquired for the first time in the reference point set by the mobile robot.
And step S3, the mobile robot calculates the pose transformation of inertial navigation, then the pose transformation of the characteristic points and the pose transformation of the inertial navigation are subtracted to obtain the inertial navigation error and judge, and if the inertial navigation error is larger than a threshold value, the inertial navigation error is corrected. The inertial navigation pose transformation comprises angle transformation of a gyroscope, angle transformation of an encoder and displacement transformation of the encoder; the angle transformation of the gyroscope is obtained by making a difference between the current angle of the gyroscope and the last corrected angle of the gyroscope; the angle conversion of the encoder is obtained by integrating the angle recorded by a code disc in the current encoder; the displacement transformation of the encoder is obtained by integrating the displacement recorded by the code scale in the current encoder.
It can be seen that, when the inertial navigation has no error, the pose transformation of the feature point is the same as that of the inertial navigation or the difference is not large enough to be ignored. However, the inertial navigation data inevitably generate accumulated errors after the robot runs for a long time. In step S3, the inertial navigation error includes an angle error of the gyroscope, an angle error of the encoder, and a displacement error of the encoder; the angle error of the gyroscope is obtained by making a difference between the angle transformation of the characteristic point and the angle transformation of the gyroscope; the angle error of the encoder is obtained by making a difference between the angle transformation of the characteristic points and the angle transformation of the encoder; and the displacement error of the encoder is obtained by subtracting the displacement transformation of the characteristic point and the displacement transformation of the encoder.
In the step S3, the method for correcting inertial navigation data by the mobile robot includes: when the angle error of the gyroscope is larger than a threshold value, updating the current angle of the gyroscope to be the sum of the current angle of the gyroscope and the angle error of the gyroscope; when the angle error of the encoder is larger than a threshold value, updating the angle recorded by the current code disc to the sum of the angle recorded by the current code disc and the angle error of the encoder; and when the displacement error of the encoder is larger than the threshold value, updating the displacement recorded by the current code disc to the sum of the displacement recorded by the current code disc and the displacement error of the encoder. As described above, the mobile robot determines the angular error of the gyroscope, the angular error of the encoder, and the displacement error of the encoder, and corrects the value having a serious deviation. It should be noted that the invention transforms the pose of the 3D-TOF point cloud feature point into a reference, that is, the pose transformation of the 3D-TOF point cloud feature point is considered to be accurate, without error or with negligible error. Based on the method, the gyroscope and the encoder can be known whether serious deviation exists on the angle and the position by comparing the pose transformation of inertial navigation with the pose transformation of the 3D-TOF point cloud characteristic point, and the difference value of the gyroscope and the encoder is equivalent to the error of inertial navigation. It will be appreciated that summing the inertial navigation error with the corresponding data eliminates the error.
Compared with the prior art, the method provided by the invention has the advantages that the point cloud data is collected through the 3D-TOF sensor, the characteristic points of the point cloud data are extracted and stored as the reference point set. And then when the same point cloud characteristic point is detected next time, the pose of the characteristic point can be transformed into the reference twice before and after, and compared with the pose transformation of the inertial navigation, so that the deviation between the robot and the actual position is obtained, and then the error of the inertial navigation is corrected by utilizing the deviation, and the navigation precision of the robot is improved. The point cloud characteristic points only comprise key points for describing point cloud data, and the number of the point cloud characteristic points is small, so that the memory occupation is small, the operation speed is high, and the method is suitable for long-term storage and correction of errors accumulated by the robot for a long time.
The invention also discloses a chip for storing computer program codes, and the computer program codes realize the steps of the inertial navigation error elimination method based on the point cloud characteristic points when being executed. It should be noted that any process or method description in the foregoing method steps or otherwise described herein may be understood as representing modules, segments or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and that the scope of the preferred embodiments of the present invention includes other implementations in which functions may be executed out of order from that shown or discussed, including in a substantially simultaneous manner or in reverse order depending on the functions involved, e.g., steps S2 and S3 may be executed concurrently, or the pose transform of the step S3 routine may be calculated prior to the pose transform of the point cloud, as will be understood by those skilled in the art to which embodiments of the present invention pertain.
The invention also discloses a mobile robot, preferably, the mobile robot is provided with a 3D-TOF sensor, and the sensor is used for collecting point cloud data, extracting characteristic points of the point cloud data and storing the characteristic points as a reference point set. And then when the same point cloud characteristic point is detected next time, the pose of the characteristic point can be transformed into the reference twice before and after, and compared with the pose transformation of the inertial navigation, so that the deviation between the robot and the actual position is obtained, and then the error of the inertial navigation is corrected by utilizing the deviation, and the navigation precision of the robot is improved.
Obviously, the above-mentioned embodiments are only a part of embodiments of the present invention, not all embodiments, and the technical solutions of the embodiments may be combined with each other. Furthermore, if terms such as "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., appear in the embodiments, their indicated orientations or positional relationships are based on those shown in the drawings only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the referred devices or elements must have a specific orientation or be constructed and operated in a specific orientation, and thus, should not be construed as limiting the present invention. If the terms "first", "second", "third", etc. appear in the embodiments, they are for convenience of distinguishing between related features, and they are not to be construed as indicating or implying any relative importance, order or number of features.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. The inertial navigation error elimination method based on the point cloud characteristic points is characterized by comprising the following steps:
step S1, the mobile robot randomly collects point cloud data of an area, extracts point cloud characteristic points of the point cloud data and stores the point cloud characteristic points as a reference point set;
step S2, in the walking process, if the mobile robot detects point cloud characteristic points which are the same as the point cloud characteristic points in the reference point set, matching the characteristic points to obtain pose transformation of the characteristic points;
and step S3, the mobile robot calculates the pose transformation of inertial navigation, then the pose transformation of the characteristic points and the pose transformation of the inertial navigation are subtracted to obtain the inertial navigation error and judge, and if the inertial navigation error is larger than a threshold value, the inertial navigation error is corrected.
2. The method for eliminating inertial navigation error based on point cloud feature points of claim 1, wherein in step S1, the mobile robot acquires point cloud data through a 3D-TOF sensor.
3. The method for eliminating inertial navigation error based on point cloud feature points according to claim 2, wherein the step S1 further includes: after point cloud data are collected, the mobile robot judges, if the number of the point cloud data is smaller than a preset value, the point cloud data of the next area are collected until the number of the point cloud data is larger than the preset value, and then point cloud feature points are extracted.
4. The method for eliminating inertial navigation error based on point cloud feature points of claim 3, wherein in step S2, the method for detecting point cloud feature points identical to the reference point set by the mobile robot includes:
the mobile robot extracts point cloud characteristic points of the currently acquired point cloud data, then a descriptor of the point cloud characteristic points is established and compared with a descriptor of a reference point set, and if the point cloud characteristic points and the descriptor of the reference point set are the same, the point cloud characteristic points of the currently acquired point cloud data are the same as the reference point set; and storing the descriptors of the reference point set and the point cloud characteristic points corresponding to the descriptors in the reference point set.
5. The inertial navigation error elimination method based on point cloud feature points according to claim 1, wherein the pose transformation of inertial navigation in the step S3 includes an angle transformation of a gyroscope, an angle transformation of an encoder, and a displacement transformation of the encoder; wherein the content of the first and second substances,
the angle transformation of the gyroscope is obtained by making a difference between the current angle of the gyroscope and the last corrected angle of the gyroscope;
the angle conversion of the encoder is obtained by integrating the angle recorded by a code disc in the current encoder;
the displacement transformation of the encoder is obtained by integrating the displacement recorded by the code scale in the current encoder.
6. The method of eliminating inertial navigation error based on point cloud feature points according to claim 5, wherein the inertial navigation error in the step S3 includes an angle error of a gyroscope, an angle error of an encoder, and a displacement error of the encoder; wherein the content of the first and second substances,
the angle error of the gyroscope is obtained by making a difference between the angle transformation of the characteristic points and the angle transformation of the gyroscope;
the angle error of the encoder is obtained by making a difference between the angle transformation of the characteristic points and the angle transformation of the encoder;
the displacement error of the encoder is obtained by making a difference between the displacement transformation of the characteristic points and the displacement transformation of the encoder;
the angle transformation of the characteristic points and the displacement transformation of the characteristic points belong to the pose transformation of the characteristic points.
7. The method for eliminating inertial navigation error based on point cloud feature points according to claim 6, wherein the method for correcting inertial navigation data in step S3 includes:
when the angle error of the gyroscope is larger than a threshold value, updating the current angle of the gyroscope to be the sum of the current angle of the gyroscope and the angle error of the gyroscope;
when the angle error of the encoder is larger than a threshold value, updating the angle recorded by the current code disc to the sum of the angle recorded by the current code disc and the angle error of the encoder;
and when the displacement error of the encoder is larger than the threshold value, updating the displacement recorded by the current code disc to the sum of the displacement recorded by the current code disc and the displacement error of the encoder.
8. A chip stored with a computer program, wherein the computer program is executed to implement the steps of the method for inertial navigation error elimination based on point cloud feature points according to any one of claims 1 to 7.
9. A mobile robot, characterized in that it is equipped with a chip according to claim 8.
CN202111453891.2A 2021-12-01 2021-12-01 Inertial navigation error elimination method based on point cloud characteristic points, chip and robot Pending CN114136316A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111453891.2A CN114136316A (en) 2021-12-01 2021-12-01 Inertial navigation error elimination method based on point cloud characteristic points, chip and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111453891.2A CN114136316A (en) 2021-12-01 2021-12-01 Inertial navigation error elimination method based on point cloud characteristic points, chip and robot

Publications (1)

Publication Number Publication Date
CN114136316A true CN114136316A (en) 2022-03-04

Family

ID=80386972

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111453891.2A Pending CN114136316A (en) 2021-12-01 2021-12-01 Inertial navigation error elimination method based on point cloud characteristic points, chip and robot

Country Status (1)

Country Link
CN (1) CN114136316A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170083662A (en) * 2016-01-08 2017-07-19 서울대학교산학협력단 Map building apparatus being robust in sensor error
US20180284802A1 (en) * 2016-08-29 2018-10-04 PerceptIn, Inc. Fault Tolerance to Provide Robust Tracking for Autonomous Positional Awareness
US20190235083A1 (en) * 2016-03-11 2019-08-01 Kaarta, Inc. Laser scanner with real-time, online ego-motion estimation
US10390003B1 (en) * 2016-08-29 2019-08-20 Perceptln Shenzhen Limited Visual-inertial positional awareness for autonomous and non-autonomous device
CN110689622A (en) * 2019-07-05 2020-01-14 电子科技大学 Synchronous positioning and composition algorithm based on point cloud segmentation matching closed-loop correction
CN110873883A (en) * 2019-11-29 2020-03-10 上海有个机器人有限公司 Positioning method, medium, terminal and device integrating laser radar and IMU
CN110906950A (en) * 2018-09-14 2020-03-24 高德软件有限公司 Calibration method and device for laser coordinate system and inertial navigation coordinate system and storage medium
CN110954134A (en) * 2019-12-04 2020-04-03 上海有个机器人有限公司 Gyro offset correction method, correction system, electronic device, and storage medium
CN112051590A (en) * 2020-08-31 2020-12-08 广州文远知行科技有限公司 Detection method and related device for laser radar and inertial measurement unit
WO2021057742A1 (en) * 2019-09-27 2021-04-01 Oppo广东移动通信有限公司 Positioning method and apparatus, device, and storage medium
CN112595323A (en) * 2020-12-08 2021-04-02 深圳市优必选科技股份有限公司 Robot and drawing establishing method and device thereof
CN112859860A (en) * 2021-01-13 2021-05-28 宁波工业互联网研究院有限公司 Robot system and path planning method thereof
CN113587934A (en) * 2021-07-30 2021-11-02 深圳市普渡科技有限公司 Robot, indoor positioning method and device and readable storage medium
WO2021233309A1 (en) * 2020-05-21 2021-11-25 杭州海康威视数字技术股份有限公司 Extrinsic parameter change detection method and apparatus, electronic device, and detection system

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170083662A (en) * 2016-01-08 2017-07-19 서울대학교산학협력단 Map building apparatus being robust in sensor error
US20190235083A1 (en) * 2016-03-11 2019-08-01 Kaarta, Inc. Laser scanner with real-time, online ego-motion estimation
US20180284802A1 (en) * 2016-08-29 2018-10-04 PerceptIn, Inc. Fault Tolerance to Provide Robust Tracking for Autonomous Positional Awareness
US10390003B1 (en) * 2016-08-29 2019-08-20 Perceptln Shenzhen Limited Visual-inertial positional awareness for autonomous and non-autonomous device
CN110906950A (en) * 2018-09-14 2020-03-24 高德软件有限公司 Calibration method and device for laser coordinate system and inertial navigation coordinate system and storage medium
CN110689622A (en) * 2019-07-05 2020-01-14 电子科技大学 Synchronous positioning and composition algorithm based on point cloud segmentation matching closed-loop correction
WO2021057742A1 (en) * 2019-09-27 2021-04-01 Oppo广东移动通信有限公司 Positioning method and apparatus, device, and storage medium
CN110873883A (en) * 2019-11-29 2020-03-10 上海有个机器人有限公司 Positioning method, medium, terminal and device integrating laser radar and IMU
CN110954134A (en) * 2019-12-04 2020-04-03 上海有个机器人有限公司 Gyro offset correction method, correction system, electronic device, and storage medium
WO2021233309A1 (en) * 2020-05-21 2021-11-25 杭州海康威视数字技术股份有限公司 Extrinsic parameter change detection method and apparatus, electronic device, and detection system
CN113701745A (en) * 2020-05-21 2021-11-26 杭州海康威视数字技术股份有限公司 External parameter change detection method and device, electronic equipment and detection system
CN112051590A (en) * 2020-08-31 2020-12-08 广州文远知行科技有限公司 Detection method and related device for laser radar and inertial measurement unit
CN112595323A (en) * 2020-12-08 2021-04-02 深圳市优必选科技股份有限公司 Robot and drawing establishing method and device thereof
CN112859860A (en) * 2021-01-13 2021-05-28 宁波工业互联网研究院有限公司 Robot system and path planning method thereof
CN113587934A (en) * 2021-07-30 2021-11-02 深圳市普渡科技有限公司 Robot, indoor positioning method and device and readable storage medium

Similar Documents

Publication Publication Date Title
US9625912B2 (en) Methods and systems for mobile-agent navigation
CN108303103B (en) Method and device for determining target lane
CN111912416B (en) Method, device and equipment for positioning equipment
CN112179330B (en) Pose determination method and device of mobile equipment
US9157757B1 (en) Methods and systems for mobile-agent navigation
CN109074085B (en) Autonomous positioning and map building method and device and robot
US20160068114A1 (en) Methods and Systems for Mobile-Agent Navigation
CN112667837A (en) Automatic image data labeling method and device
CN110909711B (en) Method, device, electronic equipment and storage medium for detecting lane line position change
JP2020508506A (en) Position estimation using 2D maps that capture vertical structures in 3D point data
JP5385105B2 (en) Image search method and system
CN110827353B (en) Robot positioning method based on monocular camera assistance
CN111427904A (en) High-precision map data updating method and device and electronic equipment
KR20170083662A (en) Map building apparatus being robust in sensor error
CN109739830B (en) Position fingerprint database rapid construction method based on crowdsourcing data
CN113298026A (en) Lane line determination method and system, vehicle, and storage medium
CN114136316A (en) Inertial navigation error elimination method based on point cloud characteristic points, chip and robot
CN111623783A (en) Initial positioning method, visual navigation equipment and warehousing system
CN115683170A (en) Calibration method based on radar point cloud data fusion error
WO2022252482A1 (en) Robot, and environment map construction method and apparatus therefor
CN115077563A (en) Vehicle positioning accuracy evaluation method and device and electronic equipment
CN114037977A (en) Road vanishing point detection method, device, equipment and storage medium
CN113566824A (en) Vehicle positioning method and device, electronic equipment and storage medium
CN117044478B (en) Mower control method and device, mower, electronic equipment and storage medium
Ma et al. FMCW Radar on LiDAR map localization in structural urban environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination