CN112747746A - Point cloud data acquisition method based on single-point TOF, chip and mobile robot - Google Patents

Point cloud data acquisition method based on single-point TOF, chip and mobile robot Download PDF

Info

Publication number
CN112747746A
CN112747746A CN202011559909.2A CN202011559909A CN112747746A CN 112747746 A CN112747746 A CN 112747746A CN 202011559909 A CN202011559909 A CN 202011559909A CN 112747746 A CN112747746 A CN 112747746A
Authority
CN
China
Prior art keywords
tof
coordinates
robot
mobile robot
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011559909.2A
Other languages
Chinese (zh)
Inventor
陈卓标
周和文
黄惠保
杨武
赖钦伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN202011559909.2A priority Critical patent/CN112747746A/en
Publication of CN112747746A publication Critical patent/CN112747746A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a point cloud data acquisition method based on single-point TOF, a chip and a mobile robot, wherein the method comprises the following steps: s1: the mobile robot sets basic coordinates and obtains IMU data and TOF data; s2: the mobile robot determines the current robot coordinate based on the basic coordinate and the IMU data; s3: the mobile robot acquires TOF coordinates based on current robot coordinates; s4: the mobile robot determines point cloud data based on current robot coordinates, TOF coordinates and TOF data. The mobile robot acquires coordinates in the moving process by setting basic coordinates, and then acquires coordinates of a series of obstacles relative to the basic coordinates according to the coordinates in the moving process and detection data, so that the accuracy is high, and the cost is low by adopting a TOF sensor as a detection sensor; the obtained point cloud data can be used for drawing, positioning and obstacle avoidance of the robot according to actual conditions, and the practicability is high.

Description

Point cloud data acquisition method based on single-point TOF, chip and mobile robot
Technical Field
The invention relates to the technical field of electronics, in particular to a point cloud data acquisition method based on single-point TOF, a chip and a mobile robot.
Background
Before moving, a mobile robot firstly builds and positions the position of the mobile robot, and then moves according to the built topographic map and the position of the mobile robot, at present, for the existing robot, vision, laser radar or inertial navigation are adopted to build and position the map, if the number of the arranged sensors is small, the obtained detection data can be used for building, positioning or edgewise of the robot through complex calculation, and the calculation difficulty is high; if the number of the arranged sensors is large, the production cost is too high.
Disclosure of Invention
In order to solve the problems, the invention provides a point cloud data acquisition method based on single-point TOF, a chip and a mobile robot. The specific technical scheme of the invention is as follows:
a point cloud data acquisition method based on single-point TOF comprises the following steps: s1: the mobile robot sets basic coordinates and obtains IMU data and TOF data; s2: the mobile robot determines the current robot coordinate based on the basic coordinate and the IMU data; s3: the mobile robot acquires TOF coordinates based on current robot coordinates; s4: the mobile robot determines point cloud data based on current robot coordinates, TOF coordinates and TOF data. The mobile robot acquires coordinates in the moving process by setting basic coordinates, and then acquires coordinates of a series of obstacles relative to the basic coordinates according to the coordinates in the moving process and detection data, so that the accuracy is high, and the cost is low by adopting a TOF sensor as a detection sensor; the obtained point cloud data can be used for drawing, positioning and obstacle avoidance of the robot according to actual conditions, and the practicability is high.
In one or more aspects of the present invention, in step S1: the mobile robot is based on the center position at the start of operation. The mobile robot sets basic coordinates when starting to work, and accuracy of data acquired by the robot is improved.
In one or more aspects of the present invention, in step S2: the robot coordinates are coordinates of the center position of the mobile robot, the mobile robot establishes a world coordinate system by taking the basic coordinates as an origin, and the current robot coordinates are calculated according to IMU data acquired in the moving process.
In one or more aspects of the present invention, in step S3: the TOF coordinate is the coordinate of the TOF module, the mobile robot acquires position information of the TOF module relative to the center of the robot in advance, then the mobile robot establishes a coordinate system by taking the coordinate of the robot as an origin and taking the right front side as an x axis, and the TOF coordinate is acquired according to the coordinate of the robot and the position information.
In one or more aspects of the present invention, in step S4: the TOF data is the measured distance between the TOF module and the obstacle, the mobile robot obtains the coordinates of the obstacle according to the coordinates of the robot, the TOF coordinates and the TOF data through a calculation formula, and the coordinates of the obstacle are used as point cloud data obtained by the mobile robot. The coordinates of the TOF module are determined by presetting the position information of the TOF module on the robot and the data acquired in the moving process, so that the accuracy is high and the practicability is high.
In one or more aspects of the present invention, the calculation formula is: ox = cos (r θ) × tx-sin (r θ) × ty + rx + d × cos (r θ + t θ); oy = sin (r θ) × tx + cos (r θ) × ty + ry + d × sin (r θ + t θ); the robot coordinate is (rx, ry, r θ), the TOF coordinate is (tx, ty, t θ), the TOF data is d, and the obstacle coordinate is (ox, oy).
The mobile robot comprises a main body, wherein the main body is provided with a TOF module and an IMU module, the TOF module is used for detecting the distance between the TOF module and an obstacle, and the IMU module is used for acquiring IMU data. The robot acquires data through the TOF module and the IMU module, and is simple in structure and low in production cost.
In one or more aspects of the present invention, the IMU module includes a six-axis gyroscope and a code wheel.
The chip is internally provided with a control program, and is characterized in that the control program is used for controlling a robot to execute the point cloud data acquisition method based on the single-point TOF. The robot can use the method by loading the chip, and the practicability is high.
A mobile robot is equipped with a main control chip, and is characterized in that the main control chip is the chip. The robot acquires point cloud data through the method, and the production cost of the robot is reduced.
Drawings
FIG. 1 is a schematic flow chart of a point cloud data acquisition method based on single point TOF according to the present invention;
fig. 2 is a schematic structural view of coordinates of the mobile robot of the present invention;
fig. 3 is a schematic structural diagram of the mobile robot of the present invention.
Detailed Description
Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout.
In the description of the present invention, it should be noted that, for the terms of orientation, such as "central", "lateral", "longitudinal", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", etc., it indicates that the orientation and positional relationship shown in the drawings are based on the orientation or positional relationship shown in the drawings, and is only for the convenience of describing the present invention and simplifying the description, but does not indicate or imply that the device or element referred to must have a specific orientation, be constructed in a specific orientation, and be operated without limiting the specific scope of protection of the present invention.
Furthermore, if the terms "first" and "second" are used for descriptive purposes only, they are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features. Thus, a definition of "a first" or "a second" feature may explicitly or implicitly include one or more of the feature, and in the description of the invention, "at least" means one or more unless specifically defined otherwise.
In the present invention, unless otherwise expressly specified or limited, the terms "assembled", "connected", and "connected" are to be construed broadly, e.g., as meaning fixedly connected, detachably connected, or integrally connected; or may be a mechanical connection; the two elements can be directly connected or connected through an intermediate medium, and the two elements can be communicated with each other. The specific meanings of the above terms in the present invention can be understood by those of ordinary skill in the art according to specific situations.
In the present invention, unless otherwise specified and limited, "above" or "below" a first feature may include the first and second features being in direct contact, and may also include the first and second features not being in direct contact but being in contact with each other through another feature therebetween. Also, the first feature being "above," "below," and "above" the second feature includes the first feature being directly above and obliquely above the second feature, or simply an elevation which indicates a level of the first feature being higher than an elevation of the second feature. The first feature being "above", "below" and "beneath" the second feature includes the first feature being directly below or obliquely below the second feature, or merely means that the first feature is at a lower level than the second feature.
The technical scheme and the beneficial effects of the invention are clearer and clearer by further describing the specific embodiment of the invention with the accompanying drawings of the specification. The embodiments described below are exemplary and are intended to be illustrative of the invention, but are not to be construed as limiting the invention.
Referring to fig. 1, a point cloud data acquisition method based on single-point TOF includes the following steps: s1: the mobile robot sets basic coordinates and obtains IMU data and TOF data; s2: the mobile robot determines the current robot coordinate based on the basic coordinate and the IMU data; s3: the mobile robot acquires TOF coordinates based on current robot coordinates; s4: the mobile robot determines point cloud data based on current robot coordinates, TOF coordinates and TOF data. The mobile robot acquires coordinates in the moving process by setting basic coordinates, and then acquires coordinates of a series of obstacles relative to the basic coordinates according to the coordinates in the moving process and detection data, so that the accuracy is high, and the cost is low by adopting a TOF sensor as a detection sensor; the obtained point cloud data can be used for drawing, positioning and obstacle avoidance of the robot according to actual conditions, and the practicability is high.
As one of the embodiments, the mobile robot is based on the center position at the time of starting the work. The mobile robot sets basic coordinates when starting to work, and accuracy of data acquired by the robot is improved.
In one embodiment, the robot coordinates are coordinates of a center position of the mobile robot, the mobile robot establishes a world coordinate system with the basic coordinates as an origin, and the current robot coordinates are calculated according to IMU data acquired during the moving process.
As one embodiment, the TOF coordinates are coordinates of the TOF module, the mobile robot acquires position information of the TOF module relative to the center of the robot in advance, then the mobile robot establishes a coordinate system by taking the coordinates of the robot as an origin and taking the front side as an x axis, and the TOF coordinates are acquired according to the coordinates and the position information of the robot.
As an embodiment, the TOF data is a measured distance between the TOF module and an obstacle, and the mobile robot acquires coordinates of the obstacle according to the coordinates of the robot, the TOF coordinates and the TOF data by using a calculation formula, wherein the coordinates of the obstacle are used as point cloud data acquired by the mobile robot. The coordinates of the TOF module are determined by presetting the position information of the TOF module on the robot and the data acquired in the moving process, so that the accuracy is high and the practicability is high. The calculation formula is as follows: ox = cos (r θ) × tx-sin (r θ) × ty + rx + d × cos (r θ + t θ); oy = sin (r θ) × tx + cos (r θ) × ty + ry + d × sin (r θ + t θ); the robot coordinate is (rx, ry, r θ), the TOF coordinate is (tx, ty, t θ), the TOF data is d, and the obstacle coordinate is (ox, oy).
The mobile robot comprises a main body, wherein the main body is provided with a TOF module and an IMU module, the TOF module is used for detecting the distance between the TOF module and an obstacle, and the IMU module is used for acquiring IMU data. The robot acquires data through the TOF module and the IMU module, and is simple in structure and low in production cost. The IMU module includes a six-axis gyroscope and a code wheel.
As can be seen from fig. 2, when the mobile robot works, the mobile robot sets up a world coordinate system with the basic coordinate as the base coordinate and the origin as the base coordinate, then starts to move, acquires IMU data through the IMU module during the moving process, and estimates the relative motion posture of the robot from the time t1 to the time t2 through collecting IMU data from the time t1 to the time t 2. Therefore, the robot coordinate at the time t2 can be calculated from the robot coordinate at the time t 1. For a plane-moving robot, the IMU data only includes the angular velocity in the direction perpendicular to the ground, and the two-wheel encoder counts can estimate the attitude of the robot. (this is a robot disclosure technique, which is commonly used in inertial navigation robots). After the mobile robot knows the current robot coordinate, a robot coordinate system is established by taking the current robot coordinate as an origin, an x axis is arranged right in front of the robot, a y axis can be set according to the actual situation, the y axis in the figure is the left side direction of the mobile robot, TOF coordinates of a TOF module relative to the robot coordinate system are obtained according to position parameters when the TOF module is set, for example, the robot coordinate in the figure is (1.0, 1.0, 1.5707), the TOF coordinates are (0.035, -0.165, -1.5707), and then the distance between the TOF module and an obstacle, the robot coordinate and the TOF coordinate are obtained according to a formula ox = cos (r theta) tx-sin (r theta) ty + rx + d cos (r theta + t theta); oy = sin (r θ) × tx + cos (r θ) × ty + ry + d × sin (r θ + t θ); wherein, the robot coordinate is (rx, ry, r theta), the TOF coordinate is (tx, ty, t theta), the TOF data is d, and the coordinate of the obstacle is (ox, oy). The point cloud data (point cloud data) refers to the fact that scanning data are recorded in a point form, in the application, a single-point TOF module is adopted, the obtained coordinate of one point on the obstacle is obtained, and the coordinate is the point cloud data. The mobile robot can acquire a series of point cloud data in the moving process.
The chip is internally provided with a control program, and is characterized in that the control program is used for controlling a robot to execute the point cloud data acquisition method based on the single-point TOF. The robot can use the method by loading the chip, and the practicability is high.
A mobile robot is equipped with a main control chip, and is characterized in that the main control chip is the chip. The robot acquires point cloud data through the method, and the production cost of the robot is reduced.
Referring to fig. 3, the point cloud data acquisition structure of the robot comprises a main body 1 and a controller, wherein a TOF module 2 of a single point is arranged on the front left side or the front right side of the main body, the TOF module 2 is electrically connected with the controller, and the detection direction of the TOF module 2 is parallel to the wheel axis of the robot. The detection direction of the TOF module 2 is perpendicular to the wall surface, so that the robot can conveniently acquire distance information between the robot and the wall surface, and the robot can directly use acquired data to modify the pose of the robot in the edgewise process without complex calculation.
As an example, the TOF module 2 comprises a TOF sensor, the model of which is VL 6180. The cost is lower, and the practicality is high. The TOF sensor includes a transmitter and a receiver arranged in a horizontal array. The interval between the midline of the TOF module 2 and the midline of the main body 1 is 30mm to 40mm, the midline of the main body 1 is parallel to the wheel axis of the robot, the set interval can enable the machine to better wind around a column and a wall corner along the edge, and the acquired data is more accurate.
As one embodiment, the front end of the main body 1 is provided with a collision strip 3, the collision strip 3 is provided with a round hole 4, and the TOF module 2 is arranged on one side of the round hole 4 and detects an obstacle through the round hole 4. The main body 1 comprises an IMU module, the IMU module comprises a six-axis gyroscope and a code wheel, and the six-axis gyroscope and the code wheel are respectively and electrically connected with a controller.
In the description of the specification, reference to the description of "one embodiment", "preferably", "an example", "a specific example" or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention, and schematic representations of the terms in this specification do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. The connection mode connected in the description of the specification has obvious effects and practical effectiveness.
With the above structure and principle in mind, those skilled in the art should understand that the present invention is not limited to the above embodiments, and modifications and substitutions based on the known technology in the field are within the scope of the present invention, which should be limited by the claims.

Claims (10)

1. A point cloud data acquisition method based on single-point TOF is characterized by comprising the following steps:
s1: the mobile robot sets basic coordinates and obtains IMU data and TOF data;
s2: the mobile robot determines the current robot coordinate based on the basic coordinate and the IMU data;
s3: the mobile robot acquires TOF coordinates based on current robot coordinates;
s4: the mobile robot determines point cloud data based on current robot coordinates, TOF coordinates and TOF data.
2. The point cloud data acquisition method based on single-point TOF according to claim 1, wherein in step S1: the mobile robot is based on the center position at the start of operation.
3. The point cloud data acquisition method based on single-point TOF according to claim 1, wherein in step S2: the robot coordinates are coordinates of the center position of the mobile robot, the mobile robot establishes a world coordinate system by taking the basic coordinates as an origin, and the current robot coordinates are calculated according to IMU data acquired in the moving process.
4. The point cloud data acquisition method based on single-point TOF according to claim 1, wherein in step S3: the TOF coordinate is the coordinate of the TOF module, the mobile robot acquires position information of the TOF module relative to the center of the robot in advance, then the mobile robot establishes a coordinate system by taking the coordinate of the robot as an origin and taking the right front side as an x axis, and the TOF coordinate is acquired according to the coordinate of the robot and the position information.
5. The point cloud data acquisition method based on single-point TOF according to claim 1, wherein in step S4: the TOF data is the measured distance between the TOF module and the obstacle, the mobile robot obtains the coordinates of the obstacle according to the coordinates of the robot, the TOF coordinates and the TOF data through a calculation formula, and the coordinates of the obstacle are used as point cloud data obtained by the mobile robot.
6. The single-point TOF-based point cloud data acquisition method according to claim 5, wherein the calculation formula is:
ox=cos(rθ)* tx-sin(rθ)* ty+ rx+d*cos(rθ+ tθ);
oy=sin(rθ)* tx+cos(rθ)* ty+ ry+d*sin(rθ+ tθ);
the robot coordinate is (rx, ry, r θ), the TOF coordinate is (tx, ty, t θ), the TOF data is d, and the obstacle coordinate is (ox, oy).
7. A mobile robot for executing the point cloud data acquisition method based on single-point TOF according to any one of claims 1 to 6, comprising a main body, wherein the main body is provided with a TOF module and an IMU module, the TOF module is used for detecting the distance between the TOF module and an obstacle, and the IMU module is used for acquiring IMU data.
8. The mobile robot of claim 7, wherein the IMU module includes a six-axis gyroscope and a code wheel.
9. A chip with a built-in control program, wherein the control program is configured to control a robot to execute the point cloud data acquisition method based on the single point TOF according to any one of claims 1 to 6.
10. A mobile robot equipped with a master control chip, characterized in that the master control chip is the chip of claim 9.
CN202011559909.2A 2020-12-25 2020-12-25 Point cloud data acquisition method based on single-point TOF, chip and mobile robot Pending CN112747746A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011559909.2A CN112747746A (en) 2020-12-25 2020-12-25 Point cloud data acquisition method based on single-point TOF, chip and mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011559909.2A CN112747746A (en) 2020-12-25 2020-12-25 Point cloud data acquisition method based on single-point TOF, chip and mobile robot

Publications (1)

Publication Number Publication Date
CN112747746A true CN112747746A (en) 2021-05-04

Family

ID=75647605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011559909.2A Pending CN112747746A (en) 2020-12-25 2020-12-25 Point cloud data acquisition method based on single-point TOF, chip and mobile robot

Country Status (1)

Country Link
CN (1) CN112747746A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113900435A (en) * 2021-08-31 2022-01-07 深圳蓝因机器人科技有限公司 Mobile robot obstacle avoidance method, equipment, medium and product based on double cameras

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103968828A (en) * 2014-04-11 2014-08-06 首都师范大学 Mobile measurement method in closed environment
CN104964656A (en) * 2015-06-26 2015-10-07 天津大学 Self-positioning flowing-type rapid scanning measuring device and method based on inertial navigation
CN106056664A (en) * 2016-05-23 2016-10-26 武汉盈力科技有限公司 Real-time three-dimensional scene reconstruction system and method based on inertia and depth vision
CN106123802A (en) * 2016-06-13 2016-11-16 天津大学 A kind of autonomous flow-type 3 D measuring method
CN107655461A (en) * 2014-05-05 2018-02-02 赫克斯冈技术中心 Measure subsystem and measuring system
CN108051837A (en) * 2017-11-30 2018-05-18 武汉大学 Multiple-sensor integration indoor and outdoor mobile mapping device and automatic three-dimensional modeling method
CN108256430A (en) * 2017-12-20 2018-07-06 北京理工大学 Obstacle information acquisition methods, device and robot
CN108406731A (en) * 2018-06-06 2018-08-17 珠海市微半导体有限公司 A kind of positioning device, method and robot based on deep vision
CN208323361U (en) * 2018-06-06 2019-01-04 珠海市一微半导体有限公司 A kind of positioning device and robot based on deep vision
CN208638479U (en) * 2018-07-27 2019-03-22 顺丰科技有限公司 Panoramic picture acquisition device and mobile robot
CN109882244A (en) * 2019-03-29 2019-06-14 安徽延达智能科技有限公司 Intelligent map building system of underground inspection robot
CN109900266A (en) * 2019-03-27 2019-06-18 小驴机器人(武汉)有限公司 A kind of quick identification positioning method and system based on RGB-D and inertial navigation
CN109975817A (en) * 2019-04-12 2019-07-05 南京工程学院 A kind of Intelligent Mobile Robot positioning navigation method and system
CN110196047A (en) * 2019-06-20 2019-09-03 东北大学 Robot autonomous localization method of closing a position based on TOF depth camera and IMU
CN110211228A (en) * 2019-04-30 2019-09-06 北京云迹科技有限公司 For building the data processing method and device of figure
CN110216678A (en) * 2019-06-25 2019-09-10 韦云智 A kind of method of the indoor positioning navigation of robot
CN111123925A (en) * 2019-12-19 2020-05-08 天津联汇智造科技有限公司 Mobile robot navigation system and method
CN111156998A (en) * 2019-12-26 2020-05-15 华南理工大学 Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN111366139A (en) * 2020-04-03 2020-07-03 深圳市赛为智能股份有限公司 Indoor mapping point positioning method and device, computer equipment and storage medium
CN111547085A (en) * 2020-04-22 2020-08-18 中国铁路设计集团有限公司 Self-moving type rail transit three-dimensional scanning system
CN111595332A (en) * 2020-04-13 2020-08-28 宁波深寻信息科技有限公司 Full-environment positioning method integrating inertial technology and visual modeling
CN111624997A (en) * 2020-05-12 2020-09-04 珠海市一微半导体有限公司 Robot control method and system based on TOF camera module and robot

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103968828A (en) * 2014-04-11 2014-08-06 首都师范大学 Mobile measurement method in closed environment
CN107655461A (en) * 2014-05-05 2018-02-02 赫克斯冈技术中心 Measure subsystem and measuring system
CN104964656A (en) * 2015-06-26 2015-10-07 天津大学 Self-positioning flowing-type rapid scanning measuring device and method based on inertial navigation
CN106056664A (en) * 2016-05-23 2016-10-26 武汉盈力科技有限公司 Real-time three-dimensional scene reconstruction system and method based on inertia and depth vision
CN106123802A (en) * 2016-06-13 2016-11-16 天津大学 A kind of autonomous flow-type 3 D measuring method
CN108051837A (en) * 2017-11-30 2018-05-18 武汉大学 Multiple-sensor integration indoor and outdoor mobile mapping device and automatic three-dimensional modeling method
CN108256430A (en) * 2017-12-20 2018-07-06 北京理工大学 Obstacle information acquisition methods, device and robot
CN108406731A (en) * 2018-06-06 2018-08-17 珠海市微半导体有限公司 A kind of positioning device, method and robot based on deep vision
CN208323361U (en) * 2018-06-06 2019-01-04 珠海市一微半导体有限公司 A kind of positioning device and robot based on deep vision
CN208638479U (en) * 2018-07-27 2019-03-22 顺丰科技有限公司 Panoramic picture acquisition device and mobile robot
CN109900266A (en) * 2019-03-27 2019-06-18 小驴机器人(武汉)有限公司 A kind of quick identification positioning method and system based on RGB-D and inertial navigation
CN109882244A (en) * 2019-03-29 2019-06-14 安徽延达智能科技有限公司 Intelligent map building system of underground inspection robot
CN109975817A (en) * 2019-04-12 2019-07-05 南京工程学院 A kind of Intelligent Mobile Robot positioning navigation method and system
CN110211228A (en) * 2019-04-30 2019-09-06 北京云迹科技有限公司 For building the data processing method and device of figure
CN110196047A (en) * 2019-06-20 2019-09-03 东北大学 Robot autonomous localization method of closing a position based on TOF depth camera and IMU
CN110216678A (en) * 2019-06-25 2019-09-10 韦云智 A kind of method of the indoor positioning navigation of robot
CN111123925A (en) * 2019-12-19 2020-05-08 天津联汇智造科技有限公司 Mobile robot navigation system and method
CN111156998A (en) * 2019-12-26 2020-05-15 华南理工大学 Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN111366139A (en) * 2020-04-03 2020-07-03 深圳市赛为智能股份有限公司 Indoor mapping point positioning method and device, computer equipment and storage medium
CN111595332A (en) * 2020-04-13 2020-08-28 宁波深寻信息科技有限公司 Full-environment positioning method integrating inertial technology and visual modeling
CN111547085A (en) * 2020-04-22 2020-08-18 中国铁路设计集团有限公司 Self-moving type rail transit three-dimensional scanning system
CN111624997A (en) * 2020-05-12 2020-09-04 珠海市一微半导体有限公司 Robot control method and system based on TOF camera module and robot

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113900435A (en) * 2021-08-31 2022-01-07 深圳蓝因机器人科技有限公司 Mobile robot obstacle avoidance method, equipment, medium and product based on double cameras
CN113900435B (en) * 2021-08-31 2022-09-27 深圳蓝因机器人科技有限公司 Mobile robot obstacle avoidance method, equipment, medium and product based on double cameras

Similar Documents

Publication Publication Date Title
EP3603372B1 (en) Moving robot, method for controlling the same, and terminal
WO2020253316A1 (en) Navigation and following system for mobile robot, and navigation and following control method
CN105425803B (en) Autonomous obstacle avoidance method, device and system
WO2017149813A1 (en) Sensor calibration system
US11279045B2 (en) Robot pose estimation method and apparatus and robot using the same
CN103270543B (en) Driving assist device
EP3343173B1 (en) Vehicle position estimation device, vehicle position estimation method
CN112189225B (en) Lane line information detection apparatus, method, and computer-readable recording medium storing computer program programmed to execute the method
CN110672093B (en) Vehicle navigation positioning method based on UWB and inertial navigation fusion
CN103592944A (en) Supermarket shopping robot and advancing path planning method thereof
CN111624995B (en) High-precision navigation and positioning method for mobile robot
CN112698654A (en) Single-point TOF-based mapping and positioning method, chip and mobile robot
CN105573310B (en) Coal mine roadway robot positioning and environment modeling method
KR20200015880A (en) Station apparatus and moving robot system
CN110837257B (en) AGV composite positioning navigation system based on iGPS and vision
CN110658828A (en) Autonomous landform detection method and unmanned aerial vehicle
CN112711257A (en) Robot edge method based on single-point TOF, chip and mobile robot
CN108536146B (en) Intelligent control method for positioning charging base of mobile robot based on path and RSSI (received Signal Strength indicator)
WO2020167299A1 (en) Printing systems
CN112747746A (en) Point cloud data acquisition method based on single-point TOF, chip and mobile robot
CN206959854U (en) A kind of dolly based on inertia measurement and laser radar indoor navigation
CN114714357A (en) Sorting and carrying method, sorting and carrying robot and storage medium
CN114137975A (en) Unmanned vehicle navigation deviation rectifying method based on ultrasonic-assisted fusion positioning
CN213748478U (en) Point cloud data acquisition structure of robot and robot
Xu et al. A new positioning method for indoor laser navigation on under-determined condition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Address before: Room 105-514, No.6 Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province

Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd.