CN108759822B - Mobile robot 3D positioning system - Google Patents

Mobile robot 3D positioning system Download PDF

Info

Publication number
CN108759822B
CN108759822B CN201810326546.4A CN201810326546A CN108759822B CN 108759822 B CN108759822 B CN 108759822B CN 201810326546 A CN201810326546 A CN 201810326546A CN 108759822 B CN108759822 B CN 108759822B
Authority
CN
China
Prior art keywords
mobile robot
positioning
coordinate system
algorithm
mapping unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810326546.4A
Other languages
Chinese (zh)
Other versions
CN108759822A (en
Inventor
平雪良
高文研
刘潇潇
王昕煜
蒋毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangnan University
Original Assignee
Jiangnan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangnan University filed Critical Jiangnan University
Priority to CN201810326546.4A priority Critical patent/CN108759822B/en
Publication of CN108759822A publication Critical patent/CN108759822A/en
Application granted granted Critical
Publication of CN108759822B publication Critical patent/CN108759822B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a 3D positioning system of a mobile robot, which comprises a synchronous positioning and mapping unit and a navigation unit, wherein the synchronous positioning and mapping unit and the navigation unit can be switched, the synchronous positioning and mapping unit is the 2D pose of the navigation unit, and the navigation unit displays the angle and the initial pose of the synchronous positioning and mapping unit; the navigation unit comprises an algorithm, the algorithm performs information fusion, the information comprises the input of various sensors and the fusion of the algorithm result of the synchronous positioning and mapping unit, the fusion of the algorithm result is realized by describing the state of the mobile robot in the space by using a 15-dimensional vector X: x ═ X, y, z, θrpy,vx,vy,vzrpy,ax,ay,az]T. When information fusion is carried out, the method and the device can flexibly configure the sensor and the adopted algorithm, can carry out 3D pose estimation and adapt to a three-dimensional environment.

Description

Mobile robot 3D positioning system
Technical Field
The invention relates to the technical field of robot calibration, in particular to a mobile robot 3D positioning system.
Background
The mobile robot technology is a research hotspot in the robot industry at present, and with the popularization of the cost of the mobile robot, the application of the mobile robot is bound to show explosive growth. Due to different application scenes and fields, hardware equipment, driving modes and control systems of different mobile robots are greatly different, and algorithm universality and code reusability are limited.
Positioning is a basic link of navigation, and real-time accurate positioning is a key for improving the navigation performance of the mobile robot. At present, the sensors which can be mounted on the mobile robot equipment are various, widely used sensors include odometers, inertial navigation modules, GPS, laser radars, Kinect, routers and the like, the data and the use methods of the sensors are different, and the motion state information of the mobile robot which can be provided is also greatly different. In addition, due to the fact that the accuracy and the reliability of information provided by a single sensor are not high, multi-sensor information fusion is a great trend of the positioning development of the mobile robot.
The motion information of the mobile robot can be measured by a sensor for measuring the motion state such as a gyroscope and the like, and can also be estimated by visual odometer information obtained after data of the sensor such as a Kinect and the like are processed; similarly, lidar can provide global pose estimation of a mobile robot, comparable to GPS, as a speedometer. In addition, algorithms adopted by different platforms have advantages and disadvantages and applicable environments, multi-machine distributed control and distributed computation are gradually and widely applied at present, besides information of various types of sensors is fused, if different algorithm results can be fused for positioning of the mobile robot, the effect is obviously superior.
Disclosure of Invention
This section is for the purpose of summarizing some aspects of embodiments of the invention and to briefly introduce some preferred embodiments. In this section, as well as in the abstract and the title of the invention of this application, simplifications or omissions may be made to avoid obscuring the purpose of the section, the abstract and the title, and such simplifications or omissions are not intended to limit the scope of the invention.
The present invention has been made in view of the above and/or other drawbacks of the prior art.
It is therefore an object of the present invention, among others, to provide a mobile robotic 3D positioning system that is capable of not only fusing various sensor data, but also a positioning system framework that enables fusion of different algorithms.
In order to solve the technical problems, the invention provides the following technical scheme: A3D positioning system of a mobile robot comprises a synchronous positioning mapping unit and a navigation unit, wherein the synchronous positioning mapping unit and the navigation unit can be switched, the synchronous positioning mapping unit is the 2D pose of the navigation unit, and the navigation unit displays the angle and the initial pose of the synchronous positioning mapping unit; the navigation unit comprises an algorithm, the algorithm performs information fusion, and the information comprises the input of various sensors and the fusion of the algorithm result of the synchronous positioning and mapping unit; and (3) fusion of algorithm results, namely describing the state of the mobile robot in the space by using a 15-dimensional vector X:
X=[x,y,z,θrpy,vx,vy,vzrpy,ax,ay,az]T
wherein x, y, z respectively represent three-dimensional positions, thetarpyAngle, v, representing each positionx,vy,vzRespectively representing the linear velocity, w, of each positionr,wp,wyRespectively representing angular velocities of the respective positions, ax,ay,azRepresenting the acceleration at each position, respectively.
As a preferred embodiment of the mobile robot 3D positioning system of the present invention, wherein: the synchronous positioning and mapping unit comprises positioning and coordinate setting of the mobile robot; the positioning is to accurately estimate the current position of the mobile robot through global positioning, predict the motion state of the mobile robot through a relative positioning mode and continuously correct the motion state; the coordinates, which include a world coordinate system, an odometer coordinate system, a mobile robot center coordinate system, coordinate systems of the four drive wheels, and respective sensor coordinate systems, are calculated by encoders mounted on the four drive wheels to determine odometer data relative to the world coordinate system.
As a preferred embodiment of the mobile robot 3D positioning system of the present invention, wherein: the mobile robot takes the mobile robot central coordinate system as a reference system, and the mobile robot and the current theta of the mobile robot in the world coordinate systemrpyRelated to the current plane; wherein s represents a function sin, c represents a function cos, and r, p, and y represent the aboveThe angle of rotation of the mobile robot center coordinate system around the x, y and z axes relative to the world coordinate system; if all members of the 15-dimensional vector X are obtained through sensor configuration and a related algorithm, the motion of the mobile robot in each direction in the central coordinate system of the mobile robot is projected to a world coordinate system, and when the sampling time is delta t, the prediction after fusion can be obtained.
As a preferred embodiment of the mobile robot 3D positioning system of the present invention, wherein: and determining a model of the system, wherein the model of the system is obtained by arranging the fused predicted formula into a matrix form.
As a preferred embodiment of the mobile robot 3D positioning system of the present invention, wherein: the partial derivative is calculated by the model of the system to obtain a Jacobian matrix required by algorithm iteration
Figure BDA0001626738940000021
And obtaining the fusion process of the sensor and the algorithm result in the fifteen-dimensional space.
As a preferred embodiment of the mobile robot 3D positioning system of the present invention, wherein: the coordinate system of the four driving wheels comprises a rear right wheel link, a rear left wheel link, a front right wheel link and a front left wheel link.
As a preferred embodiment of the mobile robot 3D positioning system of the present invention, wherein: the synchronous positioning and mapping unit further comprises a data processing module, a scanning matching module and a mapping module.
As a preferred embodiment of the mobile robot 3D positioning system of the present invention, wherein: the mobile robot adopts multi-machine distributed control and distributed calculation.
As a preferred embodiment of the mobile robot 3D positioning system of the present invention, wherein: the distributed computation comprises a first processor, a second processor, a sensor, a power supply system and a drive executive component; the power supply system supplies power to the driving executive component, the second processor and the sensor, the second processor carries out hardware driving, information acquisition and data conversion and then transmits the data to the driving executive component and the sensor, the first processor carries out laser ranging, attitude measurement and speed detection, and the first processor carries out man-machine interaction, motion control, data processing and planning decision.
As a preferred embodiment of the mobile robot 3D positioning system of the present invention, wherein: the distributed control comprises a microcontroller, an embedded platform and PC decision calculation.
The invention has the beneficial effects that: when information fusion is carried out, the method and the device can flexibly configure the sensor and the adopted algorithm, can carry out 3D pose estimation and adapt to a three-dimensional environment.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise. Wherein:
FIG. 1 is a schematic diagram of an overall structure of a positioning system framework in an embodiment of a 3D positioning system of a mobile robot according to the present invention;
FIG. 2 is a block diagram of a positioning classification of a mobile robot in an embodiment of the 3D positioning system of the mobile robot of the present invention;
FIG. 3 is a block diagram of the overall architecture of the distributed computation in one embodiment of the mobile robotic 3D positioning system of the present invention;
fig. 4 is a framework diagram of the overall structure of the distributed control in one embodiment of the mobile robot 3D positioning system of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways than those specifically described and will be readily apparent to those of ordinary skill in the art without departing from the spirit of the present invention, and therefore the present invention is not limited to the specific embodiments disclosed below.
Furthermore, reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one implementation of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
Referring to fig. 1, a main body of the embodiment includes a synchronous positioning mapping unit 100 and a navigation unit 200, the synchronous positioning mapping unit 100 and the navigation unit 200 can be switched, the synchronous positioning mapping unit 100 is a 2D pose of the navigation unit 200, and the navigation unit 200 displays an angle and an initial pose of the synchronous positioning mapping unit 100.
The navigation unit 200 comprises an algorithm 201 and 3D pose information 202, the algorithm 201 performs information fusion, the information comprises input of various sensors and fusion of algorithm results of the synchronous positioning mapping unit 100, and the fused results are fed back to the 3D pose information 202.
It should be noted that, in the algorithm result fusion described here, the state of the mobile robot in space is described by using a 15-dimensional vector X:
X=[x,y,z,θrpy,vx,vy,vzrpy,ax,ay,az]T
wherein x, y, z respectively represent three-dimensional positions, thetarpyAngle, v, representing each positionx,vy,vzRespectively representing the linear velocity, w, of each positionr,wp,wyRespectively representing angular velocities of the respective positions, ax,ay,azRespectively representAcceleration at each position. Although the types of the sensors of the mobile robot are numerous, and the states of the mobile robot described by different sensors and algorithms are different, the pose estimation and navigation required information provided by the sensors and algorithms are within the vector.
Preferably, the synchronous positioning mapping unit 100 includes positioning and coordinate setting of the mobile robot; in the positioning, generally, pose calculation methods of the mobile robot can be classified into two categories, referring to fig. 2, one category is that initial position information of the mobile robot is known, and the estimation of the relative position of the mobile robot is performed by an inertial navigation method, a ranging method, and the like, which is called a relative positioning method. This type of algorithm has the obvious drawback that, due to the accumulation of errors, its accuracy becomes lower and lower over time, or even no longer usable. However, if the real-time position of the mobile robot can be known, the motion state of the mobile robot can be estimated accurately by adopting the method, so that a good positioning effect is realized. Global positioning has been developed to solve the problem of how to determine the real-time position of a mobile robot. When the initial position of the mobile robot is unknown, a global positioning is used as a positioning method for positioning through data of an external sensor. Needless to say, any positioning method has advantages and disadvantages and applicable occasions. Although global positioning can solve some problems of relative positioning, global positioning also has its application range and limitation due to the problems of high operation efficiency and data jumping.
The combined positioning is a method for positioning by combining the two positioning methods. The current position of the mobile robot is accurately estimated through global positioning, the motion state of the mobile robot is predicted through a relative positioning mode, and correction is continuously carried out, so that the motion model of the mobile robot is simplified, and the optimal estimation is achieved.
Coordinate transformation is an important module when a mobile robot performs positioning. In the present embodiment, the coordinate setting of the mobile robot employs four-wheel differential drive, which includes a world coordinate system (map), an odometer coordinate system (from), a mobile robot center coordinate system (base link), coordinate systems of four driving wheels, and respective sensor coordinate systems, for example: laser radar (laser), nine-axis IMU Module (IMU).
It should be noted that the coordinate system of the four driving wheels includes a rear right wheel link, a rear left wheel link, a front right wheel link, and a front left wheel link.
It should be noted that the odometer coordinate system calculates odometer data from encoders mounted on the four drive wheels, thereby determining the relative relationship thereof with respect to the world coordinate system. Due to the existence of slippage, the odometer information and the actual position of the mobile robot are often greatly different, and the difference is estimated and maintained by the transformation from the odometer coordinate system to the vehicle body center coordinate system. The process of estimating the conversion from the odometer coordinate system to the vehicle body center coordinate system based on the information of the global sensor and the inertial navigation of the internal sensor is called dead reckoning, and refer to fig. 3.
In the present embodiment, the mobile robot is a typical nonlinear system, and the position and orientation of the mobile robot are estimated by using an algorithm, and preferably, in the present embodiment, the position and orientation are estimated by using an EKF algorithm. Ekf (extended Kalman filter), an extended Kalman filter, a highly efficient recursive filter (autoregressive filter) that is capable of estimating the state of a dynamic system from a series of measurements that do not completely contain noise. The basic idea of EKF is to linearize the nonlinear system and then perform kalman filtering, so EKF is a sub-optimal filter. And then, the estimation performance of the Kalman filtering on the nonlinear system is further improved by the adoption of various second-order generalized Kalman filtering methods. The second-order filtering method considers the quadratic term expanded by the Taylor series, so that the estimation error caused by linearization is reduced, but the operation amount is greatly increased, so that the first-order EKF is not widely applied in practice.
In this embodiment, since the information fusion of the mobile robot is performed on the 15-dimensional vector X, the flip angle of the mobile robot with respect to the horizontal plane must be consideredAnd pitch angle effects on its motion. The mobile robot takes the mobile robot central coordinate system as a reference system, and the mobile robot and the current theta of the mobile robot in the world coordinate systemrpyRelating to the current plane, expressing a function sin by s, expressing a function cos by c, and expressing the rotation angles of the central coordinate system of the mobile robot around the axes x, y and z relative to the world coordinate system by r, p and y respectively; if all members of the 15-dimensional vector X are obtained through sensor configuration and a related algorithm, the motion of the mobile robot in each direction in the central coordinate system of the mobile robot is projected to a world coordinate system, and when the sampling time is delta t, the prediction after fusion can be obtained;
Figure BDA0001626738940000061
Figure BDA0001626738940000062
Figure BDA0001626738940000063
and (3) arranging the three formulas into a matrix form to obtain a model of the system:
Figure BDA0001626738940000071
wherein,
Figure BDA0001626738940000072
partial derivative is calculated for the model of the system to obtain a Jacobian matrix required by EKF algorithm iteration
Figure BDA0001626738940000073
The 3D pose fusion process of the mobile robot is as follows:
Figure BDA0001626738940000081
so far, the fusion of the sensor and the algorithm result on the fifteen-dimensional space is completed.
Preferably, the synchronized positioning mapping unit 100 further comprises a data processing module 101, a scan matching module 102 and a mapping 103. Because the existing mobile robot platforms are numerous, the sensor configuration is very different, the dynamic configuration of the sensor input is a basic requirement for algorithm universality, and the environment map construction depends on the laser radar, so that a data processing module of the laser radar is also indispensable. Referring to fig. 1, a signal is first given by a laser radar to a data processing module 101, processed by the data processing module 101, then to a scan matching module 102, and to a pattern 103, and the pattern 103 and the matching module 102 are bi-directionally transmitted.
The navigation unit 200 uses the EKF algorithm to fuse the 2D pose and various sensor inputs to obtain a 3D pose estimate of the mobile robot. Obviously, any algorithm capable of providing 3D pose estimation can be accessed to the platform to participate in data fusion, the algorithm type is not required to be limited, and any sensor capable of providing motion information can participate in data fusion.
Due to the function limitation of the laser radar in the embodiment, only 2D pose estimation can be performed, the 2D pose is calculated in the 2D synchronous positioning mapping unit 100, and any algorithm capable of providing pose estimation information and performing mapping can be adopted. For the three-dimensional laser radar, the 2D pose estimation algorithm can be replaced by a corresponding three-dimensional algorithm.
Preferably, the information fused by the algorithm 201 also includes IMU, GPS, odometer and other sensors and algorithms.
Preferably, the mobile robot adopts multi-machine distributed control and distributed computation. Referring to fig. 3, the distributed computing includes a first processor 301, a second processor 302, a sensor 303, a power supply system 304, and a drive executive 305. The power supply system 304 supplies power to the driving executive component 305, the second processor 302 and the sensor 303, the second processor 302 carries out hardware driving, information acquisition and data conversion and then transmits the power to the driving executive component 305, the sensor 303 and the first processor 301, the sensor 303 carries out laser ranging, attitude measurement and speed detection, the first processor 301 carries out man-machine interaction, motion control, data processing and planning decision, and the driving executive component 305 is internally provided with a motor controller, a motor driver and a driving motor.
Preferably, referring to fig. 4, the distributed control includes a microcontroller, an embedded platform and a PC decision calculation. The microcontroller performs servo closed-loop control on the motor according to an instruction given by the upper computer, acquires the encoder according to a certain frequency, and feeds back information to the upper computer. The embedded platform sends the instruction to the microcontroller according to the driving information of the PC, acquires the encoder value fed back by the microcontroller according to a certain frequency and forwards the encoder value to the PC, communicates with the IMU module through the bus, acquires corresponding information and sends the information to the PC, and simultaneously sends the data of the sensor to the PC through the serial port. The PC mainly comprises basic modules of kinematics control, kinematics inverse solution, attitude calculation and the like of the robot, complex algorithm modules of filtering, map construction, navigation and the like, and a human-computer interaction interface.
And the PC decision-making calculation is externally connected with a WiFi module, an RJ-45 interface and a USB2.0 interface. The embedded platform is externally connected with a WiFi module, an RJ-45 interface, a USB2.0 port, an RS232 interface and an IIC bus, wherein I2The C bus is connected with the accelerometer, the gyroscope and the electronic compass. Microcontroller external RS232 interface, encoder structure, PWM output IO and ordinary IO, microcontroller still is connected with rotary encoder and motor drive, and wherein motor drive, motor and rotary encoder form the series connection. When the PC decision calculation is connected with the embedded platform, the microcontroller is connected with the embedded platform through an RS232 interface or TTL (transistor-transistor logic) interface or a WiFi module or an RJ-45 interface.
It should be noted that, in this embodiment, a power supply is further provided, and the power supply supplies power to the embedded platform, the microcontroller, and the motor driver, respectively.
It should be noted that the above-mentioned embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.

Claims (8)

1. A mobile robot 3D positioning system is characterized in that: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
a synchronized positioning mapping unit (100) and a navigation unit (200), the synchronized positioning mapping unit (100) and the navigation unit (200) being switchable between, the synchronized positioning mapping unit (100) being a 2D pose of the navigation unit (200), the navigation unit (200) displaying an angle and an initial pose of the synchronized positioning mapping unit (100);
the navigation unit (200) comprises an algorithm (201), the algorithm (201) performs information fusion, and the information comprises the input of various sensors and the fusion of the algorithm result of the synchronous positioning mapping unit (100);
and (3) fusion of algorithm results, namely describing the state of the mobile robot in the space by using a 15-dimensional vector X:
X=[x,y,z,θrpy,vx,vy,vzrpy,ax,ay,az]T
wherein x, y, z respectively represent three-dimensional positions, thetarpyAngle, v, representing each positionx,vy,vzRespectively representing the linear velocity, w, of each positionr,wp,wyRespectively representing angular velocities of the respective positions, ax,ay,azRespectively representing the acceleration of each position;
the synchronous positioning mapping unit (100) comprises positioning and coordinate setting of a mobile robot;
the positioning is to accurately estimate the current position of the mobile robot through global positioning, predict the motion state of the mobile robot through a relative positioning mode and continuously correct the motion state;
the coordinates comprise a world coordinate system, a milemeter coordinate system, a mobile robot center coordinate system, coordinate systems of four driving wheels and coordinate systems of various sensors, the milemeter coordinate system calculates milemeter data by encoders arranged on the four driving wheels so as to determine the relative relationship of the milemeter data relative to the world coordinate system;
the mobile robot takes the mobile robot central coordinate system as a reference system, and the mobile robot and the current theta of the mobile robot in the world coordinate systemrpyRelated to the current plane;
s represents a function sin, c represents a function cos, and r, p and y respectively represent the rotation angles of the central coordinate system of the mobile robot around the x, y and z axes relative to the world coordinate system;
if all members of the 15-dimensional vector X are obtained through sensor configuration and a related algorithm, the motion of the mobile robot in each direction in the central coordinate system of the mobile robot is projected to a world coordinate system, and when the sampling time is delta t, the prediction after fusion can be obtained;
Figure FDA0002949579450000021
Figure FDA0002949579450000022
Figure FDA0002949579450000023
2. the mobile robotic 3D positioning system of claim 1, wherein: determining a model of a system, wherein the model of the system is obtained by arranging three formulas into a matrix form:
Figure FDA0002949579450000024
wherein,
Figure FDA0002949579450000031
3. the mobile robotic 3D positioning system of claim 2, wherein: the partial derivative is calculated by the model of the system to obtain a Jacobian matrix required by algorithm iteration
Figure FDA0002949579450000032
And obtaining a fusion process of the sensor and the algorithm result on a fifteen-dimensional space as follows:
Figure FDA0002949579450000033
4. the mobile robot 3D positioning system according to any one of claims 1 to 3, wherein: the coordinate system of the four driving wheels comprises a rear right wheel link, a rear left wheel link, a front right wheel link and a front left wheel link.
5. The mobile robot 3D positioning system according to any one of claims 1 to 3, wherein: the synchronous positioning mapping unit (100) further comprises a data processing module (101), a scanning matching module (102) and a mapping (103).
6. The mobile robotic 3D positioning system of claim 5, wherein: the mobile robot adopts multi-machine distributed control and distributed calculation.
7. The mobile robotic 3D positioning system of claim 6, wherein: the distributed computing comprises a first processor (301), a second processor (302), a sensor (303), a power supply system (304) and a drive executive (305);
the power supply system (304) supplies power to the driving executive component (305), the second processor (302) and the sensor (303), the second processor (302) carries out hardware driving, information acquisition and data conversion and then transmits the power to the driving executive component (305) and the sensor (303), the first processor (301) carries out laser ranging, attitude measurement and speed detection, and the first processor (301) carries out human-computer interaction, motion control, data processing and planning decision.
8. The mobile robotic 3D positioning system according to claim 6 or 7, characterized in that: the distributed control comprises a microcontroller, an embedded platform and PC decision calculation.
CN201810326546.4A 2018-04-12 2018-04-12 Mobile robot 3D positioning system Active CN108759822B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810326546.4A CN108759822B (en) 2018-04-12 2018-04-12 Mobile robot 3D positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810326546.4A CN108759822B (en) 2018-04-12 2018-04-12 Mobile robot 3D positioning system

Publications (2)

Publication Number Publication Date
CN108759822A CN108759822A (en) 2018-11-06
CN108759822B true CN108759822B (en) 2021-04-30

Family

ID=63981720

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810326546.4A Active CN108759822B (en) 2018-04-12 2018-04-12 Mobile robot 3D positioning system

Country Status (1)

Country Link
CN (1) CN108759822B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110132327B (en) * 2019-06-05 2021-09-17 知恒科技(天津)有限公司 Photoelectric encoder
CN111007522A (en) * 2019-12-16 2020-04-14 深圳市三宝创新智能有限公司 Position determination system of mobile robot
CN111308490B (en) * 2020-02-05 2021-11-19 浙江工业大学 Balance car indoor positioning and navigation system based on single-line laser radar
CN112461227B (en) * 2020-10-22 2023-07-21 新兴际华集团有限公司 Wheel type chassis robot inspection intelligent autonomous navigation method
CN113325837A (en) * 2021-04-23 2021-08-31 北京启安智慧科技有限公司 Control system and method for multi-information fusion acquisition robot

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104236548A (en) * 2014-09-12 2014-12-24 清华大学 Indoor autonomous navigation method for micro unmanned aerial vehicle
CN107167130A (en) * 2017-05-18 2017-09-15 上海谦尊升网络科技有限公司 Map match localization method and system
CN107369167A (en) * 2017-07-20 2017-11-21 江南大学 A kind of robot self-calibrating method based on biplane constraint error model
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9658070B2 (en) * 2014-07-11 2017-05-23 Regents Of The University Of Minnesota Inverse sliding-window filters for vision-aided inertial navigation systems

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104236548A (en) * 2014-09-12 2014-12-24 清华大学 Indoor autonomous navigation method for micro unmanned aerial vehicle
CN107167130A (en) * 2017-05-18 2017-09-15 上海谦尊升网络科技有限公司 Map match localization method and system
CN107369167A (en) * 2017-07-20 2017-11-21 江南大学 A kind of robot self-calibrating method based on biplane constraint error model
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于平面约束的机器人误差补偿方法;齐飞 等;《机械设计》;20170930;第34卷(第9期);第23-27页 *

Also Published As

Publication number Publication date
CN108759822A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN108759822B (en) Mobile robot 3D positioning system
CN113945206A (en) Positioning method and device based on multi-sensor fusion
Xiong et al. G-VIDO: A vehicle dynamics and intermittent GNSS-aided visual-inertial state estimator for autonomous driving
JP2021177168A (en) Vehicle dead-reckoning method, apparatus, device, storage medium and program
CN109141410B (en) Multi-sensor fusion positioning method for AGV (automatic guided vehicle) combined navigation
CN110986988B (en) Track calculation method, medium, terminal and device integrating multi-sensor data
CN108549771A (en) A kind of excavator auxiliary construction system and method
CN106969784B (en) A kind of combined error emerging system for concurrently building figure positioning and inertial navigation
Kang et al. Vins-vehicle: A tightly-coupled vehicle dynamics extension to visual-inertial state estimator
CN109387198B (en) Inertia/vision milemeter combined navigation method based on sequential detection
Zheng et al. An optimization-based UWB-IMU fusion framework for UGV
CN107782304A (en) Mobile robot positioning method and device, mobile robot and storage medium
CN114216456A (en) Attitude measurement method based on IMU and robot body parameter fusion
CN114926547A (en) Calibration method of camera and IMU, electronic device and system
CN114475581B (en) Automatic parking positioning method based on wheel speed pulse and IMU Kalman filtering fusion
CN115752507A (en) Online single-steering-wheel AGV parameter calibration method and system based on two-dimensional code navigation
Chen et al. 3D LiDAR-GPS/IMU calibration based on hand-eye calibration model for unmanned vehicle
CN116202509A (en) Passable map generation method for indoor multi-layer building
CN115727843A (en) Wheel speed determination method, device and equipment for dead reckoning
CN112147599A (en) Spline function-based continuous-time external parameter calibration method for 3D laser radar and inertial sensor
Zhang et al. Self-positioning for mobile robot indoor navigation based on wheel odometry, inertia measurement unit and ultra wideband
CN111912403B (en) Forklift positioning method and forklift
Lee Mobile robot localization using optical mice
Su et al. GR-SLAM: Vision-based sensor fusion SLAM for ground robots on complex terrain
CN115993089B (en) PL-ICP-based online four-steering-wheel AGV internal and external parameter calibration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant