CN112318507A - Robot intelligent control system based on SLAM technology - Google Patents

Robot intelligent control system based on SLAM technology Download PDF

Info

Publication number
CN112318507A
CN112318507A CN202011175325.5A CN202011175325A CN112318507A CN 112318507 A CN112318507 A CN 112318507A CN 202011175325 A CN202011175325 A CN 202011175325A CN 112318507 A CN112318507 A CN 112318507A
Authority
CN
China
Prior art keywords
speed
laser radar
information
radar sensor
minipc
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011175325.5A
Other languages
Chinese (zh)
Inventor
范海廷
杜云刚
苏欣
陈帅
田莎琦
侯培军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inner Mongolia University of Technology
Original Assignee
Inner Mongolia University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inner Mongolia University of Technology filed Critical Inner Mongolia University of Technology
Priority to CN202011175325.5A priority Critical patent/CN112318507A/en
Publication of CN112318507A publication Critical patent/CN112318507A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a robot intelligent control system based on SLAM technology, comprising: the system comprises a MiniPC, an ARM embedded main control unit, a laser radar sensor, a vision sensor, an omnidirectional wheel, an inertia measurement unit, a ZIGBEE module, an orthogonal code disc and encoder, a camera and a 2D laser sensor; the ARM embedded main control is in interactive connection with the ZIGBEE module, the ARM embedded main control is in interactive connection with the MiniPC, and the omnidirectional wheel is respectively connected with the laser radar sensor, the ZIGBEE module, the inertia measurement unit and the orthogonal code disc, the encoder and the vision sensor. In the using process, the inertial measurement unit, the laser radar sensor and the camera data are fused, so that the advantages of the sensors are complemented, the mapping and positioning precision of the mobile platform is improved, the obstacle avoidance and local dynamic obstacle avoidance in the occasions with complex environment and concentrated people flow can be realized, and the robustness of the mobile chassis of the robot in actual operation is enhanced.

Description

Robot intelligent control system based on SLAM technology
The technical field is as follows:
the invention relates to the technical field of robot control, in particular to an SLAM technology-based robot intelligent control system.
Background art:
a robot is an intelligent machine that can work semi-autonomously or fully autonomously. The robot can assist or even replace human beings to finish dangerous, heavy and complex work, improve the work efficiency and quality, serve human life, and expand or extend the activity and capacity range of human beings, and along with the continuous improvement of the intelligent level of the robot, the robot has more important application in the aspects of home service, health care and medical treatment, new media entertainment and the like. When the mobile robot usually works in a relatively complex indoor environment, a plurality of sensors are needed to sense the surrounding environment, and tasks such as map construction, positioning and autonomous navigation are completed, namely SLAM of the mobile robot is realized. Therefore, information fusion of multiple sensors becomes an important aspect in SLAM research of the mobile robot, and has an important influence on the realization of functions and the level of intelligence of the robot.
In the use process of the existing robot control system, a SLAM algorithm which is applied to a laser radar and vision independently has certain defects, the robot has poor positioning precision, the obstacle avoidance capability in a complex environment and an occasion with concentrated pedestrian flow, the robot has poor robustness, and the robot is not beneficial to the control and use of the robot.
The invention content is as follows:
the invention aims to provide a robot intelligent control system based on SLAM technology to solve the problems in the background technology.
The invention is implemented by the following technical scheme: a robot intelligent control system based on SLAM technology comprises: the system comprises a MiniPC, an ARM embedded main control unit, a laser radar sensor, a vision sensor, an omnidirectional wheel, an inertia measurement unit, a ZIGBEE module, an orthogonal code disc and encoder, a camera and a 2D laser sensor; the ARM embedded main control is in interactive connection with the ZIGBEE module, the ARM embedded main control is in interactive connection with the MiniPC, the omnidirectional wheel is respectively connected with the laser radar sensor, the ZIGBEE module, the inertial measurement unit, the orthogonal code disc, the encoder and the visual sensor, the laser radar sensor and the visual sensor are connected with the MiniPC, the inertial measurement unit is connected with the MiniPC, the orthogonal code disc and the encoder are connected with the ARM embedded main control, and the ARM embedded main control is connected with the omnidirectional wheel;
the MiniPC is used for carrying a Ubuntu operating system and a ROS operating system;
the ARM embedded main control is used for transmitting data returned by the orthogonal code disc, the encoder and the laser radar sensor to the MiniPC as auxiliary information for constructing coordinate points so as to achieve the purpose of closed-loop information processing;
the laser radar sensor, the inertial measurement unit and the visual sensor are used for acquiring environmental information;
the omnidirectional wheel is used for driving the robot to move;
the ZIGBEE module is used for realizing remote communication;
the orthogonal code wheel and the encoder are used for transmitting the value of the encoder to prepare for coordinate calculation.
As further preferable in the present technical solution: the camera acquires environment image information to perform visual SLAM to obtain a positioning result, the positioning result of the visual SLAM is fused with inertial measurement data such as acceleration and angular velocity obtained by the inertial measurement unit IMU according to an extended Kalman filtering method, inertial motion data obtained by the IMU is used for estimating the state of EKF, the core positioning result of the visual SLAM is used as an observation value to complete the correction process of the EKF, and thus the positioning result after EKF fusion is obtained.
As further preferable in the present technical solution: and performing projection calculation on the in-plane distance and angle environment characteristic data acquired by the laser radar sensor, projecting the in-plane distance and angle environment characteristic data from a polar coordinate system to a plane rectangular coordinate system, and performing coordinate conversion on the projected point cloud data by combining the fused extended Kalman filtering positioning result to obtain point cloud data in a unified world coordinate system.
As further preferable in the present technical solution: the processor adopted by the ARM embedded main control is an ARM embedded processor and an Intel embedded processor, a Ubuntu operating system is loaded on the Intel processor, an ROS robot operating system is installed under the Ubuntu to provide functions similar to the operating system for a heterogeneous computer cluster, the ROS can realize positioning drawing, action planning, sensing and simulation to meet the target requirement of a mobile robot for establishing a map, the ARM processor controls the running of a chassis, receives speed information, angle information and distance information of the chassis and processes the speed information, the angle information and the distance information to calculate the most proper advancing speed, a speed control task on the ARM is performed, after the map is established by the laser radar sensor, the world coordinate distance is converted into the real distance, then the information is sent to the ARM, a UCOS system is established on the ARM, the current time is obtained in the UCOS system, and the current time, the total distance, the current initial speed and the like are calculated, The set uniform speed and the set final speed are parameters, the set uniform speed and the set final speed are introduced into a formula to calculate the acceleration length, the uniform speed length and the deceleration length, the current position to be moved is calculated, the operation speed in the period is calculated to introduce a proportionality coefficient, an integral coefficient and a differential coefficient, the final total control speed is worked out, the final total control speed is converted into the duty ratio of a PWM wave and output to a motor, the rotating speed of the motor cannot reach the target speed due to external disturbance and deviation caused by the difference of 4 rotating speeds of the motor, two parameters are set to be a current value and an expected value respectively, the current value is obtained by converting the values obtained by the orthogonal code disc and an encoder, and the expected value is the calculated total control speed and.
As further preferable in the present technical solution: after the laser radar sensor obtains environmental characteristic information such as distance and angle in a plane, the environmental data coordinate obtained by the laser radar sensor can be converted into a world coordinate system by utilizing a positioning result of the visual SLAM based on the IMU data fused with the extended Kalman filtering, so that a three-dimensional laser point cloud map is accurately constructed in real time according to two-dimensional laser data.
As further preferable in the present technical solution: the MiniPC is used as a brain to transmit an instruction to the ARM embedded type to indirectly control the omnidirectional wheel to operate and control the mobile robot to reach a target place.
As further preferable in the present technical solution: the ZIGBEE module is used for bidirectional wireless data transmission to realize remote communication.
As further preferable in the present technical solution: the camera is a visual camera and is used for acquiring environment image information.
The invention has the advantages that: in the using process, aiming at the existing defects of the SLAM algorithm applied to the laser radar sensor and the vision sensor, the invention fuses the inertial measurement unit, the laser radar sensor and the camera data, so that the advantages of the sensors are complemented, the mapping and positioning precision of the mobile platform is improved, the obstacle avoidance and the local dynamic obstacle avoidance in the occasions with complex environment and concentrated pedestrian flow can be realized, and the robustness of the mobile chassis of the robot in actual operation is enhanced.
Description of the drawings:
in order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow diagram of the system of the present invention;
FIG. 2 is a schematic diagram of the construction of a point cloud unit according to the present invention.
The specific implementation mode is as follows:
the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
Referring to fig. 1-2, the present invention provides a technical solution: a robot intelligent control system based on SLAM technology comprises: the system comprises a MiniPC, an ARM embedded main control unit, a laser radar sensor, a vision sensor, an omnidirectional wheel, an inertia measurement unit, a ZIGBEE module, an orthogonal code disc and encoder, a camera and a 2D laser sensor; the ARM embedded main control is in interactive connection with the ZIGBEE module, the ARM embedded main control is in interactive connection with the MiniPC, the omnidirectional wheel is respectively connected with the laser radar sensor, the ZIGBEE module, the inertia measurement unit, the orthogonal code disc, the encoder and the vision sensor, the laser radar sensor and the vision sensor are connected with the MiniPC, the inertia measurement unit is connected with the MiniPC, the orthogonal code disc and the encoder are connected with the ARM embedded main control, and the ARM embedded main control is connected with the omnidirectional wheel;
the MiniPC is used for carrying a Ubuntu operating system and a ROS operating system;
the ARM embedded master control is used for transmitting data returned by the orthogonal code disc, the encoder and the laser radar sensor to the MiniPC to serve as auxiliary information for constructing a coordinate point so as to achieve the purpose of closed-loop information processing;
the laser radar sensor, the inertia measurement unit and the vision sensor are used for acquiring environmental information;
the omnidirectional wheel is used for driving the robot to move;
the ZIGBEE module is used for realizing remote communication;
orthogonal codewheels and encoders for communicating encoder values in preparation for coordinate calculations.
In this embodiment, specifically: the camera acquires environment image information to perform visual SLAM to obtain a positioning result, the positioning result of the visual SLAM is fused with inertial measurement data such as acceleration and angular velocity obtained by an Inertial Measurement Unit (IMU) according to an extended Kalman filtering method, inertial motion data obtained by the IMU is used for EKF state estimation, and the core positioning result of the visual SLAM is used as an observation value to complete the correction process of EKF, so that the positioning result after EKF fusion is obtained.
In this embodiment, specifically: and performing projection calculation on the in-plane distance and angle environment characteristic data acquired by the laser radar sensor, projecting the in-plane distance and angle environment characteristic data from a polar coordinate system to a plane rectangular coordinate system, and performing coordinate conversion on the projected point cloud data by combining the fused extended Kalman filtering positioning result to obtain point cloud data in a unified world coordinate system.
In this embodiment, specifically: the ARM embedded main control adopts a processor which is an ARM embedded processor and an Intel embedded processor, a Ubuntu operating system is loaded on the Intel processor, an ROS robot operating system is installed under the Ubuntu to provide functions similar to the operating system for a heterogeneous computer cluster, the ROS can realize positioning drawing, action planning, sensing and simulation to meet the target requirement of a mobile robot for establishing a map, the ARM processor controls the running of a chassis, receives speed information, angle information and distance information of the chassis and processes the speed information, calculates the most appropriate advancing speed, controls a task on the ARM speed, converts world coordinate distance into real distance after the map is established by a laser radar sensor, then transmits the information to the ARM, establishes a UCOS system on the ARM, acquires the current time from the UCOS system, and takes the current time, the total distance, the current initial speed, the set uniform speed and the set final speed as parameters, the method comprises the steps of calculating an acceleration length, a uniform speed length and a deceleration length in a formula, calculating a current position to be moved, calculating a proportional coefficient, an integral coefficient and a differential coefficient of the operation speed in the period, calculating a final total control speed, converting the final total control speed into a duty ratio of a PWM wave, and outputting the duty ratio to a motor.
In this embodiment, specifically: after the laser radar sensor obtains environmental characteristic information such as distance and angle in a plane, the environmental data coordinate obtained by the laser radar sensor can be converted into a world coordinate system by utilizing a positioning result of the visual SLAM based on the IMU data fused with the extended Kalman filtering, so that a three-dimensional laser point cloud map is accurately constructed in real time according to two-dimensional laser data.
In this embodiment, specifically: the MiniPC is used as a function that the brain transmits an instruction to the ARM embedded type to indirectly control the omnidirectional wheel to operate and control the mobile robot to reach a target place.
In this embodiment, specifically: the ZIGBEE module is used for bidirectional wireless data transmission to realize remote communication.
In this embodiment, specifically: the camera is a visual camera and is used for acquiring environment image information.
Working principle or structural principle: when the system is used, the environmental image information is acquired through the camera to carry out visual SLAM to obtain a positioning result, the positioning result of the visual SLAM is fused with inertial measurement data such as acceleration, angular velocity and the like obtained by an inertial measurement unit IMU according to an extended Kalman filtering method, the inertial motion data obtained by the IMU is used for estimating the state of EKF, the core positioning result of the visual SLAM is used as an observation value to finish the correction process of EKF, so that the positioning result after EKF fusion is obtained, the in-plane distance and angle environmental characteristic data obtained by a laser radar sensor are subjected to projection calculation, the in-plane distance and angle environmental characteristic data are projected from a polar coordinate system to a plane rectangular coordinate system, the projected point cloud data are subjected to coordinate conversion by combining the extended Kalman filtering positioning result after fusion to obtain point cloud data in a unified world coordinate system, the ARM processor controls the running of the chassis and receives speed information of the chassis, The angle information and the distance information are processed to calculate the most appropriate forward speed.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (8)

1. An intelligent robot control system based on SLAM technology, comprising: the system comprises a MiniPC, an ARM embedded main control unit, a laser radar sensor, a vision sensor, an omnidirectional wheel, an inertia measurement unit, a ZIGBEE module, an orthogonal code disc and encoder, a camera and a 2D laser sensor; the ARM embedded main control is in interactive connection with the ZIGBEE module, the ARM embedded main control is in interactive connection with the MiniPC, the omnidirectional wheel is respectively connected with the laser radar sensor, the ZIGBEE module, the inertial measurement unit, the orthogonal code disc, the encoder and the visual sensor, the laser radar sensor and the visual sensor are connected with the MiniPC, the inertial measurement unit is connected with the MiniPC, the orthogonal code disc and the encoder are connected with the ARM embedded main control, and the ARM embedded main control is connected with the omnidirectional wheel;
the MiniPC is used for carrying a Ubuntu operating system and a ROS operating system;
the ARM embedded main control is used for transmitting data returned by the orthogonal code disc, the encoder and the laser radar sensor to the MiniPC as auxiliary information for constructing coordinate points so as to achieve the purpose of closed-loop information processing;
the laser radar sensor, the inertial measurement unit and the visual sensor are used for acquiring environmental information;
the omnidirectional wheel is used for driving the robot to move;
the ZIGBEE module is used for realizing remote communication;
the orthogonal code wheel and the encoder are used for transmitting the value of the encoder to prepare for coordinate calculation.
2. The smart robot control system based on SLAM technology as claimed in claim 1, wherein: the camera acquires environment image information to perform visual SLAM to obtain a positioning result, the positioning result of the visual SLAM is fused with inertial measurement data such as acceleration and angular velocity obtained by the inertial measurement unit IMU according to an extended Kalman filtering method, inertial motion data obtained by the IMU is used for estimating the state of EKF, the core positioning result of the visual SLAM is used as an observation value to complete the correction process of the EKF, and thus the positioning result after EKF fusion is obtained.
3. The smart robot control system based on SLAM technology as claimed in claim 1, wherein: and performing projection calculation on the in-plane distance and angle environment characteristic data acquired by the laser radar sensor, projecting the in-plane distance and angle environment characteristic data from a polar coordinate system to a plane rectangular coordinate system, and performing coordinate conversion on the projected point cloud data by combining the fused extended Kalman filtering positioning result to obtain point cloud data in a unified world coordinate system.
4. The smart robot control system based on SLAM technology as claimed in claim 1, wherein: the processor adopted by the ARM embedded main control is an ARM embedded processor and an Intel embedded processor, a Ubuntu operating system is loaded on the Intel processor, an ROS robot operating system is installed under the Ubuntu to provide functions similar to the operating system for a heterogeneous computer cluster, the ROS can realize positioning drawing, action planning, sensing and simulation to meet the target requirement of a mobile robot for establishing a map, the ARM processor controls the running of a chassis, receives speed information, angle information and distance information of the chassis and processes the speed information, the angle information and the distance information to calculate the most proper advancing speed, a speed control task on the ARM is performed, after the map is established by the laser radar sensor, the world coordinate distance is converted into the real distance, then the information is sent to the ARM, a UCOS system is established on the ARM, the current time is obtained in the UCOS system, and the current time, the total distance, the current initial speed and the like are calculated, The set uniform speed and the set final speed are parameters, the set uniform speed and the set final speed are introduced into a formula to calculate the acceleration length, the uniform speed length and the deceleration length, the current position to be moved is calculated, the operation speed in the period is calculated to introduce a proportionality coefficient, an integral coefficient and a differential coefficient, the final total control speed is worked out, the final total control speed is converted into the duty ratio of a PWM wave and output to a motor, the rotating speed of the motor cannot reach the target speed due to external disturbance and deviation caused by the difference of 4 rotating speeds of the motor, two parameters are set to be a current value and an expected value respectively, the current value is obtained by converting the values obtained by the orthogonal code disc and an encoder, and the expected value is the calculated total control speed and.
5. The smart robot control system based on SLAM technology as claimed in claim 1, wherein: after the laser radar sensor obtains environmental characteristic information such as distance and angle in a plane, the environmental data coordinate obtained by the laser radar sensor can be converted into a world coordinate system by utilizing a positioning result of the visual SLAM based on the IMU data fused with the extended Kalman filtering, so that a three-dimensional laser point cloud map is accurately constructed in real time according to two-dimensional laser data.
6. The smart robot control system based on SLAM technology as claimed in claim 1, wherein: the MiniPC is used as a brain to transmit an instruction to the ARM embedded type to indirectly control the omnidirectional wheel to operate and control the mobile robot to reach a target place.
7. The smart robot control system based on SLAM technology as claimed in claim 1, wherein: the ZIGBEE module is used for bidirectional wireless data transmission to realize remote communication.
8. The smart robot control system based on SLAM technology as claimed in claim 1, wherein: the camera is a visual camera and is used for acquiring environment image information.
CN202011175325.5A 2020-10-28 2020-10-28 Robot intelligent control system based on SLAM technology Pending CN112318507A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011175325.5A CN112318507A (en) 2020-10-28 2020-10-28 Robot intelligent control system based on SLAM technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011175325.5A CN112318507A (en) 2020-10-28 2020-10-28 Robot intelligent control system based on SLAM technology

Publications (1)

Publication Number Publication Date
CN112318507A true CN112318507A (en) 2021-02-05

Family

ID=74296483

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011175325.5A Pending CN112318507A (en) 2020-10-28 2020-10-28 Robot intelligent control system based on SLAM technology

Country Status (1)

Country Link
CN (1) CN112318507A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112882480A (en) * 2021-03-23 2021-06-01 海南师范大学 System and method for fusing SLAM (simultaneous localization and mapping) by laser and vision aiming at crowd environment
CN114211173A (en) * 2022-01-27 2022-03-22 上海电气集团股份有限公司 Method, device and system for determining welding position

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110011424A (en) * 2009-07-28 2011-02-08 주식회사 유진로봇 Method for recognizing position and controlling movement of a mobile robot, and the mobile robot using the same
CN104062977A (en) * 2014-06-17 2014-09-24 天津大学 Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM
CN109993113A (en) * 2019-03-29 2019-07-09 东北大学 A kind of position and orientation estimation method based on the fusion of RGB-D and IMU information
CN110375738A (en) * 2019-06-21 2019-10-25 西安电子科技大学 A kind of monocular merging Inertial Measurement Unit is synchronous to be positioned and builds figure pose calculation method
CN110617813A (en) * 2019-09-26 2019-12-27 中国科学院电子学研究所 Monocular visual information and IMU (inertial measurement Unit) information fused scale estimation system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110011424A (en) * 2009-07-28 2011-02-08 주식회사 유진로봇 Method for recognizing position and controlling movement of a mobile robot, and the mobile robot using the same
CN104062977A (en) * 2014-06-17 2014-09-24 天津大学 Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM
CN109993113A (en) * 2019-03-29 2019-07-09 东北大学 A kind of position and orientation estimation method based on the fusion of RGB-D and IMU information
CN110375738A (en) * 2019-06-21 2019-10-25 西安电子科技大学 A kind of monocular merging Inertial Measurement Unit is synchronous to be positioned and builds figure pose calculation method
CN110617813A (en) * 2019-09-26 2019-12-27 中国科学院电子学研究所 Monocular visual information and IMU (inertial measurement Unit) information fused scale estimation system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王古超: "基于ROS的全向移动机器人系统设计与研究", 《安徽理工大学》 *
郑国贤: "机器人室内环境自主探索与地图构建方法", 《控制工程》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112882480A (en) * 2021-03-23 2021-06-01 海南师范大学 System and method for fusing SLAM (simultaneous localization and mapping) by laser and vision aiming at crowd environment
CN112882480B (en) * 2021-03-23 2023-07-21 海南师范大学 System and method for fusing laser and vision for crowd environment with SLAM
CN114211173A (en) * 2022-01-27 2022-03-22 上海电气集团股份有限公司 Method, device and system for determining welding position
CN114211173B (en) * 2022-01-27 2024-05-31 上海电气集团股份有限公司 Method, device and system for determining welding position

Similar Documents

Publication Publication Date Title
WO2020253316A1 (en) Navigation and following system for mobile robot, and navigation and following control method
JP6868028B2 (en) Autonomous positioning navigation equipment, positioning navigation method and autonomous positioning navigation system
Samuel et al. A review of some pure-pursuit based path tracking techniques for control of autonomous vehicle
CN106527432B (en) The indoor mobile robot cooperative system corrected certainly based on fuzzy algorithmic approach and two dimensional code
US11740624B2 (en) Advanced control system with multiple control paradigms
CN110262495A (en) Mobile robot autonomous navigation and pinpoint control system and method can be achieved
WO2021135813A1 (en) Robot joint mapping method and device, and computer-readable storage medium
CN110716549A (en) Autonomous navigation robot system for map-free area patrol and navigation method thereof
CN113190020A (en) Mobile robot queue system and path planning and following method
CN111367285B (en) Wheeled mobile trolley cooperative formation and path planning method
CN112318507A (en) Robot intelligent control system based on SLAM technology
CN111260751B (en) Mapping method based on multi-sensor mobile robot
CN214846390U (en) Dynamic environment obstacle avoidance system based on automatic guided vehicle
CN210835730U (en) Control device of ROS blind guiding robot
Gourley et al. Sensor based obstacle avoidance and mapping for fast mobile robots
CN105818145A (en) Distributed control system and method for humanoid robot
CN115993089B (en) PL-ICP-based online four-steering-wheel AGV internal and external parameter calibration method
CN204819543U (en) Centralized control formula multirobot motion control system
CN116100565A (en) Immersive real-time remote operation platform based on exoskeleton robot
Her et al. Localization of mobile robot using laser range finder and IR landmark
Li et al. ColAG: A Collaborative Air-Ground Framework for Perception-Limited UGVs' Navigation
de Melo et al. Mobile robot indoor autonomous navigation with position estimation using rf signal triangulation
Wang et al. Agv navigation based on apriltags2 auxiliary positioning
Balasooriya et al. Development of the smart localization techniques for low-power autonomous rover for predetermined environments
CN205651353U (en) Humanoid robot's distributed control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210205