CN114265417A - Robot control system based on laser and visual identification navigation - Google Patents

Robot control system based on laser and visual identification navigation Download PDF

Info

Publication number
CN114265417A
CN114265417A CN202210191166.0A CN202210191166A CN114265417A CN 114265417 A CN114265417 A CN 114265417A CN 202210191166 A CN202210191166 A CN 202210191166A CN 114265417 A CN114265417 A CN 114265417A
Authority
CN
China
Prior art keywords
data
module
navigation
robot
base station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210191166.0A
Other languages
Chinese (zh)
Inventor
梁贵轩
周可佳
高士杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Boxue Kuanhang Chengdu Technology Co ltd
Original Assignee
Boxue Kuanhang Chengdu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Boxue Kuanhang Chengdu Technology Co ltd filed Critical Boxue Kuanhang Chengdu Technology Co ltd
Priority to CN202210191166.0A priority Critical patent/CN114265417A/en
Publication of CN114265417A publication Critical patent/CN114265417A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a robot control system based on laser and visual recognition navigation, which belongs to the technical field of intelligent robots and solves the problem that a robot system is single in navigation algorithm and has navigation deviation, and comprises a navigation system, a hardware driving module and a hardware module; the navigation system comprises a mobile base station, a map server, a robot state release module, an AMCL algorithm module, a locator, an inertial sensor filter and a laser radar; the inertial sensor filter receives inertial sensor raw data from an IMU gyroscope data node and sends the data to the locator, and the locator sends the data to the mobile base station through the odometer; the laser radar receives data from the laser radar data nodes and transmits the data to the mobile base station; and the map server transmits the map data which is scanned and recorded by the robot to the mobile base station. The invention is used for controlling the autonomous navigation robot and has high navigation precision.

Description

Robot control system based on laser and visual identification navigation
Technical Field
The invention belongs to the technical field of intelligent robots, and particularly relates to a robot control system based on laser and visual recognition navigation.
Background
For any mobile device, its ability to navigate through the environment is important. Dangerous situations such as collisions and unsafe conditions (temperature, radiation, exposure to weather, etc.) are first avoided, but if the purpose of the robot is related to a specific location in the robot environment, these locations have to be found. Robot navigation refers to the ability of a robot to determine its position in a reference frame and then plan a path to a certain target location. In order to navigate in its environment, a robot or any other mobile device requires a representation, i.e. an environment map and the ability to interpret the representation.
Vision-based navigation or optical navigation uses computer vision algorithms and optical sensors, including laser-based rangefinders and photometric cameras using CCD arrays, to extract the visual features needed for localization in the surrounding environment.
The automatic walking system of the existing robot in the market is single in the aspect of navigation algorithm, and problems of navigation deviation and the like can occur in partial scenes.
Disclosure of Invention
The invention aims to:
in order to solve the problems that the robot automatic walking system in the prior art is single in navigation algorithm, navigation deviation occurs in partial scenes and the like, a robot control system based on laser and visual recognition navigation is provided.
The technical scheme adopted by the invention is as follows:
a robot control system based on laser and visual identification navigation comprises a navigation system, a hardware driving module and a hardware module, wherein the hardware module is communicated with the hardware driving module through a ROSSERIAL protocol, and the navigation system is electrically connected with the hardware driving module to transmit data;
the hardware driving module is provided with a chassis communication node, an IMU gyroscope data node and a laser radar data node, and the IMU gyroscope data node receives data sent by the chassis communication node;
the navigation system comprises a mobile base station, a map server, a robot state release module, an AMCL algorithm module, a locator, an inertial sensor filter and a laser radar; the inertial sensor filter receives inertial sensor raw data of an IMU gyroscope data node from the hardware driving module and sends the data to the locator, and the locator sends the data to the mobile base station through the odometer; the laser radar receives data transmitted by a laser radar data node from a hardware driving module and transmits the data to a mobile base station; the map server transmits map data obtained by scanning and recording a map by the robot to the mobile base station; the mobile base station also receives data from the robot state release module and the AMCL algorithm module through communication connection;
the AMCL is a probabilistic positioning system which positions a mobile robot in a 2D mode, and realizes an adaptive (or KLD-sampling) Monte Carlo positioning method, and the pose of the robot in a known map is tracked by using particle filtering.
And the navigation system sends the instruction to the chassis communication node after data processing.
Furthermore, the hardware module comprises a motor driving module, an inertial sensor, an ultrasonic ranging module, a serial port, an encoder, a PID algorithm module, a remote control module and a servo cloud control module, and a differential kinematics model is further established in the hardware module.
Furthermore, the differential kinematic model adjusts the speed through a PID algorithm module, counts through an encoder, exchanges data with a serial port through a remote control module, and transmits differential data of robot navigation to a motor driving module.
Furthermore, the serial port receives ranging data from the ultrasonic ranging module, attitude data from the star observation sensor and control information from the servo cloud control module.
Further, the mobile base station directly receives data from a laser radar data node, and the locator directly receives data from a chassis communication node.
In summary, due to the adoption of the technical scheme, the invention has the beneficial effects that:
1. the system utilizes a 5G communication technology to communicate with a server, can remotely transmit data and monitor the state of the robot in real time, is linked with an Internet of things platform, breaks through data barriers of the robot and other peripheral Internet of things equipment and intelligent equipment, enables the robot to exchange data and perform linkage control with any equipment in real time, supports multiple algorithms to construct images in the aspect of autonomous navigation, can realize integration of multiple data and multiple sensors by matching with various sensor equipment, utilizes a depth camera to acquire scene depth image data, can perform functions of acquiring fused point cloud data and the like, and is matched with a laser radar to perform accurate navigation, so that the autonomous navigation of the robot has higher reliability.
2. The invention integrates the vision following algorithm control and the vision line patrol algorithm control, thereby controlling the vision recognition navigation of the robot, rapidly and accurately retrieving the position again even after losing the coordinate position, and adjusting in real time by combining the methods of vision obstacle avoidance and the like so as to deal with various possible changes in the environment and navigation errors possibly generated by the robot, thereby forming the global autonomous navigation of the robot.
Drawings
FIG. 1 is a schematic diagram of the communication and control between the elements of the control system of the present invention;
FIG. 2 is a diagram of the hardware module control scheme of the present invention.
Part of English in the figure is explained:
robot state publication, tf static transmission, Map server-Map server, scan-scan, odometry-odometer, IMU: the system comprises an inertial sensor, a Move base-mobile base station, an EKF localization-EKF positioner, an IMU filter madg wick-inertial sensor filter, data-raw data and an rpidar/ydlidar-laser radar.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention relates to a robot control system based on laser and visual identification navigation, which comprises a navigation system, a hardware driving module and a hardware module, wherein the hardware module is communicated with the hardware driving module through a ROSSERIAL protocol, and the navigation system is electrically connected with the hardware driving module to transmit data;
the hardware driving module is provided with a chassis communication node, an IMU gyroscope data node and a laser radar data node, and the IMU gyroscope data node receives data sent by the chassis communication node;
the navigation system comprises a mobile base station, a map server, a robot state release module, an AMCL algorithm module, a locator, an inertial sensor filter and a laser radar; the inertial sensor filter receives inertial sensor raw data of an IMU gyroscope data node from the hardware driving module and sends the data to the locator, and the locator sends the data to the mobile base station through the odometer; the laser radar receives data transmitted by a laser radar data node from a hardware driving module and transmits the data to a mobile base station; the map server transmits map data obtained by scanning and recording a map by the robot to the mobile base station; the mobile base station also receives data from the robot state release module and the AMCL algorithm module through communication connection;
and the navigation system sends the instruction to the chassis communication node after data processing.
In the aspect of autonomous navigation, the system supports multiple algorithms to build graphs, and can realize fusion of multiple data and multiple sensors by matching with carried multiple types of sensor equipment; the system carries multiple artificial intelligence algorithm images such as OpenCV, Dlib, TensorFlow and the like, has the functions of image processing, voice processing and the like, can develop the functions of face detection, human body detection, color block detection, voice recognition and the like, and supports multiple face recognition algorithms and target detection and classification algorithms.
The navigation algorithm disclosed by the invention integrates various navigation algorithms such as mapping SLAM, vector SLAM, Karto SLAM, Cartogrer SLAM, deep camera mapping and the like to construct and navigate a map, so that the robot system can find the optimal navigation algorithm in different environments and application scenes, and the defects of single navigation algorithm and single application scene in the prior art are overcome.
STM32F103RCT6 can be selected to hardware driver module's drive plate MCU, board-mounted MPU6050 IMU angular velocity gyroscope sensor, support 4 way direct current motor closed loop drive control, ultrasonic sensor control all the way, two way steering wheel drive control, temperature and humidity sensor all the way, bluetooth control all the way, SWD firmware program burns record mouth, all the other IO interfaces are drawn forth with the mode such as row needle, serial port line and are used for user's secondary development, mainly carry out work such as power control, IMU data reading, motor drive development, steering wheel drive development.
Furthermore, the hardware module comprises a motor driving module, an inertial sensor, an ultrasonic ranging module, a serial port, an encoder, a PID algorithm module, a remote control module and a servo cloud control module, and a differential kinematics model is further established in the hardware module.
Furthermore, the differential kinematic model adjusts the speed through a PID algorithm module, counts through an encoder, exchanges data with a serial port through a remote control module, and transmits differential data of robot navigation to a motor driving module.
Furthermore, the serial port receives ranging data from the ultrasonic ranging module, attitude data from the star observation sensor and control information from the servo cloud control module.
Further, the mobile base station directly receives data from a laser radar data node, and the locator directly receives data from a chassis communication node.
In the aspect of the operating system, an ROS system which is most suitable for robot system development at present can be selected, and the running condition of the robot can be checked in real time in the modes of gazebo simulation, rviz visual mapping and the like.
In the aspect of application, this system accessible 5G signal is linked with the thing networking platform, carries out data interaction with any thing networking device that the platform inserts in real time with the help of 5G communication and thing networking platform, also can utilize 5G communication protocol directly to assign the task instruction to the robot through the platform, also can utilize 5G communication protocol to carry out coordinated control with other thing networking device, smart machine with the robot through the platform.
The control principle of each hardware module as shown in fig. 2 is as follows:
the robot of the invention acquires actual data (such as motor rotating speed, IMU data, ultrasonic data, encoder data and the like) of a hardware module through modules such as a motor drive module, an IMU attitude data acquisition module, an ultrasonic wave module, an encoder module and the like.
The kinematics differential motion model is a control core and is responsible for driving four motors to rotate according to various data and actual data of a hardware module so as to realize actions of advancing, turning and the like of the robot according to requirements.
The serial port is a collection port of various data, and all data are transmitted to the kinematics model through the serial port.
The motor drive is a total drive module of four motors, and a rotating speed instruction is given to the four motors, and the four motors rotate according to rotating speed instruction data.
The IMU provides data such as real-time posture, direction and traveling distance of the robot, and the kinematics model issues rotating speed instructions to the four motors according to the IMU data so as to ensure that the robot travels according to a specified path.
The ultrasonic ranging provides real-time peripheral obstacle conditions for the kinematics model, and the kinematics model adjusts the rotating speeds of the four motors in real time according to the data to achieve the functions of obstacle avoidance and the like.
The encoder counting mainly provides motion distance data for a kinematic model, and the kinematic model calculates the actual travel distance of the robot according to the encoder counter and the motor rotating speed.
And the PID algorithm assists in adjusting the rotating speed of the motor according to the traditional PID control theory.
The robot can realize utilizing remote control that equipment such as bluetooth, wifi are connected, and after the instruction was assigned to remote control equipment, the kinematics model was adjusted the rotational speed of four motors according to the actual rotational speed of instruction and four motors to accomplish the appointed instruction of long-range assigned.
The robot accessible network connection such as 5G to the high in the clouds server, the high in the clouds server sends control command down to the serial ports through the 5G network, and the kinematics model is according to the actual rotational speed of instruction and four motors, adjusts the rotational speed of four motors to accomplish the appointed instruction that the high in the clouds was issued.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (5)

1. A robot control system based on laser and visual identification navigation is characterized by comprising a navigation system, a hardware driving module and a hardware module, wherein the hardware module is communicated with the hardware driving module through a ROSSERIAL protocol, and the navigation system is electrically connected with the hardware driving module to transmit data;
the hardware driving module is provided with a chassis communication node, an IMU gyroscope data node and a laser radar data node, and the IMU gyroscope data node receives data sent by the chassis communication node;
the navigation system comprises a mobile base station, a map server, a robot state release module, an AMCL algorithm module, a locator, an inertial sensor filter and a laser radar; the inertial sensor filter receives inertial sensor raw data of an IMU gyroscope data node from the hardware driving module and sends the data to the locator, and the locator sends the data to the mobile base station through the odometer; the laser radar receives data transmitted by a laser radar data node from a hardware driving module and transmits the data to a mobile base station; the map server transmits map data obtained by scanning and recording a map by the robot to the mobile base station; the mobile base station also receives data from the robot state release module and the AMCL algorithm module through communication connection;
and the navigation system sends the instruction to the chassis communication node after data processing.
2. The robot control system based on laser and vision recognition navigation of claim 1, wherein the hardware module comprises a motor driving module, an inertial sensor, an ultrasonic ranging module, a serial port, an encoder, a PID algorithm module, a remote control module and a servo cloud control module, and a differential kinematics model is further established in the hardware module.
3. The robot control system based on laser and vision recognition navigation of claim 2, wherein the differential kinematic model performs speed adjustment through a PID algorithm module, performs counting through an encoder, performs data exchange with a serial port through a remote control module, and transmits differential data of robot navigation to a motor driving module.
4. The laser and vision recognition navigation-based robot control system of claim 2, wherein the serial port receives ranging data from an ultrasonic ranging module, attitude data from a star sensor and control information from a servo cloud control module.
5. The laser and vision recognition navigation-based robot control system of claim 1, wherein the mobile base station directly receives data from a lidar data node and the locator directly receives data from a chassis communication node.
CN202210191166.0A 2022-03-01 2022-03-01 Robot control system based on laser and visual identification navigation Pending CN114265417A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210191166.0A CN114265417A (en) 2022-03-01 2022-03-01 Robot control system based on laser and visual identification navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210191166.0A CN114265417A (en) 2022-03-01 2022-03-01 Robot control system based on laser and visual identification navigation

Publications (1)

Publication Number Publication Date
CN114265417A true CN114265417A (en) 2022-04-01

Family

ID=80833788

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210191166.0A Pending CN114265417A (en) 2022-03-01 2022-03-01 Robot control system based on laser and visual identification navigation

Country Status (1)

Country Link
CN (1) CN114265417A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108733062A (en) * 2018-06-25 2018-11-02 山东大学 Family accompanies and attends to robot autonomous charging system and method
CN109599945A (en) * 2018-11-30 2019-04-09 武汉大学 A kind of autonomous crusing robot cruising inspection system of wisdom power plant and method
CN111830987A (en) * 2020-07-27 2020-10-27 济南浪潮高新科技投资发展有限公司 Control method and device of inspection robot and inspection system of robot
CN111866337A (en) * 2020-06-30 2020-10-30 北京福瑶科技有限公司 Intelligent inspection robot and inspection method
CN212322113U (en) * 2020-07-01 2021-01-08 湖南工业大学 Trolley obstacle avoidance system based on laser radar
CN113093756A (en) * 2021-04-07 2021-07-09 福州大学 Indoor navigation robot based on laser SLAM under raspberry group platform
CN113821040A (en) * 2021-09-28 2021-12-21 中通服创立信息科技有限责任公司 Robot with depth vision camera and laser radar integrated navigation
CN113848561A (en) * 2021-09-28 2021-12-28 中通服创立信息科技有限责任公司 Depth vision camera and laser radar fused navigation method, system and equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108733062A (en) * 2018-06-25 2018-11-02 山东大学 Family accompanies and attends to robot autonomous charging system and method
CN109599945A (en) * 2018-11-30 2019-04-09 武汉大学 A kind of autonomous crusing robot cruising inspection system of wisdom power plant and method
CN111866337A (en) * 2020-06-30 2020-10-30 北京福瑶科技有限公司 Intelligent inspection robot and inspection method
CN212322113U (en) * 2020-07-01 2021-01-08 湖南工业大学 Trolley obstacle avoidance system based on laser radar
CN111830987A (en) * 2020-07-27 2020-10-27 济南浪潮高新科技投资发展有限公司 Control method and device of inspection robot and inspection system of robot
CN113093756A (en) * 2021-04-07 2021-07-09 福州大学 Indoor navigation robot based on laser SLAM under raspberry group platform
CN113821040A (en) * 2021-09-28 2021-12-21 中通服创立信息科技有限责任公司 Robot with depth vision camera and laser radar integrated navigation
CN113848561A (en) * 2021-09-28 2021-12-28 中通服创立信息科技有限责任公司 Depth vision camera and laser radar fused navigation method, system and equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
杨芳: "基于ROS移动机器人底盘设计与实现", 《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》 *
柯耀: "基于ROS的开源移动机器人平台设计", 《单片机与嵌入式系统应用》 *
黄加俊等: "基于激光SLAM的全方位移动机器人控制系统设计", 《科技通报》 *

Similar Documents

Publication Publication Date Title
WO2022021739A1 (en) Humanoid inspection operation method and system for semantic intelligent substation robot
WO2020253316A1 (en) Navigation and following system for mobile robot, and navigation and following control method
CN102368158B (en) Navigation positioning method of orchard machine
CN106354161A (en) Robot motion path planning method
CN110163963B (en) Mapping device and mapping method based on SLAM
CN111958592A (en) Image semantic analysis system and method for transformer substation inspection robot
CN108536145A (en) A kind of robot system intelligently followed using machine vision and operation method
CN214520204U (en) Port area intelligent inspection robot based on depth camera and laser radar
CN105082161A (en) Robot vision servo control device of binocular three-dimensional video camera and application method of robot vision servo control device
CN110716549A (en) Autonomous navigation robot system for map-free area patrol and navigation method thereof
CN111123911A (en) Legged intelligent star catalogue detection robot sensing system and working method thereof
WO2022252221A1 (en) Mobile robot queue system, path planning method and following method
CN111958593B (en) Vision servo method and system for inspection operation robot of semantic intelligent substation
Mulky et al. Autonomous scooter navigation for people with mobility challenges
CN113498667A (en) Intelligent mowing robot based on panoramic machine vision
Jensen et al. Laser range imaging using mobile robots: From pose estimation to 3D-models
Mutz et al. Following the leader using a tracking system based on pre-trained deep neural networks
Su et al. A framework of cooperative UAV-UGV system for target tracking
CN112327868A (en) Intelligent robot automatic navigation system
CN114265417A (en) Robot control system based on laser and visual identification navigation
CN116352722A (en) Multi-sensor fused mine inspection rescue robot and control method thereof
CN115237113B (en) Robot navigation method, robot system and storage medium
Sidaoui et al. Collaborative human augmented SLAM
CN113610910B (en) Obstacle avoidance method for mobile robot
CN212193168U (en) Robot head with laser radars arranged on two sides

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220401