CN112318507A - Robot intelligent control system based on SLAM technology - Google Patents

Robot intelligent control system based on SLAM technology Download PDF

Info

Publication number
CN112318507A
CN112318507A CN202011175325.5A CN202011175325A CN112318507A CN 112318507 A CN112318507 A CN 112318507A CN 202011175325 A CN202011175325 A CN 202011175325A CN 112318507 A CN112318507 A CN 112318507A
Authority
CN
China
Prior art keywords
speed
sensor
robot
minipc
main control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011175325.5A
Other languages
Chinese (zh)
Inventor
范海廷
杜云刚
苏欣
陈帅
田莎琦
侯培军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inner Mongolia University of Technology
Original Assignee
Inner Mongolia University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inner Mongolia University of Technology filed Critical Inner Mongolia University of Technology
Priority to CN202011175325.5A priority Critical patent/CN112318507A/en
Publication of CN112318507A publication Critical patent/CN112318507A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

本发明公开了一种基于SLAM技术的机器人智能控制系统,包括:MiniPC、ARM嵌入式主控、激光雷达传感器、视觉传感器、全向轮、惯性测量单元、ZIGBEE模块、正交码盘与编码器、相机和2D激光传感器;所述ARM嵌入式主控与所述ZIGBEE模块交互连接,所述ARM嵌入式主控与所述MiniPC交互连接,所述全向轮分别与所述激光雷达传感器、所述ZIGBEE模块、所述惯性测量单元、所述正交码盘与编码器和所述视觉传感器连接。本发明在使用的过程中,将惯性测量单元、激光雷达传感器、相机数据进行融合,使各传感器的优势互补,以提高移动平台的建图、定位精度,且能实现在环境复杂、人流集中的场合中的避障及局部动态避障,增强机器人移动底盘实际运行时的鲁棒性。

Figure 202011175325

The invention discloses a robot intelligent control system based on SLAM technology, comprising: MiniPC, ARM embedded main control, laser radar sensor, visual sensor, omnidirectional wheel, inertial measurement unit, ZIGBEE module, orthogonal code disc and encoder , camera and 2D laser sensor; the ARM embedded main control is interactively connected with the ZIGBEE module, the ARM embedded main control is interactively connected with the MiniPC, and the omnidirectional wheel is respectively connected with the lidar sensor, the The ZIGBEE module, the inertial measurement unit, and the quadrature code disc are connected with the encoder and the vision sensor. In the process of using the invention, the inertial measurement unit, the laser radar sensor and the camera data are fused, so that the advantages of each sensor can complement each other, so as to improve the mapping and positioning accuracy of the mobile platform, and can realize the complex environment and the crowd flow. The obstacle avoidance and local dynamic obstacle avoidance in occasions enhance the robustness of the robot mobile chassis in actual operation.

Figure 202011175325

Description

Robot intelligent control system based on SLAM technology
The technical field is as follows:
the invention relates to the technical field of robot control, in particular to an SLAM technology-based robot intelligent control system.
Background art:
a robot is an intelligent machine that can work semi-autonomously or fully autonomously. The robot can assist or even replace human beings to finish dangerous, heavy and complex work, improve the work efficiency and quality, serve human life, and expand or extend the activity and capacity range of human beings, and along with the continuous improvement of the intelligent level of the robot, the robot has more important application in the aspects of home service, health care and medical treatment, new media entertainment and the like. When the mobile robot usually works in a relatively complex indoor environment, a plurality of sensors are needed to sense the surrounding environment, and tasks such as map construction, positioning and autonomous navigation are completed, namely SLAM of the mobile robot is realized. Therefore, information fusion of multiple sensors becomes an important aspect in SLAM research of the mobile robot, and has an important influence on the realization of functions and the level of intelligence of the robot.
In the use process of the existing robot control system, a SLAM algorithm which is applied to a laser radar and vision independently has certain defects, the robot has poor positioning precision, the obstacle avoidance capability in a complex environment and an occasion with concentrated pedestrian flow, the robot has poor robustness, and the robot is not beneficial to the control and use of the robot.
The invention content is as follows:
the invention aims to provide a robot intelligent control system based on SLAM technology to solve the problems in the background technology.
The invention is implemented by the following technical scheme: a robot intelligent control system based on SLAM technology comprises: the system comprises a MiniPC, an ARM embedded main control unit, a laser radar sensor, a vision sensor, an omnidirectional wheel, an inertia measurement unit, a ZIGBEE module, an orthogonal code disc and encoder, a camera and a 2D laser sensor; the ARM embedded main control is in interactive connection with the ZIGBEE module, the ARM embedded main control is in interactive connection with the MiniPC, the omnidirectional wheel is respectively connected with the laser radar sensor, the ZIGBEE module, the inertial measurement unit, the orthogonal code disc, the encoder and the visual sensor, the laser radar sensor and the visual sensor are connected with the MiniPC, the inertial measurement unit is connected with the MiniPC, the orthogonal code disc and the encoder are connected with the ARM embedded main control, and the ARM embedded main control is connected with the omnidirectional wheel;
the MiniPC is used for carrying a Ubuntu operating system and a ROS operating system;
the ARM embedded main control is used for transmitting data returned by the orthogonal code disc, the encoder and the laser radar sensor to the MiniPC as auxiliary information for constructing coordinate points so as to achieve the purpose of closed-loop information processing;
the laser radar sensor, the inertial measurement unit and the visual sensor are used for acquiring environmental information;
the omnidirectional wheel is used for driving the robot to move;
the ZIGBEE module is used for realizing remote communication;
the orthogonal code wheel and the encoder are used for transmitting the value of the encoder to prepare for coordinate calculation.
As further preferable in the present technical solution: the camera acquires environment image information to perform visual SLAM to obtain a positioning result, the positioning result of the visual SLAM is fused with inertial measurement data such as acceleration and angular velocity obtained by the inertial measurement unit IMU according to an extended Kalman filtering method, inertial motion data obtained by the IMU is used for estimating the state of EKF, the core positioning result of the visual SLAM is used as an observation value to complete the correction process of the EKF, and thus the positioning result after EKF fusion is obtained.
As further preferable in the present technical solution: and performing projection calculation on the in-plane distance and angle environment characteristic data acquired by the laser radar sensor, projecting the in-plane distance and angle environment characteristic data from a polar coordinate system to a plane rectangular coordinate system, and performing coordinate conversion on the projected point cloud data by combining the fused extended Kalman filtering positioning result to obtain point cloud data in a unified world coordinate system.
As further preferable in the present technical solution: the processor adopted by the ARM embedded main control is an ARM embedded processor and an Intel embedded processor, a Ubuntu operating system is loaded on the Intel processor, an ROS robot operating system is installed under the Ubuntu to provide functions similar to the operating system for a heterogeneous computer cluster, the ROS can realize positioning drawing, action planning, sensing and simulation to meet the target requirement of a mobile robot for establishing a map, the ARM processor controls the running of a chassis, receives speed information, angle information and distance information of the chassis and processes the speed information, the angle information and the distance information to calculate the most proper advancing speed, a speed control task on the ARM is performed, after the map is established by the laser radar sensor, the world coordinate distance is converted into the real distance, then the information is sent to the ARM, a UCOS system is established on the ARM, the current time is obtained in the UCOS system, and the current time, the total distance, the current initial speed and the like are calculated, The set uniform speed and the set final speed are parameters, the set uniform speed and the set final speed are introduced into a formula to calculate the acceleration length, the uniform speed length and the deceleration length, the current position to be moved is calculated, the operation speed in the period is calculated to introduce a proportionality coefficient, an integral coefficient and a differential coefficient, the final total control speed is worked out, the final total control speed is converted into the duty ratio of a PWM wave and output to a motor, the rotating speed of the motor cannot reach the target speed due to external disturbance and deviation caused by the difference of 4 rotating speeds of the motor, two parameters are set to be a current value and an expected value respectively, the current value is obtained by converting the values obtained by the orthogonal code disc and an encoder, and the expected value is the calculated total control speed and.
As further preferable in the present technical solution: after the laser radar sensor obtains environmental characteristic information such as distance and angle in a plane, the environmental data coordinate obtained by the laser radar sensor can be converted into a world coordinate system by utilizing a positioning result of the visual SLAM based on the IMU data fused with the extended Kalman filtering, so that a three-dimensional laser point cloud map is accurately constructed in real time according to two-dimensional laser data.
As further preferable in the present technical solution: the MiniPC is used as a brain to transmit an instruction to the ARM embedded type to indirectly control the omnidirectional wheel to operate and control the mobile robot to reach a target place.
As further preferable in the present technical solution: the ZIGBEE module is used for bidirectional wireless data transmission to realize remote communication.
As further preferable in the present technical solution: the camera is a visual camera and is used for acquiring environment image information.
The invention has the advantages that: in the using process, aiming at the existing defects of the SLAM algorithm applied to the laser radar sensor and the vision sensor, the invention fuses the inertial measurement unit, the laser radar sensor and the camera data, so that the advantages of the sensors are complemented, the mapping and positioning precision of the mobile platform is improved, the obstacle avoidance and the local dynamic obstacle avoidance in the occasions with complex environment and concentrated pedestrian flow can be realized, and the robustness of the mobile chassis of the robot in actual operation is enhanced.
Description of the drawings:
in order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow diagram of the system of the present invention;
FIG. 2 is a schematic diagram of the construction of a point cloud unit according to the present invention.
The specific implementation mode is as follows:
the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
Referring to fig. 1-2, the present invention provides a technical solution: a robot intelligent control system based on SLAM technology comprises: the system comprises a MiniPC, an ARM embedded main control unit, a laser radar sensor, a vision sensor, an omnidirectional wheel, an inertia measurement unit, a ZIGBEE module, an orthogonal code disc and encoder, a camera and a 2D laser sensor; the ARM embedded main control is in interactive connection with the ZIGBEE module, the ARM embedded main control is in interactive connection with the MiniPC, the omnidirectional wheel is respectively connected with the laser radar sensor, the ZIGBEE module, the inertia measurement unit, the orthogonal code disc, the encoder and the vision sensor, the laser radar sensor and the vision sensor are connected with the MiniPC, the inertia measurement unit is connected with the MiniPC, the orthogonal code disc and the encoder are connected with the ARM embedded main control, and the ARM embedded main control is connected with the omnidirectional wheel;
the MiniPC is used for carrying a Ubuntu operating system and a ROS operating system;
the ARM embedded master control is used for transmitting data returned by the orthogonal code disc, the encoder and the laser radar sensor to the MiniPC to serve as auxiliary information for constructing a coordinate point so as to achieve the purpose of closed-loop information processing;
the laser radar sensor, the inertia measurement unit and the vision sensor are used for acquiring environmental information;
the omnidirectional wheel is used for driving the robot to move;
the ZIGBEE module is used for realizing remote communication;
orthogonal codewheels and encoders for communicating encoder values in preparation for coordinate calculations.
In this embodiment, specifically: the camera acquires environment image information to perform visual SLAM to obtain a positioning result, the positioning result of the visual SLAM is fused with inertial measurement data such as acceleration and angular velocity obtained by an Inertial Measurement Unit (IMU) according to an extended Kalman filtering method, inertial motion data obtained by the IMU is used for EKF state estimation, and the core positioning result of the visual SLAM is used as an observation value to complete the correction process of EKF, so that the positioning result after EKF fusion is obtained.
In this embodiment, specifically: and performing projection calculation on the in-plane distance and angle environment characteristic data acquired by the laser radar sensor, projecting the in-plane distance and angle environment characteristic data from a polar coordinate system to a plane rectangular coordinate system, and performing coordinate conversion on the projected point cloud data by combining the fused extended Kalman filtering positioning result to obtain point cloud data in a unified world coordinate system.
In this embodiment, specifically: the ARM embedded main control adopts a processor which is an ARM embedded processor and an Intel embedded processor, a Ubuntu operating system is loaded on the Intel processor, an ROS robot operating system is installed under the Ubuntu to provide functions similar to the operating system for a heterogeneous computer cluster, the ROS can realize positioning drawing, action planning, sensing and simulation to meet the target requirement of a mobile robot for establishing a map, the ARM processor controls the running of a chassis, receives speed information, angle information and distance information of the chassis and processes the speed information, calculates the most appropriate advancing speed, controls a task on the ARM speed, converts world coordinate distance into real distance after the map is established by a laser radar sensor, then transmits the information to the ARM, establishes a UCOS system on the ARM, acquires the current time from the UCOS system, and takes the current time, the total distance, the current initial speed, the set uniform speed and the set final speed as parameters, the method comprises the steps of calculating an acceleration length, a uniform speed length and a deceleration length in a formula, calculating a current position to be moved, calculating a proportional coefficient, an integral coefficient and a differential coefficient of the operation speed in the period, calculating a final total control speed, converting the final total control speed into a duty ratio of a PWM wave, and outputting the duty ratio to a motor.
In this embodiment, specifically: after the laser radar sensor obtains environmental characteristic information such as distance and angle in a plane, the environmental data coordinate obtained by the laser radar sensor can be converted into a world coordinate system by utilizing a positioning result of the visual SLAM based on the IMU data fused with the extended Kalman filtering, so that a three-dimensional laser point cloud map is accurately constructed in real time according to two-dimensional laser data.
In this embodiment, specifically: the MiniPC is used as a function that the brain transmits an instruction to the ARM embedded type to indirectly control the omnidirectional wheel to operate and control the mobile robot to reach a target place.
In this embodiment, specifically: the ZIGBEE module is used for bidirectional wireless data transmission to realize remote communication.
In this embodiment, specifically: the camera is a visual camera and is used for acquiring environment image information.
Working principle or structural principle: when the system is used, the environmental image information is acquired through the camera to carry out visual SLAM to obtain a positioning result, the positioning result of the visual SLAM is fused with inertial measurement data such as acceleration, angular velocity and the like obtained by an inertial measurement unit IMU according to an extended Kalman filtering method, the inertial motion data obtained by the IMU is used for estimating the state of EKF, the core positioning result of the visual SLAM is used as an observation value to finish the correction process of EKF, so that the positioning result after EKF fusion is obtained, the in-plane distance and angle environmental characteristic data obtained by a laser radar sensor are subjected to projection calculation, the in-plane distance and angle environmental characteristic data are projected from a polar coordinate system to a plane rectangular coordinate system, the projected point cloud data are subjected to coordinate conversion by combining the extended Kalman filtering positioning result after fusion to obtain point cloud data in a unified world coordinate system, the ARM processor controls the running of the chassis and receives speed information of the chassis, The angle information and the distance information are processed to calculate the most appropriate forward speed.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (8)

1.一种基于SLAM技术的机器人智能控制系统,其特征在于,包括:MiniPC、ARM嵌入式主控、激光雷达传感器、视觉传感器、全向轮、惯性测量单元、ZIGBEE模块、正交码盘与编码器、相机和2D激光传感器;所述ARM嵌入式主控与所述ZIGBEE模块交互连接,所述ARM嵌入式主控与所述MiniPC交互连接,所述全向轮分别与所述激光雷达传感器、所述ZIGBEE模块、所述惯性测量单元、所述正交码盘与编码器和所述视觉传感器连接,所述激光雷达传感器和所述视觉传感器与所述MiniPC连接,所述惯性测量单元与所述MiniPC连接,所述正交码盘与编码器与所述ARM嵌入式主控连接,所述ARM嵌入式主控与所述全向轮连接;1. a robot intelligent control system based on SLAM technology, is characterized in that, comprises: MiniPC, ARM embedded main control, lidar sensor, vision sensor, omnidirectional wheel, inertial measurement unit, ZIGBEE module, orthogonal code disc and Encoder, camera and 2D laser sensor; the ARM embedded main control is interactively connected with the ZIGBEE module, the ARM embedded main control is interactively connected with the MiniPC, and the omnidirectional wheels are respectively connected with the lidar sensor , the ZIGBEE module, the inertial measurement unit, the quadrature code disc are connected with the encoder and the vision sensor, the lidar sensor and the vision sensor are connected with the MiniPC, and the inertial measurement unit is connected with the the MiniPC is connected, the orthogonal code disc and the encoder are connected with the ARM embedded main control, and the ARM embedded main control is connected with the omnidirectional wheel; 所述MiniPC,用于搭载Ubuntu操作系统和ROS操作系统;The MiniPC is used to carry the Ubuntu operating system and the ROS operating system; 所述ARM嵌入式主控,用于向所述MiniPC传送所述正交码盘与编码器和所述激光雷达传感器返回的数据作为构建坐标点的辅助信息达到闭环处理信息的目的;The ARM embedded main control is used to transmit the data returned by the orthogonal code disc and the encoder and the lidar sensor to the MiniPC as auxiliary information for constructing coordinate points to achieve the purpose of closed-loop processing information; 所述激光雷达传感器、所述惯性测量单元和所述视觉传感器,用于采集环境信息;the lidar sensor, the inertial measurement unit and the visual sensor are used to collect environmental information; 所述全向轮,用于带动机器人移动;The omnidirectional wheel is used to drive the robot to move; 所述ZIGBEE模块,用于实现远程通信;The ZIGBEE module is used to realize remote communication; 所述正交码盘与编码器,用于传送编码器的值为坐标计算做准备。The orthogonal code wheel and the encoder are used to transmit the value of the encoder to prepare for coordinate calculation. 2.根据权利要求1所述的一种基于SLAM技术的机器人智能控制系统,其特征在于:所述相机获取环境图像信息进行视觉SLAM得到定位结果,根据扩展卡尔曼滤波方法将视觉SLAM的定位结果与所述惯性测量单元IMU获得的加速度、角速度等惯性测量数据进行融合,IMU获得的惯性运动数据用于EKF的状态预估,视觉SLAM的核心定位结果作为观测值完成EKF的校正过程,从而获得EKF融合后的定位结果。2. a kind of robot intelligent control system based on SLAM technology according to claim 1, is characterized in that: described camera obtains environmental image information to carry out visual SLAM to obtain positioning result, according to extended Kalman filtering method, the positioning result of visual SLAM is obtained. It is fused with the inertial measurement data such as acceleration and angular velocity obtained by the inertial measurement unit IMU. The inertial motion data obtained by the IMU is used for the state estimation of the EKF, and the core positioning result of the visual SLAM is used as the observation value to complete the EKF correction process, thereby obtaining Localization results after EKF fusion. 3.根据权利要求1所述的一种基于SLAM技术的机器人智能控制系统,其特征在于:所述激光雷达传感器获取的平面内距离、角度环境特征数据进行投影计算,将其从极坐标系投影至平面直角坐标系中,再结合融合后的扩展卡尔曼滤波定位结果,对投影后的点云数据进行坐标转换,得到位于统一的世界坐标系中的点云数据。3. a kind of robot intelligent control system based on SLAM technology according to claim 1, is characterized in that: the in-plane distance, angle environment characteristic data obtained by described lidar sensor carry out projection calculation, and it is projected from polar coordinate system To the plane rectangular coordinate system, combined with the extended Kalman filter positioning result after fusion, coordinate transformation is performed on the projected point cloud data, and the point cloud data located in the unified world coordinate system is obtained. 4.根据权利要求1所述的一种基于SLAM技术的机器人智能控制系统,其特征在于:所述ARM嵌入式主控采用的处理器为ARM嵌入式与Intel嵌入式处理器,在Intel处理器上搭载Ubuntu操作系统,将ROS机器人操作系统安装在Ubuntu下,为异质计算机集群提供类似操作系统的功能,ROS可以实现定位绘图、行动规划、感知和模拟,满足移动机器人建立地图的目标需求,ARM处理器控制底盘的运行,并接收底盘的速度信息、角度信息和距离信息,并进行处理,计算出最适当的前进速度,在ARM上的速度控制任务,由所述激光雷达传感器构建地图后,将世界坐标距离转化为真实距离,然后将信息发送到ARM上,在ARM上构建UCOS系统,再在UCOS系统中获取当前时间,以当前时间、总路程、当前初速度、设置的匀速度和设置的末速度为参数,带入到公式中算出加速长度,匀速长度,减速长度,算得当前应走位置,计算出本周期运算速度带入比例系数,积分系数,微分系数,求出最后的总控制速度,转化成PWM波的占空比,输出到电机中,由于外界扰动以及4个电机转速不同带来的偏差量,使得电机转速并不能达到目标速度,设置两个参数分别为当前值,和期望值,当前值为所述正交码盘与编码器所获取的值转化而来,期望值为计算出的总控制速度,转化成PWM占空比形式做处理。4. a kind of robot intelligent control system based on SLAM technology according to claim 1, is characterized in that: the processor that described ARM embedded main control adopts is ARM embedded and Intel embedded processor, in Intel processor Equipped with Ubuntu operating system, the ROS robot operating system is installed under Ubuntu to provide functions similar to the operating system for heterogeneous computer clusters. ROS can realize positioning and mapping, action planning, perception and simulation, and meet the target needs of mobile robots to build maps. The ARM processor controls the operation of the chassis, and receives the speed information, angle information and distance information of the chassis, and processes it to calculate the most appropriate forward speed. The speed control task on the ARM is constructed by the lidar sensor after the map is constructed. , convert the world coordinate distance into the real distance, then send the information to the ARM, build the UCOS system on the ARM, and then obtain the current time in the UCOS system, with the current time, total distance, current initial speed, set uniform speed and Set the final speed as a parameter, bring it into the formula to calculate the acceleration length, uniform speed length, deceleration length, calculate the current position to travel, calculate the operation speed of this cycle and bring it into the proportional coefficient, integral coefficient, differential coefficient, and find the final total. Control the speed, convert it into the duty cycle of the PWM wave, and output it to the motor. Due to the external disturbance and the deviation caused by the different speeds of the four motors, the motor speed cannot reach the target speed. Set the two parameters as the current value, respectively. and the expected value, the current value is converted from the value obtained by the quadrature code disc and the encoder, and the expected value is the calculated total control speed, which is converted into the PWM duty cycle form for processing. 5.根据权利要求1所述的一种基于SLAM技术的机器人智能控制系统,其特征在于:所述激光雷达传感器获取平面内距离、角度等环境特征信息后,利用基于扩展卡尔曼滤波的融合IMU数据的视觉SLAM的定位结果,可以将所述激光雷达传感器获得的环境数据坐标转换到世界坐标系,从而根据二维激光数据实时准确地构建三维激光点云地图。5. a kind of robot intelligent control system based on SLAM technology according to claim 1, is characterized in that: after described lidar sensor obtains the environmental characteristic information such as in-plane distance, angle, utilizes the fusion IMU based on extended Kalman filter The positioning results of the visual SLAM of the data can convert the coordinates of the environmental data obtained by the lidar sensor into the world coordinate system, so as to accurately construct a three-dimensional laser point cloud map in real time according to the two-dimensional laser data. 6.根据权利要求1所述的一种基于SLAM技术的机器人智能控制系统,其特征在于:所述MiniPC作为大脑向ARM嵌入式传送指令间接控制所述全向轮运行,控制移动机器人到达目标地点的功能。6. a kind of robot intelligent control system based on SLAM technology according to claim 1, is characterized in that: described MiniPC indirectly controls described omnidirectional wheel operation as brain to ARM embedded transmission instruction, controls mobile robot to reach target location function. 7.根据权利要求1所述的一种基于SLAM技术的机器人智能控制系统,其特征在于:所述ZIGBEE模块为双向无线传输数据,实现远程通信。7. A robot intelligent control system based on SLAM technology according to claim 1, characterized in that: the ZIGBEE module is a two-way wireless transmission of data to realize long-distance communication. 8.根据权利要求1所述的一种基于SLAM技术的机器人智能控制系统,其特征在于:所述相机为视觉相机,用于获取环境图像信息。8 . The robot intelligent control system based on SLAM technology according to claim 1 , wherein the camera is a visual camera, which is used to obtain environmental image information. 9 .
CN202011175325.5A 2020-10-28 2020-10-28 Robot intelligent control system based on SLAM technology Pending CN112318507A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011175325.5A CN112318507A (en) 2020-10-28 2020-10-28 Robot intelligent control system based on SLAM technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011175325.5A CN112318507A (en) 2020-10-28 2020-10-28 Robot intelligent control system based on SLAM technology

Publications (1)

Publication Number Publication Date
CN112318507A true CN112318507A (en) 2021-02-05

Family

ID=74296483

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011175325.5A Pending CN112318507A (en) 2020-10-28 2020-10-28 Robot intelligent control system based on SLAM technology

Country Status (1)

Country Link
CN (1) CN112318507A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112882480A (en) * 2021-03-23 2021-06-01 海南师范大学 System and method for fusing SLAM (simultaneous localization and mapping) by laser and vision aiming at crowd environment
CN114211173A (en) * 2022-01-27 2022-03-22 上海电气集团股份有限公司 Method, device and system for determining welding position

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110011424A (en) * 2009-07-28 2011-02-08 주식회사 유진로봇 Position recognition and driving control method of mobile robot and mobile robot using same
CN104062977A (en) * 2014-06-17 2014-09-24 天津大学 Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM
CN109993113A (en) * 2019-03-29 2019-07-09 东北大学 A Pose Estimation Method Based on RGB-D and IMU Information Fusion
CN110375738A (en) * 2019-06-21 2019-10-25 西安电子科技大学 A kind of monocular merging Inertial Measurement Unit is synchronous to be positioned and builds figure pose calculation method
CN110617813A (en) * 2019-09-26 2019-12-27 中国科学院电子学研究所 Monocular visual information and IMU (inertial measurement Unit) information fused scale estimation system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110011424A (en) * 2009-07-28 2011-02-08 주식회사 유진로봇 Position recognition and driving control method of mobile robot and mobile robot using same
CN104062977A (en) * 2014-06-17 2014-09-24 天津大学 Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM
CN109993113A (en) * 2019-03-29 2019-07-09 东北大学 A Pose Estimation Method Based on RGB-D and IMU Information Fusion
CN110375738A (en) * 2019-06-21 2019-10-25 西安电子科技大学 A kind of monocular merging Inertial Measurement Unit is synchronous to be positioned and builds figure pose calculation method
CN110617813A (en) * 2019-09-26 2019-12-27 中国科学院电子学研究所 Monocular visual information and IMU (inertial measurement Unit) information fused scale estimation system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王古超: "基于ROS的全向移动机器人系统设计与研究", 《安徽理工大学》 *
郑国贤: "机器人室内环境自主探索与地图构建方法", 《控制工程》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112882480A (en) * 2021-03-23 2021-06-01 海南师范大学 System and method for fusing SLAM (simultaneous localization and mapping) by laser and vision aiming at crowd environment
CN112882480B (en) * 2021-03-23 2023-07-21 海南师范大学 Laser and vision fusion SLAM system and method for crowd environment
CN114211173A (en) * 2022-01-27 2022-03-22 上海电气集团股份有限公司 Method, device and system for determining welding position
CN114211173B (en) * 2022-01-27 2024-05-31 上海电气集团股份有限公司 Method, device and system for determining welding position

Similar Documents

Publication Publication Date Title
WO2020253316A1 (en) Navigation and following system for mobile robot, and navigation and following control method
CN110262495B (en) Control system and method for autonomous navigation and precise positioning of mobile robots
CN107167141B (en) Robot autonomous navigation system based on double laser radars
JP6868028B2 (en) Autonomous positioning navigation equipment, positioning navigation method and autonomous positioning navigation system
CN106527432B (en) Indoor mobile robot collaborative system based on fuzzy algorithm and two-dimensional code self-correction
CN111308490B (en) Balance car indoor positioning and navigation system based on single-line laser radar
CN106123890A (en) A kind of robot localization method of Fusion
CN108844543A (en) Indoor AGV navigation control method based on UWB positioning and dead reckoning
CN103207634A (en) Data fusion system and method of differential GPS (Global Position System) and inertial navigation in intelligent vehicle
CN110673614A (en) Mapping system and mapping method of small robot group based on cloud server
CN102135766A (en) Autonomous operation forestry robot platform
CN110160543A (en) The robot of positioning and map structuring in real time
CN109491383A (en) Multirobot positions and builds drawing system and method
CN107450556A (en) ROS-based autonomous navigation intelligent wheelchair
CN115218891B (en) A mobile robot autonomous positioning and navigation method
CN112318507A (en) Robot intelligent control system based on SLAM technology
CN111766603A (en) Laser SLAM method, system, medium and equipment for mobile robot based on visual aided positioning of AprilTag code
CN214846390U (en) Dynamic environment obstacle avoidance system based on automatic guided vehicle
Mulky et al. Autonomous scooter navigation for people with mobility challenges
CN114527763A (en) Intelligent inspection system and method based on target detection and SLAM composition
CN108646759B (en) Intelligent detachable mobile robot system and control method based on stereo vision
CN115981314A (en) Robot navigation automatic obstacle avoidance method and system based on two-dimensional laser radar positioning
CN111376263A (en) Human-computer cooperation system of compound robot and cross coupling force control method thereof
CN116374041B (en) Land-air multi-mode four-foot bionic robot and control method
Al Arabi et al. 2D mapping and vertex finding method for path planning in autonomous obstacle avoidance robotic system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210205

RJ01 Rejection of invention patent application after publication