CN116989796A - Sensor array mobile robot positioning system based on ROS - Google Patents

Sensor array mobile robot positioning system based on ROS Download PDF

Info

Publication number
CN116989796A
CN116989796A CN202311124796.7A CN202311124796A CN116989796A CN 116989796 A CN116989796 A CN 116989796A CN 202311124796 A CN202311124796 A CN 202311124796A CN 116989796 A CN116989796 A CN 116989796A
Authority
CN
China
Prior art keywords
robot
data
sensors
ros
mcu
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311124796.7A
Other languages
Chinese (zh)
Inventor
艾米尔
丁楠
李芬芬
耿君佐
孔荣双
李彪
费宏彦
王德烨
周泓
林剑楚
肖绍章
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huaiyin Institute of Technology
Original Assignee
Huaiyin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huaiyin Institute of Technology filed Critical Huaiyin Institute of Technology
Priority to CN202311124796.7A priority Critical patent/CN116989796A/en
Publication of CN116989796A publication Critical patent/CN116989796A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications

Abstract

A sensor array mobile robot positioning system based on ROS comprises an MCU arranged in the center of a mobile robot, and a plurality of sensors connected with the MCU, wherein driving plates of the plurality of sensors are distributed on the periphery of the MCU in an annular mode, and all sensing probes are arranged in all directions of the robot in a extending way to form a sensor array to acquire the surrounding environment data of the mobile robot; when the system is started, the robot detects whether barriers exist around through the ultrasonic sensors arranged around, detects whether concave ground exists on a forward road through the two cliff sensors at the front and the lower sides, acquires road information of a forward traveling route through the laser sensor at the front, transmits the detected information to the MCU in real time for processing, and the MCU communicates the processed information with the master controller through the serial ports by using the ROS, drives the motor of the wheels by using the PWM, and returns the value of the encoder to perform PID control. The invention can lead the navigation of the robot to be more accurate and reliable.

Description

Sensor array mobile robot positioning system based on ROS
Technical Field
The invention relates to the technical field of robot positioning, in particular to a sensor array mobile robot positioning system based on ROS.
Background
With the development of computer technology and the continuous improvement of social demands, mobile robots are widely used in various fields. Mobile robots often require accurate positioning in practical applications in order to accomplish a given task, such as environmental surveying, article handling, etc. Thus, real-time positioning systems are critical for mobile robotic applications. As robots are continuously permeated into various industries, the facing external environment is more and more complex, and new requirements on the accuracy, stability and intellectualization of the robots are provided. However, the conventional real-time positioning method has many problems such as large error, poor robustness, etc.
The method of acquiring the target information from a plurality of sensors (such as a sensor array) is adopted, but only the mode of acquiring the target information from a plurality of sides is adopted, but the method of simply increasing the sensors continuously increases the complexity of the system, improves the calculation power requirement on a platform, cuts off the inherent relation among the sensor information, loses the deep effective information possibly contained in the information after being organically combined, causes the waste of information resources, and even possibly causes decision errors. Thus, the development of multisensor fusion techniques has shown its necessity under the influence of multiple factors.
In the prior art, although many robot positioning technologies are available for selection, the single-mode features are usually directly extracted and identified to specific positions, so that strong dependence on preset priori conditions is generated among the features, such as mapping positioning provides basis for subsequent path planning and decision control, mapping is the fundamental foundation of the whole system, and the strong dependence impairs the stability and precision of rapid deployment implementation of positioning detection and single use scene.
Disclosure of Invention
Aiming at the technical problems, the technical scheme provides a sensor array mobile robot positioning system based on ROS, which uses MCU to collect and process the data of each sensor, can process the data of a plurality of sensors simultaneously and integrate the data into one message, thereby reducing the complexity of an upper computer system, the requirement on the calculation force of the upper computer and the occupation rate of CPU under the condition of continuously increasing the sensors; the problems can be effectively solved.
The invention is realized by the following technical scheme:
a sensor array mobile robot positioning system based on ROS comprises an MCU arranged in the center of a mobile robot and a plurality of sensors connected with the MCU; the driving plates of the plurality of sensors are distributed on the periphery of the MCU in an annular mode, all the sensing probes are arranged in all directions of the robot in a extending mode, and a sensor array is formed to acquire the surrounding environment data of the mobile robot; the method comprises the steps of obtaining all-dimensional environment information through a sensor array, obtaining the current position of a robot according to the surrounding information, and judging whether the robot is in a safe working state or not; the MCU as a main processor receives readings of a plurality of sensors and transmits the readings to the ROS node, and the ROS node receives data from the plurality of sensors and integrates the data into one message which can be subscribed to by other ROS nodes so that the ROS node can use the sensor data for navigation and other tasks; the specific operation steps comprise:
step one: receiving ultrasonic sensor data, converting the position of an ultrasonic sensor and a corresponding robot coordinate system, and acquiring the data of the ultrasonic sensor by taking the robot coordinate system as a reference;
step two: receiving single-point laser radar data, acquiring obstacle information in front of a robot through the single-point laser radar, and giving limitation of the maximum speed of the robot according to the distance between the obstacle and the robot so as to prevent collision caused by the excessively high speed of the robot; and deriving the width of the obstacle according to the following formula:
wherein w in the above formula is the width of the obstacle, D a and Db L is the angle of the maximum value of the left and right ends of the obstacle a and Lb Is D a Angle and D b Positionally corresponding to and distance from the obstacle;
step three: receiving IMU data and converting, and obtaining Euler angles of acceleration, angular velocity and magnetic field intensity in the received 9-axis IMU sensor data through Euler transformation, wherein the Euler angles are pitch angle, yaw angle and roll angle; the form of IMU original data is in the form of quaternion as follows:
Q=q 0 +q 1 i+q 2 j+q 3 k;
converting IMU initial data in a quaternion form into Euler angles through the following steps, and issuing the IMU initial data through nodes;
psi=atan2(2(q 0 *q 1 +q 2 *q 3 )),1-2(q 1 *q 1 +q 2 *q 2 )
th eta=asin(2(q 0 *q 2 -q 3 *q 1 ))
ph i=atan2(2(q 0 *q 3 +q 1 *q 2 )),1-2(q 2 *q 2 +q 3 *q 3 )
wherein psi is yaw angle, theta is pitch angle, phi is roll angle, q 0 、q 1 、q 2 、q 3 For four components of IMU initial data, atan and asin are an arcsine function and an arccosine function respectively;
step four: processing the received sensor data and then issuing the sensor data through the ROS topic; the method comprises the steps of performing filtering processing on received sensor data by adopting unscented Kalman filtering, establishing ROS nodes and topics through an MCU after the processed data are obtained, and publishing the received sensor data; then the pressure is received by the upper computer, so that the pressure is calculated in a dispersed way by cooperative work among different hosts;
step five: receiving data of a motor encoder and performing PID control;
receiving the data of the motor encoder, comparing the data with the data of the command issued by the upper computer, ensuring that the actual rotating direction and speed of the motor are always consistent with the command of the upper computer, and carrying out PID correction in real time when errors occur so as to ensure that the robot is always within the control range of the upper computer;
the odometer of the robot can be obtained through the data of the IMU and the motor encoder; the differential model robot always does circular arc motion with R as radius; the linear speed V and the angular speed omega of the robot, the left wheel speed and the right wheel speed are represented by VL and VR, the wheel distance is represented by D, D=2d, and the distance from the right wheel to the rotation center is L; left and right wheel speed V L and VR To perform speed control:
V L =ω*(L+D)=ω*(R+d)=V+ωd
V R =ω*L=ω*(R-d)=V-ωd
the ideal PID control law of the continuous control system is as follows:
wherein ,Kp Is proportional gain, K p Reciprocal relation with proportionality; t (T) t Is an integration time constant; t (T) D Is a differential time constant; an output signal of the u (t) PID controller; e (t) the difference between the given value r (t) and the measured value;
step six: the robot is positioned in real time through the received information on the upper computer, and after the data of each sensor are acquired, the real-time position of the robot is obtained through a motion model of the robot; acquiring the movement direction of the robot through IMU data, acquiring the speed of the robot through an encoder, and acquiring the environmental information around the robot through a sensor array; the required motion model is a combination of position estimation and direction estimation, both using a constant velocity/angular velocity model; the state vector comprises a position, a speed, an Euler angle and a Euler angle speed, wherein the Euler angle is a roll angle alpha, a pitch angle beta and a yaw angle gamma of the robot;
V local =R α,β,γ [α,β,γ] T
wherein the variance is and />Representing the mass of the speed and direction constraints of the robot.
Further, in the fourth step, the received sensor data is filtered by unscented kalman filtering, which specifically includes:
firstly, sampling sigma points in acquired sensor data so as to accurately capture any nonlinear post-detection mean value and covariance;
the sampling method of Sigma points is as follows: assuming the k-1 time, the mean and covariance of the system probability distribution are x k-1 and Pk-1 N is the dimension of the state space, γ=α 2 (k+n) is a scale factor, alpha represents the distribution of sigma points around x, k is an adjustment parameter,for the ith sampling point at x k The results of the time sampling are:
the corresponding weights for Sigma points are:
wherein ,is at x at the ith sample point k The result of the time sampling is used to calculate the weight of the mean,/->Is at x at the ith sample point k The result of the time sampling is used to calculate the weights of the covariance,/-> and />For the weight of n=0, n is the dimension of the state space at the previous sampling step, α, κ, γ, β are unscented transformation parameters, and α and κ determine the diffusion of sigma points around x; beta is used to incorporate a priori knowledge of the x distribution; these three values provide additional degrees of freedom to "fine tune" the high order moment of the approximation, the commonly used value Fan Wei being greater than or equal to 0, alpha ε (0, 1)],γ=α 2 (k+n) -n, β=2, is and can be used to reduce overall prediction bias;
substituting the new sigma point set into the observation equation g (x) of unscented Kalman filtering, and then approximating with the mean and covariance of the posterior sigma points of the weighted samplesMean and covariance of (c):
and then the estimated mean value of the unscented Kalman filter model can be obtainedSum of covariance->The method comprises the following steps:
where n is the dimension of the used state space at the time of the previous Sigma point sampling, and />The weight of Sigma points is obtained in the previous step, and T is the transposed matrix;
by averaging the estimatesSum of covariance->Substituting the data into the unscented Kalman filter to obtain the prediction of the data at the time K-1, and comparing the actually obtained data to filter out the noise data generated by the fluctuation of the data.
Further, the plurality of sensors include nine-axis IMU sensors which are arranged on the MCU and take the MCU as a center point, laser sensors which are arranged in the middle position right in front of the robot, two cliff sensors which are respectively arranged on the front left side and the front right side of the robot, and six ultrasonic sensors which are respectively arranged on the front two sides, the two sides and the rear two sides of the robot.
Further, when the system is started, the robot detects whether obstacles exist around through the ultrasonic sensors arranged around, detects whether concave ground exists on the advancing route through the two cliff sensors at the front lower part, acquires road information of the advancing route through the laser sensor at the front, transmits the detected information to the MCU in real time for processing, and the MCU communicates the processed information with the master controller through the serial port by using the ROS, drives the motor of the wheel by using the PWM and returns the value of the encoder for PID control.
Further, the nine-axis IMU sensor comprises three sensors: accelerometers, gyroscopes, and magnetometers; the accelerometer measures acceleration of the robot on three axes, the gyroscope measures angular velocity of the robot on the three axes, and the magnetometer measures magnetic field intensity of the position of the robot; the data of acceleration, angular velocity and magnetic field strength help the robot determine its own position, direction and motion state.
Further, the acquiring of the data of the current inertial measurement unit IMU of the robot is to acquire physical quantities of the robot including measurement of motion, acceleration and rotational speed.
Advantageous effects
Compared with the prior art, the sensor array mobile robot positioning system based on the ROS has the following beneficial effects:
(1) The invention uses MCU to collect and process data of each sensor, which can process data of a plurality of sensors at the same time and integrate the data into one message. This makes the navigation of the robot more accurate and reliable. The complexity of the upper computer system, the requirement on the calculation force of the upper computer and the occupancy rate of the CPU under the condition of continuously increasing the sensors are reduced. In addition, the low power consumption and high performance of the MCU make the system more energy-saving and efficient.
(2) The invention adopts unscented Kalman filtering to filter the received sensor data and calibrate the data so as to ensure the accuracy and the reliability of the data. Errors that may exist due to errors in the sensor itself and the effects of environmental factors are reduced.
(3) The invention combines the 9-axis IMU with the sensor array to create real-time positioning for the navigation of the mobile robot, and can ensure that a judgment basis can be provided for the robot when the sensor is affected by electromagnetism and the like due to the stability of the IMU. In addition, the IMU data are used in the scheme, the estimation of the direction of the robot is increased, and the positioning is more accurate and reliable. The method solves the problem that the common motion model estimates errors generated by the position of the robot by assuming that the motion is uniform forward, uniform backward and uniform rotation.
(4) The invention is based on the ROS system, so that the ROS system can be conveniently and quickly embedded into any wheeled robot using ROS. The positioning precision and the working efficiency of the robot are improved, so that the application value of the robot is improved.
(5) The invention is based on the array formed by a plurality of sensors, can acquire data of a plurality of positions at the same time, and can improve the perception capability and decision capability of the robot, thereby realizing the accurate positioning of the robot. This would enable an AGV, robot or indoor automatic transport to safely move in a dynamic setting by using multiple sensors to monitor obstacles 360 degrees on a robot base level. The robot has the advantages of being capable of automatically and accurately positioning the mobile robot, simple and fast to operate and the like, and can be used as a lower computer to be connected to any robot driven by two wheels.
Drawings
Fig. 1 is a schematic overall flow diagram of the present invention.
Fig. 2 is a schematic block diagram of the connection of the sensor in the present invention.
Fig. 3 is a schematic diagram of a positional relationship between a sensor array and an MCU in the present invention.
FIG. 4 is a three-dimensional view of the positional relationship between the sensor array and the MCU in the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. The described embodiments are only some, but not all, embodiments of the invention. Various modifications and improvements of the technical scheme of the invention, which are made by those skilled in the art, are included in the protection scope of the invention without departing from the design concept of the invention.
Example 1:
a sensor array mobile robot positioning system based on ROS comprises an MCU arranged in the center of a mobile robot and a plurality of sensors connected with the MCU; the sensors comprise nine-axis IMU sensors which are arranged on the MCU and take the nine-axis IMU sensors as center points, laser sensors which are arranged in the middle position right in front of the robot, two cliff sensors which are respectively arranged on the left side and the right side in front of the robot, and six ultrasonic sensors which are respectively arranged on the front two sides, the two sides and the rear two sides of the robot.
The driving plates of the plurality of sensors are distributed on the periphery of the MCU in an annular mode, all the sensing probes are arranged in all directions of the robot in a extending mode, and a sensor array is formed to acquire the surrounding environment data of the mobile robot; the method comprises the steps of obtaining all-dimensional environment information through a sensor array, obtaining the current position of a robot according to the surrounding information, and judging whether the robot is in a safe working state or not.
When the system is started, the robot detects whether barriers exist around through the ultrasonic sensors arranged around, detects whether concave ground exists on a forward road through the two cliff sensors at the front and the lower sides, acquires road information of a forward traveling route through the laser sensor at the front, transmits the detected information to the MCU in real time for processing, and the MCU communicates the processed information with the master controller through the serial ports by using the ROS, drives the motor of the wheels by using the PWM, and returns the value of the encoder to perform PID control.
The MCU as a host processor receives readings from the plurality of sensors and communicates them to the ROS node, which receives data from the plurality of sensors and integrates them into a message that can be subscribed to by other ROS nodes for the ROS node to use the sensor data for navigation and other tasks. The specific operation steps comprise:
step one: receiving ultrasonic sensor data;
the ultrasonic sensor converts an ultrasonic signal into an electrical signal whose vibration frequency is a mechanical wave higher than 20 kHz. After the data of each sensor in the ultrasonic sensor array is acquired, a robot coordinate system, a sensor coordinate system and a world coordinate system are arranged due to a Cartesian coordinate system of the mobile robot; wherein the world coordinate system is a coordinate system describing global information of the robot; the robot coordinate system is a coordinate system describing information of the robot itself; the sensor coordinate system is a coordinate system describing sensor information.
When the robot is positioned, the robot is required to be used as the center, the position of the ultrasonic sensor and the corresponding robot coordinate system are required to be converted, and the data of the ultrasonic sensor are acquired by taking the robot coordinate system as a reference.
Step two: receiving single-point laser radar data;
the obstacle information in front of the robot is acquired through the single-point laser radar, and the original data of the laser radar is the distance between the object facing the object returned from each angle and can be used after being processed.
According to the distance between the obstacle and the robot, the limitation of the maximum speed of the robot is given, and the collision of the robot caused by too high speed is prevented; and deriving the width of the obstacle according to the following formula:
wherein w in the above formula is the width of the obstacle, D a and Db L is the angle of the maximum value of the left and right ends of the obstacle a and Lb Is D a Angle and D b Positionally corresponding to and a distance to the obstacle.
Step three: receiving IMU data and converting;
the IMU is an inertial measurement unit and is used for feeding back the acceleration and the angular speed of the robot in the three-dimensional space so as to obtain the pose information of the robot. The nine-axis IMU includes three sensors: accelerometers, gyroscopes, and magnetometers. Wherein. The accelerometer measures acceleration of the robot in three axes, the gyroscope measures angular velocity of the robot in three axes, and the magnetometer measures magnetic field intensity at the position of the robot. The data from these sensors may help the robot determine its position, orientation and motion status.
The 9-axis IMU can measure the rotation, acceleration and magnetic field of the robot and can provide more accurate positioning information of the robot. This information is integrated into the ROS node so that the robot can be positioned in real time while moving.
Because the accelerometer, gyroscope and magnetometer in the initial data of the IMU sensor measure different physical quantities, the data of the accelerometer, gyroscope and magnetometer need to be converted to obtain useful information; therefore, the Euler angle is obtained by Euler transformation of the values of the acceleration, the angular velocity and the magnetic field intensity in the received 9-axis IMU sensor data, and is divided into a pitch angle, a yaw angle and a roll angle; the form of IMU original data is in the form of quaternion as follows:
Q=q 0 +q 1 i+q 2 j+q 3 k;
converting IMU initial data in a quaternion form into Euler angles through the following steps, and issuing the IMU initial data through nodes;
psi=atan2(2(q 0 *q 1 +q 2 *q 3 )),1-2(q 1 *q 1 +q 2 *q 2 )
th eta=asin(2(q 0 *q 2 -q 3 *q 1 ))
ph i=atan2(2(q 0 *q 3 +q 1 *q 2 )),1-2(q 2 *q 2 +q 3 *q 3 )
wherein psi is yaw angle, theta is pitch angle, phi is roll angle, q 0 、q 1 、q 2 、q 3 For the four components of the IMU initial data, atan and asin are the arcsine function and the arccosine function, respectively.
Step four: processing the received sensor data and then issuing the sensor data through the ROS topic;
when the data is converted, the unscented Kalman filtering is used for filtering the received sensor data, so that errors of the sensor and errors caused by environmental factors are eliminated, and the data is calibrated to ensure the accuracy and reliability of the sensor data. The specific operation mode is as follows:
firstly, sampling sigma points in acquired sensor data so as to accurately capture any nonlinear post-detection mean value and covariance;
the sampling method of Sigma points is as follows: assuming the k-1 time, the mean and covariance of the system probability distribution are x k-1 and Pk-1 N is the dimension of the state space, γ=α 2 (k+n) is a scale factor, alpha represents the distribution of sigma points around x, k is an adjustment parameter,for the ith sampling point at x k The results of the time sampling are:
the corresponding weights for Sigma points are:
wherein ,is at x at the ith sample point k The result of the time sampling is used to calculate the weight of the mean,/->Is at x at the ith sample point k The result of the time sampling is used to calculate the weights of the covariance,/-> and />For the weight of n=0, n is the dimension of the state space at the previous sampling step, α, κ, γ, β are unscented transformation parameters, and α and κ determine the diffusion of sigma points around x; beta is used to incorporate a priori knowledge of the x distribution; these three values provide additional degrees of freedom to "fine tune" the high order moment of the approximation, the commonly used value Fan Wei being greater than or equal to 0, alpha ε (0, 1)],γ=α 2 (k+n) -n, β=2, is and can be used to reduce overall prediction bias;
substituting the new sigma point set into the observation equation g (x) of unscented Kalman filtering, and then approximating with the mean and covariance of the posterior sigma points of the weighted samplesMean and covariance of (c):
and then the estimated mean value of the unscented Kalman filter model can be obtainedSum of covariance->The method comprises the following steps:
where n is the dimension of the used state space at the time of the previous Sigma point sampling, and />The weight of Sigma points is obtained in the previous step, and T is the transposed matrix;
by averaging the estimatesSum of covariance->Substituting the data into the unscented Kalman filter to obtain the prediction of the data at the time K-1, and comparing the actually obtained data to filter out the noise data generated by the fluctuation of the data.
After the processed data are obtained, ROS nodes and topics are established through the MCU, and the received sensor data are issued; and then the pressure is received by the upper computer, so that the pressure is calculated in a dispersed way by cooperative work among different hosts.
Step five: receiving data of a motor encoder and performing PID control;
and receiving the data of the motor encoder, comparing the data with the data of the command issued by the upper computer, ensuring that the actual rotating direction and speed of the motor are always consistent with the command of the upper computer, and carrying out PID correction in real time when errors occur so as to ensure that the robot is always within the control range of the upper computer.
The odometer of the robot can be obtained through the data of the IMU and the motor encoder; the differential model robot always does circular arc motion with R as radius; the linear velocity V and the angular velocity ω of the robot, the left and right wheel speeds are denoted by VL and VR, the wheel distance is denoted by D, d=2d, and the distance from the right wheel to the center of rotation is L. Left and right wheel speed V L and VR To perform speed control:
V L =ω*(L+D)=ω*(R+d)=V+ωd
V R =ω*L=ω*(R-d)=V-ωd
the ideal PID control law of the continuous control system is as follows:
wherein ,Kp Is proportional gain, K p Reciprocal relation with proportionality; t (T) t Is an integration time constant; t (T) D Is a differential time constant; an output signal of the u (t) PID controller; e (t) is the difference between the given value r (t) and the measured value.
Step six: the robot is positioned in real time through the received information on the upper computer, and after the data of each sensor are acquired, the real-time position of the robot is obtained through a motion model of the robot; acquiring the movement direction of the robot through IMU data, acquiring the speed of the robot through an encoder, and acquiring the environmental information around the robot through a sensor array; the required motion model is a combination of position estimation and direction estimation, both using a constant velocity/angular velocity model; the state vector contains the position, the speed, the euler angle and the speed of the euler angle, the euler angle being the roll angle α, the pitch angle β and the yaw angle γ of the robot.
V local =R α,β,γ [α,β,γ] T
Wherein the variance is and />Representing the mass of the speed and direction constraints of the robot.

Claims (6)

1. A ROS-based sensor array mobile robot positioning system, characterized by: the system comprises an MCU arranged in the center of the mobile robot and a plurality of sensors connected with the MCU; the driving plates of the plurality of sensors are distributed on the periphery of the MCU in an annular mode, all the sensing probes are arranged in all directions of the robot in a extending mode, and a sensor array is formed to acquire the surrounding environment data of the mobile robot; the method comprises the steps of obtaining all-dimensional environment information through a sensor array, obtaining the current position of a robot according to the surrounding information, and judging whether the robot is in a safe working state or not; the MCU as a main processor receives readings of a plurality of sensors and transmits the readings to the ROS node, and the ROS node receives data from the plurality of sensors and integrates the data into one message which can be subscribed to by other ROS nodes so that the ROS node can use the sensor data for navigation and other tasks; the specific operation steps comprise:
step one: receiving ultrasonic sensor data, converting the position of an ultrasonic sensor and a corresponding robot coordinate system, and acquiring the data of the ultrasonic sensor by taking the robot coordinate system as a reference;
step two: receiving single-point laser radar data, acquiring obstacle information in front of a robot through the single-point laser radar, and giving limitation of the maximum speed of the robot according to the distance between the obstacle and the robot so as to prevent collision caused by the excessively high speed of the robot; and deriving the width of the obstacle according to the following formula:
wherein w in the above formula is the width of the obstacle, D a and Db L is the angle of the maximum value of the left and right ends of the obstacle a and Lb Is D a Angle and D b Positionally corresponding to and distance from the obstacle;
step three: receiving IMU data and converting, and obtaining Euler angles of acceleration, angular velocity and magnetic field intensity in the received 9-axis IMU sensor data through Euler transformation, wherein the Euler angles are pitch angle, yaw angle and roll angle; the form of IMU original data is in the form of quaternion as follows:
Q=q 0 +q 1 i+q 2 j+q 3 k;
converting IMU initial data in a quaternion form into Euler angles through the following steps, and issuing the IMU initial data through nodes;
psi=atan2(2(q 0 *q 1 +q 2 *q 3 )),1-2(q 1 *q 1 +q 2 *q 2 )
th eta=asin(2(q 0 *q 2 -q 3 *q 1 ))
phi=atan2(2(q 0 *q 3 +q 1 *q 2 )),1-2(q 2 *q 2 +q 3 -q 3 )
wherein psi is yaw angle, theta is pitch angle, phi is roll angle, q 0 、q 1 、q 2 、q 3 For four components of IMU initial data, atan and asin are an arcsine function and an arccosine function respectively;
step four: processing the received sensor data and then issuing the sensor data through the ROS topic; the method comprises the steps of performing filtering processing on received sensor data by adopting unscented Kalman filtering, establishing ROS nodes and topics through an MCU after the processed data are obtained, and publishing the received sensor data; then the pressure is received by the upper computer, so that the pressure is calculated in a dispersed way by cooperative work among different hosts;
step five: receiving data of a motor encoder and performing PID control;
receiving the data of the motor encoder, comparing the data with the data of the command issued by the upper computer, ensuring that the actual rotating direction and speed of the motor are always consistent with the command of the upper computer, and carrying out PID correction in real time when errors occur so as to ensure that the robot is always within the control range of the upper computer;
the odometer of the robot can be obtained through the data of the IMU and the motor encoder; the differential model robot always does circular arc motion with R as radius; the linear speed V and the angular speed omega of the robot, the left wheel speed and the right wheel speed are represented by VL and VR, the wheel distance is represented by D, D=2d, and the distance from the right wheel to the rotation center is L; left and right wheel speed V L and VR To perform speed control:
V L =ω*(L+D)=ω*(R+d)=V+ωd
V R =ω*L=ω*(R-d)=V-ωd
the ideal PID control law of the continuous control system is as follows:
wherein ,Kp Is proportional gain, K p Reciprocal relation with proportionality; t (T) t Is an integration time constant; t (T) D Is a differential time constant; an output signal of the u (t) PID controller; e (t) the difference between the given value r (t) and the measured value;
step six: the robot is positioned in real time through the received information on the upper computer, and after the data of each sensor are acquired, the real-time position of the robot is obtained through a motion model of the robot; acquiring the movement direction of the robot through IMU data, acquiring the speed of the robot through an encoder, and acquiring the environmental information around the robot through a sensor array; the required motion model is a combination of position estimation and direction estimation, both using a constant velocity/angular velocity model; the state vector comprises a position, a speed, an Euler angle and a Euler angle speed, wherein the Euler angle is a roll angle alpha, a pitch angle beta and a yaw angle gamma of the robot;
V local =R α,β,γ [α,β,γ] T
wherein the variance is and />Representing the mass of the speed and direction constraints of the robot.
2. The ROS-based sensor array mobile robot positioning system of claim 1, wherein: and step four, the received sensor data is filtered by adopting unscented Kalman filtering, and the specific operation mode is as follows:
firstly, sampling sigma points in acquired sensor data so as to accurately capture any nonlinear post-detection mean value and covariance;
the sampling method of Sigma points is as follows: assuming the k-1 time, the mean and covariance of the system probability distribution are x k-1 and Pk-1 N is the dimension of the state space, γ=α 2 (k+n) is a scale factor, alpha represents the distribution of sigma points around x, k is an adjustment parameter,for the ith sampling point at x k The results of the time sampling are:
the corresponding weights for Sigma points are:
wherein ,is at x at the ith sample point k The result of the time sampling is used to calculate the weight of the mean,/->Is at x at the ith sample point k The result of the time sampling is used to calculate the weights of the covariance,/-> and />For the weight of n=0, n is the dimension of the state space at the previous sampling step, α, κ, γ, β are unscented transformation parameters, and α and κ determine the diffusion of sigma points around x; beta is used to incorporate a priori knowledge of the x distribution; these three values provide additional degrees of freedom to "fine tune" the high order moment of the approximation, the commonly used value Fan Wei being greater than or equal to 0, alpha ε (0, 1)],γ=α 2 (k+n) -n, β=2, is and can be used to reduce overall prediction bias;
substituting the new sigma point set into the observation equation g (x) of unscented Kalman filtering, and then approximating with the mean and covariance of the posterior sigma points of the weighted samplesMean and covariance of (c):
and then the estimated mean value of the unscented Kalman filter model can be obtainedSum of covariance->The method comprises the following steps:
where n is the dimension of the used state space at the time of the previous Sigma point sampling, and />The weight of Sigma points is obtained in the previous step, and T is the transposed matrix;
by averaging the estimatesSum of covariance->Substituting the data into the unscented Kalman filter to obtain the prediction of the data at the time K-1, and comparing the actually obtained data to filter out the noise data generated by the fluctuation of the data.
3. The ROS-based sensor array mobile robot positioning system of claim 1, wherein: the sensors comprise nine-axis IMU sensors which are arranged on the MCU and take the nine-axis IMU sensors as center points, laser sensors which are arranged in the middle position right in front of the robot, two cliff sensors which are respectively arranged on the left side and the right side in front of the robot, and six ultrasonic sensors which are respectively arranged on the front two sides, the two sides and the rear two sides of the robot.
4. A ROS-based sensor array mobile robotic positioning system of claim 3, wherein: when the system is started, the robot detects whether barriers exist around through the ultrasonic sensors arranged around, detects whether concave ground exists on a forward road through the two cliff sensors at the front and the lower sides, acquires road information of a forward traveling route through the laser sensor at the front, transmits the detected information to the MCU in real time for processing, and the MCU communicates the processed information with the master controller through the serial ports by using the ROS, drives the motor of the wheels by using the PWM, and returns the value of the encoder to perform PID control.
5. A ROS-based sensor array mobile robotic positioning system of claim 3, wherein: the nine-axis IMU sensor comprises three sensors: accelerometers, gyroscopes, and magnetometers; the accelerometer measures acceleration of the robot on three axes, the gyroscope measures angular velocity of the robot on the three axes, and the magnetometer measures magnetic field intensity of the position of the robot; the data of acceleration, angular velocity and magnetic field strength help the robot determine its own position, direction and motion state.
6. The ROS-based sensor array mobile robot positioning system of claim 5, wherein: the method comprises the steps of acquiring data of the current Inertial Measurement Unit (IMU) of the robot, wherein the data are physical quantities of the robot including measurement of motion, acceleration and rotation speed.
CN202311124796.7A 2023-09-02 2023-09-02 Sensor array mobile robot positioning system based on ROS Pending CN116989796A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311124796.7A CN116989796A (en) 2023-09-02 2023-09-02 Sensor array mobile robot positioning system based on ROS

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311124796.7A CN116989796A (en) 2023-09-02 2023-09-02 Sensor array mobile robot positioning system based on ROS

Publications (1)

Publication Number Publication Date
CN116989796A true CN116989796A (en) 2023-11-03

Family

ID=88532181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311124796.7A Pending CN116989796A (en) 2023-09-02 2023-09-02 Sensor array mobile robot positioning system based on ROS

Country Status (1)

Country Link
CN (1) CN116989796A (en)

Similar Documents

Publication Publication Date Title
JP4170284B2 (en) Method and system for estimating moving direction of moving object
WO2021254367A1 (en) Robot system and positioning navigation method
Cho et al. A dead reckoning localization system for mobile robots using inertial sensors and wheel revolution encoding
EP2856273B1 (en) Pose estimation
CN101846734B (en) Agricultural machinery navigation and position method and system and agricultural machinery industrial personal computer
CN110001840B (en) Two-wheeled self-balancing vehicle motion control method based on visual sensor under various road conditions
CN202939489U (en) Multi-rotor autobalance flight controller
CN108562289B (en) Laser radar navigation method for four-rotor aircraft in continuous multilateral geometric environment
CN111650955B (en) Control method of climbing robot and climbing robot
CN113093742B (en) Unmanned ship path tracking system capable of automatically avoiding multiple obstacles
US20240003721A1 (en) Magnetic encoder calibration
Seyr et al. Proprioceptive navigation, slip estimation and slip control for autonomous wheeled mobile robots
Zunaidi et al. Positioning system for 4-wheel mobile robot: encoder, gyro and accelerometer data fusion with error model method
CN116989796A (en) Sensor array mobile robot positioning system based on ROS
US20240077880A1 (en) Slope location correction method and apparatus, robot and readable storage medium
JP6832394B2 (en) Self-position estimator, self-regioselector, and learner
CN114415655A (en) Inspection robot navigation control method based on improved SLAM
Srinivasan et al. Multiple sensor fusion in mobile robot localization
Podsedkowski et al. Online navigation of mobile robots using laser scanner
Bergantin et al. Minimalistic in-flight odometry based on two optic flow sensors along a bouncing trajectory
Jun et al. State estimation via sensor modeling for helicopter control using an indirect kalman filter
CN215954140U (en) Automatic obstacle avoidance navigation system of mobile robot
Roumeliotis et al. Reliable mobile robot localization
Liu et al. Autonomous Vehicle Based on ROS2 for Indoor Package Delivery
Paryanto et al. Wheel Odometry-Based Localization for Autonomous Wheelchair

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination