CN108955679B - High-precision positioning method for intelligent inspection robot of transformer substation - Google Patents

High-precision positioning method for intelligent inspection robot of transformer substation Download PDF

Info

Publication number
CN108955679B
CN108955679B CN201810934182.8A CN201810934182A CN108955679B CN 108955679 B CN108955679 B CN 108955679B CN 201810934182 A CN201810934182 A CN 201810934182A CN 108955679 B CN108955679 B CN 108955679B
Authority
CN
China
Prior art keywords
robot
pose
state
algorithm
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810934182.8A
Other languages
Chinese (zh)
Other versions
CN108955679A (en
Inventor
左琳
蒋正钢
张昌华
刘宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201810934182.8A priority Critical patent/CN108955679B/en
Publication of CN108955679A publication Critical patent/CN108955679A/en
Application granted granted Critical
Publication of CN108955679B publication Critical patent/CN108955679B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a high-precision positioning method for an intelligent inspection robot of a transformer substation, and belongs to the technical field of robot positioning. The method combines an extended Kalman filter algorithm (EKF), an adaptive Monte Carlo positioning Algorithm (AMCL) and a scanning matching algorithm, integrates data of a milemeter, an Inertial Measurement Unit (IMU) and two-dimensional laser, and realizes three times of correction of the pose of the robot. The method of the invention utilizes a plurality of algorithms to process the sensor data of different stages and different types, and overcomes the limitation of using a single sensor and a single algorithm. Experiments show that the positioning method improves the accuracy, reliability and instantaneity of robot position and posture acquisition and realizes the high-precision positioning of the trackless mobile robot in the environment of a transformer substation.

Description

High-precision positioning method for intelligent inspection robot of transformer substation
Technical Field
The invention belongs to the technical field of robot positioning, and particularly relates to a high-precision positioning method for an intelligent inspection robot of a transformer substation.
Background
The electric energy plays a vital role in national economy and daily life of people, and when a power grid system fails, inestimable loss is brought to people, so that normal production and life are seriously influenced. The operation of ensuring the safety and reliability of the power grid system is related to the vital interests of the country and people, and is also a key problem to be solved urgently in the power grid system of China. The power transmission and transformation system is a key component of a power grid system, and safe and reliable operation of substation equipment is of great importance to guarantee the operation quality of the whole power transmission and transformation system. With the development of science and technology, the performance requirements of various aspects of equipment in a power grid system are also improved to a new level, and the performance level of substation equipment directly influences the stability, reliability, safety and accident resistance of the power grid system. Ensuring reliable and safe operation of equipment is an important task, so that monitoring and management of substation equipment are highly regarded, inspection, patrol and real-time monitoring of the substation equipment are well done, potential dangers are timely eliminated, and the important point for ensuring stable operation of the substation is to ensure. The transformer substation fault condition is inspected by manpower, the transformer substation fault condition is inspected by the aid of human senses to a great extent, efficiency is low, serious accidents threaten life safety of people, and the intelligent inspection robot is used for inspecting the transformer substation, so that the transformer substation is in the mainstream direction of working development. Development of the intelligent inspection robot of the transformer substation is promoted by development of the autonomous navigation system, the intelligent inspection robot of the transformer substation assists operators to inspect equipment, abnormal phenomena of thermal defects, foreign matter suspension and the like of power equipment can be found in time, potential equipment operation hazards are avoided, work efficiency and inspection quality are really improved, the effect of reducing personnel and increasing efficiency is achieved, and unattended progress of the transformer substation is effectively promoted.
At present, the transformer substation inspection robot is mostly applied in a rail type, so that secondary construction needs to be carried out on the transformer substation, and the operation cost is increased. As laser technology matures and the price of laser equipment decreases, more and more researchers begin to research trackless robots. In a special environment of a transformer substation, a robot is required to have high-precision positioning capability to complete a corresponding inspection task, so that the search for a new robot positioning method becomes important. The robot positioning research work in the prior art mainly comprises the following methods:
(1) the document "A Robust and modular Multi-Sensor Fusion application to mav navigation" summarizes Multi-Sensor-Fusion EKF to fuse Sensor data based on IEKF equations, and the document "Robust video-aided navigation using Sliding-Window maps" integrates a long-term Sliding Window (iSAM2) and short-term Sliding Window (Sliding-Window maps) to fuse a pose Sensor, thereby obtaining a Multi-Sensor fused robot pose.
(2) The document "Enhanced Monte Carlo localization in coordination a mechanism for predicting prediction conversion" combines particle filtering with a probability model of motion and perception of a mobile robot, and proposes the idea of Monte Carlo Localization (MCL) of the mobile robot.
(3) In a Map-based method adopted in the document "Map-based indoor navigation and localization using laser range finder", the mobile robot realizes global positioning by matching a group of vertexes on the Map with a group of vertexes of laser data.
(4) The document "the position acquisition of mobile robot localization based on particulate filters combined with the scan matching" discloses a particle filtering hybrid scanning matching method, which realizes the high-precision navigation positioning of the mobile machine.
The above methods all solve the problem of robot positioning to a certain extent, but still have many limitations, which are mainly shown in that:
(1) the positioning accuracy is not high. In the prior art, one or two algorithms are adopted in the law-based method to solve the problem of robot positioning, and no specific algorithm is used for different sensors. The EKF algorithm uses a first order approximation term as an approximate expression of an original state equation and a measurement equation, and the estimated mean value has second order precision, but the EKF algorithm is far from enough for a high-precision system, and the EKF algorithm assumes that the process noise is Gaussian noise with zero mean value, but the process noise is not the noise in the actual system. Also, the error in the EKF can be exacerbated by IMU sensor temperature drift due to wheel slip. Due to the problems of limited particle quantity, particle degradation, limited grid map resolution and the like, the Monte Carlo positioning algorithm causes inaccurate estimation of the pose and possibly generates an error pose. The scan matching algorithm has the effect of error accumulation and can burden the computation and increase the error if the iterative initial solution selection of ICP is inaccurate.
(2) The calculation efficiency and the positioning accuracy cannot be considered at the same time. When the method is used independently, the timeliness of the operation of the robot can be ensured, but the positioning precision is not high. If the accuracy of the EKF fusion data is to be improved, the order of the linear approximation can be increased, but this increases the computation time. If the estimation accuracy of the Monte Carlo positioning algorithm is to be improved, the most direct way is to increase the number of particles, and although the accuracy of pose estimation can be improved by increasing the number of particles, the calculation cost is increased, and the smoothness of the operation of the robot is influenced. If the matching precision of an iterative solution method in a scanning matching algorithm is required to be improved, the most direct method is to increase the number of iterations, but the calculation time is multiplied. The above methods all have the problem that the calculation efficiency and the positioning precision can not be considered at the same time.
Disclosure of Invention
The invention aims to solve the problems that a positioning algorithm in the prior art has a large error when solving the pose of a robot and is difficult to ensure the completion of a substation inspection task, and provides a high-precision positioning method for a substation intelligent inspection robot.
The technical problem proposed by the invention is solved as follows:
a high-precision positioning method for an intelligent inspection robot of a transformer substation comprises the following steps:
step 1. according to the kinematic model of the odometer
Figure BDA0001766805980000021
Solving the pose information (x, y, theta) of the robot relative to the starting point, wherein k is the current time,
Figure BDA0001766805980000022
for the estimated state of the odometer motion,
Figure BDA0001766805980000023
for the precise state of the odometer movement at the last moment, vkF is a function for solving a kinematic model, X is a distance in an X-axis direction relative to a starting point, Y is a distance in a Y-axis direction, and theta is a relative rotation angle; the Inertial Measurement Unit (IMU) outputs an angle and an angular velocity to calculate the pose state (x) of the robot on the horizontal plane1,y1,θ1);
Step 2, setting the pose state x of the robot at the moment kkObeying a Gaussian distribution
Figure BDA0001766805980000031
Process and measurement noise obey a zero-mean, high-term distribution in which
Figure BDA0001766805980000032
Is the initial state of the motion pose of the robot, y0:kIs an observed quantity, v1:kIs the input control quantity at time 1-k,
Figure BDA0001766805980000033
is the average value of the average of the values,
Figure BDA0001766805980000034
is covariance, p represents probability, and N represents gaussian distribution;
obtaining a predicted value of the pose state of the robot at the current moment according to a prediction equation of an EKF (extended Kalman Filter) algorithm
Figure BDA0001766805980000035
Predictive noise covariance
Figure BDA0001766805980000036
Kalman gain Kk(ii) a Obtaining the accurate value of the pose state of the robot at the current moment by using the observed quantity and an update equation of an EKF algorithm
Figure BDA0001766805980000037
Accurate value of noise covariance
Figure BDA0001766805980000038
Step 3, calculating the accurate value of the pose state of the robot at the current moment by using the data obtained by solving the odometer kinematic model and the inertia test unit in the step 2
Figure BDA0001766805980000039
The motion update model of is p (x)k|xk-1,yk,vk-1) I.e. pose state x of the robot at time k-1k-1And observed quantity y at time kkAnd the input control amount v at the time k-1k-1Obtaining pose state x of robot at time kk(ii) a Enabling the pose state accurate value of the robot at the current moment obtained in the step 2
Figure BDA00017668059800000310
Is the primary pose of the robot;
step 4, updating the model by using the motionp(xk|xk-1,yk,vk-1) The laser perception model is combined with a Monte Carlo positioning algorithm, and the global pose of the robot is estimated by a plurality of weighted particles;
step 4-1. current time state xkPrior confidence of (d):
Figure BDA00017668059800000311
wherein Bel (x)k-1) The state reliability at the last moment is obtained;
current time state xkThe posterior probability distribution of (a):
Bel(xk)=p(zk|xk)Bel-(xk)/p(zk|z1:k-1)
wherein z iskAs measurement information of the current laser, p (z)k|xk) For the laser perception model, p (z)k|z1:k-1) Is the inverse of the normalization factor;
step 4-2, introducing a self-adaptive mechanism to the particle number change in the Monte Carlo positioning algorithm by using a KLD (Kullback-Leibler diversity) sampling idea, wherein the sample number is in direct proportion to the size of a state space;
step 4-3. particle Collection
Figure BDA00017668059800000312
The mean value of (1) is the minimum variance estimation of the pose state of the robot, wherein i is more than or equal to 1 and less than or equal to n, n is the number of particles,
Figure BDA00017668059800000313
is xkThe (i) th element of (a),
Figure BDA00017668059800000314
are particles
Figure BDA00017668059800000315
A corresponding weight; the estimated value is recorded as the pose state of the robot after the second correction and is globalInformation truly reflecting the actual position and direction of the robot in a map coordinate system;
step 5, selecting an Iterative Closest Point (ICP) algorithm in the scanning matching algorithm as a final correction algorithm of the robot pose; and matching the point cloud observed in real time by the two-dimensional laser with the grid map point cloud by taking the pose corrected for the second time as an iterative initial solution of the ICP algorithm to obtain the rotation and translation relation between the estimated pose and the real pose of the robot, so as to correct the pose of the robot for the third time.
The invention has the beneficial effects that:
(1) and high-precision positioning is realized. The final pose of the robot is determined by combining three algorithms, the advantages and the disadvantages of the algorithms are complemented, and the advantages of each sensor are fully utilized. The EKF algorithm improves the defect of large deviation of output angle information of the odometer by fusing the odometer and IMU data, thereby obtaining more accurate track information. The Monte Carlo positioning algorithm uses EKF to generate and sample an importance density function, so that the pose estimation precision is improved. Pose information estimated by the Monte Carlo algorithm is used as an iterative initial solution of scanning matching, and the grid map is used as a reference point cloud, so that error accumulation is reduced. Therefore, the method adopted by the invention improves the positioning precision of the robot.
(2) The compatibility of the calculation efficiency and the positioning accuracy is improved. Computational costs need to be considered when combining multiple algorithms to process multiple sensor information. The EKF algorithm adopted by the invention is fused with two relative positioning sensors, and has the characteristic of high efficiency. A particle number self-adaptive mechanism is introduced into the Monte Carlo positioning algorithm, and the pose estimation efficiency is improved. The maximum iteration times are set in the scanning matching algorithm, the number of associated point logarithms is selected as few as possible according to the number of the two-dimensional laser lines, and meanwhile, the accurate pose is used as the initial iteration solution, so that the iteration times when the convergence condition is reached are reduced. The comprehensive use of the three algorithms improves the compatibility of the calculation efficiency and the positioning accuracy.
Drawings
FIG. 1 is a block flow diagram of a method of the present invention;
FIG. 2 is a grid map of a substation environment and corresponding test points used by the method of the embodiment;
FIG. 3 is a diagram illustrating test details of an embodiment;
FIG. 4 is a diagram illustrating a distance d between an actual arrival position of the robot and a target point in the embodimentrA waveform diagram of (a);
FIG. 5 is a waveform diagram of an angular deviation θ between an actual arrival position of the robot and a target point in the embodiment;
FIG. 6 is a diagram illustrating an embodiment in which the robot estimates the distance d between the position and the target point through a positioning algorithmeA waveform diagram of (a);
fig. 7 is a waveform diagram of the real positioning error d of the robot in the embodiment.
Detailed Description
The invention is further described below with reference to the figures and examples.
The embodiment provides a high-precision positioning method for an intelligent inspection robot of a power station, the flow chart of the method is shown in fig. 1, the model of the robot used in the embodiment is HUSKYA200, an industrial personal computer, a motion control module, an information acquisition module, a power management module, a mileometer and an IMU of an ROS system are arranged in the robot, and a Wesk two-dimensional laser radar with the model of LMS151 is carried outside the robot. The test environment of this embodiment is a200 KV substation, and a total of three test points are set, and 50 sets of experiments are performed respectively. The test positioning data content comprises the distance d between the actual arrival position of the robot and the target pointrAngle deviation theta; the robot estimates the distance d between the position and the target point through a positioning algorithmeAnd an angle deviation beta. The method of the embodiment comprises the following steps:
step 1. according to the kinematic model of the odometer
Figure BDA0001766805980000051
Solving the pose information (x, y, theta) of the robot relative to the starting point, wherein k is the current time,
Figure BDA0001766805980000052
for the estimated state of the odometer motion,
Figure BDA0001766805980000053
for the precise state of the odometer movement at the last moment, vkF is a function for solving a kinematic model, X is a distance in an X-axis direction relative to a starting point, Y is a distance in a Y-axis direction, and theta is a relative rotation angle; the Inertial Measurement Unit (IMU) outputs an angle and an angular velocity to calculate the pose state (x) of the robot on the horizontal plane1,y1,θ1);
Step 2, setting the pose state x of the robot at the moment kkObeying a Gaussian distribution
Figure BDA0001766805980000054
Process and measurement noise obey a zero-mean, high-term distribution in which
Figure BDA0001766805980000055
Is the initial state of the motion pose of the robot, y0:kIs an observed quantity, v1:kIs the input control quantity at time 1-k,
Figure BDA0001766805980000056
is the average value of the average of the values,
Figure BDA0001766805980000057
is covariance, p represents probability, and N represents gaussian distribution;
obtaining a predicted value of the pose state of the robot at the current moment according to a prediction equation of an EKF (extended Kalman Filter) algorithm
Figure BDA0001766805980000058
Predictive noise covariance
Figure BDA0001766805980000059
Kalman gain Kk(ii) a Obtaining the accurate value of the pose state of the robot at the current moment by using the observed quantity and an update equation of an EKF algorithm
Figure BDA00017668059800000510
Accurate value of noise covariance
Figure BDA00017668059800000511
Step 3, calculating the accurate value of the pose state of the robot at the current moment by using the data obtained by solving the odometer kinematic model and the inertia test unit in the step 2
Figure BDA00017668059800000512
The motion update model of is p (x)k|xk-1,yk,vk-1) I.e. pose state x of the robot at time k-1k-1And observed quantity y at time kkAnd the input control amount v at the time k-1k-1Obtaining pose state x of robot at time kk(ii) a Enabling the pose state accurate value of the robot at the current moment obtained in the step 2
Figure BDA00017668059800000513
Is the primary pose of the robot;
step 4. update the model p (x) by using the motionk|xk-1,yk,vk-1) The laser perception model is combined with a Monte Carlo positioning algorithm, and the global pose of the robot is estimated by a plurality of weighted particles;
step 4-1. current time state xkPrior confidence of (d):
Figure BDA00017668059800000514
wherein Bel (x)k-1) The state reliability at the last moment is obtained;
current time state xkThe posterior probability distribution of (a):
Bel(xk)=p(zk|xk)Bel-(xk)/p(zk|z1:k-1)
wherein z iskAs measurement information of the current laser, p (z)k|xk) For the laser perception model, p (z)k|z1:k-1) Is the inverse of the normalization factor;
step 4-2, introducing a self-adaptive mechanism to the particle number change in the Monte Carlo positioning algorithm by using a KLD (Kullback-Leibler diversity) sampling idea, wherein the sample number is in direct proportion to the size of a state space;
step 4-3. particle Collection
Figure BDA0001766805980000061
The mean value of (1) is the minimum variance estimation of the pose state of the robot, wherein i is more than or equal to 1 and less than or equal to n, n is the number of particles,
Figure BDA0001766805980000062
is xkThe (i) th element of (a),
Figure BDA0001766805980000063
are particles
Figure BDA0001766805980000064
A corresponding weight; the estimated value is recorded as the pose state of the robot after the second correction, and is global information which truly reflects the actual position and direction of the robot in a map coordinate system;
step 5, selecting an Iterative Closest Point (ICP) algorithm in the scanning matching algorithm as a final correction algorithm of the robot pose; and matching the point cloud observed in real time by the two-dimensional laser with the grid map point cloud by taking the pose corrected for the second time as an iterative initial solution of the ICP algorithm to obtain the rotation and translation relation between the estimated pose and the real pose of the robot, so as to correct the pose of the robot for the third time.
A grid map of a robot grid map is built using a simultaneous localization and mapping (SLAM) algorithm as shown in fig. 2.
And selecting three target points on the grid map of the grid map, and storing the target points in an internal memory of the robot. The three points are selected to be far apart, a turn is needed to reach the three points, and the ambient characteristics of each target point have large differences. Therefore, the situation that the positioning error of the robot accumulates along with the increase of the distance is tested, and whether the robot has influence on the positioning precision under different characteristic environments can also be tested.
And (3) static environment testing: there are no dynamic barriers in the substation environment. The robot starts a polling task and sequentially reaches three target points. The test results are all expressed as the mean of absolute values as shown in table 1.
TABLE 1
Figure BDA0001766805980000065
It can be seen from table 1 that the difference between the actual angle error and the estimated angle error is small, which indicates that the estimated angle can truly reflect the actual angle of the robot.
Dynamic environment testing: several people are left to walk around in front of the robot after starting the patrol task to act as dynamic obstacles. The test results are all expressed as the mean of absolute values as shown in table 2.
TABLE 2
Figure BDA0001766805980000071
From table 2, it can be seen that the positioning error of the robot in the dynamic environment is larger than that in the static environment, and the dynamic obstacle affects the measurement of the two-dimensional laser, so that the real-time two-dimensional laser data is not matched with the established grid map, thereby affecting the positioning accuracy.
Mapping d of target point 3 in static Environment with MATLABr、θ、deThe data waveforms of (a) are shown in fig. 4, 5 and 6, respectively, and the root mean square error is calculated by analyzing the test data 50 times.
The robot has small deviation on the angle, so the angle error is ignored. Calculating d ═ dr-deAnd | d value represents the real positioning error of the robot. The data waveform of d is shown in fig. 7.
The actual positioning error of the robot obtained by the method of the embodiment and the method of the prior art is compared, and the comparison result is shown in table 3.
TABLE 3
Method EKF MCL Scan matching The method of the invention
Error in positioning 15cm 6cm 10cm 2.5cm
The method realizes the high-precision positioning of the intelligent inspection robot of the transformer substation. drHas an average value of 2.26cm and a root mean square error of 2.385cm, from FIGS. 4 and drThe results of the mean value and the root mean square error show that the robot has small positioning error and small data deviation each time under the relatively ideal and stable test environment selection condition. The mean value of θ was 1.12 °, the root mean square error was 1.35 °, and as can be seen from fig. 5 and the calculation results of the mean value and the root mean square error of θ, the error of the robot in the direction was smallTherefore, a separate calculation of the estimated angular difference β is not required. deHas an average value of 1.37cm and a root mean square error of 1.40cm, and the motion of the robot is according to deIs controlled by the value of (c). The average value of d is 1.01cm, the root mean square error is 1.40cm, and the calculation result of d is the real positioning accuracy of the robot. The real positioning capability of the robot is evaluated by making a difference between the actual distance error and the estimated distance error, and the error excludes the influences of robot hardware, control precision and the like and is introduced by a positioning algorithm, a map and a sensor.

Claims (1)

1. A high-precision positioning method for an intelligent inspection robot of a transformer substation is characterized by comprising the following steps:
step 1. according to the kinematic model of the odometer
Figure FDA0003480355540000011
Solving the pose information (x, y, theta) of the robot relative to the starting point, wherein k is the current time,
Figure FDA0003480355540000012
for the estimated state of the odometer motion,
Figure FDA0003480355540000013
for the precise state of the odometer movement at the last moment, vkF is a function for solving a kinematic model, X is a distance in an X-axis direction relative to a starting point, Y is a distance in a Y-axis direction, and theta is a relative rotation angle; the inertia test unit outputs an angle and an angular speed, and the pose state (x) of the robot on the horizontal plane is calculated1,y11);
Step 2, setting the pose state x of the robot at the moment kkObeying a Gaussian distribution
Figure FDA0003480355540000014
Process and measurement noise obey a zero-mean, high-term distribution in which
Figure FDA0003480355540000015
Is the initial state of the motion pose of the robot, y0:kIs an observed quantity, v1:kIs the input control quantity at the time of 1-k, p represents probability, and N represents Gaussian distribution;
obtaining a predicted value of the pose state of the robot at the current moment according to a prediction equation of an EKF algorithm
Figure FDA0003480355540000016
Predictive noise covariance
Figure FDA0003480355540000017
Kalman gain Kk(ii) a Obtaining the accurate value of the pose state of the robot at the current moment by using the observed quantity and an update equation of an EKF algorithm
Figure FDA0003480355540000018
Accurate value of noise covariance
Figure FDA0003480355540000019
Step 3, calculating the accurate value of the pose state of the robot at the current moment by using the data obtained by solving the odometer kinematic model and the inertia test unit in the step 2
Figure FDA00034803555400000110
The motion update model of
Figure FDA00034803555400000111
I.e. the pose state x of the robot at the moment k-1k-1And observed quantity y at time kkAnd the input control amount v at the time k-1k-1Obtaining pose state x of robot at time kk(ii) a Enabling the pose state accurate value of the robot at the current moment obtained in the step 2
Figure FDA00034803555400000112
As a preliminary to the robotPose;
step 4. update the model p (x) by using the motionk|xk-1,yk,vk-1) The laser perception model is combined with a Monte Carlo positioning algorithm, and the global pose of the robot is estimated by a plurality of weighted particles;
the specific process of the step 4 is as follows:
step 4-1. current time state xkPrior confidence of (d):
Figure FDA00034803555400000113
wherein Bel (x)k-1) The state reliability at the last moment is obtained;
current time state xkThe posterior probability distribution of (a):
Bel(xk)=p(zk|xk)Bel-(xk)/p(zk|z1:k-1)
wherein z iskAs measurement information of the current laser, p (z)k|xk) For the laser perception model, p (z)k|z1:k-1) Is the inverse of the normalization factor;
step 4-2, introducing a self-adaptive mechanism to the particle number change in the Monte Carlo positioning algorithm by using the KLD sampling idea, wherein the number of samples is in direct proportion to the size of a state space;
step 4-3. particle Collection
Figure FDA0003480355540000021
The mean value of (1) is the minimum variance estimation of the pose state of the robot, wherein i is more than or equal to 1 and less than or equal to n, n is the number of particles,
Figure FDA0003480355540000022
is xkThe (i) th element of (a),
Figure FDA0003480355540000023
are particles
Figure FDA0003480355540000024
A corresponding weight; the estimated value is recorded as the pose state of the robot after the second correction, and is global information which truly reflects the actual position and direction of the robot in a map coordinate system;
step 5, selecting an iterative closest point algorithm in the scanning matching algorithm as a final correction algorithm of the robot pose; and (4) taking the pose corrected in the step (4) as an iterative initial solution of the ICP algorithm, matching the point cloud observed in real time by the two-dimensional laser with the grid map point cloud, and obtaining the rotation and translation relation between the estimated pose of the robot and the real pose, thereby correcting the pose of the robot for the third time.
CN201810934182.8A 2018-08-16 2018-08-16 High-precision positioning method for intelligent inspection robot of transformer substation Active CN108955679B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810934182.8A CN108955679B (en) 2018-08-16 2018-08-16 High-precision positioning method for intelligent inspection robot of transformer substation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810934182.8A CN108955679B (en) 2018-08-16 2018-08-16 High-precision positioning method for intelligent inspection robot of transformer substation

Publications (2)

Publication Number Publication Date
CN108955679A CN108955679A (en) 2018-12-07
CN108955679B true CN108955679B (en) 2022-03-15

Family

ID=64469623

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810934182.8A Active CN108955679B (en) 2018-08-16 2018-08-16 High-precision positioning method for intelligent inspection robot of transformer substation

Country Status (1)

Country Link
CN (1) CN108955679B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109443351B (en) * 2019-01-02 2020-08-11 亿嘉和科技股份有限公司 Robot three-dimensional laser positioning method in sparse environment
CN109760018B (en) * 2019-01-03 2021-04-13 中建地下空间有限公司 Power cabin device
CN109579849B (en) * 2019-01-14 2020-09-29 浙江大华技术股份有限公司 Robot positioning method, robot positioning device, robot and computer storage medium
CN111694903B (en) * 2019-03-11 2023-09-12 北京地平线机器人技术研发有限公司 Map construction method, device, equipment and readable storage medium
CN110434859B (en) * 2019-05-30 2022-11-08 上海大学 Intelligent service robot system facing commercial office environment and operation method thereof
CN110262495B (en) * 2019-06-26 2020-11-03 山东大学 Control system and method capable of realizing autonomous navigation and accurate positioning of mobile robot
CN110293563B (en) * 2019-06-28 2022-07-26 炬星科技(深圳)有限公司 Method, apparatus, and storage medium for estimating pose of robot
CN110793543B (en) * 2019-10-21 2023-06-13 国网电力科学研究院有限公司 Positioning navigation precision measuring device and method of electric power inspection robot based on laser scanning
CN110954100A (en) * 2019-12-30 2020-04-03 广东省智能制造研究所 Method for estimating body state of foot type robot based on fusion of laser and inertial navigation
CN111536967B (en) * 2020-04-09 2022-12-16 江苏大学 EKF-based multi-sensor fusion greenhouse inspection robot tracking method
CN111993391B (en) * 2020-08-25 2022-02-15 深圳市优必选科技股份有限公司 Robot pose estimation method and device, humanoid robot and storage medium
CN112033412A (en) * 2020-09-07 2020-12-04 中国南方电网有限责任公司超高压输电公司天生桥局 Method and device for improving positioning precision of inspection robot
CN113075687A (en) * 2021-03-19 2021-07-06 长沙理工大学 Cable trench intelligent inspection robot positioning method based on multi-sensor fusion
CN113091736B (en) * 2021-04-02 2023-04-07 京东科技信息技术有限公司 Robot positioning method, device, robot and storage medium
CN113190002B (en) * 2021-04-25 2022-09-30 上海工程技术大学 Method for realizing automatic inspection by high-speed rail box girder inspection robot
CN113504543B (en) * 2021-06-16 2022-11-01 国网山西省电力公司电力科学研究院 Unmanned aerial vehicle LiDAR system positioning and attitude determination system and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009229394A (en) * 2008-03-25 2009-10-08 Ihi Corp Apparatus and method for determining traveling area of mobile robot
CN105698807A (en) * 2016-02-01 2016-06-22 郑州金惠计算机系统工程有限公司 Laser navigation system applicable to intelligent inspection robot of transformer substation
CN105891865A (en) * 2016-03-28 2016-08-24 南京工程学院 Markov-chain-Monte-Carlo-based particle filter positioning method
CN107063264A (en) * 2017-04-13 2017-08-18 杭州申昊科技股份有限公司 A kind of robot map creating method suitable for extensive substation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009229394A (en) * 2008-03-25 2009-10-08 Ihi Corp Apparatus and method for determining traveling area of mobile robot
CN105698807A (en) * 2016-02-01 2016-06-22 郑州金惠计算机系统工程有限公司 Laser navigation system applicable to intelligent inspection robot of transformer substation
CN105891865A (en) * 2016-03-28 2016-08-24 南京工程学院 Markov-chain-Monte-Carlo-based particle filter positioning method
CN107063264A (en) * 2017-04-13 2017-08-18 杭州申昊科技股份有限公司 A kind of robot map creating method suitable for extensive substation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"基于改进粒子滤波算法的移动机器人定位";张弦;《万方数据库》;20101231;正文第9-51页 *

Also Published As

Publication number Publication date
CN108955679A (en) 2018-12-07

Similar Documents

Publication Publication Date Title
CN108955679B (en) High-precision positioning method for intelligent inspection robot of transformer substation
CN108181636B (en) Environment modeling and map building device and method for petrochemical plant inspection robot
CN109579849A (en) Robot localization method, apparatus and robot and computer storage medium
CN111536967A (en) EKF-based multi-sensor fusion greenhouse inspection robot tracking method
CN103017771B (en) Multi-target joint distribution and tracking method of static sensor platform
CN103984981A (en) Building environment sensor measuring point optimization method based on Gauss process model
CN114459470A (en) Inspection robot positioning method based on multi-sensor fusion
CN114964212B (en) Multi-machine collaborative fusion positioning and mapping method oriented to unknown space exploration
CN107607091A (en) A kind of method for measuring unmanned plane during flying flight path
Monjazeb et al. Autonomous navigation among large number of nearby landmarks using FastSLAM and EKF-SLAM-A comparative study
Shi et al. Integrated Navigation by a Greenhouse Robot Based on an Odometer/Lidar.
Jiang et al. 3D SLAM based on NDT matching and ground constraints for ground robots in complex environments
CN116659500A (en) Mobile robot positioning method and system based on laser radar scanning information
Lee et al. Development of advanced grid map building model based on sonar geometric reliability for indoor mobile robot localization
Guan et al. A new integrated navigation system for the indoor unmanned aerial vehicles (UAVs) based on the neural network predictive compensation
Kong et al. Hybrid indoor positioning method of BLE and monocular VINS based smartphone
Wu et al. Cooperative localization of network robot system based on improved MPF
Zhang et al. Improved grid mapping technology based on Rao-Blackwellized particle filters and the gradient descent algorithm
Zhao et al. Location technology of indoor robot based on laser sensor
Emharraf et al. Mobile Robot: SLAM Implementation for Unknown Indoor Environment Exploration.
Liu et al. Research on NDT-based positioning for autonomous driving
Huang et al. Correcting of the unexpected localization measurement for indoor automatic mobile robot transportation based on a neural network
Lin et al. Pedestrian Movement Tracking Model in Road Environment Based on UAV Video
CN115047406B (en) Reconstruction method of ground-air link propagation attenuation region
CN115225136B (en) Reconstruction method of satellite-ground link propagation attenuation region

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant