CN112947461B - VSLAM algorithm-based blast furnace tuyere platform inspection robot control method - Google Patents
VSLAM algorithm-based blast furnace tuyere platform inspection robot control method Download PDFInfo
- Publication number
- CN112947461B CN112947461B CN202110225477.XA CN202110225477A CN112947461B CN 112947461 B CN112947461 B CN 112947461B CN 202110225477 A CN202110225477 A CN 202110225477A CN 112947461 B CN112947461 B CN 112947461B
- Authority
- CN
- China
- Prior art keywords
- inspection
- robot
- pixel
- algorithm
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 90
- 238000004422 calculation algorithm Methods 0.000 title claims abstract description 30
- 238000000034 method Methods 0.000 title claims abstract description 20
- 230000008569 process Effects 0.000 claims abstract description 4
- 230000033001 locomotion Effects 0.000 claims description 15
- 238000006073 displacement reaction Methods 0.000 claims description 10
- 238000004364 calculation method Methods 0.000 claims description 8
- 238000010586 diagram Methods 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 6
- 230000001419 dependent effect Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 5
- 238000013480 data collection Methods 0.000 claims description 3
- 238000012417 linear regression Methods 0.000 claims description 3
- 238000012067 mathematical method Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims description 3
- 238000001228 spectrum Methods 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 claims description 2
- 238000009795 derivation Methods 0.000 claims description 2
- 230000015572 biosynthetic process Effects 0.000 claims 1
- 239000006185 dispersion Substances 0.000 claims 1
- 239000002994 raw material Substances 0.000 claims 1
- 238000003786 synthesis reaction Methods 0.000 claims 1
- 230000000007 visual effect Effects 0.000 abstract description 8
- 239000000284 extract Substances 0.000 abstract 1
- 239000011159 matrix material Substances 0.000 description 11
- 241000282414 Homo sapiens Species 0.000 description 3
- 238000005034 decoration Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003034 coal gas Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005065 mining Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0253—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Manipulator (AREA)
Abstract
The invention aims to provide a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm. The invention firstly accurately extracts the image characteristics in the inspection site through the VSLAM algorithm, and continuously establishes and perfects an inspection site model through the characteristics. After receiving the inspection task, planning an inspection line; after the target location is reached, the image collected in real time and the modeling image are subjected to feature matching to ensure accurate positioning, and finally, relevant data are collected through a camera, a temperature sensor, an infrared sensor, a noise detector and a combustible gas detector carried by the target location and are transmitted to host computer software to analyze, process and establish visual data.
Description
Technical Field
The invention belongs to the field of computer vision and electrical control, and particularly relates to a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm.
Background
The blast furnace tuyere platform is a key area of an iron-making blast furnace and is also a key area of blast furnace iron-making production safety management. Blast furnace operators pay attention to the working state of a blast furnace tuyere raceway and use the working state as one of important bases for judging and controlling the running condition of the blast furnace. The work is completed mainly by a manual inspection mode. Because the leakage of coal gas may exist in the tuyere platform, great potential safety hazards exist in manual inspection.
With the rapid development of science and technology, inspection robots are widely applied in various fields to replace or assist human beings to perform various inspection works, particularly in environments where the human beings are difficult to live, the environments are severe and harmful to body health, or places where safety accidents frequently occur. The inspection robot can be divided into an autonomous inspection robot for the high-voltage transmission line, a mining intelligent inspection robot, an industrial factory inspection robot and the like according to the application range. According to the form, it can be divided into: wheeled robot, tracked robot and rail mounted robot of patrolling and examining. The inspection robot is controlled by a master control room often, so that the inspection robot can be used for performing inspection in different time periods and different areas in an inspection place according to preset or temporarily set inspection work, the inspection work is more comprehensive, real and reliable, and the interference of human factors such as missed inspection, error records and the like during manual inspection is effectively avoided. Meanwhile, the inspection robot can transmit the acquired data to the upper computer software for recording, processing and analyzing to form visual data, and the digital management level of the equipment is effectively improved.
At present, a lot of research attention is attracted to a VSLAM algorithm, the core method of the VSLAM is to sense the surrounding environment through optical sensors (such as a video camera, a depth camera and the like) in a visible light wave band, image data is formed, the image data is further extracted and processed, and a surrounding image model can be constructed by combining depth information of a scene and camera motion estimation. In the aspect of path planning, global planning can be performed by utilizing the identification of obstacles in the environment and visual landmark distribution, then the path planning result is sent to the processor, and finally a driving command is sent to realize navigation. In the aspect of feature matching, as a perfect routing inspection map is established in advance, the images to be matched are only required to be crossed, compared and matched with the images in the memory. In conclusion, the VLSAM comprises knowledge intersection of multiple science such as computer vision, electrical control science and the like, and is undoubtedly the current fire-heat and advanced research technology.
The VSLAM algorithm carried by the invention can obtain more abundant relevant information such as images, color and the like, and meanwhile, as the depth camera with the depth information is carried, the VSLAM algorithm is low in price, light and convenient to install, can obtain the depth information of an object, enables a captured scene not to be rigid in view, has scale information, can greatly simplify the modeling process of an inspection place, and has universality in the face of the great change of the working environment.
Therefore, the invention adopts the control method of the blast furnace tuyere platform inspection robot based on the VSLAM algorithm, which not only can ensure the safety and the high efficiency, but also can overcome some problems and defects existing in the traditional blast furnace tuyere platform inspection.
Disclosure of Invention
Although the positioning accuracy is high and the application range is wide, the current widely-used inspection robot carrying the GPS can only work outside a clear room with good signals, cannot normally operate in a place with a severe environment and is very easy to lose position information; the inspection robot based on the laser radar has the advantages of reliable sensing information, strong robustness and the like, but can cause the navigation and positioning to be incapable of sensing normally when facing the high repetition of environment sensing or being indistinguishable, and has high cost and less acquired environment information and characteristics. In order to solve the problems, the invention provides a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm is characterized by comprising the following specific steps:
the first step is as follows: establishing a routing inspection map:
the inspection robot is located on a main control room charging station when in an initial state, the robot is started, a self-checking map is established, if the gray value of a certain pixel point in an image is more prominent than the rest pixel points, the pixel point is set as a center pixel point P of a detection area, the detected pixel point is I, the detection range is a discrete circle with the radius of 3 pixel points, 16 pixel points are counted, a gray threshold value is set as T, if the gray value [ I-P ] of the detected pixel point is more than or equal to T, the fact that sampling is carried out due to overlarge gray value difference is shown, otherwise, the pixel point is used as a feature point and is established in a model map, and the high-precision inspection map is established through continuous sampling and comparison.
Setting the patrol area as D, and defining the moment of D as: m is a unit of pq =∑x p y q I (x, y), p, q = {0,1}; x, y ∈ D, where x p y q I (x, y) is the gray value at point (x, y), which is the moment of the neighborhood pixels of feature point p.
The centroid of the inspection area D is:m 10 、m 01 、m 00 is the key node coordinate in the area D. So as to make a vector0 is the geometric center, C is the centroid,
the included angle between the characteristic point and the mass center is defined asAfter the area direction is determined, starting to sample brighter points in the area, and acquiring n characteristic points, x n y n Representing a set of feature points, wherein a feature description matrix is A:
rotating A by alpha angle to obtain rotation matrix R α The calculation is simplified by rotating the matrix.
WhereinLiberation of R α The principal direction of the characteristic point in the theta region D can be obtained, and then the color gray characteristic value is collected, and the characteristic extraction is completed.
The second step is that: path planning:
after modeling of the inspection place is completed, the robot starts to perform an inspection task;
the robot needs to establish a path plan autonomously to reach an accurate inspection position in any mode;
a mathematical method of a unary linear regression model by a least square method is adopted,
the sample regression model is set as:wherein: y represents a dependent variable, X is an independent variable, e i In order to randomly perturb the terms of the disturbance,is a constant term of a regression equation.
To find a conforming regression modelThe extreme value of (2) can be obtained by calculating partial derivatives, and the regression model equation can be obtained by planning the path of the next moment according to the actual situation;
the method is simple in calculation, stable in estimation value and sensitive to abnormal values, and path precision is effectively guaranteed.
The third step: and (3) feature matching:
the good feature matching visual algorithm can be applied to path planning, and can also be used for secondarily determining whether the specified position is reached after the specified position is reached.
Let the structure of two figures be G 1 =(V 1 ,E 1 )、G 2 =(V 2 ,E 2 ) V represents a point set, E represents an edge set, and the number of nodes is n 1 ,n 2 I.e. | V 1 |=n 1 ,|V 2 |=n 2 In the general case of n 1 =n 2 . The distribution matrix is used for describing the mapping relation between corresponding nodes formed in the structures of the two graphs, and is set asUnder its constraints, maximizing its matching value,
wherein c is i,j 、d i,j,k,l Respectively, the similarity between nodes, the similarity between edges, X i,j 、X k,l . Representing matching relationships between nodes. If X in the distribution matrix i,j If =1, it represents G 1 The ith node in the network and G 2 The j-th node in the set is matched if X i,j If =0, it means that the two nodes are not related. G 1 Should have at most and only 1 node and G 2 The node in (2) is matched, and the condition that one-to-many exists is not existed or is not allowed to occur.
If G is carried out directly 1 =(V 1 ,E 1 ) The point coordinate pair G 2 And global search matching is carried out, the calculation engineering quantity is extremely large, the requirement on a processor is high, and delay and downtime are easily caused. Therefore, set up G 1 =(V 1 ,E 1 ),G 2 =(V 1 +d u ,E 1 +d v ) WhereinThe pixel displacement is expressed, and the incremental algorithm is adopted for feature matching, so that the operation speed can be greatly improved, the operation burden is reduced, and the processing time is shortened.
Wherein: v k ,E k Represents G 1 The coordinates of the kth point in the figure. [ V ] k +d u ,E 1 +d v ]Represents G 1 At the k-th point in (1) and G 2 Coordinates of the medium-sized corresponding relation point.
G′ 1,u ,G′ 1,v representation solution derivative solution [ d ] on pixels u ,d v ]Extreme value, i.e. finding the smallestThe displacement of the pixels is carried out by shifting the pixels,representing after x iterationsAnd (4) pixel displacement.
The above equation is quantized into a control gradient matrix form to obtain:
in the form of least squares, iterations may be performed step by step from an initial valueThe error for iteration x times is:
whereinRepresenting the value after x iterations. Defining an update direction of an iteration error
So that the pixel is shiftedCan be expressed as(C is the cost matrix). When in useWhen sufficiently small, iterate equation e x G(V k ,E k ) Stopping the iteration to obtain the pixel ratio of G 1 =(V 1 ,E 1 ) The characteristic point G can be obtained by calculation 1 =(V 1 ,E 1 ) At G in 2 =(V 2 ,E 2 ) Where the location of the device is. Thereby completing the feature matching quickly.
The fourth step: positioning:
in fact, for a robot carrying a sensor to move in an unknown environment, the time of a continuous movement is usually changed into discrete time t =1,2,3 1 ,x 2 ,x 3 ...x n ) Representing the real trajectory of the robot, which constitutes the actual trajectory of the robot. On the aspect of a map, at each moment, the sensors measure a part of map features and obtain observation data of the map features so as to calculate corresponding track coordinatesTrajectory coordinates generated for the algorithm. The positioning problem is solved by comparing the Root Mean Square Error (RMSE) of the true motion trajectory with the algorithmically generated trajectory with the motion measurements and sensor readings.
Represents the offset of the algorithm-generated trajectory at the ith point, trans (x) i ) And the offset of the actual motion track at the ith position is shown, and n represents the number of cameras.
The fifth step: collecting data;
collecting the surface temperature of each working element and the surface temperature of the related binding post through a temperature detector; an inspection place thermodynamic diagram is established through an infrared sensor and can be compared with data collected by a temperature detector, and closed-loop control is formed to ensure the accuracy of data collection; collecting field noise spectrum data through a noise detector; collecting a concentration value of the on-site dangerous gas through a combustible gas detector; the camera begins to gather instrument data, because the robot that patrols and examines can the accurate positioning patrol and examine the position, because realized patrolling and examining the accurate positioning of position, so patrol and examine the correlation, the uniformity and the accurate performance of the signal that the robot gathered and obtain very big assurance, the data of gathering are passed through the intelligent control host computer and are passed through the signal interchanger and transmit for the master control room computer, carry out analysis and processing.
As a further improvement of the invention, the inspection task comprises two inspection modes;
in a first mode: the main control room computer can enable the inspection robot to automatically start a preset inspection task by sending a command;
and a second mode: and the computer in the master control room remotely controls the inspection robot to perform inspection to a specified place.
Compared with the prior art, the invention has the following advantages:
(1) The invention relates to a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm, which can accurately position the real-time position of the robot and can clearly see the position of the robot in upper computer software. The positioning precision of the robot and the mutual inductance of the users are effectively improved.
(2) The invention adopts VSLAM algorithm, has the advantages of open source, easy programming and the like, can measure speed and distance, quickly identify map coordinates, and realize long-time and high-precision positioning of the robot.
(3) Compared with the GPS and the laser radar, the invention has the characteristics of richer visual information, low hardware cost, wide visual range and the like.
Drawings
FIG. 1 is a schematic diagram of the principles of the present invention;
FIG. 2 is a schematic diagram of the VSLAM algorithm;
fig. 3 is a robot inspection flow chart.
Detailed Description
The present invention will be further illustrated by the following specific examples, which are carried out on the premise of the technical scheme of the present invention, and it should be understood that these examples are only for illustrating the present invention and are not intended to limit the scope of the present invention.
The invention aims to provide a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm. Firstly, extracting the image characteristics in the inspection site, and continuously establishing and perfecting an inspection site model through the characteristics. After receiving the inspection task, planning an inspection line; after the target inspection position is reached, the image acquired in real time is subjected to feature matching with the modeling image so as to ensure accurate positioning, and finally, related data are acquired through a camera, a temperature sensor, an infrared sensor, a noise detector and a combustible gas detector which are carried by the target inspection position, and are transmitted to the upper computer software to analyze, process and establish visual data.
The invention discloses a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm, which is shown in a schematic diagram of a VSLAM algorithm shown in a figure 1, a schematic diagram of a VSLAM algorithm shown in a figure 2, and a robot inspection flow chart shown in a figure 3, and comprises the following specific embodiments:
the first step is as follows: establishing a routing inspection map: the inspection robot is located on a main control room charging station when in an initial state. And starting the robot, performing self-checking and establishing a routing inspection map. And setting a central pixel point of the detection area as P, a detected pixel point as I, forming a detection range by taking a certain pixel point as a radius, setting a gray threshold value as T, indicating that sampling is wrong and re-acquisition is carried out if the gray value [ I-P ] of the detected pixel point is more than or equal to T, and otherwise, establishing the sampling point in a model graph.
The second step: path planning: after the modeling of the inspection place is completed, the robot can start to perform inspection tasks, and the invention is provided with two inspection modes. The first mode is as follows: the main control room computer can send a command to enable the inspection robot to automatically start a preset inspection task. And a second mode: and the computer in the master control room remotely controls the inspection robot to an appointed place for inspection. In either case, the robot needs to autonomously establish a path plan to reach the accurate inspection position.
A mathematical method of a unary linear regression model by a least square method is adopted,
the sum of the squares of the residuals is:to solve a coincidence regression modelThe extreme value of (2) is obtained by calculating a partial derivative. And solving the regression model equation to plan the path of the next moment according to the actual situation.
X i An argument representing the ith point, Y i To representAnd X i Dependent variable of interest, e i In order to randomly perturb the term(s),is a constant term of a regression equation.
The third step: and (3) feature matching:
the good feature matching visual algorithm can be applied to path planning, and can also be used for secondarily determining whether the specified position is reached after the specified position is reached.
Let the structure of two figures be G 1 =(V 1 ,E 1 )、G 2 =(V 2 ,E 2 ) The number of the nodes is n 1 ,n 2 I.e. | V 1 |=n 1 ,|V 2 |=n 2 In the general case of n 1 =n 2 . The distribution matrix is used for describing the mapping relation between corresponding nodes formed in the structures of the two graphs, and is set asUnder its constraints, its match value is maximized, and the match value function is expressed as:
wherein c is i,j 、d i,j,k,l Respectively, the similarity between nodes, the similarity between edges, X i,j 、X k,l . Representing matching relationships between nodes. If X in the distribution matrix i,j If =1, then represents G 1 The ith node in the network and G 2 The j-th node in the set is matched if X i,j If =0, it means that the two nodes are not related. G 1 Should have at most and only 1 node and G 2 The node in (2) is matched, and the condition that one-to-many exists is not existed or is not allowed to occur.
If G is carried out directly 1 =(V 1 ,E 1 ) The point coordinate pair G 2 The global search matching is carried out, the calculation engineering quantity is extremely large, and the requirement on a processor is metHigher, easily causing delay and downtime. Therefore, set up G 1 =(V 1 ,E 1 ),G 2 =(V 1 +d u ,E 1 +d v ) WhereinThe pixel displacement is expressed, and the incremental algorithm is adopted for feature matching, so that the operation speed can be greatly improved, the operation burden is reduced, and the processing time is shortened.
Wherein: v k ,E k Represents G 1 The coordinates of the kth point in the figure. V k +d u ,E 1 +d v Represents G 1 At the k-th point in (1) and G 2 Coordinates of the medium-sized corresponding relation point.
G′ 1,u ,G′ 1,v representation solution derivation solution [ d ] for pixels u ,d v ]Value, i.e. finding the smallestThe displacement of the pixels is carried out by shifting the pixels,representing after x iterationsThe pixel is shifted.
The above equation is quantized into a control gradient matrix form to obtain:
in the form of least squares, iterations may be performed step by step from an initial valueThe error for iteration x times is:
whereinRepresenting the value after x iterations. Obtaining an update direction of the iteration error:
so that the pixel is shiftedCan be expressed as(C is the cost matrix). When in useWhen sufficiently small, iterate formula e x G(V k ,E k ) Stopping the iteration to obtain the pixel ratio of G 1 =(V 1 ,E 1 ) The characteristic point G can be obtained by calculation 1 =(V 1 ,E 1 ) At G in 2 =(V 2 ,E 2 ) Where the location of the sensor is located. Thereby completing the feature matching quickly.
The fourth step: positioning:
in fact, for a robot carrying a sensor to move in an unknown environment, the time of a continuous movement is usually changed into discrete time t =1,2,3 1 ,x 2 ,x 3 ...x n ) Representing the real trajectory of the robot, which constitutes the actual trajectory of the robot. On the aspect of a map, at each moment, the sensors measure a part of map features and obtain observation data of the map features so as to calculate corresponding track coordinatesTrajectory coordinates generated for the algorithm. The positioning problem is solved by comparing the Root Mean Square Error (RMSE) of the true motion trajectory with the algorithmically generated trajectory with the motion measurements and sensor readings.
Represents the offset of the algorithm-generated trajectory at the ith point, trans (x) i ) And the deviation amount of the actual motion track at the ith position is shown, and n represents the number of cameras.
The fifth step: collecting data; a chassis of a wheel type industrial robot is adopted to carry various sensing devices, and various devices of a blast furnace tuyere platform are regularly inspected. Collecting the surface temperature of each working element and the surface temperature of the related binding post through a temperature detector; an inspection place thermodynamic diagram is established through an infrared sensor and can be compared with data collected by a temperature detector, and closed-loop control is formed to ensure the accuracy of data collection; collecting field noise frequency spectrum data through a noise detector; collecting a concentration value of the on-site dangerous gas through a combustible gas detector; the camera starts to collect the instrument data. Because the inspection robot can accurately position the inspection position, the correlation, consistency and accuracy of signals acquired by the inspection robot are greatly guaranteed because the inspection robot can accurately position the inspection position. The collected data is transmitted to a main control room computer through an intelligent control host computer through a signal exchanger, analyzed and processed, and visualized data is established.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (1)
1. A control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm is characterized by comprising the following specific steps:
the first step is as follows: establishing a routing inspection map:
the inspection robot is located on a charging station of a master control room when in an initial state, the robot is started, a inspection map is self-checked and established, the characteristics of an inspection place are extracted through a VSLAM algorithm, if the gray value of a certain pixel in an image is different from the gray values of enough pixels in surrounding neighborhoods enough, the pixel is set as a central pixel P of a detection area, a detected pixel is I, 16 pixels in a circle of dispersion with 3 pixels as radiuses form a detection range, a gray threshold value is set as T, if the gray value [ I-P ] of the detected pixel is more than or equal to T, sampling errors are indicated, and re-collection is performed, otherwise, a sampling point is established in a model map;
the second step: path planning:
after modeling of the inspection place is completed, the robot starts to perform an inspection task;
the inspection task comprises two inspection modes;
the first mode is as follows: the main control room computer can enable the inspection robot to automatically start a preset inspection task by sending a command;
and a second mode: the computer in the master control room remotely controls the inspection robot to an appointed place for inspection;
the robot needs to establish a path plan autonomously to reach an accurate inspection position in any mode;
a mathematical method of a unary linear regression model by a least square method is adopted,
the sample regression model is set as:wherein: x i An argument representing the ith point, Y i Is represented by X i Dependent variable of interest, e i In order to randomly perturb the terms of the disturbance,is a regression equation constant term;
the sum of the squares of the residuals is: is a dependent variable Y i The corresponding estimated value of (a), effectively describes the path estimated by the algorithm;
to find a conforming regression modelThe extreme value of (2) can be obtained by calculating partial derivatives, and the regression model equation can be obtained by planning the path of the next moment according to the actual situation;
the third step: and (3) feature matching:
the feature matching can be simply understood as two graphs in the same scene or similar scenes, and corresponding nodes related to the two graphs are set as;
G 1 =(V 1 ,E 1 )、G 2 =(V 1 +d u ,E 1 +d v ) V denotes a point set, E denotes an edge setIn the synthesis process, the raw materials are mixed,representing pixel displacement, the number of nodes being n 1 ,n 2 I.e. | V 1 |=n 1 ,|V 2 |=n 2 In the general case of n 1 =n 2 ;
Wherein: v k ,E k Represents G 1 Coordinates of the kth point in the graph; v k +d u ,E 1 +d v Represents G 1 At the k-th point in (1) and G 2 Coordinates of the Chinese patent corresponding relation points;
in order to ensure that the pixel displacement, namely the increment, is minimum and reduce the calculation load, the partial derivative of the formula is solved, and the partial derivative is obtained after x iterations
G′ 1,u ,G′ 1,v Representation solution the derivation of the pixel is solved for [ d ] u ,d v ]Value, i.e. finding the smallestThe displacement is carried out in such a way that,is expressed byAfter x iterationsPixel displacement;
when the iteration formula is sufficiently small, the iteration is stopped, and the minimum displacement of the pixel is calculated by the iteration formulaFinally, G is obtained by calculation 1 =(V 1 ,E 1 ) G with the point having corresponding relation 2 =(V 2 ,E 2 ) The point coordinates;
the fourth step: positioning:
the accuracy of the route is ensured through strict and accurate route planning and feature matching, and finally, whether the accuracy is correct is determined through comparison of a real motion track and a Root Mean Square Error (RMSE) of an algorithm generated track, motion measurement and sensor reading, and the actual motion track of the robot is set as X = (X) 1 ,x 2 ,x 3 ...x k ) The motion track generated by the algorithm isJudging whether the positioning is accurate or not according to the root mean square difference (RMSE) of the real motion track and the algorithm generated track;represents the offset of the algorithm-generated trajectory at the ith point, trans (x) i ) Representing the offset of the actual motion track at the ith point, wherein n represents the number of cameras;
the fifth step: collecting data;
a chassis of a wheeled industrial robot is adopted, various sensing devices are carried, various devices of a blast furnace tuyere platform are regularly checked, and the surface temperature of each working component and the surface temperature of a related binding post are collected through a temperature detector; an inspection place thermodynamic diagram is established through an infrared sensor and can be compared with data collected by a temperature detector, and closed-loop control is formed to ensure the accuracy of data collection; collecting field noise frequency spectrum data through a noise detector; the camera begins to gather instrument data, because the robot that patrols and examines can the accurate positioning patrol and examine the position, because realized patrolling and examining the accurate positioning of position, so patrol and examine the correlation, the uniformity and the accurate performance of the signal that the robot gathered and obtain very big assurance, the data of gathering are passed through the intelligent control host computer and are passed through the signal interchanger and transmit for the master control room computer, carry out analysis and processing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110225477.XA CN112947461B (en) | 2021-03-01 | 2021-03-01 | VSLAM algorithm-based blast furnace tuyere platform inspection robot control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110225477.XA CN112947461B (en) | 2021-03-01 | 2021-03-01 | VSLAM algorithm-based blast furnace tuyere platform inspection robot control method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112947461A CN112947461A (en) | 2021-06-11 |
CN112947461B true CN112947461B (en) | 2022-11-04 |
Family
ID=76246933
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110225477.XA Active CN112947461B (en) | 2021-03-01 | 2021-03-01 | VSLAM algorithm-based blast furnace tuyere platform inspection robot control method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112947461B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113581792B (en) * | 2021-08-04 | 2023-04-28 | 三一机器人科技有限公司 | Tray position verification method, device and tray positioning system |
CN117587181A (en) * | 2023-11-23 | 2024-02-23 | 建龙西林钢铁有限公司 | Blast furnace tuyere temperature monitoring device and control method thereof |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6362875B1 (en) * | 1999-12-10 | 2002-03-26 | Cognax Technology And Investment Corp. | Machine vision system and method for inspection, homing, guidance and docking with respect to remote objects |
CN107657640A (en) * | 2017-09-30 | 2018-02-02 | 南京大典科技有限公司 | Intelligent patrol inspection management method based on ORB SLAM |
CN208841421U (en) * | 2018-08-22 | 2019-05-10 | 上海宝宬冶金科技有限公司 | Blast-furnace tuyere automatic crusing robot |
-
2021
- 2021-03-01 CN CN202110225477.XA patent/CN112947461B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6362875B1 (en) * | 1999-12-10 | 2002-03-26 | Cognax Technology And Investment Corp. | Machine vision system and method for inspection, homing, guidance and docking with respect to remote objects |
CN107657640A (en) * | 2017-09-30 | 2018-02-02 | 南京大典科技有限公司 | Intelligent patrol inspection management method based on ORB SLAM |
CN208841421U (en) * | 2018-08-22 | 2019-05-10 | 上海宝宬冶金科技有限公司 | Blast-furnace tuyere automatic crusing robot |
Non-Patent Citations (3)
Title |
---|
基于ORB特征匹配的视觉SLAM研究;魏雄;《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》;20200215;第五章 * |
基于VSLAM的移动机器人控制系统研究;杜永程;《中国优秀博硕士学位论文全文数据库(硕士) 农业科技辑》;20210115;第3、4节 * |
基于改进ORB算法的VSLAM特征匹配算法研究;杨立闯;《河北工业大学学报》;20200430;第0、1节 * |
Also Published As
Publication number | Publication date |
---|---|
CN112947461A (en) | 2021-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kim et al. | SLAM-driven robotic mapping and registration of 3D point clouds | |
Lu et al. | Mobile robot for power substation inspection: A survey | |
CN112947461B (en) | VSLAM algorithm-based blast furnace tuyere platform inspection robot control method | |
WO2019136714A1 (en) | 3d laser-based map building method and system | |
CN104062973A (en) | Mobile robot SLAM method based on image marker identification | |
CN113189977B (en) | Intelligent navigation path planning system and method for robot | |
Safin et al. | Evaluation of visual slam methods in usar applications using ros/gazebo simulation | |
CN112446543A (en) | Urban underground comprehensive pipe gallery risk management and control system and method based on neural network | |
Pan et al. | Sweeping robot based on laser SLAM | |
Zhang et al. | Factor graph-based high-precision visual positioning for agricultural robots with fiducial markers | |
Zheng et al. | Mobile robot integrated navigation algorithm based on template matching VO/IMU/UWB | |
Gao et al. | Fully automatic large-scale point cloud mapping for low-speed self-driving vehicles in unstructured environments | |
Reiterer et al. | A 3D optical deformation measurement system supported by knowledge-based and learning techniques | |
Wang | Autonomous mobile robot visual SLAM based on improved CNN method | |
CN114721377A (en) | Improved Cartogrier based SLAM indoor blind guiding robot control method | |
Gao et al. | A new method for repeated localization and matching of tunnel lining defects | |
Kornilova et al. | Evops benchmark: evaluation of plane segmentation from RGBD and LiDAR data | |
Peng et al. | Dynamic Visual SLAM Integrated with IMU for Unmanned Scenarios | |
Bayer et al. | On construction of a reliable ground truth for evaluation of visual slam algorithms | |
Liang et al. | An Accurate Visual Navigation Method for Wheeled Robot in Unstructured Outdoor Environment Based on Virtual Navigation Line | |
Ju et al. | Learning scene adaptive covariance error model of LiDAR scan matching for fusion based localization | |
Park et al. | Mobile robot navigation based on direct depth and color-based environment modeling | |
CN117726785B (en) | Target identification positioning system and method for initiating explosive device cleaning | |
Muharom et al. | Optimization RTAB-Map Based on TORO Graph to Filter Wrong Loop Closure Detection for Search and Rescue Robot Application | |
Radzi et al. | Visual-based and Lidar-based SLAM Study for Outdoor Environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |