CN112947461A - VSLAM algorithm-based blast furnace tuyere platform inspection robot control method - Google Patents
VSLAM algorithm-based blast furnace tuyere platform inspection robot control method Download PDFInfo
- Publication number
- CN112947461A CN112947461A CN202110225477.XA CN202110225477A CN112947461A CN 112947461 A CN112947461 A CN 112947461A CN 202110225477 A CN202110225477 A CN 202110225477A CN 112947461 A CN112947461 A CN 112947461A
- Authority
- CN
- China
- Prior art keywords
- inspection
- robot
- algorithm
- pixel
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 90
- 238000004422 calculation algorithm Methods 0.000 title claims abstract description 32
- 238000000034 method Methods 0.000 title claims abstract description 20
- 230000033001 locomotion Effects 0.000 claims description 15
- 238000004364 calculation method Methods 0.000 claims description 9
- 238000006073 displacement reaction Methods 0.000 claims description 8
- 238000010586 diagram Methods 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 6
- 238000005070 sampling Methods 0.000 claims description 6
- 238000012545 processing Methods 0.000 claims description 5
- 238000013480 data collection Methods 0.000 claims description 3
- 230000001419 dependent effect Effects 0.000 claims description 3
- 238000012417 linear regression Methods 0.000 claims description 3
- 238000012067 mathematical method Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims description 3
- 238000001228 spectrum Methods 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 claims description 2
- 238000009795 derivation Methods 0.000 claims description 2
- 230000000007 visual effect Effects 0.000 abstract description 8
- 230000008569 process Effects 0.000 abstract description 3
- 239000000284 extract Substances 0.000 abstract 1
- 239000011159 matrix material Substances 0.000 description 11
- 241000282414 Homo sapiens Species 0.000 description 3
- 238000005034 decoration Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003034 coal gas Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005065 mining Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0221—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0223—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0253—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Manipulator (AREA)
Abstract
The invention aims to provide a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm. The invention firstly accurately extracts the image characteristics in the inspection site through the VSLAM algorithm, and continuously establishes and perfects an inspection site model through the characteristics. After receiving the inspection task, planning an inspection line; after the target location is reached, the image collected in real time and the modeling image are subjected to feature matching to ensure accurate positioning, and finally, relevant data are collected through a camera, a temperature sensor, an infrared sensor, a noise detector and a combustible gas detector carried by the target location and are transmitted to host computer software to analyze, process and establish visual data.
Description
Technical Field
The invention belongs to the field of computer vision and electrical control, and particularly relates to a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm.
Background
The blast furnace tuyere platform is a key area of an iron-making blast furnace and is also a key area of blast furnace iron-making production safety management. Blast furnace operators pay attention to the working state of a blast furnace tuyere raceway and use the working state as one of important bases for judging and controlling the running condition of the blast furnace. The work is completed mainly by a manual inspection mode. Because the leakage of coal gas may exist in the tuyere platform, great potential safety hazard exists in manual inspection.
With the rapid development of science and technology, inspection robots are widely applied in various fields to replace or assist human beings to perform various inspection works, particularly in environments where the human beings are difficult to live, the environments are severe and harmful to body health, or places where safety accidents frequently occur. The inspection robot can be divided into a high-voltage transmission line autonomous inspection robot, a mining intelligent inspection robot, an industrial factory inspection robot and the like according to the application range. According to the form, it can be divided into: wheeled robot, tracked robot and rail mounted robot of patrolling and examining. The inspection robot is controlled by a master control room often, so that the inspection robot can be used for performing inspection in different time periods and different areas in an inspection place according to preset or temporarily set inspection work, the inspection work is more comprehensive, real and reliable, and the interference of human factors such as missed inspection, error records and the like during manual inspection is effectively avoided. Meanwhile, the inspection robot can transmit the acquired data to the upper computer software for recording, processing and analyzing to form visual data, and the digital management level of the equipment is effectively improved.
At present, a lot of research attention is attracted to a VSLAM algorithm, the core method of the VSLAM is to sense the surrounding environment through optical sensors (such as a video camera, a depth camera and the like) in a visible light wave band, image data is formed, the image data is further extracted and processed, and a surrounding image model can be constructed by combining depth information of a scene and camera motion estimation. In the aspect of path planning, global planning can be performed by utilizing the identification of obstacles in the environment and visual landmark distribution, then the path planning result is sent to the processor, and finally a driving command is sent to realize navigation. In the aspect of feature matching, as a perfect routing inspection map is established in advance, only the images needing to be matched and the images in the memory need to be crossed, compared and matched. In conclusion, the VLSAM comprises knowledge intersection of multiple science such as computer vision, electrical control science and the like, and is undoubtedly the current fire-heat and advanced research technology.
The VSLAM algorithm carried by the invention can obtain more abundant relevant information such as images, color and the like, and meanwhile, as the depth camera with the depth information is carried, the VSLAM algorithm is low in price, light and convenient to install, can obtain the depth information of an object, enables a captured scene not to be rigid in view, has scale information, can greatly simplify the modeling process of an inspection place, and has universality in the face of the great change of the working environment.
Therefore, the invention adopts the control method of the blast furnace tuyere platform inspection robot based on the VSLAM algorithm, which not only can ensure the safety and the high efficiency, but also can overcome some problems and defects existing in the traditional blast furnace tuyere platform inspection.
Disclosure of Invention
The current widely used inspection robot carrying the GPS can only work outside the open with better signals and can not normally operate in the severe environment, and position information is easy to lose, although the positioning precision is higher and the application range is wider; the inspection robot based on the laser radar has the advantages of reliable sensing information, strong robustness and the like, but can cause the navigation and positioning to be incapable of sensing normally when facing the high repetition of environment sensing or being indistinguishable, and has high cost and less acquired environment information and characteristics. In order to solve the problems, the invention provides a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm is characterized by comprising the following specific steps:
the first step is as follows: establishing a routing inspection map:
the inspection robot is located on a main control room charging station when in an initial state, the robot is started, a self-checking map is established, if the gray value of a certain pixel point in an image is more prominent than the rest pixel points, the pixel point is set as a center pixel point P of a detection area, the detected pixel point is I, the detection range is a discrete circle with the radius of 3 pixel points, 16 pixel points are counted, a gray threshold value is set as T, if the gray value [ I-P ] of the detected pixel point is more than or equal to T, the fact that sampling is carried out due to overlarge gray value difference is shown, otherwise, the pixel point is used as a feature point and is established in a model map, and the high-precision inspection map is established through continuous sampling and comparison.
And setting the inspection area as D, and defining the moment of D as: m ispq=∑xpyqI (x, y), p, q ═ 0, 1 }; x, y ∈ D, where xpyqI (x, y) is the gray value at point (x, y), which is the moment of the neighborhood pixels of feature point p.
The centroid of the inspection area D is:m10、m01、m00is the key node coordinate in the area D. So as to make a vector0 is the geometric center, C is the centroid,
the included angle between the characteristic point and the mass center is defined asAfter the area direction is determined, starting to sample brighter points in the area, and acquiring n characteristic points, xnynRepresenting a set of feature points, wherein a feature description matrix is A:
rotating A by alpha angle to obtain rotation matrix RαThe calculation is simplified by rotating the matrix.
WhereinLiberation of RαThe principal direction of the characteristic point in the theta region D can be obtained, and then the color gray characteristic value is collected, and the characteristic extraction is completed.
The second step is that: path planning:
after modeling of the inspection place is completed, the robot starts to perform an inspection task;
the robot needs to establish a path plan autonomously to reach an accurate inspection position in any mode;
a mathematical method of a unary linear regression model by a least square method is adopted,
the sample regression model is set as:wherein: y represents a dependent variable, X is an independent variable, eiIn order to randomly perturb the terms of the disturbance,is a constant term of a regression equation.
To find a conforming regression modelThe extreme value of (2) can be obtained by calculating partial derivatives, and the regression model equation can be obtained by planning the path of the next moment according to the actual situation;
the method is simple in calculation, stable in estimation value and sensitive to abnormal values, and path precision is effectively guaranteed.
The third step: and (3) feature matching:
the good feature matching visual algorithm can be applied to path planning, and can also be used for secondarily determining whether the specified position is reached after the specified position is reached.
Let the structure of two figures be G1=(V1,E1)、G2=(V2,E2) V represents a point set, E represents an edge set, and the number of nodes is n1,n2I.e. | V1|=n1,|V2|=n2In the general case of n1=n2. The distribution matrix is used for describing the mapping relation between corresponding nodes formed in the structures of the two graphs, and is set asUnder its constraints, maximizing its matching value,
wherein c isi,j、di,j,k,lRespectively, the similarity between nodes, the similarity between edges, Xi,j、Xk,l. Representing matching relationships between nodes. If X in the distribution matrixi,jWhen 1, then represents G1The ith node and G2The j-th node in the set is matched if Xi,jWhen 0, it means that the two nodes are not related. G1Should have at most and only 1 node and G2The node in (2) is matched, and the condition that one-to-many exists is not existed or is not allowed to occur.
If G is carried out directly1=(V1,E1) The point coordinate pair G2And global search matching is carried out, the calculation engineering quantity is extremely large, the requirement on a processor is high, and delay and downtime are easily caused. Therefore, set up G1=(V1,E1),G2=(V1+du,E1+dv) WhereinThe pixel displacement is expressed, and the incremental algorithm is adopted for feature matching, so that the operation speed can be greatly improved, and the reduction of the operation speed is realizedThe calculation load and the processing time are reduced.
Wherein: vk,EkRepresents G1The coordinates of the kth point in the figure. [ V ]k+du,E1+dv]Represents G1At the k-th point in (1) and G2Coordinates of the medium-sized corresponding relation point.
G′1,u,G′1,vrepresentation solution derivative solution [ d ] on pixelsu,dv]Extreme value, i.e. finding the smallestThe displacement of the pixels is carried out by shifting the pixels,representing after x iterationsPixelAnd (4) displacing.
The above equation is quantized into a control gradient matrix form to obtain:
in the form of least squares, iterations may be performed step by step from an initial valueThe error for iteration x times is:
whereinRepresenting the value after x iterations. Defining an update direction of an iteration error
So that the pixel is shiftedCan be expressed as(C is the cost matrix). When in useWhen sufficiently small, iterate equation exG(Vk,Ek) Stopping the iteration, thereby obtaining the pixel ratio of G1=(V1,E1) The characteristic point G can be obtained by calculation1=(V1,E1) At G in2=(V2,E2) Where the location of the sensor is located. Thereby completing the feature matching quickly.
The fourth step: positioning:
in fact, for a robot carrying a sensor to move in an unknown environment, a time of a continuous movement is usually changed into discrete time t ═ 1, 2, 3.. k, and in these time, X ═ 3.. k1,x2,x3...xn) Representing the real trajectory of the robot, which constitutes the actual trajectory of the robot. On the aspect of a map, at each moment, the sensors measure a part of map features and obtain observation data of the map features so as to calculate corresponding track coordinatesTrajectory coordinates generated for the algorithm. The positioning problem is solved by comparing the Root Mean Square Error (RMSE) of the true motion trajectory with the algorithmically generated trajectory with the motion measurements and sensor readings.
Represents the offset of the algorithm-generated trajectory at the ith point, trans (x)i) And the offset of the actual motion track at the ith position is shown, and n represents the number of cameras.
The fifth step: collecting data;
collecting the surface temperature of each working element and the surface temperature of the related binding post through a temperature detector; an inspection place thermodynamic diagram is established through an infrared sensor and can be compared with data collected by a temperature detector, and closed-loop control is formed to ensure the accuracy of data collection; collecting field noise frequency spectrum data through a noise detector; collecting a concentration value of the on-site dangerous gas through a combustible gas detector; the camera begins to gather instrument data, because the robot that patrols and examines can the accurate positioning patrol and examine the position, because realized patrolling and examining the accurate positioning of position, so patrol and examine the correlation, the uniformity and the accurate performance of the signal that the robot gathered and obtain very big assurance, the data of gathering are passed through the intelligent control host computer and are passed through the signal interchanger and transmit for the master control room computer, carry out analysis and processing.
As a further improvement of the invention, the inspection task comprises two inspection modes;
the first mode is as follows: the main control room computer can enable the inspection robot to automatically start a preset inspection task by sending a command;
and a second mode: and the computer in the master control room remotely controls the inspection robot to an appointed place for inspection.
Compared with the prior art, the invention has the following advantages:
(1) the invention relates to a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm, which can accurately position the real-time position of the robot and can clearly see the position of the robot in upper computer software. The positioning precision of the robot and the mutual inductance of the users are effectively improved.
(2) The invention adopts VSLAM algorithm, has the advantages of open source, easy programming and the like, can measure speed and distance, quickly identify map coordinates, and realize long-time and high-precision positioning of the robot.
(3) Compared with the GPS and the laser radar, the invention has the characteristics of richer visual information, low hardware cost, wide visual range and the like.
Drawings
FIG. 1 is a schematic diagram of the principles of the present invention;
FIG. 2 is a schematic diagram of the VSLAM algorithm;
fig. 3 is a robot inspection flow chart.
Detailed Description
The present invention will be further illustrated by the following specific examples, which are carried out on the premise of the technical scheme of the present invention, and it should be understood that these examples are only for illustrating the present invention and are not intended to limit the scope of the present invention.
The invention aims to provide a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm. Firstly, extracting the image characteristics in the inspection site, and continuously establishing and perfecting an inspection site model through the characteristics. After receiving the inspection task, planning an inspection line; after the target location is reached, the image collected in real time and the modeling image are subjected to feature matching to ensure accurate positioning, and finally, relevant data are collected through a camera, a temperature sensor, an infrared sensor, a noise detector and a combustible gas detector carried by the target location and are transmitted to host computer software to analyze, process and establish visual data.
The invention discloses a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm, which is shown in a schematic diagram of a VSLAM algorithm shown in a figure 1, a schematic diagram of a VSLAM algorithm shown in a figure 2, and a robot inspection flow chart shown in a figure 3, and comprises the following specific embodiments:
the first step is as follows: establishing a routing inspection map: the inspection robot is located on a main control room charging station when in an initial state. And starting the robot, performing self-checking and establishing a routing inspection map. And setting a central pixel point of the detection area as P, a detected pixel point as I, forming a detection range by taking a certain pixel point as a radius, setting a gray threshold value as T, indicating that sampling is wrong and re-acquisition is carried out if the gray value [ I-P ] of the detected pixel point is more than or equal to T, and otherwise, establishing the sampling point in a model graph.
The second step is that: path planning: after the modeling of the inspection place is completed, the robot can start to perform inspection tasks, and the invention is provided with two inspection modes. The first mode is as follows: the main control room computer can send a command to enable the inspection robot to automatically start a preset inspection task. And a second mode: and the computer in the master control room remotely controls the inspection robot to an appointed place for inspection. In either case, the robot needs to autonomously establish a path plan to reach the accurate inspection position.
A mathematical method of a unary linear regression model by a least square method is adopted,
the sum of the squares of the residuals is:to find a conforming regression modelThe extreme value of (2) is obtained by calculating a partial derivative of the extreme value. And solving the regression model equation to plan the path of the next moment according to the actual situation.
XiAn argument representing the ith point, YiIs represented by the formula XiDependent variable of interest, eiIn order to randomly perturb the terms of the disturbance,is a constant term of a regression equation.
The third step: and (3) feature matching:
the good feature matching visual algorithm can be applied to path planning, and can also be used for secondarily determining whether the specified position is reached after the specified position is reached.
Let the structure of two figures be G1=(V1,E1)、G2=(V2,E2) The number of the nodes is n1,n2I.e. | V1|=n1,|V2|=n2In the general case of n1=n2. The distribution matrix is used for describing the mapping relation between corresponding nodes formed in the structures of the two graphs, and is set asUnder its constraints, its match value is maximized, and the match value function is expressed as:
wherein c isi,j、di,j,k,lRespectively, the similarity between nodes, the similarity between edges, Xi,j、Xk,l. Representing matching relationships between nodes. If X in the distribution matrixi,jWhen 1, then represents G1The ith node and G2The j-th node in the set is matched if Xi,jWhen 0, it means that the two nodes are not related. G1Should have at most and only 1 node and G2The node in (2) is matched, and the condition that one-to-many exists is not existed or is not allowed to occur.
If G is carried out directly1=(V1,E1) The point coordinate pair G2And global search matching is carried out, the calculation engineering quantity is extremely large, the requirement on a processor is high, and delay and downtime are easily caused. Therefore, set up G1=(V1,E1),G2=(V1+du,E1+dv) WhereinThe pixel displacement is expressed, and the incremental algorithm is adopted for feature matching, so that the operation speed can be greatly improved, the operation burden is reduced, and the processing time is shortened.
Wherein: vk,EkRepresents G1The coordinates of the kth point in the figure. Vk+du,E1+dvRepresents G1At the k-th point in (1) and G2Coordinates of the medium-sized corresponding relation point.
G′1,u,G′1,vrepresentation solution derivation solution [ d ] for pixelsu,dv]Value, i.e. finding the smallestThe displacement of the pixels is carried out by shifting the pixels,representing after x iterationsThe pixel is shifted.
The above equation is quantized into a control gradient matrix form to obtain:
in the form of least squares, iterations may be performed step by step from an initial valueThe error for iteration x times is:
whereinRepresenting the value after x iterations. Obtaining an update direction of the iteration error:
so that the pixel is shiftedCan be expressed as(C is the cost matrix). When in useWhen sufficiently small, iterate equation exG(Vk,Ek) Stopping the iteration, thereby obtaining the pixel ratio of G1=(V1,E1) The characteristic point G can be obtained by calculation1=(V1,E1) At G in2=(V2,E2) Where the location of the sensor is located. Thereby completing the feature matching quickly.
The fourth step: positioning:
in fact, for a robot carrying a sensor to move in an unknown environment, a time of a continuous movement is usually changed into discrete time t ═ 1, 2, 3.. k, and in these time, X ═ 3.. k1,x2,x3...xn) Representing the real trajectory of the robot, which constitutes the actual trajectory of the robot. On the aspect of a map, at each moment, the sensors measure a part of map features and obtain observation data of the map features so as to calculate corresponding track coordinatesTrajectory coordinates generated for the algorithm. The positioning problem is solved by comparing the Root Mean Square Error (RMSE) of the true motion trajectory with the algorithmically generated trajectory with the motion measurements and sensor readings.
Represents the offset of the algorithm-generated trajectory at the ith point, trans (x)i) And the offset of the actual motion track at the ith position is shown, and n represents the number of cameras.
The fifth step: collecting data; a chassis of a wheel type industrial robot is adopted to carry various sensing devices, and various devices of a blast furnace tuyere platform are regularly inspected. Collecting the surface temperature of each working element and the surface temperature of the related binding post through a temperature detector; an inspection place thermodynamic diagram is established through an infrared sensor and can be compared with data collected by a temperature detector, and closed-loop control is formed to ensure the accuracy of data collection; collecting field noise frequency spectrum data through a noise detector; collecting a concentration value of the on-site dangerous gas through a combustible gas detector; the camera starts to collect the instrument data. Because the inspection robot can accurately position the inspection position, the correlation, consistency and accuracy of signals acquired by the inspection robot are greatly guaranteed because the inspection robot can accurately position the inspection position. The collected data is transmitted to a main control room computer through an intelligent control host computer through a signal exchanger, analyzed and processed, and visualized data is established.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (2)
1. A control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm is characterized by comprising the following specific steps:
the first step is as follows: establishing a routing inspection map:
the inspection robot is located on a main control room charging station when in an initial state, the robot is started, an inspection map is self-checked and established, the characteristics of an inspection place are extracted through a VSLAM algorithm, if the gray value of a certain pixel in an image is sufficiently different from the gray values of enough pixels in surrounding neighborhoods, the pixel is set as a central pixel P of a detection area, the detected pixel is I, 16 pixels in a scattered circle with 3 pixels as the radius form a detection range, the gray threshold value is set as T, if the gray value [ I-P ] of the detected pixel is more than or equal to T, sampling errors are indicated, and re-acquisition is carried out, otherwise, the sampling point is established in a model map;
the second step is that: path planning:
after modeling of the inspection place is completed, the robot starts to perform an inspection task;
the robot needs to establish a path plan autonomously to reach an accurate inspection position in any mode;
a mathematical method of a unary linear regression model by a least square method is adopted,
the sample regression model is set as:wherein: xiAn argument representing the ith point, YiIs represented by the formula XiDependent variable of interest, eiIn order to randomly perturb the terms of the disturbance,is a constant term of a regression equation.
The sum of the squares of the residuals is: is a dependent variable YiEffectively illustrating the path estimated by the algorithm.
To find a conforming regression modelThe extreme value of (2) can be obtained by calculating partial derivatives, and the regression model equation can be obtained by planning the path of the next moment according to the actual situation;
the third step: and (3) feature matching:
the invention can effectively ensure the accuracy of the routing inspection line and the routing inspection position by carrying out feature matching based on the VSLAM algorithm. Feature matching can be simply understood as two graphs in the same scene or similar scenes, which have some corresponding relationship, generally a non-linear relationship. Setting the corresponding nodes related to the two graphs as;
G1=(V1,E1)、G2=(V1+du,E1+dv) V denotes a point set, E denotes an edge set,representing pixel displacement, the number of nodes being n1,n2I.e. | V1|=n1,|V2|=n2In the general case of n1=n2;
Wherein: vk,EkRepresents G1The coordinates of the kth point in the figure. Vk+du,E1+dvRepresents G1At the k-th point in (1) and G2Coordinates of the medium-sized corresponding relation point.
In order to ensure that the pixel displacement, namely the increment, is minimum and reduce the calculation load, the partial derivative of the formula is solved, and the partial derivative is obtained after x iterations
G′1,u,G′1,vRepresentation solution the derivation of the pixel is solved for [ d ]u,dv]Value, i.e. finding the smallestThe displacement is carried out in such a way that,representing after x iterationsThe pixel is shifted.
When the iteration formula is sufficiently small, the iteration is stopped, and the minimum displacement of the pixel is calculated by the iteration formulaFinally, G is obtained by calculation1=(V1,E1) G with the point having corresponding relation2=(V2,E2) The point coordinates;
the fourth step: positioning:
the accuracy of the route is guaranteed through strict and accurate route planning and feature matching, and finally, whether the accuracy is correct is determined through comparison of a real motion track and a root mean square difference (RMSE) of an algorithm generated track, motion measurement and sensor reading, and the actual motion track of the robot is set as X-X (X-X)1,x2,x3…x1) The motion track generated by the algorithm isJudging whether the positioning is accurate or not according to the root mean square difference (RMSE) of the real motion track and the algorithm generated track;represents the offset of the algorithm-generated trajectory at the ith point, trans (x)i) And the offset of the actual motion track at the ith point is shown, and n represents the number of cameras.
The fifth step: collecting data;
a chassis of a wheeled industrial robot is adopted, various sensing devices are carried, various devices of a blast furnace tuyere platform are regularly checked, and the surface temperature of each working component and the surface temperature of a related binding post are collected through a temperature detector; an inspection place thermodynamic diagram is established through an infrared sensor and can be compared with data collected by a temperature detector, and closed-loop control is formed to ensure the accuracy of data collection; collecting field noise frequency spectrum data through a noise detector; the camera begins to gather instrument data, because the robot that patrols and examines can the accurate positioning patrol and examine the position, because realized patrolling and examining the accurate positioning of position, so patrol and examine the correlation, the uniformity and the accurate performance of the signal that the robot gathered and obtain very big assurance, the data of gathering are passed through the intelligent control host computer and are passed through the signal interchanger and transmit for the master control room computer, carry out analysis and processing.
2. The VSLAM algorithm-based blast furnace tuyere platform inspection robot control method according to claim 1, characterized in that: the polling task comprises two polling modes;
the first mode is as follows: the main control room computer can enable the inspection robot to automatically start a preset inspection task by sending a command;
and a second mode: and the computer in the master control room remotely controls the inspection robot to an appointed place for inspection.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110225477.XA CN112947461B (en) | 2021-03-01 | 2021-03-01 | VSLAM algorithm-based blast furnace tuyere platform inspection robot control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110225477.XA CN112947461B (en) | 2021-03-01 | 2021-03-01 | VSLAM algorithm-based blast furnace tuyere platform inspection robot control method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112947461A true CN112947461A (en) | 2021-06-11 |
CN112947461B CN112947461B (en) | 2022-11-04 |
Family
ID=76246933
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110225477.XA Active CN112947461B (en) | 2021-03-01 | 2021-03-01 | VSLAM algorithm-based blast furnace tuyere platform inspection robot control method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112947461B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113581792A (en) * | 2021-08-04 | 2021-11-02 | 三一机器人科技有限公司 | Tray position checking method and device and tray positioning system |
CN117587181A (en) * | 2023-11-23 | 2024-02-23 | 建龙西林钢铁有限公司 | Blast furnace tuyere temperature monitoring device and control method thereof |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6362875B1 (en) * | 1999-12-10 | 2002-03-26 | Cognax Technology And Investment Corp. | Machine vision system and method for inspection, homing, guidance and docking with respect to remote objects |
CN107657640A (en) * | 2017-09-30 | 2018-02-02 | 南京大典科技有限公司 | Intelligent patrol inspection management method based on ORB SLAM |
CN208841421U (en) * | 2018-08-22 | 2019-05-10 | 上海宝宬冶金科技有限公司 | Blast-furnace tuyere automatic crusing robot |
-
2021
- 2021-03-01 CN CN202110225477.XA patent/CN112947461B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6362875B1 (en) * | 1999-12-10 | 2002-03-26 | Cognax Technology And Investment Corp. | Machine vision system and method for inspection, homing, guidance and docking with respect to remote objects |
CN107657640A (en) * | 2017-09-30 | 2018-02-02 | 南京大典科技有限公司 | Intelligent patrol inspection management method based on ORB SLAM |
CN208841421U (en) * | 2018-08-22 | 2019-05-10 | 上海宝宬冶金科技有限公司 | Blast-furnace tuyere automatic crusing robot |
Non-Patent Citations (3)
Title |
---|
杜永程: "基于VSLAM的移动机器人控制系统研究", 《中国优秀博硕士学位论文全文数据库(硕士) 农业科技辑》 * |
杨立闯: "基于改进ORB算法的VSLAM特征匹配算法研究", 《河北工业大学学报》 * |
魏雄: "基于ORB特征匹配的视觉SLAM研究", 《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113581792A (en) * | 2021-08-04 | 2021-11-02 | 三一机器人科技有限公司 | Tray position checking method and device and tray positioning system |
CN117587181A (en) * | 2023-11-23 | 2024-02-23 | 建龙西林钢铁有限公司 | Blast furnace tuyere temperature monitoring device and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
CN112947461B (en) | 2022-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kim et al. | SLAM-driven robotic mapping and registration of 3D point clouds | |
CN112947461B (en) | VSLAM algorithm-based blast furnace tuyere platform inspection robot control method | |
CN104062973A (en) | Mobile robot SLAM method based on image marker identification | |
WO2019136714A1 (en) | 3d laser-based map building method and system | |
CN113189977B (en) | Intelligent navigation path planning system and method for robot | |
CN110136186B (en) | Detection target matching method for mobile robot target ranging | |
Safin et al. | Evaluation of visual slam methods in usar applications using ros/gazebo simulation | |
Tsubouchi | Introduction to simultaneous localization and mapping | |
Shim et al. | Remote robotic system for 3D measurement of concrete damage in tunnel with ground vehicle and manipulator | |
Zhang et al. | Factor graph-based high-precision visual positioning for agricultural robots with fiducial markers | |
Pan et al. | Sweeping robot based on laser SLAM | |
Gao et al. | Fully automatic large-scale point cloud mapping for low-speed self-driving vehicles in unstructured environments | |
Fasiolo et al. | Comparing LiDAR and IMU-based SLAM approaches for 3D robotic mapping | |
Sujiwo et al. | Localization based on multiple visual-metric maps | |
Chen et al. | SCL-SLAM: A scan context-enabled LiDAR SLAM using factor graph-based optimization | |
Wang | Autonomous mobile robot visual SLAM based on improved CNN method | |
CN114721377A (en) | Improved Cartogrier based SLAM indoor blind guiding robot control method | |
Gao et al. | A new method for repeated localization and matching of tunnel lining defects | |
Kornilova et al. | Evops benchmark: evaluation of plane segmentation from RGBD and LiDAR data | |
Bayer et al. | On construction of a reliable ground truth for evaluation of visual slam algorithms | |
Peng et al. | Dynamic Visual SLAM Integrated with IMU for Unmanned Scenarios | |
Albrecht et al. | Mapping and automatic post-processing of indoor environments by extending visual slam | |
Liang et al. | An Accurate Visual Navigation Method for Wheeled Robot in Unstructured Outdoor Environment Based on Virtual Navigation Line | |
Park et al. | Mobile robot navigation based on direct depth and color-based environment modeling | |
Radzi et al. | Visual-based and Lidar-based SLAM Study for Outdoor Environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |