CN112947461A - VSLAM algorithm-based blast furnace tuyere platform inspection robot control method - Google Patents

VSLAM algorithm-based blast furnace tuyere platform inspection robot control method Download PDF

Info

Publication number
CN112947461A
CN112947461A CN202110225477.XA CN202110225477A CN112947461A CN 112947461 A CN112947461 A CN 112947461A CN 202110225477 A CN202110225477 A CN 202110225477A CN 112947461 A CN112947461 A CN 112947461A
Authority
CN
China
Prior art keywords
inspection
robot
algorithm
pixel
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110225477.XA
Other languages
Chinese (zh)
Other versions
CN112947461B (en
Inventor
胡兴柳
顾海华
方挺
司海飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jinling Institute of Technology
Original Assignee
Jinling Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jinling Institute of Technology filed Critical Jinling Institute of Technology
Priority to CN202110225477.XA priority Critical patent/CN112947461B/en
Publication of CN112947461A publication Critical patent/CN112947461A/en
Application granted granted Critical
Publication of CN112947461B publication Critical patent/CN112947461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Manipulator (AREA)

Abstract

The invention aims to provide a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm. The invention firstly accurately extracts the image characteristics in the inspection site through the VSLAM algorithm, and continuously establishes and perfects an inspection site model through the characteristics. After receiving the inspection task, planning an inspection line; after the target location is reached, the image collected in real time and the modeling image are subjected to feature matching to ensure accurate positioning, and finally, relevant data are collected through a camera, a temperature sensor, an infrared sensor, a noise detector and a combustible gas detector carried by the target location and are transmitted to host computer software to analyze, process and establish visual data.

Description

VSLAM algorithm-based blast furnace tuyere platform inspection robot control method
Technical Field
The invention belongs to the field of computer vision and electrical control, and particularly relates to a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm.
Background
The blast furnace tuyere platform is a key area of an iron-making blast furnace and is also a key area of blast furnace iron-making production safety management. Blast furnace operators pay attention to the working state of a blast furnace tuyere raceway and use the working state as one of important bases for judging and controlling the running condition of the blast furnace. The work is completed mainly by a manual inspection mode. Because the leakage of coal gas may exist in the tuyere platform, great potential safety hazard exists in manual inspection.
With the rapid development of science and technology, inspection robots are widely applied in various fields to replace or assist human beings to perform various inspection works, particularly in environments where the human beings are difficult to live, the environments are severe and harmful to body health, or places where safety accidents frequently occur. The inspection robot can be divided into a high-voltage transmission line autonomous inspection robot, a mining intelligent inspection robot, an industrial factory inspection robot and the like according to the application range. According to the form, it can be divided into: wheeled robot, tracked robot and rail mounted robot of patrolling and examining. The inspection robot is controlled by a master control room often, so that the inspection robot can be used for performing inspection in different time periods and different areas in an inspection place according to preset or temporarily set inspection work, the inspection work is more comprehensive, real and reliable, and the interference of human factors such as missed inspection, error records and the like during manual inspection is effectively avoided. Meanwhile, the inspection robot can transmit the acquired data to the upper computer software for recording, processing and analyzing to form visual data, and the digital management level of the equipment is effectively improved.
At present, a lot of research attention is attracted to a VSLAM algorithm, the core method of the VSLAM is to sense the surrounding environment through optical sensors (such as a video camera, a depth camera and the like) in a visible light wave band, image data is formed, the image data is further extracted and processed, and a surrounding image model can be constructed by combining depth information of a scene and camera motion estimation. In the aspect of path planning, global planning can be performed by utilizing the identification of obstacles in the environment and visual landmark distribution, then the path planning result is sent to the processor, and finally a driving command is sent to realize navigation. In the aspect of feature matching, as a perfect routing inspection map is established in advance, only the images needing to be matched and the images in the memory need to be crossed, compared and matched. In conclusion, the VLSAM comprises knowledge intersection of multiple science such as computer vision, electrical control science and the like, and is undoubtedly the current fire-heat and advanced research technology.
The VSLAM algorithm carried by the invention can obtain more abundant relevant information such as images, color and the like, and meanwhile, as the depth camera with the depth information is carried, the VSLAM algorithm is low in price, light and convenient to install, can obtain the depth information of an object, enables a captured scene not to be rigid in view, has scale information, can greatly simplify the modeling process of an inspection place, and has universality in the face of the great change of the working environment.
Therefore, the invention adopts the control method of the blast furnace tuyere platform inspection robot based on the VSLAM algorithm, which not only can ensure the safety and the high efficiency, but also can overcome some problems and defects existing in the traditional blast furnace tuyere platform inspection.
Disclosure of Invention
The current widely used inspection robot carrying the GPS can only work outside the open with better signals and can not normally operate in the severe environment, and position information is easy to lose, although the positioning precision is higher and the application range is wider; the inspection robot based on the laser radar has the advantages of reliable sensing information, strong robustness and the like, but can cause the navigation and positioning to be incapable of sensing normally when facing the high repetition of environment sensing or being indistinguishable, and has high cost and less acquired environment information and characteristics. In order to solve the problems, the invention provides a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm is characterized by comprising the following specific steps:
the first step is as follows: establishing a routing inspection map:
the inspection robot is located on a main control room charging station when in an initial state, the robot is started, a self-checking map is established, if the gray value of a certain pixel point in an image is more prominent than the rest pixel points, the pixel point is set as a center pixel point P of a detection area, the detected pixel point is I, the detection range is a discrete circle with the radius of 3 pixel points, 16 pixel points are counted, a gray threshold value is set as T, if the gray value [ I-P ] of the detected pixel point is more than or equal to T, the fact that sampling is carried out due to overlarge gray value difference is shown, otherwise, the pixel point is used as a feature point and is established in a model map, and the high-precision inspection map is established through continuous sampling and comparison.
And setting the inspection area as D, and defining the moment of D as: m ispq=∑xpyqI (x, y), p, q ═ 0, 1 }; x, y ∈ D, where xpyqI (x, y) is the gray value at point (x, y), which is the moment of the neighborhood pixels of feature point p.
The centroid of the inspection area D is:
Figure BDA0002957199860000021
m10、m01、m00is the key node coordinate in the area D. So as to make a vector
Figure BDA0002957199860000022
0 is the geometric center, C is the centroid,
the included angle between the characteristic point and the mass center is defined as
Figure BDA0002957199860000023
After the area direction is determined, starting to sample brighter points in the area, and acquiring n characteristic points, xnynRepresenting a set of feature points, wherein a feature description matrix is A:
Figure BDA0002957199860000031
rotating A by alpha angle to obtain rotation matrix RαThe calculation is simplified by rotating the matrix.
Wherein
Figure BDA0002957199860000032
Liberation of RαThe principal direction of the characteristic point in the theta region D can be obtained, and then the color gray characteristic value is collected, and the characteristic extraction is completed.
The second step is that: path planning:
after modeling of the inspection place is completed, the robot starts to perform an inspection task;
the robot needs to establish a path plan autonomously to reach an accurate inspection position in any mode;
a mathematical method of a unary linear regression model by a least square method is adopted,
the sample regression model is set as:
Figure BDA0002957199860000033
wherein: y represents a dependent variable, X is an independent variable, eiIn order to randomly perturb the terms of the disturbance,
Figure BDA0002957199860000034
is a constant term of a regression equation.
The sum of the squares of the residuals is:
Figure BDA0002957199860000035
wherein:
Figure BDA0002957199860000036
is a dependent variable YiCorresponding to the estimated value of (a).
To find a conforming regression model
Figure BDA0002957199860000037
The extreme value of (2) can be obtained by calculating partial derivatives, and the regression model equation can be obtained by planning the path of the next moment according to the actual situation;
Figure BDA0002957199860000038
Figure BDA0002957199860000039
the method is simple in calculation, stable in estimation value and sensitive to abnormal values, and path precision is effectively guaranteed.
The third step: and (3) feature matching:
the good feature matching visual algorithm can be applied to path planning, and can also be used for secondarily determining whether the specified position is reached after the specified position is reached.
Let the structure of two figures be G1=(V1,E1)、G2=(V2,E2) V represents a point set, E represents an edge set, and the number of nodes is n1,n2I.e. | V1|=n1,|V2|=n2In the general case of n1=n2. The distribution matrix is used for describing the mapping relation between corresponding nodes formed in the structures of the two graphs, and is set as
Figure BDA00029571998600000310
Under its constraints, maximizing its matching value,
the match value function is expressed as:
Figure BDA0002957199860000041
wherein c isi,j、di,j,k,lRespectively, the similarity between nodes, the similarity between edges, Xi,j、Xk,l. Representing matching relationships between nodes. If X in the distribution matrixi,jWhen 1, then represents G1The ith node and G2The j-th node in the set is matched if Xi,jWhen 0, it means that the two nodes are not related. G1Should have at most and only 1 node and G2The node in (2) is matched, and the condition that one-to-many exists is not existed or is not allowed to occur.
If G is carried out directly1=(V1,E1) The point coordinate pair G2And global search matching is carried out, the calculation engineering quantity is extremely large, the requirement on a processor is high, and delay and downtime are easily caused. Therefore, set up G1=(V1,E1),G2=(V1+du,E1+dv) Wherein
Figure BDA0002957199860000042
The pixel displacement is expressed, and the incremental algorithm is adopted for feature matching, so that the operation speed can be greatly improved, and the reduction of the operation speed is realizedThe calculation load and the processing time are reduced.
Figure BDA0002957199860000043
Wherein: vk,EkRepresents G1The coordinates of the kth point in the figure. [ V ]k+du,E1+dv]Represents G1At the k-th point in (1) and G2Coordinates of the medium-sized corresponding relation point.
Need to find
Figure BDA0002957199860000044
If so, such that
Figure BDA0002957199860000045
The minimum value, the partial derivative is first calculated:
Figure BDA0002957199860000046
Figure BDA0002957199860000047
Figure BDA0002957199860000048
G′1,u,G′1,vrepresentation solution derivative solution [ d ] on pixelsu,dv]Extreme value, i.e. finding the smallest
Figure BDA0002957199860000049
The displacement of the pixels is carried out by shifting the pixels,
Figure BDA00029571998600000410
representing after x iterations
Figure BDA00029571998600000411
PixelAnd (4) displacing.
The above equation is quantized into a control gradient matrix form to obtain:
Figure BDA00029571998600000412
Figure BDA00029571998600000413
in the form of least squares, iterations may be performed step by step from an initial value
Figure BDA00029571998600000414
The error for iteration x times is:
Figure BDA00029571998600000415
wherein
Figure BDA0002957199860000051
Representing the value after x iterations. Defining an update direction of an iteration error
Figure BDA0002957199860000052
Figure BDA0002957199860000053
So that the pixel is shifted
Figure BDA0002957199860000054
Can be expressed as
Figure BDA0002957199860000055
(C is the cost matrix). When in use
Figure BDA0002957199860000059
When sufficiently small, iterate equation exG(Vk,Ek) Stopping the iteration, thereby obtaining the pixel ratio of G1=(V1,E1) The characteristic point G can be obtained by calculation1=(V1,E1) At G in2=(V2,E2) Where the location of the sensor is located. Thereby completing the feature matching quickly.
The fourth step: positioning:
in fact, for a robot carrying a sensor to move in an unknown environment, a time of a continuous movement is usually changed into discrete time t ═ 1, 2, 3.. k, and in these time, X ═ 3.. k1,x2,x3...xn) Representing the real trajectory of the robot, which constitutes the actual trajectory of the robot. On the aspect of a map, at each moment, the sensors measure a part of map features and obtain observation data of the map features so as to calculate corresponding track coordinates
Figure BDA0002957199860000056
Trajectory coordinates generated for the algorithm. The positioning problem is solved by comparing the Root Mean Square Error (RMSE) of the true motion trajectory with the algorithmically generated trajectory with the motion measurements and sensor readings.
Figure BDA0002957199860000057
Figure BDA0002957199860000058
Represents the offset of the algorithm-generated trajectory at the ith point, trans (x)i) And the offset of the actual motion track at the ith position is shown, and n represents the number of cameras.
The fifth step: collecting data;
collecting the surface temperature of each working element and the surface temperature of the related binding post through a temperature detector; an inspection place thermodynamic diagram is established through an infrared sensor and can be compared with data collected by a temperature detector, and closed-loop control is formed to ensure the accuracy of data collection; collecting field noise frequency spectrum data through a noise detector; collecting a concentration value of the on-site dangerous gas through a combustible gas detector; the camera begins to gather instrument data, because the robot that patrols and examines can the accurate positioning patrol and examine the position, because realized patrolling and examining the accurate positioning of position, so patrol and examine the correlation, the uniformity and the accurate performance of the signal that the robot gathered and obtain very big assurance, the data of gathering are passed through the intelligent control host computer and are passed through the signal interchanger and transmit for the master control room computer, carry out analysis and processing.
As a further improvement of the invention, the inspection task comprises two inspection modes;
the first mode is as follows: the main control room computer can enable the inspection robot to automatically start a preset inspection task by sending a command;
and a second mode: and the computer in the master control room remotely controls the inspection robot to an appointed place for inspection.
Compared with the prior art, the invention has the following advantages:
(1) the invention relates to a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm, which can accurately position the real-time position of the robot and can clearly see the position of the robot in upper computer software. The positioning precision of the robot and the mutual inductance of the users are effectively improved.
(2) The invention adopts VSLAM algorithm, has the advantages of open source, easy programming and the like, can measure speed and distance, quickly identify map coordinates, and realize long-time and high-precision positioning of the robot.
(3) Compared with the GPS and the laser radar, the invention has the characteristics of richer visual information, low hardware cost, wide visual range and the like.
Drawings
FIG. 1 is a schematic diagram of the principles of the present invention;
FIG. 2 is a schematic diagram of the VSLAM algorithm;
fig. 3 is a robot inspection flow chart.
Detailed Description
The present invention will be further illustrated by the following specific examples, which are carried out on the premise of the technical scheme of the present invention, and it should be understood that these examples are only for illustrating the present invention and are not intended to limit the scope of the present invention.
The invention aims to provide a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm. Firstly, extracting the image characteristics in the inspection site, and continuously establishing and perfecting an inspection site model through the characteristics. After receiving the inspection task, planning an inspection line; after the target location is reached, the image collected in real time and the modeling image are subjected to feature matching to ensure accurate positioning, and finally, relevant data are collected through a camera, a temperature sensor, an infrared sensor, a noise detector and a combustible gas detector carried by the target location and are transmitted to host computer software to analyze, process and establish visual data.
The invention discloses a control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm, which is shown in a schematic diagram of a VSLAM algorithm shown in a figure 1, a schematic diagram of a VSLAM algorithm shown in a figure 2, and a robot inspection flow chart shown in a figure 3, and comprises the following specific embodiments:
the first step is as follows: establishing a routing inspection map: the inspection robot is located on a main control room charging station when in an initial state. And starting the robot, performing self-checking and establishing a routing inspection map. And setting a central pixel point of the detection area as P, a detected pixel point as I, forming a detection range by taking a certain pixel point as a radius, setting a gray threshold value as T, indicating that sampling is wrong and re-acquisition is carried out if the gray value [ I-P ] of the detected pixel point is more than or equal to T, and otherwise, establishing the sampling point in a model graph.
The second step is that: path planning: after the modeling of the inspection place is completed, the robot can start to perform inspection tasks, and the invention is provided with two inspection modes. The first mode is as follows: the main control room computer can send a command to enable the inspection robot to automatically start a preset inspection task. And a second mode: and the computer in the master control room remotely controls the inspection robot to an appointed place for inspection. In either case, the robot needs to autonomously establish a path plan to reach the accurate inspection position.
A mathematical method of a unary linear regression model by a least square method is adopted,
the sample regression model is set as:
Figure BDA0002957199860000071
the sum of the squares of the residuals is:
Figure BDA0002957199860000072
to find a conforming regression model
Figure BDA0002957199860000073
The extreme value of (2) is obtained by calculating a partial derivative of the extreme value. And solving the regression model equation to plan the path of the next moment according to the actual situation.
XiAn argument representing the ith point, YiIs represented by the formula XiDependent variable of interest, eiIn order to randomly perturb the terms of the disturbance,
Figure BDA0002957199860000074
is a constant term of a regression equation.
The third step: and (3) feature matching:
the good feature matching visual algorithm can be applied to path planning, and can also be used for secondarily determining whether the specified position is reached after the specified position is reached.
Let the structure of two figures be G1=(V1,E1)、G2=(V2,E2) The number of the nodes is n1,n2I.e. | V1|=n1,|V2|=n2In the general case of n1=n2. The distribution matrix is used for describing the mapping relation between corresponding nodes formed in the structures of the two graphs, and is set as
Figure BDA0002957199860000075
Under its constraints, its match value is maximized, and the match value function is expressed as:
Figure BDA0002957199860000076
wherein c isi,j、di,j,k,lRespectively, the similarity between nodes, the similarity between edges, Xi,j、Xk,l. Representing matching relationships between nodes. If X in the distribution matrixi,jWhen 1, then represents G1The ith node and G2The j-th node in the set is matched if Xi,jWhen 0, it means that the two nodes are not related. G1Should have at most and only 1 node and G2The node in (2) is matched, and the condition that one-to-many exists is not existed or is not allowed to occur.
If G is carried out directly1=(V1,E1) The point coordinate pair G2And global search matching is carried out, the calculation engineering quantity is extremely large, the requirement on a processor is high, and delay and downtime are easily caused. Therefore, set up G1=(V1,E1),G2=(V1+du,E1+dv) Wherein
Figure BDA0002957199860000077
The pixel displacement is expressed, and the incremental algorithm is adopted for feature matching, so that the operation speed can be greatly improved, the operation burden is reduced, and the processing time is shortened.
Figure BDA0002957199860000081
Wherein: vk,EkRepresents G1The coordinates of the kth point in the figure. Vk+du,E1+dvRepresents G1At the k-th point in (1) and G2Coordinates of the medium-sized corresponding relation point.
Need to find
Figure BDA0002957199860000082
If so, such that
Figure BDA0002957199860000083
The minimum value, the partial derivative is first calculated:
Figure BDA0002957199860000084
Figure BDA0002957199860000085
G′1,u,G′1,vrepresentation solution derivation solution [ d ] for pixelsu,dv]Value, i.e. finding the smallest
Figure BDA0002957199860000086
The displacement of the pixels is carried out by shifting the pixels,
Figure BDA0002957199860000087
representing after x iterations
Figure BDA0002957199860000088
The pixel is shifted.
The above equation is quantized into a control gradient matrix form to obtain:
Figure BDA0002957199860000089
Figure BDA00029571998600000810
in the form of least squares, iterations may be performed step by step from an initial value
Figure BDA00029571998600000811
The error for iteration x times is:
Figure BDA00029571998600000812
wherein
Figure BDA00029571998600000813
Representing the value after x iterations. Obtaining an update direction of the iteration error:
Figure BDA00029571998600000814
so that the pixel is shifted
Figure BDA00029571998600000815
Can be expressed as
Figure BDA00029571998600000816
(C is the cost matrix). When in use
Figure BDA00029571998600000817
When sufficiently small, iterate equation exG(Vk,Ek) Stopping the iteration, thereby obtaining the pixel ratio of G1=(V1,E1) The characteristic point G can be obtained by calculation1=(V1,E1) At G in2=(V2,E2) Where the location of the sensor is located. Thereby completing the feature matching quickly.
The fourth step: positioning:
in fact, for a robot carrying a sensor to move in an unknown environment, a time of a continuous movement is usually changed into discrete time t ═ 1, 2, 3.. k, and in these time, X ═ 3.. k1,x2,x3...xn) Representing the real trajectory of the robot, which constitutes the actual trajectory of the robot. On the aspect of a map, at each moment, the sensors measure a part of map features and obtain observation data of the map features so as to calculate corresponding track coordinates
Figure BDA00029571998600000818
Trajectory coordinates generated for the algorithm. The positioning problem is solved by comparing the Root Mean Square Error (RMSE) of the true motion trajectory with the algorithmically generated trajectory with the motion measurements and sensor readings.
Figure BDA0002957199860000091
Figure BDA0002957199860000092
Represents the offset of the algorithm-generated trajectory at the ith point, trans (x)i) And the offset of the actual motion track at the ith position is shown, and n represents the number of cameras.
The fifth step: collecting data; a chassis of a wheel type industrial robot is adopted to carry various sensing devices, and various devices of a blast furnace tuyere platform are regularly inspected. Collecting the surface temperature of each working element and the surface temperature of the related binding post through a temperature detector; an inspection place thermodynamic diagram is established through an infrared sensor and can be compared with data collected by a temperature detector, and closed-loop control is formed to ensure the accuracy of data collection; collecting field noise frequency spectrum data through a noise detector; collecting a concentration value of the on-site dangerous gas through a combustible gas detector; the camera starts to collect the instrument data. Because the inspection robot can accurately position the inspection position, the correlation, consistency and accuracy of signals acquired by the inspection robot are greatly guaranteed because the inspection robot can accurately position the inspection position. The collected data is transmitted to a main control room computer through an intelligent control host computer through a signal exchanger, analyzed and processed, and visualized data is established.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (2)

1. A control method of a blast furnace tuyere platform inspection robot based on a VSLAM algorithm is characterized by comprising the following specific steps:
the first step is as follows: establishing a routing inspection map:
the inspection robot is located on a main control room charging station when in an initial state, the robot is started, an inspection map is self-checked and established, the characteristics of an inspection place are extracted through a VSLAM algorithm, if the gray value of a certain pixel in an image is sufficiently different from the gray values of enough pixels in surrounding neighborhoods, the pixel is set as a central pixel P of a detection area, the detected pixel is I, 16 pixels in a scattered circle with 3 pixels as the radius form a detection range, the gray threshold value is set as T, if the gray value [ I-P ] of the detected pixel is more than or equal to T, sampling errors are indicated, and re-acquisition is carried out, otherwise, the sampling point is established in a model map;
the second step is that: path planning:
after modeling of the inspection place is completed, the robot starts to perform an inspection task;
the robot needs to establish a path plan autonomously to reach an accurate inspection position in any mode;
a mathematical method of a unary linear regression model by a least square method is adopted,
the sample regression model is set as:
Figure FDA0002957199850000011
wherein: xiAn argument representing the ith point, YiIs represented by the formula XiDependent variable of interest, eiIn order to randomly perturb the terms of the disturbance,
Figure FDA0002957199850000012
is a constant term of a regression equation.
The sum of the squares of the residuals is:
Figure FDA0002957199850000013
Figure FDA0002957199850000014
is a dependent variable YiEffectively illustrating the path estimated by the algorithm.
To find a conforming regression model
Figure FDA0002957199850000015
The extreme value of (2) can be obtained by calculating partial derivatives, and the regression model equation can be obtained by planning the path of the next moment according to the actual situation;
the third step: and (3) feature matching:
the invention can effectively ensure the accuracy of the routing inspection line and the routing inspection position by carrying out feature matching based on the VSLAM algorithm. Feature matching can be simply understood as two graphs in the same scene or similar scenes, which have some corresponding relationship, generally a non-linear relationship. Setting the corresponding nodes related to the two graphs as;
G1=(V1,E1)、G2=(V1+du,E1+dv) V denotes a point set, E denotes an edge set,
Figure FDA0002957199850000016
representing pixel displacement, the number of nodes being n1,n2I.e. | V1|=n1,|V2|=n2In the general case of n1=n2
Figure FDA0002957199850000017
Wherein: vk,EkRepresents G1The coordinates of the kth point in the figure. Vk+du,E1+dvRepresents G1At the k-th point in (1) and G2Coordinates of the medium-sized corresponding relation point.
In order to ensure that the pixel displacement, namely the increment, is minimum and reduce the calculation load, the partial derivative of the formula is solved, and the partial derivative is obtained after x iterations
Figure FDA0002957199850000021
Figure FDA0002957199850000022
Figure FDA0002957199850000023
G′1,u,G′1,vRepresentation solution the derivation of the pixel is solved for [ d ]u,dv]Value, i.e. finding the smallest
Figure FDA0002957199850000024
The displacement is carried out in such a way that,
Figure FDA0002957199850000025
representing after x iterations
Figure FDA0002957199850000026
The pixel is shifted.
When the iteration formula is sufficiently small, the iteration is stopped, and the minimum displacement of the pixel is calculated by the iteration formula
Figure FDA0002957199850000027
Finally, G is obtained by calculation1=(V1,E1) G with the point having corresponding relation2=(V2,E2) The point coordinates;
the fourth step: positioning:
the accuracy of the route is guaranteed through strict and accurate route planning and feature matching, and finally, whether the accuracy is correct is determined through comparison of a real motion track and a root mean square difference (RMSE) of an algorithm generated track, motion measurement and sensor reading, and the actual motion track of the robot is set as X-X (X-X)1,x2,x3…x1) The motion track generated by the algorithm is
Figure FDA0002957199850000028
Judging whether the positioning is accurate or not according to the root mean square difference (RMSE) of the real motion track and the algorithm generated track;
Figure FDA0002957199850000029
represents the offset of the algorithm-generated trajectory at the ith point, trans (x)i) And the offset of the actual motion track at the ith point is shown, and n represents the number of cameras.
Figure FDA00029571998500000210
The fifth step: collecting data;
a chassis of a wheeled industrial robot is adopted, various sensing devices are carried, various devices of a blast furnace tuyere platform are regularly checked, and the surface temperature of each working component and the surface temperature of a related binding post are collected through a temperature detector; an inspection place thermodynamic diagram is established through an infrared sensor and can be compared with data collected by a temperature detector, and closed-loop control is formed to ensure the accuracy of data collection; collecting field noise frequency spectrum data through a noise detector; the camera begins to gather instrument data, because the robot that patrols and examines can the accurate positioning patrol and examine the position, because realized patrolling and examining the accurate positioning of position, so patrol and examine the correlation, the uniformity and the accurate performance of the signal that the robot gathered and obtain very big assurance, the data of gathering are passed through the intelligent control host computer and are passed through the signal interchanger and transmit for the master control room computer, carry out analysis and processing.
2. The VSLAM algorithm-based blast furnace tuyere platform inspection robot control method according to claim 1, characterized in that: the polling task comprises two polling modes;
the first mode is as follows: the main control room computer can enable the inspection robot to automatically start a preset inspection task by sending a command;
and a second mode: and the computer in the master control room remotely controls the inspection robot to an appointed place for inspection.
CN202110225477.XA 2021-03-01 2021-03-01 VSLAM algorithm-based blast furnace tuyere platform inspection robot control method Active CN112947461B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110225477.XA CN112947461B (en) 2021-03-01 2021-03-01 VSLAM algorithm-based blast furnace tuyere platform inspection robot control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110225477.XA CN112947461B (en) 2021-03-01 2021-03-01 VSLAM algorithm-based blast furnace tuyere platform inspection robot control method

Publications (2)

Publication Number Publication Date
CN112947461A true CN112947461A (en) 2021-06-11
CN112947461B CN112947461B (en) 2022-11-04

Family

ID=76246933

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110225477.XA Active CN112947461B (en) 2021-03-01 2021-03-01 VSLAM algorithm-based blast furnace tuyere platform inspection robot control method

Country Status (1)

Country Link
CN (1) CN112947461B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113581792A (en) * 2021-08-04 2021-11-02 三一机器人科技有限公司 Tray position checking method and device and tray positioning system
CN117587181A (en) * 2023-11-23 2024-02-23 建龙西林钢铁有限公司 Blast furnace tuyere temperature monitoring device and control method thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6362875B1 (en) * 1999-12-10 2002-03-26 Cognax Technology And Investment Corp. Machine vision system and method for inspection, homing, guidance and docking with respect to remote objects
CN107657640A (en) * 2017-09-30 2018-02-02 南京大典科技有限公司 Intelligent patrol inspection management method based on ORB SLAM
CN208841421U (en) * 2018-08-22 2019-05-10 上海宝宬冶金科技有限公司 Blast-furnace tuyere automatic crusing robot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6362875B1 (en) * 1999-12-10 2002-03-26 Cognax Technology And Investment Corp. Machine vision system and method for inspection, homing, guidance and docking with respect to remote objects
CN107657640A (en) * 2017-09-30 2018-02-02 南京大典科技有限公司 Intelligent patrol inspection management method based on ORB SLAM
CN208841421U (en) * 2018-08-22 2019-05-10 上海宝宬冶金科技有限公司 Blast-furnace tuyere automatic crusing robot

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
杜永程: "基于VSLAM的移动机器人控制系统研究", 《中国优秀博硕士学位论文全文数据库(硕士) 农业科技辑》 *
杨立闯: "基于改进ORB算法的VSLAM特征匹配算法研究", 《河北工业大学学报》 *
魏雄: "基于ORB特征匹配的视觉SLAM研究", 《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113581792A (en) * 2021-08-04 2021-11-02 三一机器人科技有限公司 Tray position checking method and device and tray positioning system
CN117587181A (en) * 2023-11-23 2024-02-23 建龙西林钢铁有限公司 Blast furnace tuyere temperature monitoring device and control method thereof

Also Published As

Publication number Publication date
CN112947461B (en) 2022-11-04

Similar Documents

Publication Publication Date Title
Kim et al. SLAM-driven robotic mapping and registration of 3D point clouds
CN112947461B (en) VSLAM algorithm-based blast furnace tuyere platform inspection robot control method
CN104062973A (en) Mobile robot SLAM method based on image marker identification
WO2019136714A1 (en) 3d laser-based map building method and system
CN113189977B (en) Intelligent navigation path planning system and method for robot
CN110136186B (en) Detection target matching method for mobile robot target ranging
Safin et al. Evaluation of visual slam methods in usar applications using ros/gazebo simulation
Tsubouchi Introduction to simultaneous localization and mapping
Shim et al. Remote robotic system for 3D measurement of concrete damage in tunnel with ground vehicle and manipulator
Zhang et al. Factor graph-based high-precision visual positioning for agricultural robots with fiducial markers
Pan et al. Sweeping robot based on laser SLAM
Gao et al. Fully automatic large-scale point cloud mapping for low-speed self-driving vehicles in unstructured environments
Fasiolo et al. Comparing LiDAR and IMU-based SLAM approaches for 3D robotic mapping
Sujiwo et al. Localization based on multiple visual-metric maps
Chen et al. SCL-SLAM: A scan context-enabled LiDAR SLAM using factor graph-based optimization
Wang Autonomous mobile robot visual SLAM based on improved CNN method
CN114721377A (en) Improved Cartogrier based SLAM indoor blind guiding robot control method
Gao et al. A new method for repeated localization and matching of tunnel lining defects
Kornilova et al. Evops benchmark: evaluation of plane segmentation from RGBD and LiDAR data
Bayer et al. On construction of a reliable ground truth for evaluation of visual slam algorithms
Peng et al. Dynamic Visual SLAM Integrated with IMU for Unmanned Scenarios
Albrecht et al. Mapping and automatic post-processing of indoor environments by extending visual slam
Liang et al. An Accurate Visual Navigation Method for Wheeled Robot in Unstructured Outdoor Environment Based on Virtual Navigation Line
Park et al. Mobile robot navigation based on direct depth and color-based environment modeling
Radzi et al. Visual-based and Lidar-based SLAM Study for Outdoor Environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant