CN117745537A - Tunnel equipment temperature detection method, device, computer equipment and storage medium - Google Patents

Tunnel equipment temperature detection method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN117745537A
CN117745537A CN202410191026.2A CN202410191026A CN117745537A CN 117745537 A CN117745537 A CN 117745537A CN 202410191026 A CN202410191026 A CN 202410191026A CN 117745537 A CN117745537 A CN 117745537A
Authority
CN
China
Prior art keywords
tunnel
spliced
fusion
images
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410191026.2A
Other languages
Chinese (zh)
Other versions
CN117745537B (en
Inventor
穆阳
张文祥
梁耀聪
刘志州
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microbrand Technology Zhejiang Co ltd
Original Assignee
Microbrand Technology Zhejiang Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microbrand Technology Zhejiang Co ltd filed Critical Microbrand Technology Zhejiang Co ltd
Priority to CN202410191026.2A priority Critical patent/CN117745537B/en
Priority claimed from CN202410191026.2A external-priority patent/CN117745537B/en
Publication of CN117745537A publication Critical patent/CN117745537A/en
Application granted granted Critical
Publication of CN117745537B publication Critical patent/CN117745537B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Radiation Pyrometers (AREA)

Abstract

The application relates to the technical field of tunnel equipment monitoring and provides a tunnel equipment temperature detection method, a tunnel equipment temperature detection device, computer equipment and a storage medium. The method comprises the following steps: screening a plurality of images to be spliced from the multi-frame tunnel fusion images based on the equipment acquisition positions corresponding to the tunnel fusion images; the tunnel fusion map is obtained by carrying out fusion processing on the tunnel point cloud map and the tunnel thermodynamic diagram of the same frame; matching the characteristic points of the tunnel planar targets in the images to be spliced, and matching the characteristic points of the tunnel linear targets in the images to be spliced to obtain a transformation matrix between the images to be spliced; based on the transformation matrix, splicing a plurality of images to be spliced to obtain a spliced image; fusing the point cloud overlapping areas in the spliced graph to obtain a tunnel line graph; and obtaining a tunnel equipment temperature detection result based on the tunnel line graph. The method has the advantages of higher splicing accuracy and lower time delay.

Description

Tunnel equipment temperature detection method, device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of tunnel equipment monitoring technologies, and in particular, to a method and apparatus for detecting a temperature of a tunnel equipment, a computer device, a storage medium, and a computer program product.
Background
The temperature detection of the tunnel equipment can help to ensure the safe operation of the tunnel equipment, ensure the comfort level of the internal environment of the tunnel, and discover and prevent potential safety hazards in advance, so that the temperature detection of the tunnel equipment is necessary.
When the temperature of the tunnel equipment is detected, a fusion graph can be obtained by means of a radar scanning technology and a thermal imaging technology. If the fusion map is spliced by directly utilizing the existing splicing algorithm, the problems of low splicing accuracy and high time delay are easy to occur.
Disclosure of Invention
Based on this, it is necessary to provide a tunnel device temperature detection method, apparatus, computer device, computer readable storage medium and computer program product in order to solve the above technical problems.
In a first aspect, the present application provides a method for detecting a temperature of a tunnel device, including:
screening a plurality of images to be spliced from the multi-frame tunnel fusion images based on the equipment acquisition positions corresponding to the tunnel fusion images; the tunnel fusion map is obtained by carrying out fusion processing on a tunnel point cloud map and a tunnel thermodynamic diagram of the same frame;
matching the characteristic points of the tunnel planar targets in the images to be spliced, and matching the characteristic points of the tunnel linear targets in the images to be spliced to obtain a transformation matrix between the images to be spliced;
Based on the transformation matrix, splicing a plurality of images to be spliced to obtain a spliced image;
fusing the point cloud overlapping areas in the spliced graph to obtain a tunnel line graph;
and obtaining a tunnel equipment temperature detection result based on the tunnel line graph.
In one embodiment, before a plurality of images to be spliced are screened out from the multi-frame tunnel fusion image based on the device acquisition positions corresponding to the tunnel fusion image, the method further includes:
determining a preliminary equipment acquisition position corresponding to the tunnel fusion map according to an inertial navigation positioning technology;
and correcting and optimizing the initial equipment acquisition position corresponding to the tunnel fusion map according to the beacon positioning technology and combining the speed of the network inspection vehicle to obtain the equipment acquisition position corresponding to the tunnel fusion map.
In one embodiment, the selecting a plurality of graphs to be spliced in the multi-frame tunnel fusion graph based on the device acquisition position corresponding to the tunnel fusion graph includes:
obtaining a screening interval;
and screening a plurality of pictures to be spliced from the multi-frame tunnel fusion pictures based on the equipment acquisition positions and the screening intervals corresponding to the tunnel fusion pictures.
In one embodiment, the matching the characteristic points of the tunnel planar target in each graph to be spliced and the matching the characteristic points of the tunnel linear target in each graph to be spliced to obtain the transformation matrix between the graphs to be spliced includes:
Extracting tunnel planar target feature points and tunnel linear target feature points from the images to be spliced;
matching the characteristic points of the tunnel planar targets on the pictures to be spliced, and matching the characteristic points of the tunnel linear targets on the pictures to be spliced to obtain a preliminary transformation matrix between the pictures to be spliced;
and obtaining a transformation matrix between the pictures to be spliced according to the preliminary transformation matrix.
In one embodiment, the matching the characteristic points of the tunnel planar target in each graph to be spliced and the matching the characteristic points of the tunnel linear target in each graph to be spliced to obtain a preliminary transformation matrix between the graphs to be spliced includes:
matching the tunnel planar targets with the characteristic points of each graph to be spliced to obtain a transformation matrix to be optimized;
and optimizing the transformation matrix to be optimized according to the characteristic points of the tunnel linear targets in the images to be spliced to obtain a preliminary transformation matrix among the images to be spliced.
In one embodiment, the fusing the overlapping areas of the point clouds in the mosaic to obtain the tunnel line graph includes:
Determining a point cloud overlapping area of the continuous multi-frame mosaic;
acquiring temperature data of the point cloud overlapping area in a continuous multi-frame splice graph;
and carrying out averaging on the temperature data of the point cloud overlapping region in the continuous multi-frame spliced graph to obtain the tunnel line graph.
In one embodiment, after obtaining a temperature detection result of the tunnel device based on the tunnel map, the method further includes:
determining a device of interest based on the detection information input by the user;
and feeding back a temperature detection result of the interested device to the user based on the tunnel line graph.
In a second aspect, the present application further provides a tunnel device temperature detection apparatus, including:
the to-be-spliced image screening module is used for screening a plurality of to-be-spliced images from the multi-frame tunnel fusion images based on the equipment acquisition positions corresponding to the tunnel fusion images; the tunnel fusion map is obtained by carrying out fusion processing on a tunnel point cloud map and a tunnel thermodynamic diagram of the same frame;
the transformation matrix acquisition module is used for matching the characteristic points of the tunnel planar targets in the pictures to be spliced and matching the characteristic points of the tunnel linear targets in the pictures to be spliced to obtain a transformation matrix between the pictures to be spliced;
The splicing diagram acquisition module is used for splicing a plurality of diagrams to be spliced based on the transformation matrix to obtain splicing diagrams;
the splicing diagram fusion module is used for fusing the point cloud overlapping areas in the splicing diagram to obtain a tunnel line diagram;
and the temperature acquisition module is used for acquiring a tunnel equipment temperature detection result based on the tunnel circuit diagram.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor executing the method described above.
In a fourth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium has stored thereon a computer program which is executed by a processor to perform the above method.
In a fifth aspect, the present application also provides a computer program product. The computer program product comprises a computer program which is executed by a processor to perform the above method.
According to the tunnel equipment temperature detection method, the tunnel equipment temperature detection device, the computer equipment, the storage medium and the computer program product, a plurality of pictures to be spliced are screened out from the multi-frame tunnel fusion pictures based on equipment acquisition positions corresponding to the tunnel fusion pictures; the tunnel fusion map is obtained by carrying out fusion processing on the tunnel point cloud map and the tunnel thermodynamic diagram of the same frame; matching the characteristic points of the tunnel planar targets in the images to be spliced, and matching the characteristic points of the tunnel linear targets in the images to be spliced to obtain a transformation matrix between the images to be spliced; based on the transformation matrix, splicing a plurality of images to be spliced to obtain a spliced image; fusing the point cloud overlapping areas in the spliced graph to obtain a tunnel line graph; and obtaining a tunnel equipment temperature detection result based on the tunnel line graph. Matching the characteristic points of the tunnel planar targets on the pictures to be spliced, and matching the characteristic points of the tunnel linear targets on the pictures to be spliced to obtain a transformation matrix between the pictures to be spliced; based on the transformation matrix, a plurality of images to be spliced are spliced to obtain a spliced image, and the point cloud overlapping areas in the spliced image are fused to obtain a tunnel line image, so that the splicing accuracy is high and the time delay is low.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the related art, the drawings that are required to be used in the embodiments or the related technical descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to the drawings without inventive effort for a person having ordinary skill in the art.
FIG. 1 is an application environment diagram of a tunnel device temperature detection method in one embodiment;
FIG. 2 is a flow chart of a method for detecting tunnel equipment temperature in one embodiment;
FIG. 3 is a schematic diagram of a radar and thermal imager mounting structure in one embodiment;
FIG. 4 is a schematic view of a region in one embodiment;
FIG. 5 is a flow chart of a method for detecting temperature of a tunnel device according to another embodiment;
FIG. 6 is a block diagram of a tunnel equipment temperature sensing device in one embodiment;
fig. 7 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly understand that the embodiments described herein may be combined with other embodiments.
The embodiment of the application provides a method for detecting the temperature of tunnel equipment, which can be executed by computer equipment, as shown in fig. 1, the computer equipment can acquire multi-frame tunnel point cloud images acquired by a radar and multi-frame tunnel thermodynamic diagrams acquired by a thermal imager, and fusion processing is carried out on the multi-frame tunnel point cloud images and the tunnel thermodynamic diagrams in the same frame to obtain a tunnel fusion diagram; screening a plurality of images to be spliced from the multi-frame tunnel fusion images based on the equipment acquisition positions corresponding to the tunnel fusion images; matching the characteristic points of the tunnel planar targets in the images to be spliced, and matching the characteristic points of the tunnel linear targets in the images to be spliced to obtain a transformation matrix between the images to be spliced; based on the transformation matrix, splicing a plurality of images to be spliced to obtain a spliced image; fusing the point cloud overlapping areas in the spliced graph to obtain a tunnel line graph; and obtaining a tunnel equipment temperature detection result based on the tunnel line graph. It will be appreciated that the computer device may be implemented by a server, a terminal, or an interactive system between the terminal and the server. In this embodiment, the method includes the steps shown in fig. 2:
Step S201, screening a plurality of pictures to be spliced from a multi-frame tunnel fusion picture based on equipment acquisition positions corresponding to the tunnel fusion picture; the tunnel fusion map is obtained by carrying out fusion processing on the tunnel point cloud map and the tunnel thermodynamic diagram of the same frame.
In this application, radar and thermal imaging appearance can be placed from top to bottom 5cm in interval, and the perpendicular alignment of main sight of visual angle is put, can let radar and thermal imaging appearance's detection key point overlap ratio reach the highest, and non-overlapping area is littleer, and invalid area also can reduce to the minimum, has accelerated later stage's fusion speed and accuracy, and concrete mounting structure is as shown in fig. 3. The radar may be a 3D lidar and the thermal imager may be a non-cryogenic array thermal imager.
Through the radar and the thermal imager in the installation mode, multi-frame tunnel point cloud images and multi-frame tunnel thermodynamic diagrams can be obtained. And carrying out fusion processing on the tunnel point cloud images and the tunnel thermodynamic diagrams of the same frame to obtain a tunnel fusion image.
And obtaining the equipment acquisition position corresponding to the tunnel fusion map according to the inertial navigation positioning technology and the beacon positioning technology and combining the speed of the network inspection vehicle. And determining screening intervals according to actual conditions, and screening a plurality of images to be spliced in the multi-frame tunnel fusion map based on equipment acquisition positions and the screening intervals corresponding to the tunnel fusion map.
Step S202, matching characteristic points of tunnel planar targets on the pictures to be spliced and matching characteristic points of tunnel linear targets on the pictures to be spliced to obtain a transformation matrix between the pictures to be spliced.
The tunnel planar target can be a tunnel wall surface target, and the tunnel linear target can be a plate surface of a ballastless track, a civil air defense channel and a double-line rail target.
Matching the same tunnel planar target feature points and tunnel linear target feature points in different pictures to be spliced, for example, finding out the corresponding relation between a large ground plane and double-line rails shared by the two pictures to be spliced, and calculating a transformation matrix by detecting and matching the directions and positions of the large ground plane and the double-line rails to realize the preliminary alignment between frame data.
And step S203, based on the transformation matrix, splicing the plurality of images to be spliced to obtain a spliced image.
And mapping each frame of the images to be spliced into a global coordinate system by applying a transformation matrix, so as to ensure the correct alignment among the images to be spliced.
And step S204, fusing the point cloud overlapping areas in the spliced graph to obtain a tunnel line graph.
Illustratively, according to the splice graph, temperature data of three continuous frames of the splice graph before, during and after are respectively matched, and point clouds in the overlapping area of the point clouds are averaged and unified to obtain a tunnel line graph.
Step S205, based on the tunnel line diagram, obtaining a tunnel equipment temperature detection result.
In the tunnel equipment temperature detection method, characteristic points of tunnel planar targets in all the pictures to be spliced are matched, characteristic points of tunnel linear targets in all the pictures to be spliced are matched, and a transformation matrix among the pictures to be spliced is obtained; based on the transformation matrix, a plurality of images to be spliced are spliced to obtain a spliced image, and the point cloud overlapping areas in the spliced image are fused to obtain a tunnel line image, so that the splicing accuracy is high and the time delay is low.
In one embodiment, before a plurality of images to be spliced are screened out from the multi-frame tunnel fusion map based on the device acquisition positions corresponding to the tunnel fusion map, the method provided by the application further includes: determining a preliminary equipment acquisition position corresponding to the tunnel fusion map according to an inertial navigation positioning technology; and correcting and optimizing the initial equipment acquisition position corresponding to the tunnel fusion map according to the beacon positioning technology and combining the speed of the network inspection vehicle to obtain the equipment acquisition position corresponding to the tunnel fusion map.
The inertial navigation positioning technology is realized by an inertial navigation system (Inertial Navigation System, INS), is an autonomous navigation technology, and calculates the position, the speed and the gesture of the inertial navigation positioning technology by measuring the acceleration and the angular speed of the device. The inertial navigation system comprises three gyroscopes and three accelerometers for measuring angular and linear accelerations of the device in three axes, respectively. The position, velocity and attitude information of the device can be continuously updated using the newton law of motion and the attitude transformation represented by the euler angles (or quaternions). However, due to sensor noise and drift, measurement errors of inertial navigation systems can accumulate over time. Therefore, correction and optimization is required using beacon positioning techniques.
Beacon positioning technology generally relies on preset landmarks to determine the location of equipment, and currently, embedded positioning beacon devices are installed in subway tunnel lines. When the train positioning sensing device approaches the beacon, the signal of the beacon can be received and analyzed, so that accurate position information is obtained.
Determining a preliminary equipment acquisition position corresponding to the tunnel fusion map according to an inertial navigation positioning technology; and correcting and optimizing the initial equipment acquisition position corresponding to the tunnel fusion map according to the beacon positioning technology and combining the speed of the network inspection vehicle to obtain the equipment acquisition position corresponding to the tunnel fusion map.
In this embodiment, according to the inertial navigation positioning technology and the beacon positioning technology and combining the speed of the network inspection vehicle, a more accurate device acquisition position corresponding to the tunnel fusion map is obtained.
In one embodiment, based on the device acquisition position corresponding to the tunnel fusion map, a plurality of maps to be spliced are screened from the multi-frame tunnel fusion map, and the specific steps are as follows: obtaining a screening interval; and screening a plurality of images to be spliced from the multi-frame tunnel fusion images based on the equipment acquisition positions and the screening intervals corresponding to the tunnel fusion images.
For example, a screening interval of 5 meters is obtained, and 200 images to be spliced are screened out every 5 meters in the 500-frame tunnel fusion map based on the equipment acquisition position and the screening interval corresponding to the tunnel fusion map.
In this embodiment, based on the device acquisition position and the screening interval corresponding to the tunnel fusion map, a plurality of to-be-spliced maps are screened out from the multi-frame tunnel fusion map, and representative to-be-spliced maps are screened out, so that the calculation amount is reduced.
In one embodiment, the method comprises the following specific steps of: extracting tunnel planar target feature points and tunnel linear target feature points from each graph to be spliced; matching the characteristic points of the tunnel planar targets in the images to be spliced, and matching the characteristic points of the tunnel linear targets in the images to be spliced to obtain a primary transformation matrix between the images to be spliced; and obtaining a transformation matrix between the pictures to be spliced according to the preliminary transformation matrix.
And (3) performing target feature point identification according to target clustering, track fitting and plane segmentation technologies by referring to algorithms in a point cloud (Point Cloud Library, PCL) library, and extracting tunnel planar target feature points and tunnel linear target feature points from each graph to be spliced. And (3) using a dimension Scale-invariant feature transform (Scale-Invariant Feature Transform in D, SIFT3D) technology to find out feature points of the same target in each point cloud in each graph to be spliced, matching the feature points of the tunnel planar target in each graph to be spliced, matching the feature points of the tunnel linear target in each graph to be spliced, detecting the directions and positions of the tunnel planar target and the tunnel linear target, and obtaining a preliminary transformation matrix between the graphs to be spliced. And mapping each graph to be spliced into a global coordinate system by applying a preliminary transformation matrix, so as to ensure correct alignment among the graphs to be spliced. The original computational complexity of the traditional nearest point matching algorithm (Iterative Closest Point algorithm, ICP) technology is simplified, the improved nearest point matching algorithm is used for accurately registering point clouds of all the images to be spliced, geometric transformation required for converting all the images to be spliced from a local coordinate system to a global coordinate system is accurately calculated, and a transformation matrix among the images to be spliced is obtained.
In the embodiment, according to the matching of the characteristic points of the tunnel planar targets on each graph to be spliced and the matching of the characteristic points of the tunnel linear targets on each graph to be spliced, a preliminary transformation matrix between the graphs to be spliced is obtained; and obtaining a transformation matrix between the pictures to be spliced according to the preliminary transformation matrix.
In one embodiment, the method comprises the following specific steps of matching characteristic points of tunnel planar targets in each graph to be spliced and matching characteristic points of tunnel linear targets in each graph to be spliced to obtain a preliminary transformation matrix between the graphs to be spliced: matching the tunnel planar targets with the characteristic points of each graph to be spliced to obtain a transformation matrix to be optimized; optimizing the transformation matrix to be optimized at the characteristic points of each graph to be spliced according to the tunnel linear target to obtain a preliminary transformation matrix among the graphs to be spliced.
Matching the tunnel planar targets at the characteristic points of each graph to be spliced, and detecting the direction and the position of the tunnel planar targets to obtain a transformation matrix to be optimized; optimizing the transformation matrix to be optimized according to the characteristic points of the tunnel linear targets in each graph to be spliced, and detecting the direction and the position of the tunnel linear targets to obtain the preliminary transformation matrix between the graphs to be spliced.
In this embodiment, according to matching the characteristic points of the tunnel planar target in each graph to be spliced and matching the characteristic points of the tunnel linear target in each graph to be spliced, a relatively accurate preliminary transformation matrix between the graphs to be spliced is obtained.
In one embodiment, the overlapping areas of the point clouds in the spliced graph are fused to obtain a tunnel line graph, which specifically includes the following steps: determining a point cloud overlapping area of the continuous multi-frame mosaic; acquiring temperature data of a point cloud overlapping area in a continuous multi-frame spliced graph; and (5) carrying out averaging on temperature data of the point cloud overlapping area in the continuous multi-frame spliced graph to obtain a tunnel line graph.
The method comprises the steps of determining a point cloud overlapping area of a three-frame spliced graph with continuous front, middle and rear, obtaining temperature data of the three-frame spliced graph with continuous front, middle and rear in the point cloud overlapping area, carrying out averaging and unification on the temperature data of the three-frame spliced graph with continuous front, middle and rear in the point cloud overlapping area, and obtaining a tunnel line graph.
In this embodiment, by fusing the overlapping areas of the point clouds in the spliced graph, a relatively accurate tunnel line graph is obtained.
In one embodiment, after obtaining the temperature detection result of the tunnel device based on the tunnel circuit diagram, the method provided by the application further includes: determining a device of interest based on the detection information input by the user; based on the tunnel line graph, the temperature detection result of the interested device is fed back to the user.
Based on the detection information input by the user, the detection information can comprise detection area, detection category, identification level and alarm temperature information, and the interested device is determined. Based on the tunnel line graph, feeding back a temperature detection result of the interested device to the user, wherein the temperature detection result can comprise overtemperature alarm, a temperature comparison line graph and panoramic map information. The application can also feed back a detailed report to the user, including the functions of alarming a certain device when He Dechan is generated and comparing historical data changes.
In this embodiment, according to detection information input by a user, an interested device is determined; based on the tunnel line graph, the temperature detection result of the interested device is fed back to the user.
In one embodiment, the multi-frame tunnel fusion map may be obtained by: acquiring a multi-frame tunnel point cloud image acquired by a radar and a multi-frame tunnel thermodynamic diagram acquired by a thermal imager; based on fusion processing of the same-frame tunnel point cloud image and the tunnel thermodynamic diagram, a multi-frame tunnel fusion image is obtained; when fusion processing is carried out on the tunnel point cloud images and the tunnel thermodynamic diagrams in the same frame, aiming at one type of region in each group, obtaining the temperature value of one type of real point cloud of the one type of region based on the temperature value of one type of thermodynamic point of the one type of region; aiming at the second-class areas in each group, filling the second-class areas with point clouds based on the temperature value change condition of the second-class thermal points of the second-class areas to obtain second-class mixing point clouds, and obtaining point clouds to be mapped based on the second-class mixing point clouds and the first-class real point clouds of the first-class areas in the same group; performing multi-scale mapping on the point cloud to be mapped and the two kinds of heating points to obtain temperature calibration values of the two kinds of heating points, and obtaining temperature values of the two kinds of mixing point clouds according to the temperature calibration values of the two kinds of heating points; the first-class area and the second-class area in the same group have common radar waves; the first-class area is an area where radar waves and thermal imaging waves coincide, and the second-class area is an area formed by adjacent radar waves with thermal imaging waves between the first-class area.
After the multi-frame tunnel fusion map is obtained, a temperature detection result of the tunnel equipment can be obtained based on the multi-frame tunnel fusion map.
As shown in fig. 4, the region where the radar wave and the thermal imaging wave overlap is referred to as a first-class region, the region where the thermal imaging wave exists between adjacent radar waves is referred to as a second-class region, and the first-class region and the second-class region having the radar wave in common are regarded as a group.
For distinguishing, the real point cloud in the first type of area is called as a first type of real point cloud, and the thermodynamic point of the first type of area is called as a first type of thermodynamic point; the real point cloud in the second-class area is called a second-class real point cloud, and the thermodynamic point of the second-class area is called a second-class thermodynamic point.
Performing external parameter calibration on the radar and the thermal imager, and performing space coordinate registration operation according to the superposition characteristics of the main view and the reference point of the radar and the thermal imager, so that one type of real point cloud of one type of region basically corresponds to one type of heating point of the upper type of region, and increasing the temperature value and the radiation value of one type of heating point on the parameter values of four-dimensional coordinates of one type of real point cloud aiming at one type of region in each group, thereby obtaining the temperature value of one type of real point cloud of the one type of region, wherein the four-dimensional coordinates of the one type of real point cloud are space abscissa, space ordinate and echo intensity respectively.
Dividing the two-class area into a gentle area, a gradient area and a convex-concave area based on the temperature value change condition of the two-class thermodynamic points in each group, determining the filling density of each area according to the distance value corresponding to the two-class real point cloud, filling the two-class area with the point cloud according to the filling density of each area to obtain the two-class mixed point cloud, and obtaining the point cloud to be mapped based on the two-class mixed point cloud and the one-class real point cloud of the one-class area in the same group; the method comprises the steps of carrying out multi-scale mapping on a point cloud to be mapped and a second-class heating point to obtain temperature calibration values of the second-class heating point, and mapping the second-class heating point and the second-class mixing point cloud, so that the temperature values of the second-class mixing point cloud are obtained according to the temperature calibration values of the second-class heating point.
Dividing the areas of each frame of tunnel fusion map according to the echo intensity of the point cloud in each frame of tunnel fusion map and the change condition of the temperature value, and obtaining each area; determining equipment corresponding to each area; positioning the mark surface of the equipment to obtain point sets and particle points of different surfaces of the equipment; and obtaining poles of the equipment according to the point sets and the mass points of different surfaces of the equipment, and performing temperature marking operation to obtain an equipment temperature detection result of each frame of tunnel fusion map. Splicing each frame of tunnel fusion map to obtain a tunnel line map; determining a tunnel fusion map of adjacent frames in the tunnel line map; and fusing the device temperature detection results of the tunnel fusion graphs of the adjacent frames to obtain the temperature detection results of the tunnel devices.
In the tunnel equipment temperature detection method, when fusion processing is carried out on the point cloud images of the tunnels in the same frame and the tunnel thermodynamic diagrams, aiming at the two kinds of areas in each group, filling the two kinds of areas with point cloud based on the temperature value change condition of the two kinds of thermodynamic points of the two kinds of areas to obtain two kinds of mixed point cloud, and obtaining point cloud to be mapped based on the two kinds of mixed point cloud and one kind of true point cloud of the one kind of areas in the same group; the method comprises the steps of carrying out multi-scale mapping on point clouds to be mapped and two kinds of heating points to obtain temperature calibration values of the two kinds of heating points, obtaining temperature values of two kinds of mixing point clouds according to the temperature calibration values of the two kinds of heating points, enriching point cloud data of a tunnel fusion map, reducing influence of radar blind areas on accuracy of temperature detection results, and improving accuracy of the temperature detection results. According to the method, through the fusion recognition technology of the radar and the thermal forming instrument, the single characteristic of the tunnel equipment in the sensor is amplified, and the superposition of multiple sensors can generate the characteristic of the third mode, so that the multi-angle recognition possibility is realized, the limitation of the barriers of temperature detection and the radar scanning interval is broken through, and a good effect is achieved in actual use. The tunnel point cloud image identification belongs to one mode, the tunnel thermodynamic diagram belongs to one mode, a third mode, namely an original composite mode, is generated after the tunnel point cloud image identification and the tunnel thermodynamic diagram are fused, the tunnel fusion diagram after the tunnel point cloud image identification and the tunnel thermodynamic diagram are fused is a result with the characteristics of the three modes, the characteristics of the targets are more obvious, the defects of the sensors are mutually compensated, the calculation complexity of the single sensor is reduced, and the accuracy of the temperature detection result is improved.
In one embodiment, based on the temperature values of a class of thermodynamic points of a class of regions, the temperature values of a class of real point clouds of the class of regions are obtained, and the specific steps are as follows: carrying out space coordinate registration operation on the tunnel point cloud map and the tunnel thermodynamic diagram in the same frame to determine initial temperature values of a class of real point clouds of a class of areas based on temperature values of a class of thermodynamic points of the class of areas; and calibrating the initial temperature value of the real point cloud according to the difference between the distance value corresponding to the thermal point and the distance value corresponding to the real point cloud to obtain the temperature value of the real point cloud.
Performing external parameter calibration on the radar and the thermal imager, performing spatial coordinate registration operation according to the superposition characteristics of the main view and the reference point of the radar and the thermal imager, enabling a class of real point clouds of a class of areas to basically correspond to a class of heating points of a class of areas, and adding the temperature value and the radiation value of the class of heating points to the parameter values of four-dimensional coordinates of the class of real point clouds, so as to obtain the initial temperature value of the class of real point clouds of the class of areas. Obtaining a distance difference value according to the difference between the distance value corresponding to the type of thermal point and the distance value corresponding to the type of real point cloud, substituting the distance difference value into a temperature correction algorithm to obtain a corrected offset value, and calibrating the initial temperature value of the type of real point cloud according to the offset value to obtain a more accurate temperature value of the type of real point cloud.
In this embodiment, based on the temperature values of the first type of thermal points of the first type of regions, an initial temperature value of the first type of real point clouds of the first type of regions is obtained, and according to the difference between the distance value corresponding to the first type of thermal points and the distance value corresponding to the first type of real point clouds, the initial temperature value of the first type of real point clouds is calibrated, so as to obtain a more accurate temperature value of the first type of real point clouds.
In one embodiment, based on the temperature value change condition of the two-class thermodynamic points of the two-class region, the two-class region is filled with point cloud to obtain two-class mixing point cloud, and the specific steps are as follows: dividing the second-class area into a plurality of subareas based on the temperature value change condition of the second-class thermodynamic point of the second-class area; determining the filling density of each subarea according to the distance value corresponding to the second-class real point cloud; according to the filling density of each subarea, respectively filling the subareas with point cloud to obtain second-class virtual point cloud; and obtaining a second-class mixed point cloud according to the second-class real point cloud and the second-class virtual point cloud.
And carrying out gradient identification on the second-class region through an edge detection image algorithm to obtain the temperature value change condition of the second-class thermodynamic point of the second-class region. Dividing the two-class area into three sub-areas of a gentle area, a convex-concave area and a gradient area based on the temperature value change condition of the two-class thermodynamic point of the two-class area; determining the filling density of each subarea according to the distance value corresponding to the second-class real point cloud, wherein the filling densities of the subareas are different from each other according to the distance value; and selecting a corresponding filling algorithm to fill the sub-areas with point clouds respectively according to the filling density of the sub-areas, wherein the filling method can be a parity scan conversion algorithm based on a bucket model and mainly fills in different modes according to the local characteristics of temperature value distribution. The second-class virtual point cloud can be obtained by performing continuous non-repeated region mapping according to the second-class virtual point cloud quantity and the second-class thermal point and then performing proportional filling; and obtaining a second-class mixed point cloud according to the second-class real point cloud and the second-class virtual point cloud.
If the irradiation target is a television, the screen in the middle of the television heats uniformly, and the edge detection image algorithm is used for identifying that the middle area of the television belongs to a gentle area, the edge and background area of the television belongs to a gradient area with abrupt temperature change, and the power indicator area of the television belongs to a convex-concave area with concave temperature in the central area. In the filling process, firstly, demarcating is conducted according to thermodynamic diagram characteristics of the two kinds of areas (the demarcation of the gentle area is a rectangle, the demarcation of the gradient area is a line segment, the demarcation of the convex-concave area is a closed curve), the filling density of each subarea is determined according to the distance value corresponding to the two kinds of real point clouds, and then the filling position is determined according to an algorithm.
The gentle area is defined according to the total temperature value, and the plane is uniformly filled by using an algorithm; the gradient region is defined according to the boundary, and a pole linear filling algorithm is used to enable the filled second-class virtual point clouds to be more attached to the boundary line, so that the filled positions of the second-class virtual point clouds can represent the sharpness of the object more; the convex-concave area is defined according to the limit of the closed curve, and a multi-stage gradient topography filling algorithm is used to enable the edge of the irregular limit to form a pair of contour lines similar to a topography, the intervals among the contour lines are different, and the filling densities are different.
The gentle region filling is mainly used for processing the region with smaller temperature value variation in the second-class region, and the filling is uniformly-spaced in the rectangular region. The implementation steps are as follows: (1) areas which are continuous and have small changes in temperature values (or gray values) are identified from the data of the thermal imager, and these areas can be regarded as flat areas. (2) And in a blank area between radar scanning lines, generating a second-class virtual point cloud according to real point cloud data on adjacent radar scanning lines and thermal point data interpolation of a thermal imager. (3) Linear interpolation, bilinear interpolation, or other suitable interpolation methods may be employed to estimate the second class virtual point cloud data for the blank region.
Assume a two-dimensional image I (x, y), where x and y are pixel coordinates and I (x, y) is the corresponding temperature value. Bilinear interpolation may be used to estimate the second class virtual point cloud locations of the empty regions between radar scan lines. The bilinear interpolation formula is as follows:
Code I(x',y')=(1-α)(1-β)I(x,y)+α(1-β)I(x+1,y)+(1-α)βI(x,y+1)+αβI(x+1,y+1)
where (x ', y') is a point where interpolation is required, (x, y), (x+1, y), (x, y+1), and (x+1, y+1) are four known pixel points around it, and α= (x '-x) and β= (y' -y) are interpolation parameters.
The convex-concave area filling is used for processing the area with obvious fluctuation change of the temperature value in the closed loop area of the second-class area, and is simply understood as filling of the inner and outer ring change area, and belongs to tight filling at the boundary line. The implementation steps are as follows: (1) edges or feature points with large changes in temperature values (or gray values) are identified from the thermal imager data, which may be convex or concave boundaries. (2) More complex interpolation methods, such as spline interpolation, radial basis function interpolation, are used to better capture and model the shape of the convex-concave region. (3) And generating second-class virtual point clouds in a blank area between radar scanning lines according to the interpolation methods, and ensuring that the second-class virtual point clouds can accurately reflect the fluctuation change of the terrain. Spline interpolation or radial basis function interpolation may be used for convex-concave regions to better capture and simulate the shape of the terrain. For spline interpolation, a suitable spline function (e.g., cubic spline, B-spline) can be selected and the following objective functions are minimized:
E(s)=∑(w i *(s(x i )-I(x i )) 2
Where s (x) is a spline function, I (x) is an input temperature value, x i Is the coordinates of the data point, w i Is a weight.
The gradient region filling is a filling method based on non-closed-loop linear region gradient information, and the filling of the left and right change regions is simply understood. The implementation steps are as follows: (1) calculating gradient information for each point in the thermal imager data may be accomplished by comparing temperature (or gray scale) differences for adjacent pixels. (2) And in the blank area between the radar scanning lines, deducing gradient distribution of the blank area according to known radar real point cloud data and gradient information of the thermal imager. (3) And generating a second-class virtual point cloud conforming to gradient characteristics according to gradient distribution and a possible surface model (such as a plane, a cone, an ellipsoid and the like). (4) An iterative optimization process may be required to ensure that the generated second class virtual point cloud matches the slope characteristics of the actual terrain.
Assuming that gradient information G (x, y) for each point in the thermal imager data has been calculated, this can be achieved by the Sobel (Sobel) operator, the plavit (Prewitt) operator, or other edge detection operator.
The gradient distribution of the blank area can be deduced according to the known radar real point cloud data and gradient information of the thermal imager. This can be achieved by constructing an optimization problem, for example:
min s ∑(w i *(G s (x i ,y i )-G(x i ,y i )) 2 )+λ*∥2 s.t.z radar (x j ,y j )=s(x j ,y j ),/>(x j ,y j )∈radar points
Where s (x, y) is the surface height function to be estimated, z radar (x, y) is the height data of Lei Dazhen real point cloud, G s (x, y) is gradient calculated from s (x, y), G (x, y) is gradient data of the thermal imager, w i Is the weight (optionally, for emphasizing the importance of certain points), lambda is the regularization parameter,is a gradient of s (x, y), radar points Is the coordinate set of the Lei Dazhen real point cloud.
In this embodiment, based on the temperature value change condition of the two-class thermodynamic points of the two-class region, the two-class region is subjected to point cloud filling to obtain two-class mixing point cloud.
In one embodiment, based on the temperature value change condition of the two-class thermodynamic points of the two-class region, the two-class region is divided into a plurality of sub-regions, and the specific steps are as follows: dividing a second-class area where second-class heating points with small temperature value change and uniform distribution in any rectangular range are located into a gentle area to be filled; dividing a second-class area where the second-class thermal points with temperature value distribution conforming to the same-direction abrupt change condition are located into gradient areas to be filled; dividing a second-class area of the bulge or the recess where the second-class heating power points with the temperature value distribution conforming to the multi-direction mutation situation are located into convex-concave areas to be filled.
Gradient identification is carried out on the two-class area through an edge detection image algorithm, the temperature value change condition of the two-class thermodynamic points of the two-class area is obtained, and the two-class area where the two-class thermodynamic points with continuous temperature values and the change smaller than a threshold value are located is divided into gentle areas; dividing a second-class area where a second-class heating point with abrupt temperature value is located into a gradient area; dividing a convex or concave second-class region where the second-class heating power points with the temperature value change larger than the threshold value are located into convex-concave regions.
In this embodiment, the second-class region is divided into a plurality of sub-regions according to the temperature value change condition of the second-class thermal point based on the second-class region.
In one embodiment, the multi-scale mapping is performed on the point cloud to be mapped and the second class of thermodynamic points to obtain the temperature calibration value of the second class of thermodynamic points, and the specific steps are as follows: for each two-class thermodynamic point, acquiring an initial mapping area of the two-class thermodynamic point; the initial mapping area of the two-class thermodynamic point is an area where the corresponding two-class virtual point cloud is located based on each two-class thermodynamic point when the two-class area is filled with point clouds; according to different scales, carrying out region division on the initial mapping region to obtain the mapping region of each two classes of thermodynamic points under each scale; mapping each two types of thermodynamic points and point clouds to be mapped in mapping areas under different corresponding scales; and aiming at each two-class thermodynamic point, calibrating the temperature value of the two-class thermodynamic point based on the distance and the temperature value corresponding to the point cloud with the mapping of the two-class thermodynamic point, and obtaining the temperature calibration value of the two-class thermodynamic point.
For each two-class thermodynamic point, acquiring an initial mapping area of the two-class thermodynamic point; the initial mapping area of the two-class thermodynamic point is an area where the corresponding two-class virtual point cloud is located based on each two-class thermodynamic point when the two-class area is filled with point clouds. For each scale, obtaining a window corresponding to each scale, designing different window sizes according to a random sampling consistency (RANdom SAmple Consensus, RANSAC) algorithm, and defining the sliding range of the window to be required to be sliding-discarded along the window boundary and the central area; dividing the point cloud by a multi-scale algorithm based on Octree (Octree), organizing the point cloud into detail areas of different levels, and obtaining each mapping area under each scale based on sliding of a window on the point cloud to be mapped. Mapping each two types of thermodynamic points with point clouds in mapping areas under different scales, and solving an average value; and for each two-class thermodynamic point, carrying out reverse projection or reverse mapping through the hierarchical information of the octree based on the distance and the temperature value corresponding to the point cloud with the mapping of the two-class thermodynamic point, and calibrating the temperature value of the two-class thermodynamic point to obtain the temperature calibration value of the two-class thermodynamic point.
The method comprises the steps of screening 9 class-II virtual point clouds in a partial area in 16 class-II virtual point clouds, adding 7 peripheral class-II real point clouds, recombining the range of 16 class-II point clouds, and then mapping the temperature values of 4 class-II thermal points based on the distance and the temperature values corresponding to the 16 class-II point clouds with mapping relation with the 4 class-II thermal points. The temperature values of the 4 class-II heating points obtained at this time are more accurate than the original temperature values.
If the ratio of the thermodynamic point to the real point cloud of a tunnel thermodynamic diagram to a tunnel point cloud diagram is 500:300, the total number of points in one-to-one correspondence of one type of real point cloud to one type of thermodynamic point is 100, and filling and multi-scale mapping are not needed for the hundred types of real point clouds and temperature values. The ratio of the second class of thermodynamic points to the second class of real point clouds is therefore 400:200, after 1400 kinds of second-class virtual point clouds are filled, the proportion of the second-class thermodynamic point to the second-class mixing point cloud is 400:1600, at this time, depending on the number 300 of real point clouds (100 is the number of real point clouds of one type corresponding to one type), fusing 300 real point cloud data into 1300 two types of virtual point clouds, performing region division and multi-scale mapping to obtain more accurate 400 temperature values, and finally obtaining a tunnel fusion map of 1700 point clouds and temperature data, wherein the tunnel fusion map is a 6D data map.
In this embodiment, the point cloud to be mapped and the second class of thermal points are mapped in a multi-scale manner, so as to obtain the temperature calibration value of the second class of thermal points.
In one embodiment, the initial mapping area is divided into areas according to different scales to obtain the mapping area of each two types of thermodynamic points under each scale, and the specific steps are as follows: sliding along four sides of the initial mapping area based on windows with different scales to obtain mapping areas with different scales; the mapping region comprises a part of two-class virtual point clouds of the initial mapping region and real point clouds around the initial mapping region.
For each initial mapping area, rectangular windows with different scales can be designed according to a random sampling consistency algorithm, sliding is carried out along four sides of the initial mapping area based on the rectangular windows with different scales, a sliding range of the window is limited to be required to be along a window boundary, the central area is required to be sliding and abandoned, partial real point cloud information can be covered by the window in the sliding process, and the point cloud is divided according to a multi-scale algorithm of an octree, so that the mapping area under each scale is obtained. The mapping region comprises a part of two-class virtual point clouds of the initial mapping region and real point clouds around the initial mapping region.
In this embodiment, according to different scales, the point cloud to be mapped is divided into regions, so as to obtain the mapping region under each scale.
In one embodiment, based on a multi-frame tunnel fusion map, a temperature detection result of the tunnel device is obtained, and the specific steps are as follows: acquiring a device temperature detection result of each frame of tunnel fusion map; splicing each frame of tunnel fusion map to obtain a tunnel line map; determining a tunnel fusion map of adjacent frames in the tunnel line map; and fusing the device temperature detection results of the tunnel fusion graphs of the adjacent frames to obtain the temperature detection results of the tunnel devices.
Acquiring a device temperature detection result of each frame of tunnel fusion map; acquiring positioning information of each frame of tunnel fusion map by combining the speed of the network inspection vehicle by means of inertial navigation positioning and beacon positioning technology, correcting and calibrating a data source, registering data after the three-dimensional coordinates of the tunnel fusion map are in the same coordinate system and are in spatial consistency, and splicing each frame of tunnel fusion map to obtain a tunnel line map; and carrying out data merging according to a time sequence and a maximum likelihood estimation matching algorithm, removing noise and an overlapping area of the tunnel line graph, fitting and reducing distortion to achieve the accuracy of data splicing, respectively matching temperature data of three continuous tunnel fusion graphs before, during and after the splicing of the tunnel line graph, and carrying out averaging and unification on point clouds of the point cloud overlapping area. And directly matching the non-overlapping part according to the original tunnel fusion map data to obtain a temperature detection result of the tunnel equipment.
In this embodiment, a temperature detection result of the tunnel device is obtained according to the multi-frame tunnel fusion map.
In one embodiment, the device temperature detection result of each frame of tunnel fusion map is obtained, and the specific steps are as follows: dividing the areas of each frame of tunnel fusion map according to the echo intensity of the point cloud in each frame of tunnel fusion map and the change condition of the temperature value, and obtaining each area; determining equipment corresponding to each area; positioning the mark surface of the equipment to obtain point sets and particle points of different surfaces of the equipment; and obtaining poles of the equipment according to the point sets and the mass points of different surfaces of the equipment, and performing temperature marking operation to obtain an equipment temperature detection result of each frame of tunnel fusion map.
According to the echo intensity of the point cloud in each frame of tunnel fusion map and the change condition of the temperature value, contour detection of a temperature field, mutation detection of point cloud data, filtering of the point cloud and clustering algorithm processing can be combined, and finally, region division is carried out on each frame of tunnel fusion map according to a boundary sensitive network (Boundary Sensitive Network, BSN) algorithm to obtain each region. The divided region may have a category overlapping region. Determining equipment corresponding to each region, performing targeted detection and temperature measurement on tunnel equipment in each region, for example, detecting cables, adopting a cylinder detection algorithm, filtering, normalizing, inputting into a voxel network (VoxelNet) model for detection and identification, and finally matching corresponding temperature values for the cables according to the result. And (3) carrying out marking surface positioning on the cable to obtain point sets and mass points of different surfaces of the cable, then solving poles of the cable (which can be regarded as a rectangular body) according to the point sets and the mass points of the different surfaces of the cable, and carrying out temperature and characteristic marking operation on the cable to obtain a cable temperature detection result of each frame of tunnel fusion map. In the embodiment, according to the echo intensity and the temperature value change condition of the point cloud in each frame of tunnel fusion map, carrying out region division on each frame of tunnel fusion map to obtain each region; and positioning the mark surface of the equipment to obtain the pole of the equipment, and performing temperature marking operation to obtain the equipment temperature detection result of each frame of tunnel fusion map. The marking operation records the point set and the particle information of the target equipment, so that the size, the volume, the reflectivity and the temperature information of the target equipment can be conveniently obtained in use, and the computational waste of repeated detection and repeated identification is reduced.
In order to better understand the above method, an application embodiment of the tunnel device temperature detection method of the present application is described in detail below, as shown in fig. 5.
And acquiring a multi-frame tunnel point cloud image acquired by the radar and a multi-frame tunnel thermodynamic diagram acquired by the thermal imager. Performing external parameter calibration on the radar and the thermal imager, performing spatial coordinate registration operation according to the superposition characteristics of the main view and the datum point of the radar and the thermal imager, enabling one type of real point clouds of one type of region to basically correspond to one type of heating point of the upper type of region, and aiming at one type of region in each group, increasing the temperature value of one type of heating point on the parameter value of four-dimensional coordinates of one type of real point clouds, so as to obtain the temperature value of one type of real point clouds of one type of region. The method comprises the steps of dividing a second-class area into a gentle area, a gradient area and a convex-concave area based on the temperature value change condition of second-class thermodynamic points of the second-class area aiming at the second-class area in each group, determining the filling density of each area according to the distance value corresponding to second-class real point clouds, and filling the second-class area with the point clouds according to the filling density of each area to obtain second-class mixing point clouds. Obtaining point clouds to be mapped based on the two kinds of mixed point clouds and one kind of real point clouds of one kind of areas in the same group; the method comprises the steps of carrying out multi-scale mapping on a point cloud to be mapped and a second-class heating point to obtain temperature calibration values of the second-class heating point, and mapping the second-class heating point and the second-class mixing point cloud, so that the temperature values of the second-class mixing point cloud are obtained according to the temperature calibration values of the second-class heating point. And dividing the area of each frame of tunnel fusion map according to the echo intensity and the change condition of the temperature value of the point cloud in each frame of tunnel fusion map to obtain each area. Determining equipment corresponding to each area; positioning the mark surface of the equipment to obtain point sets and particle points of different surfaces of the equipment; and obtaining poles of the equipment according to the point sets and the mass points of different surfaces of the equipment, and performing temperature marking operation to obtain an equipment temperature detection result of each frame of tunnel fusion map. Splicing each frame of tunnel fusion map to obtain a tunnel line map; determining a tunnel fusion map of adjacent frames in the tunnel line map; and fusing the device temperature detection results of the tunnel fusion graphs of the adjacent frames to obtain the temperature detection results of the tunnel devices. Determining a device of interest based on the detection information input by the user; based on the tunnel line graph, the temperature detection result of the interested device is fed back to the user.
In the tunnel equipment temperature detection method, when fusion processing is carried out on the point cloud images of the tunnels in the same frame and the tunnel thermodynamic diagrams, aiming at the two kinds of areas in each group, filling the two kinds of areas with point cloud based on the temperature value change condition of the two kinds of thermodynamic points of the two kinds of areas to obtain two kinds of mixed point cloud, and obtaining point cloud to be mapped based on the two kinds of mixed point cloud and one kind of true point cloud of the one kind of areas in the same group; the method comprises the steps of carrying out multi-scale mapping on point clouds to be mapped and two kinds of heating points to obtain temperature calibration values of the two kinds of heating points, obtaining temperature values of two kinds of mixing point clouds according to the temperature calibration values of the two kinds of heating points, enriching point cloud data of a tunnel fusion map, reducing influence of radar blind areas on accuracy of temperature detection results, and improving accuracy of the temperature detection results.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a tunnel equipment temperature detection device for realizing the tunnel equipment temperature detection method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiments of the tunnel device temperature detection device or devices provided below may be referred to the limitation of the tunnel device temperature detection method hereinabove, and will not be repeated herein.
In an exemplary embodiment, as shown in fig. 6, there is provided a tunnel equipment temperature detection apparatus, wherein:
the to-be-spliced graph screening module 601 is configured to screen a plurality of to-be-spliced graphs from the multi-frame tunnel fusion graph based on the device acquisition position corresponding to the tunnel fusion graph; the tunnel fusion map is obtained by carrying out fusion processing on a tunnel point cloud map and a tunnel thermodynamic diagram of the same frame;
the transformation matrix acquisition module 602 is configured to match the characteristic points of the tunnel planar target in each graph to be spliced and match the characteristic points of the tunnel linear target in each graph to be spliced, so as to obtain a transformation matrix between the graphs to be spliced;
a mosaic acquisition module 603, configured to splice a plurality of graphs to be spliced based on the transformation matrix to obtain a mosaic;
A stitching graph fusion module 604, configured to fuse the point cloud overlapping areas in the stitching graph to obtain a tunnel line graph;
and the temperature obtaining module 605 is configured to obtain a tunnel device temperature detection result based on the tunnel line graph.
In one embodiment, the apparatus further comprises an acquisition position acquisition module for: determining a preliminary equipment acquisition position corresponding to the tunnel fusion map according to an inertial navigation positioning technology; and correcting and optimizing the initial equipment acquisition position corresponding to the tunnel fusion map according to the beacon positioning technology and combining the speed of the network inspection vehicle to obtain the equipment acquisition position corresponding to the tunnel fusion map.
In one embodiment, the to-be-stitched graph filtering module 601 is further configured to: obtaining a screening interval; and screening a plurality of pictures to be spliced from the multi-frame tunnel fusion pictures based on the equipment acquisition positions and the screening intervals corresponding to the tunnel fusion pictures.
In one embodiment, the transformation matrix acquisition module 602 is further configured to: extracting tunnel planar target feature points and tunnel linear target feature points from the images to be spliced; matching the characteristic points of the tunnel planar targets on the pictures to be spliced, and matching the characteristic points of the tunnel linear targets on the pictures to be spliced to obtain a preliminary transformation matrix between the pictures to be spliced; and obtaining a transformation matrix between the pictures to be spliced according to the preliminary transformation matrix.
In one embodiment, the transformation matrix acquisition module 602 is further configured to: matching the tunnel planar targets with the characteristic points of each graph to be spliced to obtain a transformation matrix to be optimized; and optimizing the transformation matrix to be optimized according to the characteristic points of the tunnel linear targets in the images to be spliced to obtain a preliminary transformation matrix among the images to be spliced.
In one embodiment, the stitching graph fusion module 604 is further configured to: determining a point cloud overlapping area of the continuous multi-frame mosaic; acquiring temperature data of the point cloud overlapping area in a continuous multi-frame splice graph; and carrying out averaging on the temperature data of the point cloud overlapping region in the continuous multi-frame spliced graph to obtain the tunnel line graph.
In one embodiment, the apparatus further comprises a feedback module for: determining a device of interest based on the detection information input by the user; and feeding back a temperature detection result of the interested device to the user based on the tunnel line graph.
The modules in the tunnel equipment temperature detection device can be realized in whole or in part by software, hardware and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one exemplary embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, an Input/Output interface (I/O) and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used for storing data of the tunnel device temperature detection method. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by a processor, implements a tunnel device temperature detection method.
It will be appreciated by those skilled in the art that the structure shown in fig. 7 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, there is also provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
It should be noted that, the user information (including, but not limited to, user equipment information, user personal information, etc.) and the data (including, but not limited to, data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use, and processing of the related data are required to meet the related regulations.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (10)

1. A method for detecting a temperature of a tunnel device, the method comprising:
screening a plurality of images to be spliced from the multi-frame tunnel fusion images based on the equipment acquisition positions corresponding to the tunnel fusion images; the tunnel fusion map is obtained by carrying out fusion processing on a tunnel point cloud map and a tunnel thermodynamic diagram of the same frame;
matching the characteristic points of the tunnel planar targets in the images to be spliced, and matching the characteristic points of the tunnel linear targets in the images to be spliced to obtain a transformation matrix between the images to be spliced;
Based on the transformation matrix, splicing a plurality of images to be spliced to obtain a spliced image;
fusing the point cloud overlapping areas in the spliced graph to obtain a tunnel line graph;
and obtaining a tunnel equipment temperature detection result based on the tunnel line graph.
2. The method of claim 1, wherein before screening a plurality of graphs to be spliced in the multi-frame tunnel fusion graph based on the device acquisition position corresponding to the tunnel fusion graph, the method further comprises:
determining a preliminary equipment acquisition position corresponding to the tunnel fusion map according to an inertial navigation positioning technology;
and correcting and optimizing the initial equipment acquisition position corresponding to the tunnel fusion map according to the beacon positioning technology and combining the speed of the network inspection vehicle to obtain the equipment acquisition position corresponding to the tunnel fusion map.
3. The method of claim 1, wherein the selecting a plurality of graphs to be spliced in the multi-frame tunnel fusion graph based on the device acquisition location corresponding to the tunnel fusion graph comprises:
obtaining a screening interval;
and screening a plurality of pictures to be spliced from the multi-frame tunnel fusion pictures based on the equipment acquisition positions and the screening intervals corresponding to the tunnel fusion pictures.
4. The method of claim 1, wherein the matching the characteristic points of the tunnel planar object in each graph to be spliced and the matching the characteristic points of the tunnel linear object in each graph to be spliced to obtain the transformation matrix between the graphs to be spliced comprises:
extracting tunnel planar target feature points and tunnel linear target feature points from the images to be spliced;
matching the characteristic points of the tunnel planar targets on the pictures to be spliced, and matching the characteristic points of the tunnel linear targets on the pictures to be spliced to obtain a preliminary transformation matrix between the pictures to be spliced;
and obtaining a transformation matrix between the pictures to be spliced according to the preliminary transformation matrix.
5. The method of claim 4, wherein the matching the tunnel planar object at the feature points of the respective graphs to be spliced and the matching the tunnel linear object at the feature points of the respective graphs to be spliced to obtain the preliminary transformation matrix between the graphs to be spliced comprises:
matching the tunnel planar targets with the characteristic points of each graph to be spliced to obtain a transformation matrix to be optimized;
and optimizing the transformation matrix to be optimized according to the characteristic points of the tunnel linear targets in the images to be spliced to obtain a preliminary transformation matrix among the images to be spliced.
6. The method of claim 1, wherein the fusing the overlapping areas of the point clouds in the mosaic to obtain the tunnel map includes:
determining a point cloud overlapping area of the continuous multi-frame mosaic;
acquiring temperature data of the point cloud overlapping area in a continuous multi-frame splice graph;
and carrying out averaging on the temperature data of the point cloud overlapping region in the continuous multi-frame spliced graph to obtain the tunnel line graph.
7. The method according to claim 1, wherein after obtaining a temperature detection result of a tunnel device based on the tunnel map, the method further comprises:
determining a device of interest based on the detection information input by the user;
and feeding back a temperature detection result of the interested device to the user based on the tunnel line graph.
8. A tunnel equipment temperature detection apparatus, the apparatus comprising:
the to-be-spliced image screening module is used for screening a plurality of to-be-spliced images from the multi-frame tunnel fusion images based on the equipment acquisition positions corresponding to the tunnel fusion images; the tunnel fusion map is obtained by carrying out fusion processing on a tunnel point cloud map and a tunnel thermodynamic diagram of the same frame;
The transformation matrix acquisition module is used for matching the characteristic points of the tunnel planar targets in the pictures to be spliced and matching the characteristic points of the tunnel linear targets in the pictures to be spliced to obtain a transformation matrix between the pictures to be spliced;
the splicing diagram acquisition module is used for splicing a plurality of diagrams to be spliced based on the transformation matrix to obtain splicing diagrams;
the splicing diagram fusion module is used for fusing the point cloud overlapping areas in the splicing diagram to obtain a tunnel line diagram;
and the temperature acquisition module is used for acquiring a tunnel equipment temperature detection result based on the tunnel circuit diagram.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202410191026.2A 2024-02-21 Tunnel equipment temperature detection method, device, computer equipment and storage medium Active CN117745537B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410191026.2A CN117745537B (en) 2024-02-21 Tunnel equipment temperature detection method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410191026.2A CN117745537B (en) 2024-02-21 Tunnel equipment temperature detection method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN117745537A true CN117745537A (en) 2024-03-22
CN117745537B CN117745537B (en) 2024-05-17

Family

ID=

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117740186A (en) * 2024-02-21 2024-03-22 微牌科技(浙江)有限公司 Tunnel equipment temperature detection method and device and computer equipment
CN117740186B (en) * 2024-02-21 2024-05-10 微牌科技(浙江)有限公司 Tunnel equipment temperature detection method and device and computer equipment

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105550995A (en) * 2016-01-27 2016-05-04 武汉武大卓越科技有限责任公司 Tunnel image splicing method and system
CN108109112A (en) * 2018-01-16 2018-06-01 上海同岩土木工程科技股份有限公司 A kind of tunnel spread figure splicing parameter processing method based on Sift features
CN111627007A (en) * 2020-05-27 2020-09-04 电子科技大学 Spacecraft defect detection method based on self-optimization matching network image stitching
CN112308776A (en) * 2020-09-30 2021-02-02 香港理工大学深圳研究院 Method for solving occlusion and error mapping image sequence and point cloud data fusion
CN112767248A (en) * 2021-01-13 2021-05-07 深圳瀚维智能医疗科技有限公司 Infrared camera picture splicing method, device and equipment and readable storage medium
CN113112403A (en) * 2021-03-31 2021-07-13 国网山东省电力公司枣庄供电公司 Infrared image splicing method, system, medium and electronic equipment
WO2021184302A1 (en) * 2020-03-19 2021-09-23 深圳市大疆创新科技有限公司 Image processing method and apparatus, imaging device, movable carrier, and storage medium
CN113808098A (en) * 2021-09-14 2021-12-17 丰图科技(深圳)有限公司 Road disease identification method and device, electronic equipment and readable storage medium
WO2022095596A1 (en) * 2020-11-09 2022-05-12 Oppo广东移动通信有限公司 Image alignment method, image alignment apparatus and terminal device
CN115856829A (en) * 2023-02-06 2023-03-28 山东矩阵软件工程股份有限公司 Image data identification method and system for radar three-dimensional data conversion
CN116466350A (en) * 2023-04-23 2023-07-21 凌云光技术股份有限公司 Tunnel obstacle detection method and device
CN116579923A (en) * 2023-04-28 2023-08-11 广州海格通信集团股份有限公司 Image stitching method, device and storage medium
CN116958099A (en) * 2023-07-27 2023-10-27 微牌科技(浙江)有限公司 Cable abrasion detection method, system, device and computer equipment
CN117350919A (en) * 2022-06-28 2024-01-05 中兴通讯股份有限公司 Image fusion method, device and storage medium
CN117579753A (en) * 2024-01-16 2024-02-20 思看科技(杭州)股份有限公司 Three-dimensional scanning method, three-dimensional scanning device, computer equipment and storage medium

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105550995A (en) * 2016-01-27 2016-05-04 武汉武大卓越科技有限责任公司 Tunnel image splicing method and system
CN108109112A (en) * 2018-01-16 2018-06-01 上海同岩土木工程科技股份有限公司 A kind of tunnel spread figure splicing parameter processing method based on Sift features
WO2021184302A1 (en) * 2020-03-19 2021-09-23 深圳市大疆创新科技有限公司 Image processing method and apparatus, imaging device, movable carrier, and storage medium
CN111627007A (en) * 2020-05-27 2020-09-04 电子科技大学 Spacecraft defect detection method based on self-optimization matching network image stitching
CN112308776A (en) * 2020-09-30 2021-02-02 香港理工大学深圳研究院 Method for solving occlusion and error mapping image sequence and point cloud data fusion
WO2022095596A1 (en) * 2020-11-09 2022-05-12 Oppo广东移动通信有限公司 Image alignment method, image alignment apparatus and terminal device
CN112767248A (en) * 2021-01-13 2021-05-07 深圳瀚维智能医疗科技有限公司 Infrared camera picture splicing method, device and equipment and readable storage medium
CN113112403A (en) * 2021-03-31 2021-07-13 国网山东省电力公司枣庄供电公司 Infrared image splicing method, system, medium and electronic equipment
CN113808098A (en) * 2021-09-14 2021-12-17 丰图科技(深圳)有限公司 Road disease identification method and device, electronic equipment and readable storage medium
CN117350919A (en) * 2022-06-28 2024-01-05 中兴通讯股份有限公司 Image fusion method, device and storage medium
CN115856829A (en) * 2023-02-06 2023-03-28 山东矩阵软件工程股份有限公司 Image data identification method and system for radar three-dimensional data conversion
CN116466350A (en) * 2023-04-23 2023-07-21 凌云光技术股份有限公司 Tunnel obstacle detection method and device
CN116579923A (en) * 2023-04-28 2023-08-11 广州海格通信集团股份有限公司 Image stitching method, device and storage medium
CN116958099A (en) * 2023-07-27 2023-10-27 微牌科技(浙江)有限公司 Cable abrasion detection method, system, device and computer equipment
CN117579753A (en) * 2024-01-16 2024-02-20 思看科技(杭州)股份有限公司 Three-dimensional scanning method, three-dimensional scanning device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
冷鹏博: "基于几何标定和特征检测的隧道图像拼接方法", 硕士电子期刊, 15 January 2018 (2018-01-15) *
王照远;曹民;王毅;吴伟迪;李陶胜;: "场景与数据双驱动的隧道图像拼接方法", 湖北工业大学学报, no. 04, 15 August 2020 (2020-08-15) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117740186A (en) * 2024-02-21 2024-03-22 微牌科技(浙江)有限公司 Tunnel equipment temperature detection method and device and computer equipment
CN117740186B (en) * 2024-02-21 2024-05-10 微牌科技(浙江)有限公司 Tunnel equipment temperature detection method and device and computer equipment

Similar Documents

Publication Publication Date Title
Rupnik et al. MicMac–a free, open-source solution for photogrammetry
Wu et al. Integration of aerial oblique imagery and terrestrial imagery for optimized 3D modeling in urban areas
CN108369743B (en) Mapping a space using a multi-directional camera
US9483703B2 (en) Online coupled camera pose estimation and dense reconstruction from video
JP6057298B2 (en) Rapid 3D modeling
JP4685313B2 (en) Method for processing passive volumetric image of any aspect
CN110927708B (en) Calibration method, device and equipment of intelligent road side unit
US10636168B2 (en) Image processing apparatus, method, and program
US20100020074A1 (en) Method and apparatus for detecting objects from terrestrial based mobile mapping data
US20110110557A1 (en) Geo-locating an Object from Images or Videos
US20170352163A1 (en) Method and system for determining cells traversed by a measuring or visualization axis
Zhang et al. Leveraging vision reconstruction pipelines for satellite imagery
US8264537B2 (en) Photogrammetric networks for positional accuracy
KR20210119417A (en) Depth estimation
CN112862890B (en) Road gradient prediction method, device and storage medium
CN112396640A (en) Image registration method and device, electronic equipment and storage medium
CN112233246B (en) Satellite image dense matching method and system based on SRTM constraint
US20210264196A1 (en) Method, recording medium and system for processing at least one image, and vehicle including the system
US20170278268A1 (en) Systems, Methods, and Devices for Generating Three-Dimensional Models
CN114898044B (en) Imaging method, device, equipment and medium for detection object
Ren et al. Automated SAR reference image preparation for navigation
Yoo et al. True orthoimage generation by mutual recovery of occlusion areas
CN117740186B (en) Tunnel equipment temperature detection method and device and computer equipment
CN117745537B (en) Tunnel equipment temperature detection method, device, computer equipment and storage medium
CN107765257A (en) A kind of laser acquisition and measuring method based on the calibration of reflected intensity accessory external

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant