CN117740186B - Tunnel equipment temperature detection method and device and computer equipment - Google Patents

Tunnel equipment temperature detection method and device and computer equipment Download PDF

Info

Publication number
CN117740186B
CN117740186B CN202410191391.3A CN202410191391A CN117740186B CN 117740186 B CN117740186 B CN 117740186B CN 202410191391 A CN202410191391 A CN 202410191391A CN 117740186 B CN117740186 B CN 117740186B
Authority
CN
China
Prior art keywords
class
tunnel
point
point cloud
temperature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410191391.3A
Other languages
Chinese (zh)
Other versions
CN117740186A (en
Inventor
穆阳
张文祥
梁耀聪
刘志州
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microbrand Technology Zhejiang Co ltd
Original Assignee
Microbrand Technology Zhejiang Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microbrand Technology Zhejiang Co ltd filed Critical Microbrand Technology Zhejiang Co ltd
Priority to CN202410191391.3A priority Critical patent/CN117740186B/en
Publication of CN117740186A publication Critical patent/CN117740186A/en
Application granted granted Critical
Publication of CN117740186B publication Critical patent/CN117740186B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Radiation Pyrometers (AREA)

Abstract

The application relates to the technical field of tunnel equipment monitoring, and provides a tunnel equipment temperature detection method, a tunnel equipment temperature detection device and computer equipment. The method comprises the following steps: carrying out fusion processing on the same-frame tunnel point cloud images and the tunnel thermodynamic diagrams to obtain multi-frame tunnel fusion images; based on the multi-frame tunnel fusion map, detecting the temperature of equipment in the tunnel; when fusion processing is carried out, based on the temperature values of one type of thermodynamic points of one type of areas in each group, obtaining the temperature values of one type of real point clouds of one type of areas; performing point cloud filling on the two-class areas based on the temperature value distribution condition of the two-class thermodynamic points of the two-class areas in each group to obtain two-class mixed point clouds, and obtaining point clouds to be mapped based on the two-class mixed point clouds and one-class real point clouds in the same group; and carrying out multi-scale mapping on the point cloud to be mapped and the two kinds of heating points to obtain temperature calibration values of the two kinds of heating points, and obtaining the temperature values of the two kinds of mixing point clouds according to the temperature calibration values. By adopting the method, the accuracy of the temperature detection result can be improved.

Description

Tunnel equipment temperature detection method and device and computer equipment
Technical Field
The present application relates to the field of tunnel equipment monitoring technology, and in particular, to a tunnel equipment temperature detection method, apparatus, computer equipment, storage medium, and computer program product.
Background
The temperature detection of the tunnel equipment can help to ensure the safe operation of the tunnel equipment, ensure the comfort level of the internal environment of the tunnel, and discover and prevent potential safety hazards in advance, so that the temperature detection of the tunnel equipment is necessary.
In the case of temperature detection of the tunnel device, radar scanning techniques and thermal imaging techniques can be used. However, the radar scan interval region belongs to a radar blind area, which affects the accuracy of the temperature detection result.
Disclosure of Invention
Based on this, it is necessary to provide a tunnel device temperature detection method, apparatus, computer device, computer readable storage medium and computer program product in order to solve the above technical problems.
In a first aspect, the present application provides a method for detecting a temperature of a tunnel device, including:
Acquiring a multi-frame tunnel point cloud image acquired by a radar and a multi-frame tunnel thermodynamic diagram acquired by a thermal imager;
Based on fusion processing of the same-frame tunnel point cloud image and the tunnel thermodynamic diagram, a multi-frame tunnel fusion image is obtained;
Based on the multi-frame tunnel fusion map, obtaining a temperature detection result of the tunnel equipment;
When fusion processing is carried out on the same-frame tunnel point cloud images and tunnel thermodynamic diagrams, temperature values of a class of real point clouds of a class of areas are obtained for the class of areas in each group based on the temperature values of the class of thermodynamic points of the class of areas; aiming at the two-class areas in each group, filling point cloud of the two-class areas based on the temperature value distribution condition of the two-class thermal points of the two-class areas to obtain two-class mixing point cloud, and obtaining point cloud to be mapped based on the two-class mixing point cloud and one kind of real point cloud of the one-class area in the same group; performing multi-scale mapping on the point cloud to be mapped and the two types of heating points to obtain temperature calibration values of the two types of heating points, and obtaining the temperature values of the two types of mixing point clouds according to the temperature calibration values of the two types of heating points; the first-class area and the second-class area in the same group have common radar waves; the two types of areas are areas where radar waves and thermal imaging waves coincide, and the two types of areas are areas formed by adjacent radar waves with the thermal imaging waves between the two types of areas.
In one embodiment, the obtaining the temperature value of the class of real point clouds of the class of regions based on the temperature value of the class of thermal points of the class of regions includes:
performing space coordinate registration operation on the same-frame tunnel point cloud graph and the tunnel thermodynamic diagram to determine initial temperature values of a class of real point clouds of the class of regions based on the temperature values of a class of thermodynamic points of the class of regions;
And calibrating the initial temperature value of the class of real point clouds according to the difference between the distance value corresponding to the class of thermal points and the distance value corresponding to the class of real point clouds to obtain the temperature value of the class of real point clouds.
In one embodiment, the performing point cloud filling on the second-class area based on the temperature value distribution condition of the second-class thermal points of the second-class area to obtain a second-class mixing point cloud includes:
dividing the second-class region into a plurality of subareas based on the temperature value distribution condition of the second-class thermodynamic points of the second-class region;
Determining the filling density of each subarea according to the distance value corresponding to the second-class real point cloud;
according to the filling density of each subarea, respectively filling the subareas with point clouds to obtain second-class virtual point clouds;
and obtaining the second-class mixed point cloud according to the second-class real point cloud and the second-class virtual point cloud.
In one embodiment, the dividing the second-class area into a plurality of sub-areas based on the temperature value distribution condition of the second-class thermal points of the second-class area includes:
Dividing a second-class area where the second-class thermal points are located, wherein the temperature value change is not large and the distribution is uniform, in an arbitrary rectangular range into a gentle area to be filled;
dividing a second-class area where the second-class heating points with temperature value distribution conforming to the same-direction abrupt change condition are located into gradient areas to be filled;
Dividing a second-class area of the bulge or the recess where the second-class heating power points with temperature value distribution conforming to the multi-directional mutation condition are located into convex-concave areas to be filled.
In one embodiment, the performing multi-scale mapping on the point cloud to be mapped and the second class of thermal points to obtain a temperature calibration value of the second class of thermal points includes:
For each two-class thermodynamic point, acquiring an initial mapping area of the two-class thermodynamic point; the initial mapping area of the two-class thermodynamic point is an area where a corresponding two-class virtual point cloud is located based on each two-class thermodynamic point when the two-class thermodynamic point is filled with point clouds;
Dividing the initial mapping region according to different scales to obtain the mapping region of each two classes of thermodynamic points under each scale;
Mapping each two classes of thermodynamic points with the point cloud of the point cloud to be mapped in the mapping area under the corresponding different scales;
and aiming at each two-class thermodynamic point, calibrating the temperature values of the two-class thermodynamic point based on the distance and the temperature value corresponding to the point cloud with the mapping of the two-class thermodynamic point, and obtaining the temperature calibration value of the two-class thermodynamic point.
In one embodiment, the performing area division on the initial mapping area according to different scales to obtain a mapping area of each two types of thermodynamic points under each scale includes:
Sliding along four sides of the initial mapping area based on windows with different scales to obtain the mapping area under each scale; the mapping region comprises part of second-class virtual point clouds of the initial mapping region and real point clouds around the initial mapping region.
In one embodiment, the obtaining a temperature detection result of the tunnel device based on the multi-frame tunnel fusion map includes:
Acquiring a device temperature detection result of each frame of tunnel fusion map;
splicing the tunnel fusion graphs of each frame to obtain the tunnel line graph;
Determining a tunnel fusion map of adjacent frames in the tunnel line map;
And fusing the device temperature detection results of the adjacent frame tunnel fusion graphs to obtain the temperature detection results of the tunnel devices.
In one embodiment, the obtaining the device temperature detection result of the tunnel fusion map of each frame includes:
Dividing the areas of each frame of tunnel fusion map according to the echo intensity of the point cloud in each frame of tunnel fusion map and the change condition of the temperature value, and obtaining each area;
Determining equipment corresponding to each area;
positioning the mark surface of the equipment to obtain point sets and particle points of different surfaces of the equipment;
And obtaining poles of the equipment according to the point sets and the mass points of different surfaces of the equipment, and performing temperature marking operation to obtain an equipment temperature detection result of each frame of tunnel fusion map.
In a second aspect, the present application further provides a tunnel device temperature detection apparatus, including:
The image acquisition module is used for acquiring a multi-frame tunnel point cloud image acquired by a radar and a multi-frame tunnel thermodynamic diagram acquired by a thermal imager;
The image fusion module is used for obtaining a multi-frame tunnel fusion map based on fusion processing of the same-frame tunnel point cloud map and the tunnel thermodynamic diagram;
the temperature detection module is used for obtaining a temperature detection result of the tunnel equipment based on the multi-frame tunnel fusion map;
The image fusion module is further used for obtaining a temperature value of a class of real point clouds of the class of areas based on the temperature value of a class of thermodynamic points of the class of areas aiming at the class of areas in each group when fusion processing is carried out on the same-frame tunnel point cloud images and tunnel thermodynamic diagrams; aiming at the two-class areas in each group, filling point cloud of the two-class areas based on the temperature value distribution condition of the two-class thermal points of the two-class areas to obtain two-class mixing point cloud, and obtaining point cloud to be mapped based on the two-class mixing point cloud and one kind of real point cloud of the one-class area in the same group; performing multi-scale mapping on the point cloud to be mapped and the two types of heating points to obtain temperature calibration values of the two types of heating points, and obtaining the temperature values of the two types of mixing point clouds according to the temperature calibration values of the two types of heating points; the first-class area and the second-class area in the same group have common radar waves; the two types of areas are areas where radar waves and thermal imaging waves coincide, and the two types of areas are areas formed by adjacent radar waves with the thermal imaging waves between the two types of areas.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor executing the method described above.
In a fourth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium has stored thereon a computer program which is executed by a processor to perform the above method.
In a fifth aspect, the present application also provides a computer program product. The computer program product comprises a computer program which is executed by a processor to perform the above method.
The method, the device, the computer equipment, the storage medium and the computer program product for detecting the temperature of the tunnel equipment acquire multi-frame tunnel point cloud pictures acquired by a radar and multi-frame tunnel thermodynamic diagrams acquired by a thermal imager; based on fusion processing of the same-frame tunnel point cloud image and the tunnel thermodynamic diagram, a multi-frame tunnel fusion image is obtained; based on the multi-frame tunnel fusion map, detecting the temperature of equipment in the tunnel; when fusion processing is carried out on the tunnel point cloud images and the tunnel thermodynamic diagrams in the same frame, aiming at one type of region in each group, obtaining the temperature value of one type of real point cloud of the one type of region based on the temperature value of one type of thermodynamic point of the one type of region; aiming at the two-class areas in each group, filling the two-class areas with point clouds based on the temperature value distribution condition of the two-class thermal points of the two-class areas to obtain two-class mixing point clouds, and obtaining point clouds to be mapped based on the two-class mixing point clouds and one-class real point clouds of the one-class areas in the same group; performing multi-scale mapping on the point cloud to be mapped and the two kinds of heating points to obtain temperature calibration values of the two kinds of heating points, and obtaining temperature values of the two kinds of mixing point clouds according to the temperature calibration values of the two kinds of heating points; the first-class area and the second-class area in the same group have common radar waves; the first-class area is an area where radar waves and thermal imaging waves coincide, and the second-class area is an area formed by adjacent radar waves with thermal imaging waves between the first-class area. When the scheme is used for carrying out fusion processing on the point cloud diagrams of the tunnels in the same frame and the thermodynamic diagrams of the tunnels, aiming at the two kinds of areas in each group, carrying out point cloud filling on the two kinds of areas based on the temperature value distribution condition of the two kinds of thermodynamic points of the two kinds of areas to obtain two kinds of mixed point clouds, and obtaining point clouds to be mapped based on the two kinds of mixed point clouds and one kind of real point clouds of the one kind of areas in the same group; the method comprises the steps of carrying out multi-scale mapping on point clouds to be mapped and two kinds of heating points to obtain temperature calibration values of the two kinds of heating points, obtaining temperature values of two kinds of mixing point clouds according to the temperature calibration values of the two kinds of heating points, enriching point cloud data of a tunnel fusion map, reducing influence of radar blind areas on accuracy of temperature detection results, and improving accuracy of the temperature detection results.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the related art, the drawings that are required to be used in the embodiments or the related technical descriptions will be briefly described, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to the drawings without inventive effort for those skilled in the art.
FIG. 1 is an application environment diagram of a tunnel device temperature detection method in one embodiment;
FIG. 2 is a flow chart of a method for detecting tunnel equipment temperature in one embodiment;
FIG. 3 is a schematic diagram of a radar and thermal imager mounting structure in one embodiment;
FIG. 4 is a schematic view of a region in one embodiment;
FIG. 5 is a flow chart of a method for detecting temperature of a tunnel device according to another embodiment;
FIG. 6 is a block diagram of a tunnel equipment temperature sensing device in one embodiment;
fig. 7 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the described embodiments of the application may be combined with other embodiments.
The embodiment of the application provides a method for detecting the temperature of tunnel equipment, which can be executed by computer equipment, as shown in fig. 1, the computer equipment can acquire multi-frame tunnel point cloud pictures acquired by a radar and multi-frame tunnel thermodynamic diagrams acquired by a thermal imager; based on fusion processing of the same-frame tunnel point cloud image and the tunnel thermodynamic diagram, a multi-frame tunnel fusion image is obtained; based on the multi-frame tunnel fusion map, detecting the temperature of equipment in the tunnel; when fusion processing is carried out on the tunnel point cloud images and the tunnel thermodynamic diagrams in the same frame, aiming at one type of region in each group, obtaining the temperature value of one type of real point cloud of the one type of region based on the temperature value of one type of thermodynamic point of the one type of region; aiming at the two-class areas in each group, filling the two-class areas with point clouds based on the temperature value distribution condition of the two-class thermal points of the two-class areas to obtain two-class mixing point clouds, and obtaining point clouds to be mapped based on the two-class mixing point clouds and one-class real point clouds of the one-class areas in the same group; performing multi-scale mapping on the point cloud to be mapped and the two kinds of heating points to obtain temperature calibration values of the two kinds of heating points, and obtaining temperature values of the two kinds of mixing point clouds according to the temperature calibration values of the two kinds of heating points; the first-class area and the second-class area in the same group have common radar waves; the first-class area is an area where radar waves and thermal imaging waves coincide, and the second-class area is an area formed by adjacent radar waves with thermal imaging waves between the first-class area. It will be appreciated that the computer device may be implemented by a server, a terminal, or an interactive system between the terminal and the server. In this embodiment, the method includes the steps shown in fig. 2:
Step S201, a multi-frame tunnel point cloud image acquired by a radar and a multi-frame tunnel thermodynamic diagram acquired by a thermal imager are acquired.
In the application, the radar and the thermal imager can be placed at 5cm intervals up and down, the main vision lines of the visual angles are vertically aligned, the coincidence degree of detection key points of the radar and the thermal imager can be highest, the non-overlapping area is smaller, the invalid area is reduced to the minimum, the later fusion speed and accuracy are accelerated, and the specific installation structure is shown in figure 3. The radar may be a 3D lidar and the thermal imager may be a non-cryogenic array thermal imager.
Through the radar and the thermal imager in the installation mode, multi-frame tunnel point cloud images and multi-frame tunnel thermodynamic diagrams can be obtained.
Step S202, based on fusion processing of the same-frame tunnel point cloud image and the tunnel thermodynamic diagram, a multi-frame tunnel fusion image is obtained. When fusion processing is carried out on the tunnel point cloud images and the tunnel thermodynamic diagrams in the same frame, aiming at one type of region in each group, obtaining the temperature value of one type of real point cloud of the one type of region based on the temperature value of one type of thermodynamic point of the one type of region; aiming at the two-class areas in each group, filling the two-class areas with point clouds based on the temperature value distribution condition of the two-class thermal points of the two-class areas to obtain two-class mixing point clouds, and obtaining point clouds to be mapped based on the two-class mixing point clouds and one-class real point clouds of the one-class areas in the same group; performing multi-scale mapping on the point cloud to be mapped and the two kinds of heating points to obtain temperature calibration values of the two kinds of heating points, and obtaining temperature values of the two kinds of mixing point clouds according to the temperature calibration values of the two kinds of heating points; the first-class area and the second-class area in the same group have common radar waves; the first-class area is an area where radar waves and thermal imaging waves coincide, and the second-class area is an area formed by adjacent radar waves with thermal imaging waves between the first-class area.
As shown in fig. 4, the region where the radar wave and the thermal imaging wave overlap is referred to as a first-class region, the region where the thermal imaging wave exists between adjacent radar waves is referred to as a second-class region, and the first-class region and the second-class region having the radar wave in common are regarded as a group.
For distinguishing, the real point cloud in the first type of area is called as a first type of real point cloud, and the thermodynamic point of the first type of area is called as a first type of thermodynamic point; the real point cloud in the second-class area is called a second-class real point cloud, and the thermodynamic point of the second-class area is called a second-class thermodynamic point.
Performing external parameter calibration on the radar and the thermal imager, and performing space coordinate registration operation according to the superposition characteristics of the main view and the reference point of the radar and the thermal imager, so that one type of real point cloud of one type of region basically corresponds to one type of heating point of the upper type of region, and increasing the temperature value and the radiation value of one type of heating point on the parameter values of four-dimensional coordinates of one type of real point cloud aiming at one type of region in each group, thereby obtaining the temperature value of one type of real point cloud of the one type of region, wherein the four-dimensional coordinates of the one type of real point cloud are space abscissa, space ordinate and echo intensity respectively.
Dividing the two-class area into a gentle area, a gradient area and a convex-concave area based on the temperature value distribution condition of the two-class thermodynamic points in each group, determining the filling density of each area according to the distance value corresponding to the two-class real point cloud, filling the two-class area with the point cloud according to the filling density of each area to obtain the two-class mixed point cloud, and obtaining the point cloud to be mapped based on the two-class mixed point cloud and the one-class real point cloud of the one-class area in the same group; the method comprises the steps of carrying out multi-scale mapping on a point cloud to be mapped and a second-class heating point to obtain temperature calibration values of the second-class heating point, and mapping the second-class heating point and the second-class mixing point cloud, so that the temperature values of the second-class mixing point cloud are obtained according to the temperature calibration values of the second-class heating point.
Step S203, based on the multi-frame tunnel fusion map, a temperature detection result of the tunnel equipment is obtained.
Dividing the areas of each frame of tunnel fusion map according to the echo intensity of the point cloud in each frame of tunnel fusion map and the change condition of the temperature value, and obtaining each area; determining equipment corresponding to each area; positioning the mark surface of the equipment to obtain point sets and particle points of different surfaces of the equipment; and obtaining poles of the equipment according to the point sets and the mass points of different surfaces of the equipment, and performing temperature marking operation to obtain an equipment temperature detection result of each frame of tunnel fusion map. Splicing each frame of tunnel fusion map to obtain a tunnel line map; determining a tunnel fusion map of adjacent frames in the tunnel line map; and fusing the device temperature detection results of the tunnel fusion graphs of the adjacent frames to obtain the temperature detection results of the tunnel devices.
In the tunnel equipment temperature detection method, when fusion processing is carried out on the point cloud images of the tunnels in the same frame and the tunnel thermodynamic diagrams, aiming at the two kinds of areas in each group, filling the two kinds of areas with point cloud based on the temperature value distribution condition of the two kinds of thermodynamic points of the two kinds of areas to obtain two kinds of mixed point cloud, and obtaining point cloud to be mapped based on the two kinds of mixed point cloud and one kind of true point cloud of the one kind of areas in the same group; the method comprises the steps of carrying out multi-scale mapping on point clouds to be mapped and two kinds of heating points to obtain temperature calibration values of the two kinds of heating points, obtaining temperature values of two kinds of mixing point clouds according to the temperature calibration values of the two kinds of heating points, enriching point cloud data of a tunnel fusion map, reducing influence of radar blind areas on accuracy of temperature detection results, and improving accuracy of the temperature detection results. According to the method provided by the application, through the fusion recognition technology of the radar and the thermal imager, the single characteristic of the tunnel equipment in the sensor is amplified, and the superposition of multiple sensors can generate the characteristic of the third mode, so that the multi-angle recognition possibility is realized, the limitation of a barrier of temperature detection and a radar scanning interval is broken through, and a good effect is obtained in actual use. The tunnel point cloud image identification belongs to one mode, the tunnel thermodynamic diagram belongs to one mode, a third mode, namely an original composite mode, is generated after the tunnel point cloud image identification and the tunnel thermodynamic diagram are fused, the tunnel fusion diagram after the tunnel point cloud image identification and the tunnel thermodynamic diagram are fused is a result with the characteristics of the three modes, the characteristics of the targets are more obvious, the defects of the sensors are mutually compensated, the calculation complexity of the single sensor is reduced, and the accuracy of the temperature detection result is improved.
In one embodiment, based on the temperature values of a class of thermodynamic points of a class of regions, the temperature values of a class of real point clouds of the class of regions are obtained, and the specific steps are as follows: carrying out space coordinate registration operation on the tunnel point cloud map and the tunnel thermodynamic diagram in the same frame to determine initial temperature values of a class of real point clouds of a class of areas based on temperature values of a class of thermodynamic points of the class of areas; and calibrating the initial temperature value of the real point cloud according to the difference between the distance value corresponding to the thermal point and the distance value corresponding to the real point cloud to obtain the temperature value of the real point cloud.
Performing external parameter calibration on the radar and the thermal imager, performing spatial coordinate registration operation according to the superposition characteristics of the main view and the reference point of the radar and the thermal imager, enabling a class of real point clouds of a class of areas to basically correspond to a class of heating points of a class of areas, and adding the temperature value and the radiation value of the class of heating points to the parameter values of four-dimensional coordinates of the class of real point clouds, so as to obtain the initial temperature value of the class of real point clouds of the class of areas. Obtaining a distance difference value according to the difference between the distance value corresponding to the type of thermal point and the distance value corresponding to the type of real point cloud, substituting the distance difference value into a temperature correction algorithm to obtain a corrected offset value, and calibrating the initial temperature value of the type of real point cloud according to the offset value to obtain a more accurate temperature value of the type of real point cloud.
In this embodiment, based on the temperature values of the first type of thermal points of the first type of regions, an initial temperature value of the first type of real point clouds of the first type of regions is obtained, and according to the difference between the distance value corresponding to the first type of thermal points and the distance value corresponding to the first type of real point clouds, the initial temperature value of the first type of real point clouds is calibrated, so as to obtain a more accurate temperature value of the first type of real point clouds.
In one embodiment, based on the temperature value distribution condition of the two-class thermodynamic points of the two-class area, the two-class area is filled with point clouds to obtain two-class mixing point clouds, and the specific steps are as follows: dividing the second-class area into a plurality of subareas based on the temperature value distribution condition of the second-class thermodynamic points of the second-class area; determining the filling density of each subarea according to the distance value corresponding to the second-class real point cloud; according to the filling density of each subarea, respectively filling the subareas with point cloud to obtain second-class virtual point cloud; and obtaining a second-class mixed point cloud according to the second-class real point cloud and the second-class virtual point cloud.
And carrying out gradient identification on the second-class area through an edge detection image algorithm to obtain the temperature value distribution condition of the second-class thermal points of the second-class area. Dividing the two-class area into three sub-areas of a gentle area, a convex-concave area and a gradient area based on the temperature value distribution condition of the two-class thermodynamic points of the two-class area; determining the filling density of each subarea according to the distance value corresponding to the second-class real point cloud, wherein the filling densities of the subareas are different from each other according to the distance value; and selecting a corresponding filling algorithm to fill the sub-areas with point clouds respectively according to the filling density of the sub-areas, wherein the filling method can be a parity scan conversion algorithm based on a bucket model and mainly fills in different modes according to the local characteristics of temperature value distribution. The second-class virtual point cloud can be obtained by performing continuous non-repeated region mapping according to the second-class virtual point cloud quantity and the second-class thermal point and then performing proportional filling; and obtaining a second-class mixed point cloud according to the second-class real point cloud and the second-class virtual point cloud.
If the irradiation target is a television, the screen in the middle of the television heats uniformly, and the edge detection image algorithm is used for identifying that the middle area of the television belongs to a gentle area, the edge and background area of the television belongs to a gradient area with abrupt temperature change, and the power indicator area of the television belongs to a convex-concave area with concave temperature in the central area. In the filling process, firstly, demarcating is conducted according to thermodynamic diagram characteristics of the two kinds of areas (the demarcation of the gentle area is a rectangle, the demarcation of the gradient area is a line segment, the demarcation of the convex-concave area is a closed curve), the filling density of each subarea is determined according to the distance value corresponding to the two kinds of real point clouds, and then the filling position is determined according to an algorithm.
The gentle area is defined according to the total temperature value, and the plane is uniformly filled by using an algorithm; the gradient region is defined according to the boundary, and a pole linear filling algorithm is used to enable the filled second-class virtual point clouds to be more attached to the boundary line, so that the filled positions of the second-class virtual point clouds can represent the sharpness of the object more; the convex-concave area is defined according to the limit of the closed curve, and a multi-stage gradient topography filling algorithm is used to enable the edge of the irregular limit to form a pair of contour lines similar to a topography, the intervals among the contour lines are different, and the filling densities are different.
The gentle region filling is mainly used for processing the region with smaller temperature value variation in the second-class region, and the filling is uniformly-spaced in the rectangular region. The implementation steps are as follows: ① Areas which are continuous and have small changes in temperature values (or gray values) are identified from the data of the thermal imager, and these areas can be regarded as flat areas. ② And in a blank area between radar scanning lines, generating a second-class virtual point cloud according to real point cloud data on adjacent radar scanning lines and thermal point data interpolation of a thermal imager. ③ Linear interpolation, bilinear interpolation, or other suitable interpolation methods may be employed to estimate the second class virtual point cloud data for the blank region.
Assume a two-dimensional image I (x, y), where x and y are pixel coordinates and I (x, y) is the corresponding temperature value. Bilinear interpolation may be used to estimate the second class virtual point cloud locations of the empty regions between radar scan lines. The bilinear interpolation formula is as follows:
Code I(x',y')=(1-α)(1-β)I(x,y)+α(1-β)I(x+1,y)+(1-α)βI(x,y+1)+αβI(x+1,y+1)
Where (x ', y') is a point where interpolation is required, (x, y), (x+1, y), (x, y+1), and (x+1, y+1) are four known pixel points around it, and α= (x '-x) and β= (y' -y) are interpolation parameters.
The convex-concave area filling is used for processing the area with obvious fluctuation change of the temperature value in the closed loop area of the second-class area, and is simply understood as filling of the inner and outer ring change area, and belongs to tight filling at the boundary line. The implementation steps are as follows: ① Edges or feature points with large changes in temperature values (or gray values) are identified from the thermal imager data, which may be convex or concave boundaries. ② More complex interpolation methods, such as spline interpolation, radial basis function interpolation, are used to better capture and model the shape of the convex-concave region. ③ And generating second-class virtual point clouds in a blank area between radar scanning lines according to the interpolation methods, and ensuring that the second-class virtual point clouds can accurately reflect the fluctuation change of the terrain. Spline interpolation or radial basis function interpolation may be used for convex-concave regions to better capture and simulate the shape of the terrain. For spline interpolation, a suitable spline function (e.g., cubic spline, B-spline) can be selected and the following objective functions are minimized:
E(s)=∑(wi*(s(xi)-I(xi))2
Where s (x) is a spline function, I (x) is an input temperature value, x i is the coordinates of the data point, and w i is the weight.
The gradient region filling is a filling method based on non-closed-loop linear region gradient information, and the filling of the left and right change regions is simply understood. The implementation steps are as follows: ① Calculating gradient information for each point in the thermal imager data may be accomplished by comparing temperature (or gray scale) differences for adjacent pixels. ② And in the blank area between the radar scanning lines, deducing gradient distribution of the blank area according to known radar real point cloud data and gradient information of the thermal imager. ③ And generating a second-class virtual point cloud conforming to gradient characteristics according to gradient distribution and a possible surface model (such as a plane, a cone, an ellipsoid and the like). ④ An iterative optimization process may be required to ensure that the generated second class virtual point cloud matches the slope characteristics of the actual terrain.
Assuming that gradient information G (x, y) for each point in the thermal imager data has been calculated, this can be achieved by the Sobel (Sobel) operator, the plavit (Prewitt) operator, or other edge detection operator.
The gradient distribution of the blank area can be deduced according to the known radar real point cloud data and gradient information of the thermal imager. This can be achieved by constructing an optimization problem, for example:
mins∑(wi*(Gs(xi,yi)-G(xi,yi))2)+λ*∥2s.t.zradar(xj,yj)=s(xj,yj),/>(xj,yj)∈radarpoints
Where s (x, y) is the surface height function to be estimated, z radar (x, y) is the height data of Lei Dazhen real point cloud, G s (x, y) is the gradient calculated from s (x, y), G (x, y) is the gradient data of the thermal imager, w i is the weight (optional for emphasizing the importance of certain points), λ is the regularization parameter, Is a gradient of s (x, y), and radar points is the set of coordinates of the Lei Dazhen real point cloud.
In this embodiment, based on the temperature value distribution condition of the two-class thermodynamic points in the two-class region, the two-class region is subjected to point cloud filling to obtain two-class mixing point cloud.
In one embodiment, based on the temperature value distribution condition of the two-class thermodynamic points of the two-class region, the two-class region is divided into a plurality of sub-regions, and the specific steps are as follows: dividing a second-class area where second-class heating points with small temperature value change and uniform distribution in any rectangular range are located into a gentle area to be filled; dividing a second-class area where the second-class thermal points with temperature value distribution conforming to the same-direction abrupt change condition are located into gradient areas to be filled; dividing a second-class area of the bulge or the recess where the second-class heating power points with the temperature value distribution conforming to the multi-direction mutation situation are located into convex-concave areas to be filled.
Gradient identification is carried out on the two-class area through an edge detection image algorithm, so that the temperature value distribution condition of the two-class thermodynamic points of the two-class area is obtained, and the two-class area where the two-class thermodynamic points with small temperature value change and uniform distribution in any rectangular range are located is divided into gentle areas to be filled; dividing a second-class area where the second-class thermal points with temperature value distribution conforming to the same-direction abrupt change condition are located into gradient areas to be filled; dividing a second-class area of the bulge or the recess where the second-class heating power points with the temperature value distribution conforming to the multi-direction mutation situation are located into convex-concave areas to be filled.
In this embodiment, the second-class region is divided into a plurality of sub-regions according to the temperature value distribution condition of the second-class thermal points based on the second-class region.
In one embodiment, the multi-scale mapping is performed on the point cloud to be mapped and the second class of thermodynamic points to obtain the temperature calibration value of the second class of thermodynamic points, and the specific steps are as follows: for each two-class thermodynamic point, acquiring an initial mapping area of the two-class thermodynamic point; the initial mapping area of the two-class thermodynamic point is an area where the corresponding two-class virtual point cloud is located based on each two-class thermodynamic point when the two-class area is filled with point clouds; according to different scales, carrying out region division on the initial mapping region to obtain the mapping region of each two classes of thermodynamic points under each scale; mapping each two types of thermodynamic points and point clouds to be mapped in mapping areas under different corresponding scales; and aiming at each two-class thermodynamic point, calibrating the temperature value of the two-class thermodynamic point based on the distance and the temperature value corresponding to the point cloud with the mapping of the two-class thermodynamic point, and obtaining the temperature calibration value of the two-class thermodynamic point.
For each two-class thermodynamic point, acquiring an initial mapping area of the two-class thermodynamic point; the initial mapping area of the two-class thermodynamic point is an area where the corresponding two-class virtual point cloud is located based on each two-class thermodynamic point when the two-class area is filled with point clouds. For each scale, obtaining a window corresponding to each scale, designing different window sizes according to a random sampling consistency (RANdom SAmple Consensus, RANSAC) algorithm, and defining the sliding range of the window to be required to be sliding-discarded along the window boundary and the central area; dividing the point cloud by a multi-scale algorithm based on Octree (Octree), organizing the point cloud into detail areas of different levels, and obtaining a mapping area under each scale based on sliding of a window on the point cloud to be mapped. Mapping each two types of thermodynamic points with point clouds in mapping areas under different scales, and solving an average value; and for each two-class thermodynamic point, carrying out reverse projection or reverse mapping through the hierarchical information of the octree based on the distance and the temperature value corresponding to the point cloud with the mapping of the two-class thermodynamic point, and calibrating the temperature value of the two-class thermodynamic point to obtain the temperature calibration value of the two-class thermodynamic point.
The method comprises the steps of screening 9 class-II virtual point clouds in a partial area in 16 class-II virtual point clouds, adding 7 peripheral class-II real point clouds, recombining the range of 16 class-II point clouds, and then mapping the temperature values of 4 class-II thermal points based on the distance and the temperature values corresponding to the 16 class-II point clouds with mapping relation with the 4 class-II thermal points. The temperature values of the 4 class-II heating points obtained at this time are more accurate than the original temperature values.
If the ratio of the thermodynamic point to the real point cloud of a tunnel thermodynamic diagram to a tunnel point cloud diagram is 500:300, the total number of points in one-to-one correspondence of one type of real point cloud to one type of thermodynamic point is 100, and filling and multi-scale mapping are not needed for the hundred types of real point clouds and temperature values. The ratio of the second class of thermodynamic points to the second class of real point clouds is therefore 400:200, after 1400 kinds of second-class virtual point clouds are filled, the proportion of the second-class thermodynamic point to the second-class mixing point cloud is 400:1600, at this time, depending on the number 300 of real point clouds (100 is the number of real point clouds of one type corresponding to one type), fusing 300 real point cloud data into 1300 two types of virtual point clouds, performing region division and multi-scale mapping to obtain more accurate 400 temperature values, and finally obtaining a tunnel fusion map of 1700 point clouds and temperature data, wherein the tunnel fusion map is a 6D data map.
In this embodiment, the point cloud to be mapped and the second class of thermal points are mapped in a multi-scale manner, so as to obtain the temperature calibration value of the second class of thermal points.
In one embodiment, the initial mapping area is divided into areas according to different scales to obtain the mapping area of each two types of thermodynamic points under each scale, and the specific steps are as follows: sliding along four sides of the initial mapping area based on windows with different scales to obtain mapping areas with different scales; the mapping region comprises a part of two-class virtual point clouds of the initial mapping region and real point clouds around the initial mapping region.
For each initial mapping area, rectangular windows with different scales can be designed according to a random sampling consistency algorithm, sliding is carried out along four sides of the initial mapping area based on the rectangular windows with different scales, a sliding range of the window is limited to be required to be along a window boundary, the central area is required to be sliding and abandoned, partial real point cloud information can be covered by the window in the sliding process, and the point cloud is divided according to a multi-scale algorithm of an octree, so that the mapping area under each scale is obtained. The mapping region comprises a part of two-class virtual point clouds of the initial mapping region and real point clouds around the initial mapping region.
In this embodiment, according to different scales, the point cloud to be mapped is divided into regions, so as to obtain the mapping region under each scale.
In one embodiment, based on a multi-frame tunnel fusion map, a temperature detection result of the tunnel device is obtained, and the specific steps are as follows: acquiring a device temperature detection result of each frame of tunnel fusion map; splicing each frame of tunnel fusion map to obtain a tunnel line map; determining a tunnel fusion map of adjacent frames in the tunnel line map; and fusing the device temperature detection results of the tunnel fusion graphs of the adjacent frames to obtain the temperature detection results of the tunnel devices.
Acquiring a device temperature detection result of each frame of tunnel fusion map; acquiring positioning information of each frame of tunnel fusion map by combining the speed of the network inspection vehicle by means of inertial navigation positioning and beacon positioning technology, correcting and calibrating a data source, registering data after the three-dimensional coordinates of the tunnel fusion map are in the same coordinate system and are in spatial consistency, and splicing each frame of tunnel fusion map to obtain a tunnel line map; and carrying out data merging according to a time sequence and a maximum likelihood estimation matching algorithm, removing noise and an overlapping area of the tunnel line graph, fitting and reducing distortion to achieve the accuracy of data splicing, respectively matching temperature data of three continuous tunnel fusion graphs before, during and after the splicing of the tunnel line graph, and carrying out averaging and unification on point clouds of the point cloud overlapping area. And directly matching the non-overlapping part according to the original tunnel fusion map data to obtain a temperature detection result of the tunnel equipment.
In this embodiment, a temperature detection result of the tunnel device is obtained according to the multi-frame tunnel fusion map.
In one embodiment, the device temperature detection result of each frame of tunnel fusion map is obtained, and the specific steps are as follows: dividing the areas of each frame of tunnel fusion map according to the echo intensity of the point cloud in each frame of tunnel fusion map and the change condition of the temperature value, and obtaining each area; determining equipment corresponding to each area; positioning the mark surface of the equipment to obtain point sets and particle points of different surfaces of the equipment; and obtaining poles of the equipment according to the point sets and the mass points of different surfaces of the equipment, and performing temperature marking operation to obtain an equipment temperature detection result of each frame of tunnel fusion map.
According to the echo intensity of the point cloud in each frame of tunnel fusion map and the change condition of the temperature value, contour detection of a temperature field, mutation detection of point cloud data, filtering of the point cloud and clustering algorithm processing can be combined, and finally, region division is carried out on each frame of tunnel fusion map according to a boundary sensitive network (Boundary Sensitive Network, BSN) algorithm to obtain each region. The divided region may have a category overlapping region. Determining equipment corresponding to each region, performing targeted detection and temperature measurement on tunnel equipment in each region, for example, detecting cables, adopting a cylinder detection algorithm, filtering, normalizing, inputting into a voxel network (VoxelNet) model for detection and identification, and finally matching the cables with corresponding temperature values according to the result. And (3) carrying out marking surface positioning on the cable to obtain point sets and mass points of different surfaces of the cable, then solving poles of the cable (which can be regarded as a rectangular body) according to the point sets and the mass points of the different surfaces of the cable, and carrying out temperature and characteristic marking operation on the cable to obtain a cable temperature detection result of each frame of tunnel fusion map. In the embodiment, according to the echo intensity and the temperature value change condition of the point cloud in each frame of tunnel fusion map, carrying out region division on each frame of tunnel fusion map to obtain each region; and positioning the mark surface of the equipment to obtain the pole of the equipment, and performing temperature marking operation to obtain the equipment temperature detection result of each frame of tunnel fusion map. The marking operation records the point set and the particle information of the target equipment, so that the size, the volume, the reflectivity and the temperature information of the target equipment can be conveniently obtained in use, and the computational waste of repeated detection and repeated identification is reduced.
In one embodiment, a plurality of images to be spliced are screened from a multi-frame tunnel fusion image based on the equipment acquisition position corresponding to the tunnel fusion image; the tunnel fusion map is obtained by carrying out fusion processing on the tunnel point cloud map and the tunnel thermodynamic diagram of the same frame. And matching the characteristic points of the tunnel planar targets in the images to be spliced, and matching the characteristic points of the tunnel linear targets in the images to be spliced to obtain a transformation matrix between the images to be spliced. And based on the transformation matrix, splicing the plurality of images to be spliced to obtain a spliced image. And fusing the point cloud overlapping areas in the spliced graph to obtain a tunnel line graph. And obtaining a tunnel equipment temperature detection result based on the tunnel line graph.
And obtaining the equipment acquisition position corresponding to the tunnel fusion map according to the inertial navigation positioning technology and the beacon positioning technology and combining the speed of the network inspection vehicle. And determining screening intervals according to actual conditions, and screening a plurality of images to be spliced in the multi-frame tunnel fusion map based on equipment acquisition positions and the screening intervals corresponding to the tunnel fusion map.
The tunnel planar target can be a tunnel wall surface target, and the tunnel linear target can be a plate surface of a ballastless track, a civil air defense channel and a double-line rail target. Matching the same tunnel planar target feature points and tunnel linear target feature points in different pictures to be spliced, for example, finding out the corresponding relation between a large ground plane and double-line rails shared by the two pictures to be spliced, and calculating a transformation matrix by detecting and matching the directions and positions of the large ground plane and the double-line rails to realize the preliminary alignment between frame data. And mapping each frame of the images to be spliced into a global coordinate system by applying a transformation matrix, so as to ensure the correct alignment among the images to be spliced. Illustratively, according to the splice graph, temperature data of three continuous frames of the splice graph before, during and after are respectively matched, and point clouds in the overlapping area of the point clouds are averaged and unified to obtain a tunnel line graph.
In the tunnel equipment temperature detection method, characteristic points of tunnel planar targets in all the pictures to be spliced are matched, characteristic points of tunnel linear targets in all the pictures to be spliced are matched, and a transformation matrix among the pictures to be spliced is obtained; based on the transformation matrix, a plurality of images to be spliced are spliced to obtain a spliced image, and the point cloud overlapping areas in the spliced image are fused to obtain a tunnel line image, so that the splicing accuracy is high and the time delay is low.
In one embodiment, before a plurality of images to be spliced are screened out from the multi-frame tunnel fusion image based on the device acquisition position corresponding to the tunnel fusion image, the method provided by the application further comprises the following steps: determining a preliminary equipment acquisition position corresponding to the tunnel fusion map according to an inertial navigation positioning technology; and correcting and optimizing the initial equipment acquisition position corresponding to the tunnel fusion map according to the beacon positioning technology and combining the speed of the network inspection vehicle to obtain the equipment acquisition position corresponding to the tunnel fusion map.
The inertial navigation positioning technology is realized by an inertial navigation system (Inertial Navigation System, INS), is an autonomous navigation technology, and calculates the position, the speed and the gesture of the inertial navigation positioning technology by measuring the acceleration and the angular speed of the equipment. The inertial navigation system comprises three gyroscopes and three accelerometers for measuring angular and linear accelerations of the device in three axes, respectively. The position, velocity and attitude information of the device can be continuously updated using the newton law of motion and the attitude transformation represented by the euler angles (or quaternions). However, due to sensor noise and drift, measurement errors of inertial navigation systems can accumulate over time. Therefore, correction and optimization is required using beacon positioning techniques.
Beacon positioning technology generally relies on preset landmarks to determine the location of equipment, and currently, embedded positioning beacon devices are installed in subway tunnel lines. When the train positioning sensing device approaches the beacon, the signal of the beacon can be received and analyzed, so that accurate position information is obtained.
Determining a preliminary equipment acquisition position corresponding to the tunnel fusion map according to an inertial navigation positioning technology; and correcting and optimizing the initial equipment acquisition position corresponding to the tunnel fusion map according to the beacon positioning technology and combining the speed of the network inspection vehicle to obtain the equipment acquisition position corresponding to the tunnel fusion map.
In this embodiment, according to the inertial navigation positioning technology and the beacon positioning technology and combining the speed of the network inspection vehicle, a more accurate device acquisition position corresponding to the tunnel fusion map is obtained.
In one embodiment, based on the device acquisition position corresponding to the tunnel fusion map, a plurality of maps to be spliced are screened from the multi-frame tunnel fusion map, and the specific steps are as follows: obtaining a screening interval; and screening a plurality of images to be spliced from the multi-frame tunnel fusion images based on the equipment acquisition positions and the screening intervals corresponding to the tunnel fusion images.
For example, a screening interval of 5 meters is obtained, and 200 images to be spliced are screened out every 5 meters in the 500-frame tunnel fusion map based on the equipment acquisition position and the screening interval corresponding to the tunnel fusion map.
In this embodiment, based on the device acquisition position and the screening interval corresponding to the tunnel fusion map, a plurality of to-be-spliced maps are screened out from the multi-frame tunnel fusion map, and representative to-be-spliced maps are screened out, so that the calculation amount is reduced.
In one embodiment, the method comprises the following specific steps of: extracting tunnel planar target feature points and tunnel linear target feature points from each graph to be spliced; matching the characteristic points of the tunnel planar targets in the images to be spliced, and matching the characteristic points of the tunnel linear targets in the images to be spliced to obtain a primary transformation matrix between the images to be spliced; and obtaining a transformation matrix between the pictures to be spliced according to the preliminary transformation matrix.
And (3) performing target feature point identification according to target clustering, track fitting and plane segmentation technologies by referring to algorithms in a point cloud (Point Cloud Library, PCL) library, and extracting tunnel planar target feature points and tunnel linear target feature points from each graph to be spliced. And (3) using a dimension-Scale-invariant feature transform (Scale-INVARIANT FEATURE TRANSFORM IN D, SIFT3D) technology to find out feature points of the same target in each point cloud in each graph to be spliced, matching the feature points of the tunnel planar target in each graph to be spliced, matching the feature points of the tunnel linear target in each graph to be spliced, and detecting the directions and positions of the tunnel planar target and the tunnel linear target to obtain a preliminary transformation matrix between the graphs to be spliced. And mapping each graph to be spliced into a global coordinate system by applying a preliminary transformation matrix, so as to ensure correct alignment among the graphs to be spliced. The original computational complexity of the traditional closest point matching algorithm (ITERATIVE CLOSEST POINT ALGORITHM, ICP) technology is simplified, the improved closest point matching algorithm is used for accurately registering point clouds of all the images to be spliced, geometric transformation required for converting all the images to be spliced from a local coordinate system to a global coordinate system is accurately calculated, and a transformation matrix among the images to be spliced is obtained.
In the embodiment, according to the matching of the characteristic points of the tunnel planar targets on each graph to be spliced and the matching of the characteristic points of the tunnel linear targets on each graph to be spliced, a preliminary transformation matrix between the graphs to be spliced is obtained; and obtaining a transformation matrix between the pictures to be spliced according to the preliminary transformation matrix.
In one embodiment, the method comprises the following specific steps of matching characteristic points of tunnel planar targets in each graph to be spliced and matching characteristic points of tunnel linear targets in each graph to be spliced to obtain a preliminary transformation matrix between the graphs to be spliced: matching the tunnel planar targets with the characteristic points of each graph to be spliced to obtain a transformation matrix to be optimized; optimizing the transformation matrix to be optimized at the characteristic points of each graph to be spliced according to the tunnel linear target to obtain a preliminary transformation matrix among the graphs to be spliced.
Matching the tunnel planar targets at the characteristic points of each graph to be spliced, and detecting the direction and the position of the tunnel planar targets to obtain a transformation matrix to be optimized; optimizing the transformation matrix to be optimized according to the characteristic points of the tunnel linear targets in each graph to be spliced, and detecting the direction and the position of the tunnel linear targets to obtain the preliminary transformation matrix between the graphs to be spliced.
In this embodiment, according to matching the characteristic points of the tunnel planar target in each graph to be spliced and matching the characteristic points of the tunnel linear target in each graph to be spliced, a relatively accurate preliminary transformation matrix between the graphs to be spliced is obtained.
In one embodiment, the overlapping areas of the point clouds in the spliced graph are fused to obtain a tunnel line graph, which specifically includes the following steps: determining a point cloud overlapping area of the continuous multi-frame mosaic; acquiring temperature data of a point cloud overlapping area in a continuous multi-frame spliced graph; and (5) carrying out averaging on temperature data of the point cloud overlapping area in the continuous multi-frame spliced graph to obtain a tunnel line graph.
The method comprises the steps of determining a point cloud overlapping area of a three-frame spliced graph with continuous front, middle and rear, obtaining temperature data of the three-frame spliced graph with continuous front, middle and rear in the point cloud overlapping area, carrying out averaging and unification on the temperature data of the three-frame spliced graph with continuous front, middle and rear in the point cloud overlapping area, and obtaining a tunnel line graph.
In this embodiment, by fusing the overlapping areas of the point clouds in the spliced graph, a relatively accurate tunnel line graph is obtained.
In one embodiment, after obtaining the temperature detection result of the tunnel device based on the tunnel circuit diagram, the method provided by the application further includes: determining a device of interest based on the detection information input by the user; based on the tunnel line graph, the temperature detection result of the interested device is fed back to the user.
Based on the detection information input by the user, the detection information can comprise detection area, detection category, identification level and alarm temperature information, and the interested device is determined. Based on the tunnel line graph, feeding back a temperature detection result of the interested device to the user, wherein the temperature detection result can comprise overtemperature alarm, a temperature comparison line graph and panoramic map information. The application can also feed back detailed report forms to users, including the functions of alarming on a certain device when He Dechan is generated and comparing historical data changes.
In this embodiment, according to detection information input by a user, an interested device is determined; based on the tunnel line graph, the temperature detection result of the interested device is fed back to the user.
In order to better understand the above method, an application example of the tunnel equipment temperature detection method of the present application is described in detail as shown in fig. 5.
And acquiring a multi-frame tunnel point cloud image acquired by the radar and a multi-frame tunnel thermodynamic diagram acquired by the thermal imager. Performing external parameter calibration on the radar and the thermal imager, performing spatial coordinate registration operation according to the superposition characteristics of the main view and the datum point of the radar and the thermal imager, enabling one type of real point clouds of one type of region to basically correspond to one type of heating point of the upper type of region, and aiming at one type of region in each group, increasing the temperature value of one type of heating point on the parameter value of four-dimensional coordinates of one type of real point clouds, so as to obtain the temperature value of one type of real point clouds of one type of region. The method comprises the steps of dividing a second-class area into a gentle area, a gradient area and a convex-concave area based on the temperature value distribution condition of second-class thermodynamic points of the second-class area aiming at the second-class area in each group, determining the filling density of each area according to the distance value corresponding to second-class real point clouds, and filling the second-class area with the point clouds according to the filling density of each area to obtain second-class mixing point clouds. Obtaining point clouds to be mapped based on the two kinds of mixed point clouds and one kind of real point clouds of one kind of areas in the same group; the method comprises the steps of carrying out multi-scale mapping on a point cloud to be mapped and a second-class heating point to obtain temperature calibration values of the second-class heating point, and mapping the second-class heating point and the second-class mixing point cloud, so that the temperature values of the second-class mixing point cloud are obtained according to the temperature calibration values of the second-class heating point. And dividing the area of each frame of tunnel fusion map according to the echo intensity and the change condition of the temperature value of the point cloud in each frame of tunnel fusion map to obtain each area. Determining equipment corresponding to each area; positioning the mark surface of the equipment to obtain point sets and particle points of different surfaces of the equipment; and obtaining poles of the equipment according to the point sets and the mass points of different surfaces of the equipment, and performing temperature marking operation to obtain an equipment temperature detection result of each frame of tunnel fusion map. Splicing each frame of tunnel fusion map to obtain a tunnel line map; determining a tunnel fusion map of adjacent frames in the tunnel line map; and fusing the device temperature detection results of the tunnel fusion graphs of the adjacent frames to obtain the temperature detection results of the tunnel devices. Determining a device of interest based on the detection information input by the user; based on the tunnel line graph, the temperature detection result of the interested device is fed back to the user.
In the tunnel equipment temperature detection method, when fusion processing is carried out on the point cloud images of the tunnels in the same frame and the tunnel thermodynamic diagrams, aiming at the two kinds of areas in each group, filling the two kinds of areas with point cloud based on the temperature value distribution condition of the two kinds of thermodynamic points of the two kinds of areas to obtain two kinds of mixed point cloud, and obtaining point cloud to be mapped based on the two kinds of mixed point cloud and one kind of true point cloud of the one kind of areas in the same group; the method comprises the steps of carrying out multi-scale mapping on point clouds to be mapped and two kinds of heating points to obtain temperature calibration values of the two kinds of heating points, obtaining temperature values of two kinds of mixing point clouds according to the temperature calibration values of the two kinds of heating points, enriching point cloud data of a tunnel fusion map, reducing influence of radar blind areas on accuracy of temperature detection results, and improving accuracy of the temperature detection results.
It should be understood that, although the steps in the flowcharts related to the above embodiments are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides a tunnel equipment temperature detection device for realizing the tunnel equipment temperature detection method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiments of the tunnel device temperature detection device or devices provided below may be referred to the limitation of the tunnel device temperature detection method hereinabove, and will not be repeated herein.
In an exemplary embodiment, as shown in fig. 6, there is provided a tunnel equipment temperature detection apparatus, wherein:
The image acquisition module 601 is configured to acquire a multi-frame tunnel point cloud image acquired by a radar and a multi-frame tunnel thermodynamic diagram acquired by a thermal imager;
The image fusion module 602 is configured to obtain a multi-frame tunnel fusion map based on fusion processing performed on the same-frame tunnel point cloud map and the tunnel thermodynamic diagram;
The temperature detection module 603 is configured to obtain a temperature detection result of the tunnel device based on the multi-frame tunnel fusion map;
The image fusion module 602 is further configured to obtain, for a class of regions in each group, a temperature value of a class of real point clouds of the class of regions based on a temperature value of a class of thermodynamic points of the class of regions when fusion processing is performed on the same-frame tunnel point cloud map and the tunnel thermodynamic map; aiming at the two-class areas in each group, filling point cloud of the two-class areas based on the temperature value distribution condition of the two-class thermal points of the two-class areas to obtain two-class mixing point cloud, and obtaining point cloud to be mapped based on the two-class mixing point cloud and one kind of real point cloud of the one-class area in the same group; performing multi-scale mapping on the point cloud to be mapped and the two types of heating points to obtain temperature calibration values of the two types of heating points, and obtaining the temperature values of the two types of mixing point clouds according to the temperature calibration values of the two types of heating points; the first-class area and the second-class area in the same group have common radar waves; the two types of areas are areas where radar waves and thermal imaging waves coincide, and the two types of areas are areas formed by adjacent radar waves with the thermal imaging waves between the two types of areas.
In one embodiment, the image fusion module 602 is further configured to: performing space coordinate registration operation on the same-frame tunnel point cloud graph and the tunnel thermodynamic diagram to determine initial temperature values of a class of real point clouds of the class of regions based on the temperature values of a class of thermodynamic points of the class of regions; and calibrating the initial temperature value of the class of real point clouds according to the difference between the distance value corresponding to the class of thermal points and the distance value corresponding to the class of real point clouds to obtain the temperature value of the class of real point clouds.
In one embodiment, the image fusion module 602 is further configured to: dividing the second-class region into a plurality of subareas based on the temperature value distribution condition of the second-class thermodynamic points of the second-class region; determining the filling density of each subarea according to the distance value corresponding to the second-class real point cloud; according to the filling density of each subarea, respectively filling the subareas with point clouds to obtain second-class virtual point clouds; and obtaining the second-class mixed point cloud according to the second-class real point cloud and the second-class virtual point cloud.
In one embodiment, the image fusion module 602 is further configured to: dividing a second-class area where the second-class thermal points are located, wherein the temperature value change is not large and the distribution is uniform, in an arbitrary rectangular range into a gentle area to be filled; dividing a second-class area where the second-class heating points with temperature value distribution conforming to the same-direction abrupt change condition are located into gradient areas to be filled; dividing a second-class area of the bulge or the recess where the second-class heating power points with temperature value distribution conforming to the multi-directional mutation condition are located into convex-concave areas to be filled.
In one embodiment, the image fusion module 602 is further configured to: for each two-class thermodynamic point, acquiring an initial mapping area of the two-class thermodynamic point; the initial mapping area of the two-class thermodynamic point is an area where a corresponding two-class virtual point cloud is located based on each two-class thermodynamic point when the two-class thermodynamic point is filled with point clouds; dividing the initial mapping region according to different scales to obtain the mapping region of each two classes of thermodynamic points under each scale; mapping each two classes of thermodynamic points with the point cloud of the point cloud to be mapped in the mapping area under the corresponding different scales; and aiming at each two-class thermodynamic point, calibrating the temperature values of the two-class thermodynamic point based on the distance and the temperature value corresponding to the point cloud with the mapping of the two-class thermodynamic point, and obtaining the temperature calibration value of the two-class thermodynamic point.
In one embodiment, the image fusion module 602 is further configured to: sliding along four sides of the initial mapping area based on windows with different scales to obtain the mapping area under each scale; the mapping region comprises part of second-class virtual point clouds of the initial mapping region and real point clouds around the initial mapping region.
In one embodiment, the temperature detection module 603 is further configured to: acquiring a device temperature detection result of each frame of tunnel fusion map; splicing the tunnel fusion graphs of each frame to obtain the tunnel line graph; determining a tunnel fusion map of adjacent frames in the tunnel line map; and fusing the device temperature detection results of the adjacent frame tunnel fusion graphs to obtain the temperature detection results of the tunnel devices.
In one embodiment, the temperature detection module 603 is further configured to: dividing the areas of each frame of tunnel fusion map according to the echo intensity of the point cloud in each frame of tunnel fusion map and the change condition of the temperature value, and obtaining each area; determining equipment corresponding to each area; positioning the mark surface of the equipment to obtain point sets and particle points of different surfaces of the equipment; and obtaining poles of the equipment according to the point sets and the mass points of different surfaces of the equipment, and performing temperature marking operation to obtain an equipment temperature detection result of each frame of tunnel fusion map.
The modules in the tunnel equipment temperature detection device can be realized in whole or in part by software, hardware and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one exemplary embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 7. The computer device includes a processor, a memory, an Input/Output interface (I/O) and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used for storing data of the tunnel device temperature detection method. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by a processor, implements a tunnel device temperature detection method.
It will be appreciated by those skilled in the art that the structure shown in FIG. 7 is merely a block diagram of some of the structures associated with the present inventive arrangements and is not limiting of the computer device to which the present inventive arrangements may be applied, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In an embodiment, there is also provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are both information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data are required to meet the related regulations.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magneto-resistive random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (PHASE CHANGE Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in various forms such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), etc. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (10)

1. A method for detecting a temperature of a tunnel device, the method comprising:
Acquiring a multi-frame tunnel point cloud image acquired by a radar and a multi-frame tunnel thermodynamic diagram acquired by a thermal imager;
Based on fusion processing of the same-frame tunnel point cloud image and the tunnel thermodynamic diagram, a multi-frame tunnel fusion image is obtained;
Based on the multi-frame tunnel fusion map, obtaining a temperature detection result of the tunnel equipment;
When fusion processing is carried out on the same-frame tunnel point cloud images and the tunnel thermodynamic diagrams, aiming at one type of region in each group, carrying out space coordinate registration operation on the same-frame tunnel point cloud images and the tunnel thermodynamic diagrams so as to determine initial temperature values of one type of real point clouds of the one type of region based on temperature values of one type of thermodynamic points of the one type of region; calibrating initial temperature values of the class of real point clouds according to the difference between the distance values corresponding to the class of thermal points and the distance values corresponding to the class of real point clouds to obtain temperature values of the class of real point clouds; dividing a second-class region where the second-class thermal points with small temperature value change and uniform distribution in any rectangular range are located into gentle regions to be filled aiming at the second-class regions in each group; dividing a second-class area where the second-class heating points with temperature value distribution conforming to the same-direction abrupt change condition are located into gradient areas to be filled; dividing a second-class area of the bulge or the recess where the second-class heating power points with temperature value distribution conforming to the multi-direction mutation situation are located into convex-concave areas to be filled; determining the filling density of each subarea according to the distance value corresponding to the second-class real point cloud; according to the filling density of each subarea, respectively filling the subareas with point clouds to obtain second-class virtual point clouds; obtaining a second-class mixed point cloud according to the second-class real point cloud and the second-class virtual point cloud; obtaining point clouds to be mapped based on the two kinds of mixed point clouds and one kind of real point clouds of the one kind of areas in the same group; for each two-class thermodynamic point, acquiring an initial mapping area of the two-class thermodynamic point; the initial mapping area of the two-class thermodynamic point is an area where a corresponding two-class virtual point cloud is located based on each two-class thermodynamic point when the two-class thermodynamic point is filled with point clouds; sliding along four sides of the initial mapping area based on windows with different scales to obtain mapping areas with different scales; the mapping area comprises part of second-class virtual point clouds of the initial mapping area and real point clouds around the initial mapping area; mapping each two classes of thermodynamic points with the point cloud of the point cloud to be mapped in the mapping area under the corresponding different scales; aiming at each two-class thermodynamic point, calibrating the temperature value of the two-class thermodynamic point based on the distance and the temperature value corresponding to the point cloud with the mapping of the two-class thermodynamic point to obtain a temperature calibration value of the two-class thermodynamic point; obtaining the temperature value of the two-class mixing point cloud according to the temperature calibration value of the two-class heating point; the first-class area and the second-class area in the same group have common radar waves; the two types of areas are areas where radar waves and thermal imaging waves coincide, and the two types of areas are areas formed by adjacent radar waves with the thermal imaging waves between the two types of areas.
2. The method of claim 1, wherein the obtaining the temperature detection result of the tunnel device based on the multi-frame tunnel fusion map comprises:
Acquiring a device temperature detection result of each frame of tunnel fusion map;
splicing the tunnel fusion graphs of each frame to obtain a tunnel line graph;
Determining a tunnel fusion map of adjacent frames in the tunnel line map;
And fusing the device temperature detection results of the adjacent frame tunnel fusion graphs to obtain the temperature detection results of the tunnel devices.
3. The method of claim 2, wherein the obtaining the device temperature detection result of the per-frame tunnel fusion map comprises:
Dividing the areas of each frame of tunnel fusion map according to the echo intensity of the point cloud in each frame of tunnel fusion map and the change condition of the temperature value, and obtaining each area;
Determining equipment corresponding to each area;
positioning the mark surface of the equipment to obtain point sets and particle points of different surfaces of the equipment;
And obtaining poles of the equipment according to the point sets and the mass points of different surfaces of the equipment, and performing temperature marking operation to obtain an equipment temperature detection result of each frame of tunnel fusion map.
4. The method of claim 2, wherein the splicing the tunnel fusion map per frame to obtain a tunnel line map includes:
determining a point cloud overlapping area of a continuous multi-frame tunnel fusion map;
Acquiring temperature data of the point cloud overlapping region in a continuous multi-frame tunnel fusion map;
And carrying out averaging on the temperature data of the point cloud overlapping region in the continuous multi-frame tunnel fusion map to obtain the tunnel line map.
5. A tunnel equipment temperature detection apparatus, the apparatus comprising:
The image acquisition module is used for acquiring a multi-frame tunnel point cloud image acquired by a radar and a multi-frame tunnel thermodynamic diagram acquired by a thermal imager;
The image fusion module is used for obtaining a multi-frame tunnel fusion map based on fusion processing of the same-frame tunnel point cloud map and the tunnel thermodynamic diagram;
the temperature detection module is used for obtaining a temperature detection result of the tunnel equipment based on the multi-frame tunnel fusion map;
The image fusion module is further used for carrying out space coordinate registration operation on the same-frame tunnel point cloud images and the tunnel thermodynamic diagrams aiming at one type of areas in each group when carrying out fusion processing on the same-frame tunnel point cloud images and the tunnel thermodynamic diagrams so as to determine initial temperature values of one type of real point clouds of the one type of areas based on temperature values of one type of thermodynamic points of the one type of areas; calibrating initial temperature values of the class of real point clouds according to the difference between the distance values corresponding to the class of thermal points and the distance values corresponding to the class of real point clouds to obtain temperature values of the class of real point clouds; dividing a second-class region where the second-class thermal points with small temperature value change and uniform distribution in any rectangular range are located into gentle regions to be filled aiming at the second-class regions in each group; dividing a second-class area where the second-class heating points with temperature value distribution conforming to the same-direction abrupt change condition are located into gradient areas to be filled; dividing a second-class area of the bulge or the recess where the second-class heating power points with temperature value distribution conforming to the multi-direction mutation situation are located into convex-concave areas to be filled; determining the filling density of each subarea according to the distance value corresponding to the second-class real point cloud; according to the filling density of each subarea, respectively filling the subareas with point clouds to obtain second-class virtual point clouds; obtaining a second-class mixed point cloud according to the second-class real point cloud and the second-class virtual point cloud; obtaining point clouds to be mapped based on the two kinds of mixed point clouds and one kind of real point clouds of the one kind of areas in the same group; for each two-class thermodynamic point, acquiring an initial mapping area of the two-class thermodynamic point; the initial mapping area of the two-class thermodynamic point is an area where a corresponding two-class virtual point cloud is located based on each two-class thermodynamic point when the two-class thermodynamic point is filled with point clouds; sliding along four sides of the initial mapping area based on windows with different scales to obtain mapping areas with different scales; the mapping area comprises part of second-class virtual point clouds of the initial mapping area and real point clouds around the initial mapping area; mapping each two classes of thermodynamic points with the point cloud of the point cloud to be mapped in the mapping area under the corresponding different scales; aiming at each two-class thermodynamic point, calibrating the temperature value of the two-class thermodynamic point based on the distance and the temperature value corresponding to the point cloud with the mapping of the two-class thermodynamic point to obtain a temperature calibration value of the two-class thermodynamic point; obtaining the temperature value of the two-class mixing point cloud according to the temperature calibration value of the two-class heating point; the first-class area and the second-class area in the same group have common radar waves; the two types of areas are areas where radar waves and thermal imaging waves coincide, and the two types of areas are areas formed by adjacent radar waves with the thermal imaging waves between the two types of areas.
6. The apparatus of claim 5, wherein the temperature detection module is further configured to: acquiring a device temperature detection result of each frame of tunnel fusion map; splicing the tunnel fusion graphs of each frame to obtain a tunnel line graph; determining a tunnel fusion map of adjacent frames in the tunnel line map; and fusing the device temperature detection results of the adjacent frame tunnel fusion graphs to obtain the temperature detection results of the tunnel devices.
7. The apparatus of claim 6, wherein the temperature detection module is further configured to: dividing the areas of each frame of tunnel fusion map according to the echo intensity of the point cloud in each frame of tunnel fusion map and the change condition of the temperature value, and obtaining each area; determining equipment corresponding to each area; positioning the mark surface of the equipment to obtain point sets and particle points of different surfaces of the equipment; and obtaining poles of the equipment according to the point sets and the mass points of different surfaces of the equipment, and performing temperature marking operation to obtain an equipment temperature detection result of each frame of tunnel fusion map.
8. The apparatus of claim 6, wherein the temperature detection module is further configured to: determining a point cloud overlapping area of a continuous multi-frame tunnel fusion map; acquiring temperature data of the point cloud overlapping region in a continuous multi-frame tunnel fusion map; and carrying out averaging on the temperature data of the point cloud overlapping region in the continuous multi-frame tunnel fusion map to obtain the tunnel line map.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 4 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 4.
CN202410191391.3A 2024-02-21 2024-02-21 Tunnel equipment temperature detection method and device and computer equipment Active CN117740186B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410191391.3A CN117740186B (en) 2024-02-21 2024-02-21 Tunnel equipment temperature detection method and device and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410191391.3A CN117740186B (en) 2024-02-21 2024-02-21 Tunnel equipment temperature detection method and device and computer equipment

Publications (2)

Publication Number Publication Date
CN117740186A CN117740186A (en) 2024-03-22
CN117740186B true CN117740186B (en) 2024-05-10

Family

ID=90283476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410191391.3A Active CN117740186B (en) 2024-02-21 2024-02-21 Tunnel equipment temperature detection method and device and computer equipment

Country Status (1)

Country Link
CN (1) CN117740186B (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021102795A1 (en) * 2019-11-27 2021-06-03 大连港森立达木材交易中心有限公司 Temperature surveying and mapping system and method for log inactivation bin
CN113343745A (en) * 2021-02-26 2021-09-03 北京中科慧眼科技有限公司 Binocular camera-based remote target detection method and system and intelligent terminal
CN114263501A (en) * 2021-12-28 2022-04-01 江苏亚冠轨道交通科技有限公司 Tunnel water seepage and water burst monitoring system based on image recognition and thermal infrared combination
CN115359021A (en) * 2022-08-29 2022-11-18 上海大学 Target positioning detection method based on laser radar and camera information fusion
CN115731545A (en) * 2022-12-06 2023-03-03 国网江苏省电力有限公司 Cable tunnel inspection method and device based on fusion perception
WO2023040247A1 (en) * 2021-09-18 2023-03-23 浙江大学 Road area image recognition method based on image and point cloud fusion network
CN115876198A (en) * 2022-11-28 2023-03-31 烟台艾睿光电科技有限公司 Target detection and early warning method, device, system and medium based on data fusion
CN116702264A (en) * 2023-04-18 2023-09-05 中铁四局集团有限公司 Tunnel deformation analysis method, system, computer device and readable storage medium
WO2023185069A1 (en) * 2022-04-01 2023-10-05 北京京东乾石科技有限公司 Object detection method and apparatus, and computer-readable storage medium and unmanned vehicle
CN117011830A (en) * 2023-08-16 2023-11-07 微牌科技(浙江)有限公司 Image recognition method, device, computer equipment and storage medium
CN117146733A (en) * 2023-06-07 2023-12-01 中国空气动力研究与发展中心超高速空气动力研究所 Comprehensive measurement method for high Wen Mubiao three-dimensional morphology and temperature field
CN117197586A (en) * 2023-10-18 2023-12-08 中南大学 Tunnel guniting intelligent detection method and system based on neural network and point cloud processing
CN117309856A (en) * 2023-08-30 2023-12-29 中国科学院空天信息创新研究院 Smoke screen effect monitoring method and device, electronic equipment and storage medium
CN117557731A (en) * 2023-12-07 2024-02-13 中南大学 Forging temperature field distribution and three-dimensional size integrated reconstruction method
CN117553919A (en) * 2023-10-09 2024-02-13 河北工业大学 Relay temperature field measurement method based on three-dimensional point cloud and thermal image
CN117745537A (en) * 2024-02-21 2024-03-22 微牌科技(浙江)有限公司 Tunnel equipment temperature detection method, device, computer equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10796403B2 (en) * 2017-09-14 2020-10-06 The Regents Of The University Of Colorado, A Body Corporate Thermal-depth fusion imaging
CN108663677A (en) * 2018-03-29 2018-10-16 上海智瞳通科技有限公司 A kind of method that multisensor depth integration improves target detection capabilities
US20220285009A1 (en) * 2019-08-16 2022-09-08 Z Imaging Systems and methods for real-time multiple modality image alignment
CN114093142B (en) * 2020-08-05 2023-09-01 安霸国际有限合伙企业 Object-perceived temperature anomaly monitoring and early warning by combining visual sensing and thermal sensing

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021102795A1 (en) * 2019-11-27 2021-06-03 大连港森立达木材交易中心有限公司 Temperature surveying and mapping system and method for log inactivation bin
CN113343745A (en) * 2021-02-26 2021-09-03 北京中科慧眼科技有限公司 Binocular camera-based remote target detection method and system and intelligent terminal
WO2023040247A1 (en) * 2021-09-18 2023-03-23 浙江大学 Road area image recognition method based on image and point cloud fusion network
CN114263501A (en) * 2021-12-28 2022-04-01 江苏亚冠轨道交通科技有限公司 Tunnel water seepage and water burst monitoring system based on image recognition and thermal infrared combination
WO2023185069A1 (en) * 2022-04-01 2023-10-05 北京京东乾石科技有限公司 Object detection method and apparatus, and computer-readable storage medium and unmanned vehicle
CN115359021A (en) * 2022-08-29 2022-11-18 上海大学 Target positioning detection method based on laser radar and camera information fusion
CN115876198A (en) * 2022-11-28 2023-03-31 烟台艾睿光电科技有限公司 Target detection and early warning method, device, system and medium based on data fusion
CN115731545A (en) * 2022-12-06 2023-03-03 国网江苏省电力有限公司 Cable tunnel inspection method and device based on fusion perception
CN116702264A (en) * 2023-04-18 2023-09-05 中铁四局集团有限公司 Tunnel deformation analysis method, system, computer device and readable storage medium
CN117146733A (en) * 2023-06-07 2023-12-01 中国空气动力研究与发展中心超高速空气动力研究所 Comprehensive measurement method for high Wen Mubiao three-dimensional morphology and temperature field
CN117011830A (en) * 2023-08-16 2023-11-07 微牌科技(浙江)有限公司 Image recognition method, device, computer equipment and storage medium
CN117309856A (en) * 2023-08-30 2023-12-29 中国科学院空天信息创新研究院 Smoke screen effect monitoring method and device, electronic equipment and storage medium
CN117553919A (en) * 2023-10-09 2024-02-13 河北工业大学 Relay temperature field measurement method based on three-dimensional point cloud and thermal image
CN117197586A (en) * 2023-10-18 2023-12-08 中南大学 Tunnel guniting intelligent detection method and system based on neural network and point cloud processing
CN117557731A (en) * 2023-12-07 2024-02-13 中南大学 Forging temperature field distribution and three-dimensional size integrated reconstruction method
CN117745537A (en) * 2024-02-21 2024-03-22 微牌科技(浙江)有限公司 Tunnel equipment temperature detection method, device, computer equipment and storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
一种点云配准新方法在隧道变形监测中的应用研究;张瀚;高飞;;信息通信;20160515(第05期);全文 *
基于三维激光扫描技术的地铁隧道椭圆度快速检测方法;袁辉;;城市轨道交通研究;20200710(第07期);全文 *
采用温控镂空发热网的热像仪与3D激光雷达配准;宗民;杨毅;朱昊;付梦印;汪顺亭;;红外与激光工程;20141225(第12期);全文 *
面向低空安全的三维空中走廊可视化研究综述;冯登超;;电子测量技术;20180427(第09期);全文 *

Also Published As

Publication number Publication date
CN117740186A (en) 2024-03-22

Similar Documents

Publication Publication Date Title
KR102126724B1 (en) Method and apparatus for restoring point cloud data
CN108369743B (en) Mapping a space using a multi-directional camera
US9483703B2 (en) Online coupled camera pose estimation and dense reconstruction from video
CN117745537B (en) Tunnel equipment temperature detection method, device, computer equipment and storage medium
JP4685313B2 (en) Method for processing passive volumetric image of any aspect
JP6057298B2 (en) Rapid 3D modeling
US10134152B2 (en) Method and system for determining cells traversed by a measuring or visualization axis
Zhao et al. Geometric-constrained multi-view image matching method based on semi-global optimization
US20110110557A1 (en) Geo-locating an Object from Images or Videos
Zhang et al. Leveraging vision reconstruction pipelines for satellite imagery
US8264537B2 (en) Photogrammetric networks for positional accuracy
CN104019829A (en) Vehicle-mounted panorama camera based on POS (position and orientation system) and external parameter calibrating method of linear array laser scanner
CN112396640A (en) Image registration method and device, electronic equipment and storage medium
CN114937081B (en) Internet vehicle position estimation method and device based on independent non-uniform incremental sampling
CN112233246B (en) Satellite image dense matching method and system based on SRTM constraint
CN113205604A (en) Feasible region detection method based on camera and laser radar
CN114898044B (en) Imaging method, device, equipment and medium for detection object
CN112862890B (en) Road gradient prediction method, device and storage medium
CN117740186B (en) Tunnel equipment temperature detection method and device and computer equipment
CN114913500B (en) Pose determination method and device, computer equipment and storage medium
Liu et al. Displacement field reconstruction in landslide physical modeling by using a terrain laser scanner–Part 1: Methodology, error analysis and validation
CN113593026A (en) Lane line marking auxiliary map generation method and device and computer equipment
CN112183378A (en) Road slope estimation method and device based on color and depth image
US20230334688A1 (en) Multi-view height estimation from satellite images
Wang et al. Upsampling method for sparse light detection and ranging using coregistered panoramic images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant