CN114596506A - Unmanned aerial vehicle inspection equipment and image fusion method - Google Patents
Unmanned aerial vehicle inspection equipment and image fusion method Download PDFInfo
- Publication number
- CN114596506A CN114596506A CN202210207347.8A CN202210207347A CN114596506A CN 114596506 A CN114596506 A CN 114596506A CN 202210207347 A CN202210207347 A CN 202210207347A CN 114596506 A CN114596506 A CN 114596506A
- Authority
- CN
- China
- Prior art keywords
- image
- unmanned aerial
- aerial vehicle
- visible light
- infrared
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007689 inspection Methods 0.000 title claims abstract description 45
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 15
- 230000003287 optical effect Effects 0.000 claims abstract description 55
- 230000004927 fusion Effects 0.000 claims abstract description 45
- 238000012545 processing Methods 0.000 claims abstract description 26
- 238000003384 imaging method Methods 0.000 claims abstract description 20
- 230000009466 transformation Effects 0.000 claims description 14
- 238000003331 infrared imaging Methods 0.000 claims description 9
- 239000013589 supplement Substances 0.000 claims description 7
- 238000009529 body temperature measurement Methods 0.000 claims description 6
- 238000007781 pre-processing Methods 0.000 claims description 3
- 238000004861 thermometry Methods 0.000 claims description 2
- 230000009977 dual effect Effects 0.000 claims 1
- 238000000034 method Methods 0.000 description 13
- 238000012546 transfer Methods 0.000 description 8
- 238000009434 installation Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000013016 damping Methods 0.000 description 3
- 238000000354 decomposition reaction Methods 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 230000004308 accommodation Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 229910000838 Al alloy Inorganic materials 0.000 description 1
- 238000004971 IR microspectroscopy Methods 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 229910052732 germanium Inorganic materials 0.000 description 1
- GNPVGFCGXDBREM-UHFFFAOYSA-N germanium atom Chemical compound [Ge] GNPVGFCGXDBREM-UHFFFAOYSA-N 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000012060 immune response imaging Methods 0.000 description 1
- 238000011900 installation process Methods 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000007747 plating Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 238000001931 thermography Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/08—Arrangements of cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/48—Thermography; Techniques using wholly visual means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J2005/0077—Imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20132—Image cropping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Studio Devices (AREA)
Abstract
The invention relates to the technical field of unmanned aerial vehicle image processing, and provides unmanned aerial vehicle inspection equipment, an image fusion method and the unmanned aerial vehicle inspection equipment. Unmanned aerial vehicle equipment of patrolling and examining includes: the system comprises a double-light temperature measuring camera, a cradle head, an unmanned aerial vehicle body and an optical system, wherein the double-light temperature measuring camera is fixedly connected with the cradle head; the holder is provided with a holder interface position; the holder interface machine is mechanically installed on the unmanned aerial vehicle body; the optical system is arranged on the double-light temperature measuring camera and is electrically connected with the double-light temperature measuring camera. According to the unmanned aerial vehicle inspection equipment, the double-light temperature measuring camera and the optical system are adopted, so that the accurate imaging processing of the inspection environment image can be realized, and the unmanned aerial vehicle inspection equipment can be installed on unmanned aerial vehicle bodies of any types by adopting the holder, so that the application range of unmanned aerial vehicle inspection is further expanded. In addition, multiple imaging processing fusion can be realized by adopting the double-light temperature measuring camera, and the imaging accuracy is improved.
Description
Technical Field
The invention relates to the technical field of unmanned aerial vehicle image processing, in particular to unmanned aerial vehicle inspection equipment and an image fusion method.
Background
At present, in the data acquisition field of unmanned aerial vehicle inspection field application, generally use infrared camera and visible light camera to acquire infrared image and visible light image respectively, wherein infrared camera generally is the design of its optical system of sealing, and the light filter of its collocation can't satisfy the use of full scene, often need change the whole platform detector to the detection target of difference and just can realize, very big reduction infrared camera's ease for use.
In the visible light and infrared image fusion method applied in the field, the infrared image and the visible light image are respectively obtained by using an infrared lens and a visible light lens for data acquisition, the image fusion algorithm applied to the unmanned aerial vehicle is mainly based on a multi-scale decomposition algorithm, the features extracted by the multi-scale decomposition algorithm are manual features, the features extracted by the depth learning algorithm are depth features, and the depth features can better express image texture information. Therefore, the fusion effect is better than that of the multi-scale decomposition algorithm. But the deep learning algorithm has a high requirement on processor performance.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides unmanned aerial vehicle inspection equipment and an image fusion method.
In order to achieve the purpose, the invention provides the following scheme:
an unmanned aerial vehicle inspection device, comprising: the system comprises a double-light temperature measuring camera, a cloud deck, an unmanned aerial vehicle body and an optical system;
the double-light temperature measuring camera is fixedly connected with the holder; the holder is provided with a holder interface position; the holder interface is mechanically mounted on the unmanned aerial vehicle body; the optical system is arranged on the double-light temperature measuring camera and is electrically connected with the double-light temperature measuring camera.
Preferably, the dual-light thermometry camera comprises: the device comprises a shell, a data acquisition module, a standard processing module and an image fusion module;
the shell is fixedly connected with the holder; the shell is provided with a hole site for installing the optical system; the data acquisition module is electrically connected with the optical system and the standard processing module respectively; the standard processing module is electrically connected with the image fusion module;
the data acquisition module is used for acquiring optical data in the optical system; the optical data comprises visible light image data and infrared image data;
the standard processing module is used for preprocessing the optical data; the pretreatment comprises the following steps: image cropping and image registration;
the image fusion module is used for fusing the preprocessed optical data by adopting a CPCT fusion algorithm to obtain a fusion image; and the fused image is the patrol inspection shooting image.
Preferably, the housing comprises: the first sub-shell, the second sub-shell and the third sub-shell;
the first sub-shell is mounted at one end of the second sub-shell; the third sub-shell is arranged at the other end of the second sub-shell; the first sub-shell is fixedly connected with the holder; the first sub-shell, the second sub-shell and the third sub-shell form an accommodating space; the data acquisition module, the standard processing module and the image fusion module are all arranged in the accommodating space;
and the third sub-shell is provided with a first hole site, a second hole site and a third hole site.
Preferably, the optical system includes: a visible light imager and an infrared imager;
the visible light imager and the infrared imager are both electrically connected with the data acquisition module.
Preferably, the visible light imager comprises: the visible light imaging device comprises a visible light lens and a visible light imaging circuit;
the visible light lens is fixedly arranged in the third hole; the visible light lens is electrically connected with the visible light imaging circuit; the visible light imaging circuit is electrically connected with the data acquisition module.
Preferably, the infrared imager comprises an infrared lens, a chopper, an infrared detector and an infrared imaging circuit which are electrically connected in sequence;
the infrared lens is arranged in the second hole site; the infrared imaging circuit is electrically connected with the data acquisition module.
Preferably, the chopper is a plug-in chopper.
Preferably, the plug-in chopper includes: a chopper base and an optical filter;
the chopper base is of a clip type structure and is used for clamping the optical filter.
Preferably, the LED light supplement lamp is further included;
the LED light supplement lamp is installed in the first hole site.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
according to the unmanned aerial vehicle inspection equipment, the double-light temperature measuring camera and the optical system are adopted, so that the accurate imaging processing of the inspection environment image can be realized, and the unmanned aerial vehicle inspection equipment can be installed on unmanned aerial vehicle bodies of any types by adopting the holder, so that the application range of unmanned aerial vehicle inspection is further expanded. In addition, multiple imaging processing fusion can be realized by adopting the double-light temperature measuring camera, and the imaging accuracy is improved.
The invention also provides an image fusion method, which is applied to the unmanned aerial vehicle inspection equipment; the image fusion method comprises the following steps:
acquiring optical data; the optical data includes: visible light image data and infrared image data;
cutting a visible light image in the visible light image data by taking the resolution of the infrared image in the infrared image data as a reference;
extracting a first characteristic point and a second characteristic point; the first characteristic point is a characteristic point of a visible light image in the visible light image data; the second characteristic points are characteristic points of the infrared image in the infrared image data;
obtaining a first image coordinate transformation parameter according to the first characteristic point by adopting a characteristic point registration algorithm, and obtaining a second image coordinate transformation parameter according to the second characteristic point;
registering the cut visible light image by adopting the first image coordinate transformation parameter to obtain a first registered image, and registering the infrared image by adopting a second image coordinate transformation parameter to obtain a second registered image;
and performing image fusion on the first registration image and the second registration image by using a CPCT fusion algorithm based on a reference template to obtain a patrol shot image.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic structural diagram of the unmanned aerial vehicle inspection equipment provided by the invention;
FIG. 2 is a schematic structural diagram of a dual-light temperature measuring camera according to an embodiment of the invention;
fig. 3 is a schematic structural diagram of a plug-in chopper according to an embodiment of the present invention;
FIG. 4 is a flowchart of an image fusion method provided by the present invention;
fig. 5 is a flowchart of a CPCT image fusion based on YUV space according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a fused image according to an embodiment of the present invention.
Description of the symbols:
1-a tripod head, 2-a first sub-shell, 3-a second sub-shell, 4-a third sub-shell, 5-a first hole site, 6-a second hole site, 7-a third hole site, 8-a tripod head interface site, 11-a chopper seat and 12-an optical filter.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims to provide unmanned aerial vehicle inspection equipment and an image fusion method, which are used for improving inspection image shooting precision and improving the application universality of an unmanned aerial vehicle in the inspection field.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The invention provides unmanned aerial vehicle inspection equipment, which comprises: two light temperature measurement cameras, cloud platform 1, unmanned aerial vehicle body and optical system.
The double-light temperature measuring camera is fixedly connected with the holder 1. The holder 1 is provided with a holder interface position 8. Cloud platform interface position 8 mechanical mounting is on the unmanned aerial vehicle body. The optical system is arranged on the double-light temperature measuring camera and is electrically connected with the double-light temperature measuring camera.
Wherein, in order to further improve the accuracy of patrolling and examining image formation of image, as shown in fig. 1 and fig. 2, the above-mentioned two optical temperature measuring cameras that adopt include: the device comprises a shell, a data acquisition module, a standard processing module and an image fusion module.
The shell is fixedly connected with the holder 1. The shell is provided with a hole site for mounting the optical system. The data acquisition module is electrically connected with the optical system and the standard processing module respectively. And the standard processing module is electrically connected with the image fusion module. The double-light temperature measuring camera can also be installed at the load of the unmanned aerial vehicle through the connecting frame and the stable damping of the holder 1, and the specific installation position of the double-light temperature measuring camera is set according to actual needs.
The data acquisition module is used for acquiring optical data in the optical system. The optical data includes visible light image data and infrared image data. In actual use, the data acquisition module is used for acquiring a raw data set from the optical system, wherein the raw data set comprises a visible light image data set and an infrared image data set.
The standard processing module is used for preprocessing the optical data. The pretreatment comprises the following steps: image cropping and image registration. In a specific using process, the standard processing module performs image matching and image cutting processing on the visible light image data set and the infrared image data set to obtain a standard data set after image registration and size standardization.
And the image fusion module is used for fusing the preprocessed optical data by adopting a CPCT fusion algorithm to obtain a fusion image. The fused image is the patrol inspection shooting image. The image fusion die is designed based on YUV color space, an efficient color transfer method of visible light and infrared images is used, end-to-end infrared conversion and fusion can be achieved, convenience and rapidness are achieved, visible light information is added to the infrared images lacking textures, a fusion image set is obtained, observability of the images can be enhanced, operation personnel can accumulate temperature data conveniently, and temperature change conditions of the inspection equipment can be mastered.
In order to improve the convenience of maintenance and installation, the above-mentioned casing that adopts includes: a first sub-housing 2, a second sub-housing 3 and a third sub-housing 4.
The first sub-housing 2 is mounted at one end of the second sub-housing 3. The third sub-housing 4 is mounted at the other end of the second sub-housing 3. The first sub-housing 2 is fixedly connected with the holder 1. The first sub-housing 2, the second sub-housing 3, and the third sub-housing 4 form an accommodation space. The data acquisition module, the standard processing module and the image fusion module are all arranged in the accommodating space. The third sub-housing 4 is provided with a first hole site 5, a second hole site 6 and a third hole site 7.
Further, in order to improve the image quality of the inspection equipment, the optical system adopted comprises: visible light imagers and infrared imagers. For example, the infrared imager is a 640-resolution staring uncooled infrared detector, and the visible light imager is a 1080-high-definition camera. The advantage of combining infrared discovery heat source and the advantage of visible light observation detail, integrated high resolution module for infrared image is more clear, and the point that generates heat is fixed a position accurately, makes the device can use the function of fusing under dark environment such as pipeline equally.
The visible light imager and the infrared imager are both electrically connected with the data acquisition module.
Wherein, the visible light imager includes: visible light camera lens and visible light imaging circuit. The visible light lens is fixedly installed in the third hole position 7. The visible light lens is electrically connected with the visible light imaging circuit. The visible light imaging circuit is electrically connected with the data acquisition module. And, other parts of infrared imaging appearance all shadow length are in accommodation space to reduce the volume of whole unmanned aerial vehicle equipment of patrolling and examining.
The infrared imager comprises an infrared lens, a chopper, an infrared detector and an infrared imaging circuit which are sequentially and electrically connected. The infrared radiation of the target object is focused by the lens, and is subjected to wave band range selection by the chopper, and then imaging is carried out on the focal plane of the detector.
The infrared lens is mounted in the second hole 6. The infrared imaging circuit is electrically connected with the data acquisition module. In order to make the installation of the infrared imager simpler, an infrared imager installation structure can be further arranged, and the infrared imager installation structure is mainly used for installing and fixing components such as a lens, a color filter switching device, an infrared detector, an infrared imaging circuit, a display and a power supply. The structure comprises an internal supporting structure and a shell, wherein the internal supporting structure is mainly used for supporting and fixing each module.
In the actual installation process, the visible light lens and the infrared lens are fixed in the horizontal and vertical directions, the visible light lens and the infrared lens are vertically arranged side by side and coaxially, the visible light lens and the infrared lens are fixed without deviation in the angle of XYZ axes, and the angular deviation between the optical axes of the two cameras is corrected without a large amount of manual adjustment, so that the combination cost of the dual-light fusion camera is greatly reduced. And the technical problem that the combination cost of the double-light fusion camera is high due to the fact that the center positions of the thermal infrared camera and the visible light camera are difficult to keep consistent during installation of the double-light fusion camera in the prior art, and a large amount of manual adjustment is needed to correct the angle and distance errors between the optical axes of the two cameras.
In order to adapt to the selection of different wave bands, the chopper adopted in the invention is preferably a plug-in chopper. As shown in fig. 3, the plug-in chopper includes: a chopper base 11 and a filter 12.
The chopper base 11 is of a clip type structure, and a through hole is reserved at the lower end of the chopper base and used for clamping the optical filter 12. When the optical filters 12 in different absorption wavelength ranges need to be replaced, the chopper seat 11 is only required to be opened, the use range of wave band imaging can be remarkably improved, and the operation steps are simplified. For example, a through hole with the diameter of 26mm is reserved at the lower part of the chopper base 11, the thickness of the mountable optical filter is less than 1mm, and the diameter is 25.4/23.0 mm.
In addition, the unmanned aerial vehicle inspection equipment provided by the invention is also provided with an LED light supplement lamp. The LED light supplement lamp is arranged in the first hole position 5 so as to provide illumination compensation when ambient light is dark.
Based on this, the image fusion technique of implanting adopts the technical scheme who has combined infrared thermal imaging, visible light camera and edge calculation as an organic whole in two light temperature measurement cameras, and infrared imaging technique cooperation light filling lamp makes unmanned aerial vehicle patrol and examine equipment and can also use the fusion function in dark environment.
The technical scheme provided by the invention is explained in detail below by taking the case that the unmanned aerial vehicle inspection equipment is connected and installed with the M300 in Xinjiang and the unmanned aerial vehicles in the same series through the cloud deck 1.
The double-light temperature measuring camera is installed at a load position right below the unmanned aerial vehicle through an external cradle head.
Furthermore, the dual-light temperature measuring camera mainly comprises a camera shell, a visible light lens, an infrared lens, an LED light supplement lamp and a microprocessor. The camera shell adopts an integrated aluminum alloy shell and is used for installing each component in the double-light temperature measuring camera. The front surface of the front camera shell 4 is respectively provided with a visible light lens mounting position 7 and an infrared lens mounting position 6, and the visible light and the infrared camera lens are coaxially mounted side by side. The side of the shell 3 in the camera is provided with a holder interface position 8.
The LED light filling lamp passes through the inside bearing structure of shell and installs in the casing openly, and the cooperation LED light filling lamp when the environment is darker, opens the light filling lamp and makes the environment that the device was located bright some to make the high definition image that the camera was shot more clear, further improve the accuracy that the heat source was judged, and then make the device can stabilize the work of patrolling and examining under dark environment such as in the pipeline.
The visible light lens and the infrared lens are fixed in the horizontal and vertical directions, and the visible light lens and the infrared lens are vertically, parallelly and coaxially mounted, and have no deviation in the angle of XYZ axes after being fixed. Further, the infrared lens and the visible light lens are fixedly connected between the shell through air damping, the damping isolation frequency is 70-200 Hz, the front face of the shell is a double-light camera lens mounting position, and the visible light and the infrared camera lens are mounted side by side and coaxially. The side surface of the shell is the interface position of the holder.
The LED light filling lamp passes through the inside bearing structure installation of shell and openly with the casing, and cooperation LED light filling lamp makes the device can stabilize the work of patrolling and examining under dark surrounds such as in the pipeline.
The standard processing module is arranged inside the shell through an internal supporting structure of the shell, the standard processing module is designed based on a deep learning network, a characteristic conversion method of visible light and infrared images is utilized, convenience and rapidness are achieved, meanwhile, algorithms of deep learning and big data are utilized, information of the visible light is added to the infrared images lacking textures, a fusion image set is obtained, observability of the images can be enhanced, operation personnel can accumulate temperature data conveniently, and temperature change conditions of the inspection equipment can be mastered.
The infrared lens and the chopper form an infrared imager. The infrared radiation of the inspection target object is focused through the lens, and imaging is carried out after the wave band range selection is carried out through the chopper. The infrared lens is designed by adopting germanium materials and plating an antireflection film, the wave band of the infrared lens covers the range of 3-5 mu m, and the infrared lens is a transmission type optical system. The chopper component adopts a plug-in structure, and the corresponding optical filter can be replaced according to different wavelength ranges of the gas to be detected.
Based on the description, the unmanned aerial vehicle inspection equipment provided by the invention is based on an infrared temperature measurement technology, combines the advantages of infrared heat source discovery and the advantages of visible light detail observation, develops a double-light fusion temperature measurement detector, integrates double-lens high resolution, enables infrared images to be clearer, accurately positions heating points, and enables an unmanned aerial vehicle to stably output fusion images during inspection work in dark environments such as pipelines by matching with an LED light supplement lamp. The temperature data can be accumulated conveniently by operators, specific image data and position condition information can be prompted for the operators, fault points can be tracked conveniently, alarm conditions can be confirmed, faults can be eliminated, and the unmanned aerial vehicle inspection system can adapt to most unmanned aerial vehicle inspection application scenes.
The adopted visible light and infrared image fusion technology adopts constant parameter color transfer, namely a CPCT (continuous parameter color transfer) method, and analyzes the mean value and the variance of a reference image to obtain an image fusion method.
Based on the processing concept, the invention also provides an image fusion method which is applied to the unmanned aerial vehicle inspection equipment. As shown in fig. 4, the image fusion method includes:
step 100: optical data is acquired. The optical data includes: visible light image data and infrared image data. In a specific application process, visible light images and infrared images are collected through a visible light lens and an infrared lens in a double-light temperature measuring camera carried by the unmanned aerial vehicle. The visible image has a larger field of view than the infrared image.
Step 101: and cutting the visible light image in the visible light image data by taking the resolution of the infrared image in the infrared image data as a reference.
Step 102: and extracting the first characteristic point and the second characteristic point. The first feature point is a feature point of a visible light image in the visible light image data. The second characteristic point is a characteristic point of the infrared image in the infrared image data.
Step 103: and obtaining a first image coordinate transformation parameter according to the first characteristic point and a second image coordinate transformation parameter according to the second characteristic point by adopting a characteristic point registration algorithm.
Step 104: and registering the cut visible light image by adopting the first image coordinate transformation parameter to obtain a first registered image, and registering the infrared image by adopting the second image coordinate transformation parameter to obtain a second registered image. For example, the visible light image is cropped according to the resolution size of the infrared image with reference to the resolution size of the acquired infrared image. Extracting a representative part in the image as a characteristic point, obtaining similarity through a registration algorithm based on characteristic point matching to find a matched characteristic point pair, obtaining an image coordinate transformation parameter through the matched characteristic point pair, and finally registering the cut visible light image and the infrared image through the coordinate transformation parameter so as to obtain the visible light image and the infrared image after pixel registration.
Step 105: and performing image fusion on the first registration image and the second registration image by using a CPCT fusion algorithm based on the reference template to obtain a patrol shot image.
The specific implementation process of step 105 is as follows:
the mean and variance of the reference image are analyzed for the first and second registered images by a CPCT transfer method.
Further, the pre-processed first registration image and the pre-processed second registration image are obtained, and the pre-fused to-be-processed image S is obtained through the following formula.
Ps(Y)=(PVis(Y)+PIR(Y))/2,
Ps(U)=PVis(Y)-PIR(Y),
Ps(V)=PIR(Y)-PVis(Y),
In the formula, PVisFor visible light images, PIRIs an infrared image.
Further, the YUV color space of the first and second registration images is mainly focused here, and the conversion between the YUV color space and the RGB color space is a linear conversion, so that the complexity of calculation can be greatly reduced in the color conversion process. In the YUV color space, the brightness is independent of the color difference, i.e., the scene details are independent of the color. An exemplary process of applying the CPCT algorithm to the YUV color space is shown in fig. 5.
Where IR is the preprocessed second registered image and Vis is the resulting first registered image. "Y" in YUV represents brightness, i.e., a gray scale value. And "U" and "V" indicate the picture color and saturation, which are used to specify the color of the pixel. The RGB color scheme is a color standard in the industry, and RGB, i.e. the values of red, green and blue, also called three primary color light channels, are obtained by changing the three color channels of red (R), green (G) and blue (B) and superimposing them on each other to obtain various colors.
Unlike other color-transfer image fusion algorithms, the CPCT fusion algorithm is performed on the basis of color-transfer process analysis of a reference image.
The formula for the specific color transfer is shown below.
In the formula, P is YUV three channel values of the fused image,is the average of the reference image or images,is the mean value, P, of the image S to be processedsFor the YUV three-channel values of the image S to be processed,is the variance of the image S to be processed,the index S is the variance of the reference image, i.e. the index S is the image S to be processed, and the index t is the reference image.
Further, change muY、μU、μV、σY、σU、σVOne or two of them, the others are set to constants, and then the influence of the parameters on the fusion result is observed. The analysis of this example concludes the rules for the mean and variance of the three channels Y, U, V as follows.
1) When mu isYWhen the value is 80 to 110, the brightness of the fused image is good, and when the sigma is equal toYWhen the image boundary information is 800-1200, the fused image boundary information is better reserved.
2)μU、μVDetermines the color of the fused image whenU|,|μVWhen | is less than 6, the color of the fused image is not rich enough, and when | muU|,|μVIf | is greater than 20, the naturalness of the fused image is low, and in order to ensure the naturalness of the image, | | muU|-|μVAnd | l < 8. With | μU|,|μVIncrease of | | | | muU|-|μV| should be reduced to ensure naturalness of the image.
3)σU、σV600 ~ 1000, can guarantee to fuse image color and nature. When sigma isU>σVWhen the fused image is green, otherwise, the fused image is red. To ensure naturalness, | σU-σV|<400。
In summary, when μ is takenYWhen the brightness of the fused image is 80-110, the brightness of the fused image is better, and the sigma is takenYWhen the image boundary information is 800-1200, the fused image boundary information is better reserved.
The color transfer is actually to replace the mean and variance of the image S to be processed with the mean and variance of the reference image, so that the color change of the image to be processed is similar to that of the reference image, and the final fusion result is obtained and stored. For example, the fused image is shown in fig. 6.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.
Claims (10)
1. An unmanned aerial vehicle equipment of patrolling and examining, its characterized in that includes: the system comprises a double-light temperature measuring camera, a cloud deck, an unmanned aerial vehicle body and an optical system;
the double-light temperature measuring camera is fixedly connected with the holder; the holder is provided with a holder interface position; the holder interface is mechanically mounted on the unmanned aerial vehicle body; the optical system is installed on the double-light temperature measurement camera and is electrically connected with the double-light temperature measurement camera.
2. The unmanned aerial vehicle inspection apparatus of claim 1, wherein the dual light thermometry camera includes: the device comprises a shell, a data acquisition module, a standard processing module and an image fusion module;
the shell is fixedly connected with the holder; the shell is provided with a hole site for mounting the optical system; the data acquisition module is electrically connected with the optical system and the standard processing module respectively; the standard processing module is electrically connected with the image fusion module;
the data acquisition module is used for acquiring optical data in the optical system; the optical data comprises visible light image data and infrared image data;
the standard processing module is used for preprocessing the optical data; the pretreatment comprises the following steps: image cropping and image registration;
the image fusion module is used for fusing the preprocessed optical data by adopting a CPCT fusion algorithm to obtain a fusion image; and the fused image is the patrol inspection shooting image.
3. The unmanned aerial vehicle inspection apparatus of claim 2, wherein the housing includes: the first sub-shell, the second sub-shell and the third sub-shell;
the first sub-shell is mounted at one end of the second sub-shell; the third sub-shell is arranged at the other end of the second sub-shell; the first sub-shell is fixedly connected with the holder; the first sub-shell, the second sub-shell and the third sub-shell form an accommodating space; the data acquisition module, the standard processing module and the image fusion module are all arranged in the accommodating space;
and the third sub-shell is provided with a first hole site, a second hole site and a third hole site.
4. The unmanned aerial vehicle inspection apparatus of claim 3, wherein the optical system includes: a visible light imager and an infrared imager;
the visible light imager and the infrared imager are both electrically connected with the data acquisition module.
5. The unmanned aerial vehicle inspection apparatus of claim 4, wherein the visible light imager includes: the visible light imaging device comprises a visible light lens and a visible light imaging circuit;
the visible light lens is fixedly arranged in the third hole; the visible light lens is electrically connected with the visible light imaging circuit; the visible light imaging circuit is electrically connected with the data acquisition module.
6. The unmanned aerial vehicle inspection apparatus according to claim 4, wherein the infrared imager includes an infrared lens, a chopper, an infrared detector and an infrared imaging circuit electrically connected in sequence;
the infrared lens is arranged in the second hole site; the infrared imaging circuit is electrically connected with the data acquisition module.
7. The unmanned aerial vehicle inspection apparatus of claim 6, wherein the chopper is a plug-in chopper.
8. An unmanned aerial vehicle inspection apparatus according to claim 7, wherein the pluggable chopper includes: a chopper base and an optical filter;
the chopper base is of a clip type structure and is used for clamping the optical filter.
9. The unmanned aerial vehicle inspection apparatus of claim 3, further comprising an LED fill light;
the LED light supplement lamp is installed in the first hole site.
10. An image fusion method is characterized by being applied to the unmanned aerial vehicle inspection equipment according to any one of claims 1 to 9; the image fusion method comprises the following steps:
acquiring optical data; the optical data includes: visible light image data and infrared image data;
cutting a visible light image in the visible light image data by taking the resolution of the infrared image in the infrared image data as a reference;
extracting a first characteristic point and a second characteristic point; the first characteristic point is a characteristic point of a visible light image in the visible light image data; the second characteristic points are characteristic points of the infrared image in the infrared image data;
obtaining a first image coordinate transformation parameter according to the first characteristic point by adopting a characteristic point registration algorithm, and obtaining a second image coordinate transformation parameter according to the second characteristic point;
registering the cut visible light image by adopting the first image coordinate transformation parameter to obtain a first registered image, and registering the infrared image by adopting a second image coordinate transformation parameter to obtain a second registered image;
and performing image fusion on the first registration image and the second registration image by using a CPCT fusion algorithm based on a reference template to obtain a patrol shot image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210207347.8A CN114596506A (en) | 2022-03-04 | 2022-03-04 | Unmanned aerial vehicle inspection equipment and image fusion method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210207347.8A CN114596506A (en) | 2022-03-04 | 2022-03-04 | Unmanned aerial vehicle inspection equipment and image fusion method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114596506A true CN114596506A (en) | 2022-06-07 |
Family
ID=81807625
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210207347.8A Pending CN114596506A (en) | 2022-03-04 | 2022-03-04 | Unmanned aerial vehicle inspection equipment and image fusion method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114596506A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115914792A (en) * | 2022-12-22 | 2023-04-04 | 长春理工大学 | Real-time multidimensional imaging self-adaptive adjustment system and method based on deep learning |
-
2022
- 2022-03-04 CN CN202210207347.8A patent/CN114596506A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115914792A (en) * | 2022-12-22 | 2023-04-04 | 长春理工大学 | Real-time multidimensional imaging self-adaptive adjustment system and method based on deep learning |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110660088B (en) | Image processing method and device | |
WO2018076732A1 (en) | Method and apparatus for merging infrared image and visible light image | |
TWI525382B (en) | Camera array systems including at least one bayer type camera and associated methods | |
US10841508B2 (en) | Electrical cabinet infrared monitor systems and methods | |
US20140340515A1 (en) | Image processing method and system | |
CN110017904B (en) | Multispectral radiation temperature measurement method based on CCD camera | |
JP6455764B2 (en) | Color correction parameter calculation method, color correction parameter calculation device, and image output system | |
CN109493273B (en) | Color consistency adjusting method | |
EP2939414A1 (en) | Electrical cabinet infrared monitor systems and methods | |
Charrière et al. | Color calibration of an RGB camera mounted in front of a microscope with strong color distortion | |
CN109565577B (en) | Color correction device, color correction system and color correction method | |
CN115379123B (en) | Transformer fault detection method for unmanned aerial vehicle inspection | |
CN113676628A (en) | Multispectral sensor, imaging device and image processing method | |
CN114596506A (en) | Unmanned aerial vehicle inspection equipment and image fusion method | |
Shrestha et al. | Quality evaluation in spectral imaging–quality factors and metrics | |
Nocerino et al. | Geometric calibration and radiometric correction of the maia multispectral camera | |
US20220053153A1 (en) | Image signal processing for reducing lens flare | |
KR20180012362A (en) | System and method for measuring of luminance and chromaticity | |
CN112082738A (en) | Performance evaluation test system and test method for color night vision camera | |
US10728517B2 (en) | Parallax mitigation for multi-imager systems and methods | |
CN107170013B (en) | Calibration method for spectral response curve of RGB camera | |
CN115222785A (en) | Infrared and visible light image registration method based on binocular calibration | |
KR101326095B1 (en) | Apparatus for uniting images and method thereof | |
JP4583844B2 (en) | Image processing apparatus, image processing method, and program | |
Brauers et al. | Multispectral image acquisition with flash light sources |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |