CN116363026A - Image fusion method for quickly finding target under complex environmental condition - Google Patents
Image fusion method for quickly finding target under complex environmental condition Download PDFInfo
- Publication number
- CN116363026A CN116363026A CN202211673361.3A CN202211673361A CN116363026A CN 116363026 A CN116363026 A CN 116363026A CN 202211673361 A CN202211673361 A CN 202211673361A CN 116363026 A CN116363026 A CN 116363026A
- Authority
- CN
- China
- Prior art keywords
- image
- channel
- light
- infrared
- micro
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 23
- 230000007613 environmental effect Effects 0.000 title claims abstract description 9
- 238000001514 detection method Methods 0.000 claims abstract description 24
- 238000012545 processing Methods 0.000 claims abstract description 17
- 230000004927 fusion Effects 0.000 claims abstract description 16
- 238000013461 design Methods 0.000 claims abstract description 8
- 230000003287 optical effect Effects 0.000 claims description 6
- 238000005286 illumination Methods 0.000 claims description 5
- 238000003708 edge detection Methods 0.000 claims description 3
- 238000001914 filtration Methods 0.000 claims description 3
- 238000000034 method Methods 0.000 claims description 2
- 238000000926 separation method Methods 0.000 abstract description 2
- 238000005516 engineering process Methods 0.000 description 9
- 230000004297 night vision Effects 0.000 description 5
- 238000011161 development Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000001931 thermography Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Abstract
The invention relates to an image fusion method for quickly finding a target under a complex environmental condition, which adopts a double-channel detection design, wherein the double channels are a micro-light channel and an infrared detection channel respectively, and comprises the following steps: starting, an image processing module receives an image of an infrared channel and an image of a micro-light channel, extracting characteristic contours of the infrared image, and performing R channel assignment on an OLED screen; performing image enhancement processing on the micro-light channel image, performing G channel assignment on the processed micro-light image on an OLED screen, and performing B channel assignment on character information displayed by a system; and (5) observing and debugging. According to the image fusion method disclosed by the invention, the micro-light channel is mapped to the green channel of the OLED screen, the infrared channel is mapped to the red channel of the OLED screen, the image fusion driven by the color separation region is realized, and the aim of highlighting under a complex background can be achieved.
Description
Technical Field
The invention relates to the field of image processing, in particular to an image fusion method for quickly finding a target under a complex environmental condition.
Background
The low-light night vision technology is the most widely applied night vision technology at present, and has a plurality of advantages, but the most obvious disadvantage is that the detection capability is reduced along with the reduction of illumination and contrast, and even the detection purpose cannot be achieved. With the rapid development of infrared thermal imaging technology in recent years, the advantage of the infrared thermal imaging technology to relative low-light night vision starts to appear, and the defect of low light can be effectively overcome. However, each detector has the advantages and limitations of use, and all-weather and complex environment scene target detection and striking are necessarily combined by multiple detection means to realize advantage complementation. The multiband image fusion technology is the necessary direction of development, and is becoming a routine means for upgrading the photoelectric imaging detection system. The color image fusion equipment has very rapid development in recent years, and is a key technology which is required to be developed preferentially under the condition of 'sharing darkness' and keeps the leading position of the night vision technology by paralleling the 'image fusion technology' with the 'color night vision technology'.
At present, a digital color fusion method and an optical image fusion method based on image features are mostly adopted. The color image fusion method performs multiband image fusion processing by extracting the image features of the thermal image channel and the micro-light channel, thereby achieving the purpose of effectively finding and identifying the target under the condition that the information of the observed scene is not lost.
The color image fusion algorithm is mostly an image fusion method based on a time domain and an image fusion method based on a frequency domain, and when image feature extraction and fusion algorithm processing are carried out, the information quantity of a single-channel image is inevitably lost aiming at different scenes, so that the useful information of an image target is lost or the identification is not obvious when fusion observation is carried out under a complex background condition.
Disclosure of Invention
In order to solve the technical problems, the image fusion method for quickly finding the target under the complex environmental condition adopts a double-channel detection design, wherein the double channels are respectively a micro-light channel and an infrared detection channel, and the method comprises the following steps:
step S1: powering on, powering on the uncooled infrared core assembly and the low-illumination CMOS;
step S2: the image processing module receives the image of the infrared channel and the image of the micro-light channel, adopts a canny-based over-edge detection algorithm to extract the characteristic outline of a typical thermal target of the infrared image, and carries out R channel assignment on the extracted characteristic area target on the OLED screen;
step S3: the image processing module performs denoising and image enhancement on the received low-light-level image by adopting a median filtering algorithm based on 3 multiplied by 3, performs G channel assignment on the processed low-light-level image on an OLED screen, and performs B channel assignment on character information displayed by the system;
step S4: an observer observes the OLED image through the ocular, and can select different fusion display modes through a menu to adjust the display proportion of the infrared image and the low-light image.
In one embodiment of the present invention, the infrared channels in the steps S2 and S3 use a 12 μm640×512 uncooled infrared core assembly for infrared detection; the micro-light channel uses 18 μm800 x 600 low-light CMOS for micro-light detection.
In one embodiment of the invention, the OLED screen is highlighted in color by 0.61 inch and as a display for the fused image, an eyepiece optical system is used for fused display viewing in step S4.
In one embodiment of the invention, the image processing module can drive the OLED screen to display according to different RGB channels, and is also connected with a wireless transceiver module, a key and an external interface, and is connected with a battery box by power supply.
In one embodiment of the invention, the battery box is composed of four sections 18650 and a power panel, and the positive and negative electrodes at two ends of the 18650 are arranged in the power panel through fixing clamping plates.
Compared with the prior art, the technical scheme of the invention has the following advantages: according to the image fusion method disclosed by the invention, the micro-light channel is mapped to the green channel of the OLED screen, the infrared channel is mapped to the red channel of the OLED screen, the image fusion driven by the color separation region is realized, and the aim of highlighting under a complex background can be achieved. The rapid finding and capturing capability of the system on complex environmental targets such as jungle, desert, snow and the like is improved.
Drawings
In order that the invention may be more readily understood, a more particular description of the invention will be rendered by reference to specific embodiments thereof that are illustrated in the appended drawings.
FIG. 1 is a schematic diagram of the inventive principle of the image fusion method for rapid target discovery under complex environmental conditions of the present invention;
fig. 2 is a schematic diagram of a system design principle of the image fusion method according to the present invention.
Detailed Description
As shown in fig. 1 and fig. 2, the present embodiment provides an image fusion method for quickly finding a target under a complex environmental condition, where the image fusion method adopts a dual-channel detection design, and the dual channels are a micro-light channel and an infrared detection channel, respectively, and includes the following steps:
step S1: powering on, powering on the uncooled infrared core assembly and the low-illumination CMOS;
step S2: the image processing module receives the image of the infrared channel and the image of the micro-light channel, adopts a canny-based over-edge detection algorithm to extract the characteristic outline of a typical thermal target of the infrared image, and carries out R channel assignment on the extracted characteristic area target on the OLED screen;
step S3: the image processing module performs denoising and image enhancement on the received low-light-level image by adopting a median filtering algorithm based on 3 multiplied by 3, performs G channel assignment on the processed low-light-level image on an OLED screen, and performs B channel assignment on character information displayed by the system;
step S4: an observer observes the OLED image through the ocular, and can select different fusion display modes through a menu to adjust the display proportion of the infrared image and the low-light image.
The image fusion method comprises an optical system design, wherein the optical system design comprises a detection system and a display system. The detection system comprises a 12 mu m640 x 512 uncooled infrared core assembly for infrared detection of an infrared channel and an 18 mu m800 x 600 low-illumination CMOS for micro-light detection of a micro-light channel; the multi-channel detection is adopted, the display function is fused, and the environment sensing capability of a user can be effectively improved.
The display system comprises an OLED screen and a fusion display module which is formed by taking the image processing module as a center, wherein the OLED screen is 0.61 inch of highlight color and is used as a display of a fusion image;
in the step S4, an eyepiece optical system is adopted for fusion display observation, and meanwhile, an image fusion method of color division driving display is adopted, so that the target can be effectively highlighted under a complex background, and quick finding and capturing of the target under the complex background are realized.
The image processing module can drive the OLED screen to display according to different RGB channels, meanwhile, the image processing module is also connected with a wireless transceiver module, keys and an external interface, and meanwhile, the image processing module is connected with the battery box by power supply; the battery box is composed of four sections 18650 and a power panel, and the positive and negative electrodes at two ends of the 18650 are arranged in the power panel through fixing clamping plates.
The principle of the invention of the embodiment:
as shown in fig. 2, a dual-channel detection design is adopted, namely a low-light-level channel and an infrared detection channel. The image processing module maps the low-light-level image and the infrared image to different pixels of the color OLED screen with the resolution of 800 multiplied by 600 of 0.61 inch according to different proportions, maps the infrared image to R pixels of the color OLED screen, maps the low-light-level image to a G channel of the color OLED screen, and forms a color fusion image in human eyes through an ocular. In order to realize the adaptability of the system to complex environments such as jungle, desert, snowfield and the like, the low-light-level image and the thermal image are mapped to different color channels according to different proportions, so that the adaptability of the system to various complex environments is effectively improved.
It is apparent that the above examples are given by way of illustration only and are not limiting of the embodiments. Other variations and modifications of the present invention will be apparent to those of ordinary skill in the art in light of the foregoing description. It is not necessary here nor is it exhaustive of all embodiments. And obvious variations or modifications thereof are contemplated as falling within the scope of the present invention.
Claims (5)
1. The image fusion method for quickly finding the target under the complex environmental condition adopts a double-channel detection design, wherein the double channels are respectively a micro-light channel and an infrared detection channel, and the method is characterized by comprising the following steps of:
step S1: powering on, powering on the uncooled infrared core assembly and the low-illumination CMOS;
step S2: the image processing module receives the image of the infrared channel and the image of the micro-light channel, adopts a canny-based over-edge detection algorithm to extract the characteristic outline of a typical thermal target of the infrared image, and carries out R channel assignment on the extracted characteristic area target on the OLED screen;
step S3: the image processing module performs denoising and image enhancement on the received low-light-level image by adopting a median filtering algorithm based on 3 multiplied by 3, performs G channel assignment on the processed low-light-level image on an OLED screen, and performs B channel assignment on character information displayed by the system;
step S4: an observer observes the OLED image through the ocular, and can select different fusion display modes through a menu to adjust the display proportion of the infrared image and the low-light image.
2. The image fusion method of claim 1, wherein: the infrared channels in the steps S2 and S3 are subjected to infrared detection by using a 12 mu m640 multiplied by 512 uncooled infrared core assembly; the micro-light channel uses 18 μm800 x 600 low-light CMOS for micro-light detection.
3. The image fusion method of claim 1, wherein: the OLED screen is 0.61 inch of highlighting color, and as a display of the fused image, an eyepiece optical system is used for fusion display observation in step S4.
4. The image fusion method of claim 1, wherein: the image processing module can drive the OLED screen to display according to different RGB channels, and is also connected with a wireless transceiver module, keys and an external interface, and is connected with a battery box by power supply.
5. The image fusion method of claim 4, wherein: the battery box is composed of four sections 18650 and a power panel, and the positive and negative electrodes at two ends of the 18650 are arranged in the power panel through fixing clamping plates.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211673361.3A CN116363026A (en) | 2022-12-26 | 2022-12-26 | Image fusion method for quickly finding target under complex environmental condition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211673361.3A CN116363026A (en) | 2022-12-26 | 2022-12-26 | Image fusion method for quickly finding target under complex environmental condition |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116363026A true CN116363026A (en) | 2023-06-30 |
Family
ID=86905809
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211673361.3A Pending CN116363026A (en) | 2022-12-26 | 2022-12-26 | Image fusion method for quickly finding target under complex environmental condition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116363026A (en) |
-
2022
- 2022-12-26 CN CN202211673361.3A patent/CN116363026A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2752008B1 (en) | Pixel array, camera using the same and color processing method based on the pixel array | |
Waxman et al. | Solid-state color night vision: fusion of low-light visible and thermal infrared imagery | |
CN107580163A (en) | A kind of twin-lens black light camera | |
CN105578063A (en) | Image processing method and terminal | |
KR20140099777A (en) | Method and System for Image Fusion using Multi-spectral filter array sensor | |
CN107820066A (en) | A kind of low-luminance color video camera | |
US20170111557A1 (en) | Camera assembly with filter providing different effective entrance pupil sizes based on light type | |
US10893181B2 (en) | System and apparatus for color imaging device | |
JP2008042617A (en) | Digital camera | |
JP2012176641A (en) | Detection apparatus for parking frame | |
SE468414B (en) | FILTER PICTURE REGISTRATION IN POOR LIGHT | |
US10922802B2 (en) | Fusion of thermal and reflective imagery | |
CN101860677A (en) | Fog-penetrating and night-vision system based on refrigeration type CCD element | |
Scribner et al. | Infrared color vision: an approach to sensor fusion | |
JP5708036B2 (en) | Imaging device | |
CN104660905A (en) | Shooting processing method and device | |
US20180109739A1 (en) | Apparatus, system and method of modifying an image sensor to achieve hyperspectral imaging in low light | |
CN109451210B (en) | Refrigeration type electron multiplication CCD camera system | |
McDaniel et al. | Image fusion for tactical applications | |
US9311551B2 (en) | System and method for situational awareness and target cueing | |
US20210190594A1 (en) | Personal electronic device with built-in visible camera and thermal sensor with thermal information available in the visible camera user interface and display | |
CN116363026A (en) | Image fusion method for quickly finding target under complex environmental condition | |
US20180176445A1 (en) | Imaging device and imaging method | |
CN214842764U (en) | Target rapid aiming and capturing device under short-distance complex environment condition | |
WO2023125629A1 (en) | Image sensor, exposure control method, camera module, and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |