CN112950520A - Multi-frame image fusion method and device for removing motion smear - Google Patents

Multi-frame image fusion method and device for removing motion smear Download PDF

Info

Publication number
CN112950520A
CN112950520A CN202110456615.5A CN202110456615A CN112950520A CN 112950520 A CN112950520 A CN 112950520A CN 202110456615 A CN202110456615 A CN 202110456615A CN 112950520 A CN112950520 A CN 112950520A
Authority
CN
China
Prior art keywords
image
images
noise
filtering
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110456615.5A
Other languages
Chinese (zh)
Inventor
吉贝贝
王冠
王林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Hailichuang Technology Co ltd
Original Assignee
Shanghai Hailichuang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Hailichuang Technology Co ltd filed Critical Shanghai Hailichuang Technology Co ltd
Priority to CN202110456615.5A priority Critical patent/CN112950520A/en
Publication of CN112950520A publication Critical patent/CN112950520A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Picture Signal Circuits (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a multi-frame image fusion method and a device for removing motion smear, wherein in one embodiment, the method comprises the following steps: acquiring a plurality of groups of images of the same scene at different moments; selecting one image from the multiple groups of images as a reference image; filtering each image in the multiple groups of images by adopting mean filtering to obtain a filtered smooth image of each image; calculating a mutual noise estimation value of each image except the reference image and the reference image by using the filtering smooth image; performing edge detection on the mutual noise estimation value and removing the detected edge; and calculating the noise reduction composite image of the plurality of groups of images according to the mutual noise estimation value of the removed edge.

Description

Multi-frame image fusion method and device for removing motion smear
Technical Field
The invention belongs to the technical field of image processing, and relates to a multi-frame image fusion method and a device for removing motion smear.
Background
Noise is one of the important criteria for evaluating picture quality. In reality, the digital image is difficult to avoid being influenced by imaging equipment and external environment in the digitalization and transmission processes, and a large amount of noise is generated, so that the noise reduction algorithm is particularly important.
Conventional noise reduction algorithms mainly include spatial filtering, sparse representation, and the like, and these algorithms usually process a single picture, so to achieve a good noise reduction effect, complex computational constraints and spatial complexity, such as bilateral filtering, such as NLmeans, are generally required. In practical application, the method often has higher requirements on the effect and the calculation performance of the algorithm. The multi-frame stacking is a noise reduction method widely used in practical application, a plurality of pictures are shot for superposition, noise is reasonably controlled by utilizing the characteristic that the noise is randomly generated, and experiments prove that the multi-frame stacking can simply and effectively control the noise. However, in practical use, when an object in a multi-frame photo is in a moving state, direct superposition can bring motion smear, thereby affecting the photo effect.
At present, two motion processing algorithms are commonly used, one is to detect a motion area first, and then to process the motion area independently without directly overlapping, which increases the calculation complexity of the algorithm and makes the noise of the motion area difficult to process; the other method utilizes the idea of NLmeans, similar redundant information of a multi-frame image is utilized, similar pixel points of current pixel points are found at the positions of the similar pixel points of the multi-frame image, then weighted averaging is carried out on the pixel points, a cleaner image can be obtained, then algorithm complexity is extremely high, and the method is not beneficial to use in actual engineering at present.
Disclosure of Invention
The invention aims to provide a multi-frame image fusion method and a device for removing motion smear, which reasonably eliminate random noise and skillfully avoid the occurrence of superimposed smear in a motion area.
An embodiment of the application discloses a multi-frame image fusion method for removing motion smear, which comprises the following steps:
acquiring a plurality of groups of images of the same scene at different moments;
selecting one image from the multiple groups of images as a reference image;
filtering each image in the multiple groups of images by adopting mean filtering to obtain a filtered smooth image of each image;
calculating a mutual noise estimation value of each image except the reference image and the reference image;
performing edge detection on the mutual noise estimation value and removing the detected edge;
and calculating the noise reduction composite image of the plurality of groups of images according to the mutual noise estimation value of the removed edge.
Preferably, in the step of acquiring a plurality of sets of images of the same scene at different times, at least three images of the same scene at different times are acquired.
Preferably, the reference map is a first image of the plurality of sets of images.
Preferably, the step of filtering each image in the multiple groups of images by using a mean filtering to obtain a filtered smooth graph of each image further includes:
selecting the size of a filtering window according to the noise degree, and calculating a filtering smooth graph
Figure 890988DEST_PATH_IMAGE001
Figure 886757DEST_PATH_IMAGE002
And i is the ith image in the n images in the plurality of groups of images.
Preferably, the filter window size is 5.
Preferably, the step of calculating a mutual noise estimation value between each image except the reference image and the reference image further includes:
calculating a cross-noise estimate value of
Figure 135335DEST_PATH_IMAGE003
Figure 124020DEST_PATH_IMAGE004
Wherein, in the step (A),
Figure 17021DEST_PATH_IMAGE005
for the (i) th image(s),
Figure 308325DEST_PATH_IMAGE006
a filtered smoothed image for said ith image,
Figure 168833DEST_PATH_IMAGE007
is the noise estimate value of the ith image,
Figure 102154DEST_PATH_IMAGE008
in order to be the reference map, the reference map is,
Figure 239875DEST_PATH_IMAGE009
for the filtered smoothed image of the reference map,
Figure 843025DEST_PATH_IMAGE010
is the noise estimate of the reference map.
Preferably, the step of performing edge detection on the cross noise estimation value and removing the detected edge further includes:
setting a positive edge threshold value and a negative edge threshold value, and respectively detecting regions of which the pixel values are greater than the positive edge threshold value and less than the negative edge threshold value in the mutual noise estimation values;
setting a pixel value of the detected region to 0.
Preferably, the step of calculating a noise reduction composite map of the plurality of sets of images based on the cross-noise estimation values with edges removed further includes;
according to
Figure 331775DEST_PATH_IMAGE011
Calculating a noise-reduced composite map of the plurality of sets of images, wherein
Figure 537629DEST_PATH_IMAGE012
For the (i) th image(s),
Figure 920069DEST_PATH_IMAGE013
a filtered smoothed image for said ith image,
Figure 818755DEST_PATH_IMAGE014
is the noise estimate value of the ith image,
Figure 794801DEST_PATH_IMAGE015
in order to be the reference map, the reference map is,
Figure 414132DEST_PATH_IMAGE016
for the filtered smoothed image of the reference map,
Figure 792024DEST_PATH_IMAGE017
ref is the id number of the reference map, which is the noise estimate of the reference map.
The present application discloses in another embodiment a multi-frame image fusion apparatus for removing motion smear, including:
the image acquisition module is used for acquiring a plurality of groups of images of the same scene at different moments;
the reference selecting module is used for selecting one image from the multiple groups of images as a reference image;
the filtering module is used for filtering each image in the multiple groups of images by adopting mean filtering to obtain a filtered smooth image of each image;
the mutual noise estimation module is used for calculating the mutual noise estimation value of each image except the reference image and the reference image;
an edge removal module, configured to perform edge detection on the inter-noise estimation value and remove a detected edge;
and the synthesis module is used for calculating the noise reduction synthesis images of the plurality of groups of images according to the mutual noise estimation values of the removed edges.
The present application also discloses a computer-readable storage medium having stored therein computer-executable instructions which, when executed by a processor, implement the steps in the method as described hereinbefore.
Compared with the prior art, the method has the following beneficial effects:
in the method, the noise estimation is carried out on the multi-frame images, the multi-frame noise is superposed, the random noise is reasonably eliminated, meanwhile, in the superposition process of the noise, the superposition smear of the motion area is ingeniously avoided, and the method is a reasonable and efficient method which can be used in engineering.
Drawings
FIG. 1 is a flow chart of a multi-frame image fusion method for removing motion smear according to an embodiment of the present invention.
Fig. 2-4 are schematic diagrams of three images acquired in one embodiment of the invention.
Fig. 5 is a diagram illustrating normalized cross-noise estimates in accordance with an embodiment of the present invention.
Fig. 6 is a diagram illustrating cross-noise estimation values for edge processing according to an embodiment of the present invention.
FIG. 7 is a diagram illustrating multi-frame image fusion with motion blur removed, in accordance with an embodiment of the present invention.
Fig. 8 is a schematic diagram of multi-frame image fusion with motion smear removal for edge processing according to another embodiment of the present invention.
Fig. 9 is a schematic diagram of directly performing multi-frame image fusion in the prior art.
Detailed Description
In the following description, numerous technical details are set forth in order to provide a better understanding of the present application. However, it will be understood by those skilled in the art that the technical solutions claimed in the present application can be implemented without these technical details and with various changes and modifications based on the following embodiments.
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
An embodiment of the present application discloses a multi-frame image fusion method for removing motion smear, and fig. 1 shows a flowchart of the multi-frame image fusion method for removing motion smear in an embodiment, where the method includes:
step 101, acquiring multiple groups of images of the same scene at different times. In an embodiment, in the step of acquiring a plurality of sets of images of the same scene at different times, at least three images of the same scene at different times are acquired.
And 102, selecting one image from the multiple groups of images as a reference image. In one embodiment, the reference map is a first image of the plurality of sets of images.
And 103, filtering each image in the multiple groups of images by adopting mean filtering to obtain a filtering smooth graph of each image.
In an embodiment, the step of filtering each image in the multiple groups of images by using a mean filtering to obtain a filtered smooth graph of each image further includes: selecting the size of a filtering window according to the noise degree, and calculating a filtering smooth graph
Figure 720666DEST_PATH_IMAGE018
And i is the ith image in the n images in the plurality of groups of images. In one embodiment, the filter window size is 5.
And 104, calculating the mutual noise estimation value of each image except the reference image and the reference image, and calculating to obtain a noise reduction composite image.
In an embodiment, the step of performing mutual noise estimation on the multiple groups of images and the reference image and calculating to obtain a noise reduction composite image further includes:
calculating a cross-noise estimate value of
Figure 184008DEST_PATH_IMAGE019
Figure 731664DEST_PATH_IMAGE020
Wherein, in the step (A),
Figure 839428DEST_PATH_IMAGE021
for the (i) th image(s),
Figure 79917DEST_PATH_IMAGE022
is the noise estimate value of the ith image,
Figure 764976DEST_PATH_IMAGE023
in order to be the reference map, the reference map is,
Figure 506536DEST_PATH_IMAGE024
is a noise estimate of the reference map;
according to
Figure 327861DEST_PATH_IMAGE025
And calculating a noise reduction composite map of the plurality of groups of images, wherein ref is the id number of the reference map.
And 105, performing edge detection on the mutual noise estimation value and removing the detected edge.
In an embodiment, the step of performing edge detection on the cross noise estimation value and removing the detected edge further includes:
setting a positive edge threshold value and a negative edge threshold value, and respectively detecting regions of which the pixel values are greater than the positive edge threshold value and less than the negative edge threshold value in the mutual noise estimation values;
setting a pixel value of the detected region to 0.
And 106, calculating a noise reduction composite image of the plurality of groups of images according to the mutual noise estimation value of the removed edge.
In one embodiment, the step of calculating a noise reduction composite map of the plurality of sets of images according to the cross noise estimation values of the removed edges further comprises;
according to
Figure 4830DEST_PATH_IMAGE026
And calculating the noise reduction composite map of the plurality of groups of images.
The method of the present application is described in detail below in a specific embodiment, it should be understood that the present application is not limited thereto.
(one) motion deblurred multiframe stacking
Suppose that there are n noisy pictures
Figure 775254DEST_PATH_IMAGE027
Figure 930292DEST_PATH_IMAGE028
Figure 137282DEST_PATH_IMAGE029
,...,
Figure 844207DEST_PATH_IMAGE030
The noise reduction using the classical stack method is as follows:
Figure 503859DEST_PATH_IMAGE031
(1)
wherein X is the picture after noise reduction. Since noise reduction using equation (1) will have inevitable motion smear phenomenon, we change equation (1) into a form written as:
Figure 462587DEST_PATH_IMAGE032
(2)
wherein
Figure 868292DEST_PATH_IMAGE027
We select the noise reduction reference map. It can be seen that the equations (1) (2) are fully equivalent. The multi-frame stacking method of the present application is derived based on formula (2). The details are as follows.
In the formula (2), the first term
Figure 887064DEST_PATH_IMAGE027
Is the noise reduction reference map that we have chosen, and the second term is the noise estimate of the residual image relative to the reference map.
We break the second term of equation (2) apart, i.e., pair
Figure 893066DEST_PATH_IMAGE033
Figure 655485DEST_PATH_IMAGE034
,...,
Figure 571489DEST_PATH_IMAGE035
Is known to
Figure 105369DEST_PATH_IMAGE028
Figure 739613DEST_PATH_IMAGE029
,...,
Figure 40144DEST_PATH_IMAGE030
From a reference picture
Figure 935288DEST_PATH_IMAGE027
The images of the same scene in different time domains are, ideally, no moving object is assumed in the image, all the static objects in the image are completely aligned, and we generally assume that the image noise is additive noise, then
Figure 30283DEST_PATH_IMAGE035
Can be regarded as the difference between the two images (mutual noise), i.e. the noise
Figure 151823DEST_PATH_IMAGE037
(3)
Wherein X is the original image without noise,
Figure 865832DEST_PATH_IMAGE038
as an image
Figure 490848DEST_PATH_IMAGE030
The noise of (2) is detected,
Figure 22324DEST_PATH_IMAGE039
as an image
Figure 224635DEST_PATH_IMAGE027
The noise of (2) is, for the same reason,
Figure 132548DEST_PATH_IMAGE041
Figure 877650DEST_PATH_IMAGE042
(4)
we therefore turn the original problem into an estimation problem of noise. There is presented a method of noise estimation,
Figure 580027DEST_PATH_IMAGE044
(5)
wherein
Figure 20367DEST_PATH_IMAGE045
To estimate X, we use mean filtering to obtain an estimate of X
Figure 466392DEST_PATH_IMAGE045
Then equation (2) can be finally written as
Figure 925055DEST_PATH_IMAGE046
(6)
Taking a specific example as an example for illustration, the example comprises the following steps:
step 1, suppose there are n pictures with noise
Figure 798333DEST_PATH_IMAGE027
Figure 116182DEST_PATH_IMAGE028
Figure 710105DEST_PATH_IMAGE029
,...,
Figure 429800DEST_PATH_IMAGE030
For example, 3 images as in fig. 2 to 4, are processed according to the algorithm described above as follows:
step 2, selecting a reference map
Figure 473979DEST_PATH_IMAGE047
. The noise reduction reference picture is generally selected according to the definition of the picture, and can also be directly set
Figure 138179DEST_PATH_IMAGE027
Is a reference diagram.
(step 3) average filtering the picture using average filtering, calculating
Figure 926006DEST_PATH_IMAGE045
. The size of the filtering window can be selected according to the noise level, and the experimental data of the invention is selected 5.
Figure 500207DEST_PATH_IMAGE048
Figure 325075DEST_PATH_IMAGE049
(step 4) calculating
Figure 351936DEST_PATH_IMAGE050
Estimation, i.e. cross-noise estimation.
Figure 943455DEST_PATH_IMAGE052
Step 5, according to the mutual noise estimation, a noise reduction composite map is calculated by the following formula to obtain fig. 7.
Figure 231217DEST_PATH_IMAGE054
Because the stack of the original image is avoided in the embodiment, the influence of a moving object can be removed, and the multiframe noise reduction for removing motion blur is realized.
Edge-processed de-motion blurred multiframe stacking
In the analysis, the multi-frame noise reduction is expressed by a new formula (6), so that the motion ghost can be effectively removed. In the analysis, we have an approximation of the noise estimate, equation (5). In the experimental process, it can be found that, because we choose simple mean filtering, the noise estimation of formula (5) is not accurate, and some image edges appear in the noise estimation result. This is not the goal of our pure noise estimation and affects the final effect of the algorithm, so it is necessary to remove the edges as much as possible.
Since we assume that the images are substantially aligned except for the moving object, the assumption of multi-frame noise reduction is that we can just pass through the equation (3)
Figure 617199DEST_PATH_IMAGE055
The edge of the stationary object is cancelled out, and the opposite edge of the remaining moving object is the part that affects the effect of the algorithm, so we need to apply to equation (3)
Figure 131357DEST_PATH_IMAGE055
The result of (3) is subjected to edge processing.
The residual edge portions are analyzed to be mostly continuously distributed in patches, and the pixel values are brighter (positive edge) or darker (negative edge). Thus, we filter out and eliminate edge effects by simple connected domain detection.
Setting a positive edge threshold value u1 and a negative edge threshold value u2, respectively detecting a connected region with a pixel value greater than u1 and a connected region with a pixel value less than u2, and then setting the pixel value meeting the conditions to be 0, thereby removing the residual edge.
Specifically, the method comprises the following steps:
step 6, detecting and removing the residual edge of the mutual noise.
FIG. 5 shows the results of equation (3) for one of the images in the experimental data of the present invention, normalized to 0-255 showing the mutual noise, and it can be seen that there is a significant relative edge residual in the moving part. After the edge processing procedure in the second step, fig. 6 is obtained. It can be seen that the noise margin has been substantially removed. The residual edge trace of the moving object on the noise reduction result graph can be eliminated in the process, and therefore the quality of the noise reduction result graph is effectively improved.
Step 7, according to the estimation of the mutual noise of the edge removal, a noise reduction composite map is calculated by the following formula to obtain the map 8.
Figure 136353DEST_PATH_IMAGE057
Figure 419567DEST_PATH_IMAGE059
Wherein the content of the first and second substances,
Figure 976450DEST_PATH_IMAGE060
is an edge removal function.
Fig. 9 shows the result of direct multi-frame superposition. In contrast, the motion blur of direct multi-frame stacking is the most serious, and the fusion method based on noise estimation provided by the invention is obviously superior to the traditional multi-frame stacking in motion processing.
The noise reduction algorithm based on the noise estimation has the advantages that the noise processing is carried out by utilizing the information of a plurality of images, meanwhile, the influence of a moving target is avoided, and therefore a high-quality noise reduction image is generated.
The present application discloses in another embodiment a multi-frame image fusion apparatus for removing motion smear, which includes: the device comprises an image acquisition module, a reference selection module, a filtering module, a mutual noise calculation module, a synthesis module and an edge removal module. Wherein:
the image acquisition module is used for acquiring a plurality of groups of images of the same scene at different moments;
the reference selecting module is used for selecting one image from the multiple groups of images as a reference image;
the filtering module is used for filtering each image in the multiple groups of images by adopting mean filtering to obtain a filtering smooth image of each image;
the mutual noise calculation module is used for calculating a mutual noise estimation value of each image except the reference image and the reference image;
the edge removal module is used for carrying out edge detection on the mutual noise estimation value and removing the detected edge;
and the synthesis module is used for calculating the noise reduction synthesis images of the plurality of groups of images according to the mutual noise estimation values of the removed edges.
The present embodiment is an embodiment of an apparatus corresponding to the method in the first embodiment, and the details may be incorporated herein by reference.
Accordingly, other embodiments of the present application may also provide a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the method embodiments of the present application. Computer-readable storage media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
It is noted that, in the present patent application, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the use of the verb "comprise a" to define an element does not exclude the presence of another, same element in a process, method, article, or apparatus that comprises the element.
In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.

Claims (10)

1. A multi-frame image fusion method for removing motion smear is characterized by comprising the following steps:
acquiring a plurality of groups of images of the same scene at different moments;
selecting one image from the multiple groups of images as a reference image;
filtering each image in the multiple groups of images by adopting mean filtering to obtain a filtered smooth image of each image;
calculating the mutual noise estimation value of each image except the reference image and the reference image;
performing edge detection on the mutual noise estimation value and removing the detected edge;
and calculating the noise reduction composite image of the plurality of groups of images according to the mutual noise estimation value of the removed edge.
2. The multi-frame image fusion method for removing motion blur according to claim 1, wherein in the step of acquiring multiple sets of images of the same scene at different times, at least three images of the same scene at different times are acquired.
3. The multi-frame image fusion method for removing motion smear of claim 1 wherein said reference map is the first image of said plurality of sets of images.
4. The multi-frame image fusion method for removing motion blur according to claim 1, wherein the step of filtering each image in the plurality of sets of images by using mean filtering to obtain a filtered and smoothed image of each image further comprises:
selecting the size of a filtering window according to the noise degree, and calculating a filtering smooth graph
Figure 532182DEST_PATH_IMAGE001
Figure 398507DEST_PATH_IMAGE002
And i is the ith image in the n images in the plurality of groups of images.
5. The multi-frame image fusion method for removing motion blur according to claim 4, wherein the filtering window size is 5.
6. The multi-frame image fusion method for removing motion smear according to claim 1, wherein the step of performing mutual noise estimation of each image except the reference map and the reference map further comprises:
calculating a cross-noise estimate value of
Figure 393008DEST_PATH_IMAGE003
Wherein, in the step (A),
Figure 2980DEST_PATH_IMAGE004
for the (i) th image(s),
Figure 766537DEST_PATH_IMAGE005
a filtered smoothed image for said ith image,
Figure 69342DEST_PATH_IMAGE006
is the noise estimate value of the ith image,
Figure 285560DEST_PATH_IMAGE007
in order to be the reference map, the reference map is,
Figure 699224DEST_PATH_IMAGE008
for the filtered smoothed image of the reference map,
Figure 582866DEST_PATH_IMAGE009
is the noise estimate of the reference map.
7. The multi-frame image fusion method for removing motion blur according to claim 1, wherein the step of performing edge detection on the mutual noise estimation value and removing the detected edge further comprises:
setting a positive edge threshold value and a negative edge threshold value, and respectively detecting regions of which the pixel values are greater than the positive edge threshold value and less than the negative edge threshold value in the mutual noise estimation values;
setting a pixel value of the detected region to 0.
8. The multi-frame image fusion method for removing motion blur according to claim 1, wherein the step of calculating a noise reduction composite map of the plurality of sets of images based on the cross noise estimation values of the removed edges further comprises;
according to
Figure 790994DEST_PATH_IMAGE010
Calculating a noise-reduced composite map of the plurality of sets of images, wherein,
Figure 264481DEST_PATH_IMAGE004
is as followsThe number of the i images is such that,
Figure 216257DEST_PATH_IMAGE005
a filtered smoothed image for said ith image,
Figure 954406DEST_PATH_IMAGE006
is the noise estimate value of the ith image,
Figure 333434DEST_PATH_IMAGE007
in order to be the reference map, the reference map is,
Figure 789824DEST_PATH_IMAGE008
for the filtered smoothed image of the reference map,
Figure 279711DEST_PATH_IMAGE009
ref is the id number of the reference map, which is the noise estimate of the reference map.
9. A multi-frame image fusion device for removing motion smear, comprising:
the image acquisition module is used for acquiring a plurality of groups of images of the same scene at different moments;
the reference selecting module is used for selecting one image from the multiple groups of images as a reference image;
the filtering module is used for filtering each image in the multiple groups of images by adopting mean filtering to obtain a filtered smooth image of each image;
the mutual noise estimation module is used for calculating the mutual noise estimation value of each image except the reference image and the reference image;
an edge removal module, configured to perform edge detection on the inter-noise estimation value and remove a detected edge;
and the synthesis module is used for calculating the noise reduction synthesis images of the plurality of groups of images according to the mutual noise estimation values of the removed edges.
10. A computer-readable storage medium having stored thereon computer-executable instructions which, when executed by a processor, implement the steps in the method of any one of claims 1 to 8.
CN202110456615.5A 2021-04-27 2021-04-27 Multi-frame image fusion method and device for removing motion smear Pending CN112950520A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110456615.5A CN112950520A (en) 2021-04-27 2021-04-27 Multi-frame image fusion method and device for removing motion smear

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110456615.5A CN112950520A (en) 2021-04-27 2021-04-27 Multi-frame image fusion method and device for removing motion smear

Publications (1)

Publication Number Publication Date
CN112950520A true CN112950520A (en) 2021-06-11

Family

ID=76233509

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110456615.5A Pending CN112950520A (en) 2021-04-27 2021-04-27 Multi-frame image fusion method and device for removing motion smear

Country Status (1)

Country Link
CN (1) CN112950520A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307274A (en) * 2011-08-31 2012-01-04 南京南自信息技术有限公司 Motion detection method based on edge detection and frame difference
CN110189285A (en) * 2019-05-28 2019-08-30 北京迈格威科技有限公司 A kind of frames fusion method and device
CN110599523A (en) * 2019-09-10 2019-12-20 江南大学 ViBe ghost suppression method fused with interframe difference method
WO2020097836A1 (en) * 2018-11-15 2020-05-22 深圳市欢太科技有限公司 Image processing method and apparatus, and computer device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307274A (en) * 2011-08-31 2012-01-04 南京南自信息技术有限公司 Motion detection method based on edge detection and frame difference
WO2020097836A1 (en) * 2018-11-15 2020-05-22 深圳市欢太科技有限公司 Image processing method and apparatus, and computer device and storage medium
CN110189285A (en) * 2019-05-28 2019-08-30 北京迈格威科技有限公司 A kind of frames fusion method and device
CN110599523A (en) * 2019-09-10 2019-12-20 江南大学 ViBe ghost suppression method fused with interframe difference method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SAMUEL W. HASINOFF ETAL: "Burst photography for high dynamic range and low-light imaging on mobile cameras", ACM TRANSACTIONS ON GRAPHICS, vol. 35, no. 6, 5 December 2016 (2016-12-05), pages 1 - 12, XP055559092, DOI: 10.1145/2980179.2980254 *

Similar Documents

Publication Publication Date Title
EP2819091B1 (en) Method and apparatus for processing a gray image
CN108090886B (en) High dynamic range infrared image display and detail enhancement method
EP2164040B1 (en) System and method for high quality image and video upscaling
US8107750B2 (en) Method of generating motion vectors of images of a video sequence
KR20150037369A (en) Method for decreasing noise of image and image processing apparatus using thereof
CN102256056A (en) Image processing apparatus, image processing method, and program
JP2015225665A (en) Image noise removal method and image noise removal device
WO2017100971A1 (en) Deblurring method and device for out-of-focus blurred image
CN111340732B (en) Low-illumination video image enhancement method and device
CN111402111B (en) Image blurring method, device, terminal and computer readable storage medium
WO2023273868A1 (en) Image denoising method and apparatus, terminal, and storage medium
CN112802020A (en) Infrared dim target detection method based on image inpainting and background estimation
CN113436112A (en) Image enhancement method, device and equipment
JP2020031422A (en) Image processing method and device
CN112907467B (en) Rainbow pattern removing method and device and electronic equipment
CN117830134A (en) Infrared image enhancement method and system based on mixed filtering decomposition and image fusion
CN105574823A (en) Deblurring method and device for out-of-focus blurred image
WO2024001538A1 (en) Scratch detection method and apparatus, electronic device, and readable storage medium
CN117408886A (en) Gas image enhancement method, gas image enhancement device, electronic device and storage medium
CN106027853B (en) Image processing apparatus, photographic device and image processing method
JP6738053B2 (en) Image processing apparatus for reducing staircase artifacts from image signals
CN112950520A (en) Multi-frame image fusion method and device for removing motion smear
CN114648469B (en) Video image denoising method, system, device and storage medium thereof
CN104168405B (en) Noise suppressing method and its image processing apparatus
US8077926B2 (en) Method of motion detection using adaptive threshold

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination